Researchers analyzing complex multidimensional images may be able to save hundreds of terabytes of disk space, a team from PSC reported at the XSEDE16 supercomputing conference in Miami today. Their “virtual file system” software now in development will carry out image processing on the fly, for any viewing software, saving vast data storage by making it unnecessary to maintain multiple copies of processed datasets. “Let’s say you have 100 terabytes of electron microscopy data,” says Arthur Wetzel, principal computer scientist at PSC and first author of the peer-reviewed paper accompanying the presentation. Users will begin to analyze the images as soon as they become available; but as the image processing progresses, better images become available. “Yet there’s something that they want to keep before working on the new images … Pretty soon this 100 terabytes has multiplied by at least eight times. That’s not practical for long-term storage.” The virtual file system will solve this problem by keeping the raw images unchanged, storing only the data required to reproduce a processed image rather than the entire image, according to coauthor Jennifer Bakal, PSC public health applications programmer. It does this while producing output that can be processed by any application expecting files. The software will in effect trade computational power for storage space, re-generating desired processed images on the fly instead of storing them. Learn more at http://www.psc.edu/index.php/news-and-media/press-releases/2368-virtual-file-system-will-save-vast-computer-storage-space