Gary Grider, left, and Brad Settlemyer discuss the new Los Alamos and Carnegie Mellon software product, DeltaFS, released to the software distribution site GitHub this week.

Handling Trillions of Supercomputer Files Just Got Simpler

April 2, 2019
Delta FS, an Exascale file system, breaks the “the metadata bottleneck” by handling extreme numbers of files and amounts of data with unprecedented performance.

A new distributed file system for high-performance computing, Delta FS, was developed by computer scientists at Los Alamos National Laboratory and Carnegie Mellon University and will be distributed via Git Hub, a software collaboration site. It is predicted Delta FS will vastly improve the tasks of creating, updating, and managing extreme numbers of files.

“We designed it to enable the creation of trillions of files,” says Brad Settlemyer, a Los Alamos computer scientist and project leader. “Such a tool helps researchers solve classical problems in high-performance computing, such as particle-trajectory tracking or vortex detection.”

DeltaFS builds a file system that appears to the user just like any other file system, doesn’t require specialized hardware, and is specifically tailored to assisting scientists and engineers in making new discoveries when using high-performance computers.

“One of the foremost challenges, and primary goals of DeltaFS, was getting it to work across thousands of servers without requiring a portion of each be dedicated to the file system,” says George Amvrosiadis, assistant research professor at Carnegie Mellon University. “This frees administrators from having to decide how to allocate resources for the file system, which will become a necessity when exascale machines become a reality.”

The file system brings about two important changes in high-performance computing. First, DeltaFS makes it possible to deploy new strategies for designing supercomputers, dramatically lowering the cost of creating and managing files. In addition, DeltaFS radically improves the performance when executing highly selective queries, reducing the time needed to conduct scientific discovery.

DeltaFS is a transient, software-defined service that lets data be accessed from a handful or hundreds of thousands of computers based on the user’s performance requirements.

“Storage techniques used in DeltaFS apply in many scientific domains, but alleviating the metadata bottleneck we have shown a way for designing and procuring much more efficient HPC storage,” Settlemyer says.

Sponsored Recommendations

The Digital Thread: End-to-End Data-Driven Manufacturing

May 1, 2024
Creating a Digital Thread by harnessing end-to-end manufacturing data is providing unprecedented opportunities to create efficiencies in the world of manufacturing.

Medical Device Manufacturing and Biocompatible Materials

May 1, 2024
Learn about the critical importance of biocompatible materials in medical device manufacturing, emphasizing the stringent regulations and complex considerations involved in ensuring...

VICIS Case Study

May 1, 2024
The team at VICIS turned to SyBridge and Carbon in order to design and manufacture protective helmet pads, leveraging the digitization and customization expertise of Toolkit3D...

What's Next for Additive Manufacturing?

May 1, 2024
From larger, faster 3D printers to more sustainable materials, discover several of the top additive manufacturing trends for 2023 and beyond.

Voice your opinion!

To join the conversation, and become an exclusive member of Machine Design, create an account today!