Gary Grider and Brad Settlemyer

Handling Trillions of Supercomputer Files Just Got Simpler

April 2, 2019
Delta FS, an Exascale file system, breaks the “the metadata bottleneck” by handling extreme numbers of files and amounts of data with unprecedented performance.

A new distributed file system for high-performance computing, Delta FS, was developed by computer scientists at Los Alamos National Laboratory and Carnegie Mellon University and will be distributed via Git Hub, a software collaboration site. It is predicted Delta FS will vastly improve the tasks of creating, updating, and managing extreme numbers of files.

“We designed it to enable the creation of trillions of files,” says Brad Settlemyer, a Los Alamos computer scientist and project leader. “Such a tool helps researchers solve classical problems in high-performance computing, such as particle-trajectory tracking or vortex detection.”

DeltaFS builds a file system that appears to the user just like any other file system, doesn’t require specialized hardware, and is specifically tailored to assisting scientists and engineers in making new discoveries when using high-performance computers.

“One of the foremost challenges, and primary goals of DeltaFS, was getting it to work across thousands of servers without requiring a portion of each be dedicated to the file system,” says George Amvrosiadis, assistant research professor at Carnegie Mellon University. “This frees administrators from having to decide how to allocate resources for the file system, which will become a necessity when exascale machines become a reality.”

The file system brings about two important changes in high-performance computing. First, DeltaFS makes it possible to deploy new strategies for designing supercomputers, dramatically lowering the cost of creating and managing files. In addition, DeltaFS radically improves the performance when executing highly selective queries, reducing the time needed to conduct scientific discovery.

DeltaFS is a transient, software-defined service that lets data be accessed from a handful or hundreds of thousands of computers based on the user’s performance requirements.

“Storage techniques used in DeltaFS apply in many scientific domains, but alleviating the metadata bottleneck we have shown a way for designing and procuring much more efficient HPC storage,” Settlemyer says.

Sponsored Recommendations

Aug. 14, 2025
Production downtime caused by faulty conveyor motors leads to financial loss, so choosing the right drive system is essential. Explore industry-leading solutions engineered for...
Aug. 7, 2025
Get better products to market faster. This e-book reveals how industry leaders are adopting an agile approach to product development with integrated design and collaboration software...
Aug. 7, 2025
Change is difficult, but with the right plan, it can be successful. Learn from Rathbane Group's transition from 2D to 3D and their strategies for leveraging efficiencies and mediating...
Aug. 7, 2025
Discover how the Autodesk Platform helps you embrace a cloud-first evolution in design and manufacturing, connecting your data and unleashing your agility with AI-powered insights...

Voice your opinion!

To join the conversation, and become an exclusive member of Machine Design, create an account today!