How Will National Labs Handle Big Data Pouring from Next-Gen Particle Accelerators?

How Will National Labs Handle Big Data Pouring from Next-Gen Particle Accelerators?

Jan. 27, 2018
The good news is that newer accelerators will provide an onslaught of previously unattainable data, Big Data. The challenge will be pulling useful and accurate information from it.

The Dept. of Energy and a network of national labs and researchers spread across the U.S. are planning the next generation of accelerators and other powerful instruments for exploring x-rays, astronomy, and a host of other fields. Computer scientists at one such lab, Stanford Linear Accelerator Center (SLAC), have been tasked with figuring out how the labs will handle all the information coming from those tools.

For example, a scheduled upgrade (LCLS-II) to the Linac Coherent Light Source will make the light source’s laser fire 8,000 times faster as it is used to peer into the fundamentals of atoms. But as the firing rate jumps, so too will the data it spits out.

Two fairly recent PhD graduates, Alan Heirich and Elliott Slaughter, will work closely with director of SLAC’s Computer Science Div. to work out the details of how the lab will cope with Big Data’s challenges. Here’s what they have to say.

Here are members of SLAC’s Computer Science Div. From left: Alex Aiken, Elliott Slaughter, and Alan Heirich. (Photo courtesy of Dawn Harmer/SLAC National Accelerator Lab)

What are the computing challenges you’re trying to solve?

Heirich: The major challenge we’re looking at now is that LCLS-II will produce so much more data than the current X-ray laser. Data rates will increase 10,000 times, from about 100 megabytes per second today to a terabyte per second in a few years. We need to think about the computing tools and infrastructure necessary to take control over that enormous future data stream.

Slaughter: Our development of new computing architectures is aimed at analyzing LCLS-II data on the fly, providing initial results within a minute or two. This lets researchers quickly evaluate the data’s quality, make adjustments, and collect data in the most efficient way. However, real-time data analysis is quite challenging if you collect data with an X-ray laser that fires a million pulses per second.

How can real-time analysis be done?

Slaughter: We won’t be able to do all this with just the computing capabilities we have on site. The plan is to send some of the most challenging LCLS-II data analyses to the National Energy Research Scientific Computing Center (NERSC) at DOE’s Lawrence Berkeley National Laboratory, where extremely fast supercomputers can analyze the data and send results back to us within minutes.

Our team has joined forces with Amedeo Perazzo, the man leading the LCLS Controls and Data Systems Division, to develop the system that will run the analysis. Scientists doing experiments at LCLS will be able to define details of that analysis, depending on what their scientific questions are.

Our goal is to be able to do the analysis in a flexible way using all kinds of high-performance computers that have completely different hardware and architectures. In the future, these will also include exascale supercomputers that perform more than a billion billion calculations per second—up to a hundred times more than today’s most powerful machines.

Is it difficult to build such a flexible computing system?

Heirich: Yes. Supercomputers are highly complex with millions of processors running in parallel, and we need to figure out how to use their individual architectures most efficiently. So, at Stanford, we’re developing a programming system, called Legion, that lets people write programs that are portable across different high-performance computer architectures.

Traditionally, if you wanted to run a program with the best possible performance on a new computer system, you may have needed to rewrite significant parts of the program to match the new architecture. That’s labor and cost intensive. Legion, on the other hand, is designed to be used on diverse architectures and requires only relatively small tweaks when moving from one system to another. This approach prepares us for whatever the future of computing looks like. At SLAC, we’re now starting to adapt Legion to the needs of LCLS-II.

We’re also looking into how we can visualize the scientific data after they are analyzed at NERSC. The analysis will be done on thousands of processors, and it’s challenging to orchestrate this process and put it together into one coherent visual picture. We just presented one way to approach this problem at the supercomputing conference SC17 in November.

What’s the goal for the coming year?

Slaughter: We’re working with the LCLS team on building a data-analysis prototype. One goal is to get a test case running on the new system. This will be done with X-ray crystallography data from LCLS, which reconstructs the 3D atomic structure of important biomolecules, such as proteins. The new system will be much more responsive than the old one and be able to read and analyze data at the same time. The old system can only do one or the other at any given moment.

Will other research areas besides X-ray science profit from your work?

Slaughter: Yes. Alex is working on growing our division, identifying potential projects across the lab and expanding our research portfolio. Although we’re concentrating on LCLS-II right now, we’re interested in joining other projects, such as the Large Synoptic Survey Telescope (LSST). SLAC is building the LSST camera, a 3.2-gigapixel digital camera that will capture unprecedented images of the night sky. But it will also produce enormous piles of data—millions of gigabytes per year. Progress in computer science is needed to efficiently handle these data volumes.

Heirich: SLAC and its close partnership with Stanford Computer Science make for a great research environment. There is also a lot of interest in machine learning. In this form of artificial intelligence, computer programs get better and more efficient over time by learning from the tasks they performed in the past. It’s an active research field that has seen a lot of growth over the past five years, and machine learning has become remarkably effective in solving complex problems that previously needed to be done by human beings.

Many groups at SLAC and Stanford are exploring how they can exploit machine learning, including teams working in X-ray science, particle physics, astrophysics, accelerator research, and more. But there are fundamental computer science problems to solve. As machine learning replaces some conventional analysis methods, one big question is, for example, whether the solutions it generates are as reliable as those obtained in the conventional way.

Sponsored Recommendations

Pumps Push the Boundaries of Low Temperature Technology

June 14, 2024
As an integral part of cryotechnology, KNF pumps facilitate scientific advances in cryostats, allowing them to push temperature boundaries and approach absolute zero.

The entire spectrum of drive technology

June 5, 2024
Read exciting stories about all aspects of maxon drive technology in our magazine.


May 15, 2024
Production equipment is expensive and needs to be protected against input abnormalities such as voltage, current, frequency, and phase to stay online and in operation for the ...

Solenoid Valve Mechanics: Understanding Force Balance Equations

May 13, 2024
When evaluating a solenoid valve for a particular application, it is important to ensure that the valve can both remain in state and transition between its de-energized and fully...

Voice your opinion!

To join the conversation, and become an exclusive member of Machine Design, create an account today!