Argonne National Laboratory
Argonne researchers Joseph Libera and Anthony Stark

Leveraging AI and Statistical Methods to Improve Flame Spray Pyrolysis

March 2, 2021
Researchers are trying to refine pyrolysis so that it can consistently make similarly-sized nanoparticles of various materials.

Flame spray pyrolysis has long been used to make small particles that can be used as paint pigments. Now, researchers at Argonne National Laboratory are refining the process to make smaller, nano-sized particles of various materials that can make nano-powders for low-cobalt battery cathodes, solid state electrolytes and platinum/titanium dioxide catalysts for turning biomass into fuel.

To streamline and shorten the development process, the researchers are using artificial intelligence and advanced statistical methods.

Currently industrial flame spray pyrolysis typically involves feeding a gaseous precursor into a flame and collecting the combusted material. This creates extremely fine powders composed of particles as small as a few nanometers in diameter. To make carbon black for ink, for example, a hydrocarbon such as oil or gas is burned under controlled conditions, resulting in a fine carbon powder.

Expanding the range of materials that flame spray pyrolysis can turn into nanoparticles requires using a liquid precursor rather than a gas.

Running flame spray pyrolysis with atomized droplets is much more difficult than using a gas. It involves a host of processing variables related to flame temperature, flow rates and nozzle shape.

“Gas combustion is inherently more controllable,” says Joe Libera, a principal materials scientist who operates Argonne’s flame spray pyrolysis reactor. “The main problem with the wider adoption of liquid-feed flame spray is that it’s much more complex than gas-based combustion. The number of elements in the target materials can also add another layer of complexity.

In the past, fine-tuning flame spray pyrolysis to reliably and efficiently churn out nanoparticles would have taken painstaking trial-and-error experimentation. But now AI can identify the best control parameters for a given process much faster, even if it has little data to work with.

During their investigations, the researchers tried to make silica nanoparticles all with roughly the same diameter or size. They chose silica because it is easy to simulate on computers.

The team started with a solution of tetraethyl orthosilicate in alcohol. The solution was passed through an oxygen atomizing nozzle, with a secondary oxygen flow that envelops the central spray. At a rate of up to 10 mm per minute, the spray passed through a flame initiated with a mix of methane and oxygen. The ignited material condensed into solid silica particles, accumulating at about 20 grams per hour.

Which settings in this scenario would produce the most uniform particle size? For starters, one could vary the concentration of the solution, its flow rate, the methane flow rate and three different oxygen flow rates. Instead, researchers analyzed a limited number of these six variables across 17 experiments with a type of artificial intelligence called active learning. It lets the computer continuously learn from new data and request experiments as needed.

Most machine learning algorithms extract insights from large datasets filled with thousands of examples to analyze. But the flame spray reactor at Argonne’s Materials Engineering Research Facility is rare in that it is uniquely instrumented, which means training data is limited. Active learning is well-suited for these types of situation.

For the study, researchers had just a few dozen data points to work with, all collected over the course of 17 experiments. “That’s a small dataset for machine learning,” explains Noah Paulson, a computational materials scientist. Researchers would have had to run countless experiments over weeks or months on that one reactor setup to accumulate a larger dataset for a machine learning model to analyze.

Instead, the team took a bottom-up approach, using statistical methods to explore different possible parameters with a relatively small number of experiments. First, they identified a minimum set of input variables expected to affect the size distribution of the silica particles (the six mentioned above), and then they set a total allowable number of experiments.

To define a set of preliminary experiments to perform, the team used Latin hypercube sampling, a way to sift across the range of settings. “That gave us a rough sense of what the shape of the response is, or how results change with different input variables,” Paulson says.

The initial results were then fed into a Gaussian process, which evaluates the similarity between data points and predicts the values of new ones. The Gaussian process model allowed the use of Bayesian optimization, a well-established statistical method, to select the next point to evaluate. If the computer has seen a similar input set with a known data point, it can assign a high confidence rank to the Gaussian process prediction. But if the computer is asked about a combination of input variables not like ones it has seen before, it will assign its prediction a ranking of “low confidence.”

For example, given four prototype experiments that yield four outcomes, the model can look at what it knows and suggest a new set of processing conditions. The outcome from that fifth experiment gets added to the dataset, and the model reevaluates what it knows, suggesting another experiment—and so on, until the desired outcome is reached.

The AI engine improved flame spray pyrolysis results by more than 25%. In other words, compared to a baseline of the first four experiments, the variety of silica particle sizes decreased by 25.5% within 15 experiments, with most of the improvement gained by the 10th run. The fact that the percent improvement changed less than one percent between the 10th and 15th experiments showed the strategy achieved a near-optimal result. The team was able to run the experiments in a single morning, which otherwise would have taken weeks.

The AI approach used in the study is not particularly complicated from a computer scientist’s perspective. “The hard part is connecting it to the actual physical process,” Paulson says. “How do you process the data  so that you can interpret those results, use them to perform your improvements, and understand where you are with respect to your objectives?”

AI can also be applied other manufacturing processes. For example, the Argonne team is working on an application for atomic layer deposition, a technique for adding thin films to surfaces commonly used to make semiconductors.

The team now wants to clearly understand what happens in the flame by using computational fluid dynamics. It also wants to analyze images from laser diagnostic equipment to analyze the flame’s characteristics.

While the initial optimization study focused on particle size, the technology could be extended to many other manufacturing metrics for industrial materials, such as yield, chemical purity and flame stability. It could also be adapted to monitor a process in addition to optimizing it.

Having demonstrated the technology on silica fabrication, the researchers want to expand it to more complex chemistries like those needed to make advanced battery materials. One such material, lithium lanthanum zirconium oxide—a potential solid state battery electrolyte—is expensive and difficult to produce using current methods; the most common of these involves ball-milling oxides of zirconium, lanthanum, lithium and aluminum, then treating them at high temperature. But it might be made less expensively and more easily with flame spray pyrolysis, leading to safer batteries with twice the energy density of today’s lithium-ion counterparts.

For now, the team is focused on collecting and using as much raw data as possible from experiments to make the algorithm more accurate and predictable.

Sponsored Recommendations

How BASF turns data into savings

May 7, 2024
BASF continuously monitors the health of 63 substation assets — with Schneider’s Service Bureau and EcoStruxure™ Asset Advisor. ►Learn More: https://www.schn...

Agile design thinking: A key to operation-level digital transformation acceleration

May 7, 2024
Digital transformation, aided by agile design thinking, can reduce obstacles to change. Learn about 3 steps that can guide success.

Can new digital medium voltage circuit breakers help facilities reduce their carbon footprint?

May 7, 2024
Find out how facility managers can easily monitor energy usage to create a sustainable, decarbonized environment using digital MV circuit breakers.

The Digital Thread: End-to-End Data-Driven Manufacturing

May 1, 2024
Creating a Digital Thread by harnessing end-to-end manufacturing data is providing unprecedented opportunities to create efficiencies in the world of manufacturing.

Voice your opinion!

To join the conversation, and become an exclusive member of Machine Design, create an account today!