April 26, 2024 — Scientists often turn to X-ray microscopes when they want to see tiny structures within materials, even just a few atoms in size.
Schematic diagram showing the experimental setup.Credit: Argonne Laboratory
X-ray microscopes have advanced to the point where they generate more data than scientists can efficiently process, even with large supercomputers. As a result, researchers are exploring new techniques that can process data on the fly. This means analyzing data as it is collected and feeding the results back into experiments, ultimately building autonomous discovery pathways.
Advanced Photon Source Scientist (APS) of the U.S. Department of Energy (Department of Energy) Argonne National Laboratory We recently developed a new method that incorporates machine learning in the form of neural networks into X-ray microscopy technology. The new process allows researchers to spend less time sampling material, increases data processing speed by more than 100 times, and reduces the amount of data collected by a factor of 25.of APS is Department of Energy Science office facilities available.
“The problem is that traditional analysis tools can't keep up with the data rate,” said computational scientist Matthew Cherkala, leader of the Argonne group and author of the study.“And we find ourselves in a situation where we have incredibly complex and extraordinary hardware, but we don't have the means to analyze all the data that they produce. ”
Cherukara says that without supercomputers, analyzing data from these studies can take days or weeks, and even with supercomputers, it can still take hours.
“New neural networks mean many of these experiments can be run within minutes at the full speed of the instrument,” he said.
Argonne Group leader Antonino Miceli, another author of the research paper, said the ability to perform these experiments quickly and adjust conditions spontaneously allows scientists and autonomous instruments to: He pointed out that it would be possible.“For information on how to analyze your samples, see “In a Moment of Selection”.
“You can’t make decisions like this without the ability to analyze data on the fly,” he said.
New technology could ultimately free up time for more and better experiments. APSsaid Argonne physicist Tao Zhou, another author of the study.
“Most people who come here APSThey travel, prepare for a week of experiments, leave on the weekend with the data, and come home to analyze it,” Zhou said.“If you find something interesting during your analysis that you would like to take further measurements on, you usually have to wait for the next experimental cycle. APS. This technology essentially allows analysis to be performed on the beamline in real time, so that if something unexpected and new and interesting is found in the sample, you are ready and able to adapt immediately. Masu. ”
This new technology is called streaming ptychography (ty-KAH-grah-fee). The streaming part is a lot like a video streaming app like Netflix, except that scientists and other machine learning agents can make changes to the experiment itself as it reacts to what they're seeing. Imagine your cat adjusting its jumps to catch the red dot of a laser pointer. This will give you ideas. Data is analyzed in near real-time, so you can focus your experiments on interesting phenomena as they are discovered.
“It is becoming possible to analyze data while it is being generated. “The AI is learning as the experiment progresses,” Cherkala said.
“By placing this embedded computing close to the beamline, we can make adjustments instantly on the fly while an experiment is in progress, without having to send data back to the cloud or supercomputing cluster.'' added researcher Anaha Babu. He is currently a researcher at KLA-Tencor and another author of the study.“Beyond ptychography, this type of setup could potentially be used in a wide range of experiments that require real-time data-driven adjustments. ”
Ptychography is an imaging technique widely used in X-ray, optical, and electron microscopy, and has long been recognized for its ability to image centimeter-sized objects at high resolution with minimal sample preparation. However, traditional methods used for image processing are time-consuming and computationally intensive, which can hinder real-time imaging.
The streaming process worked as follows. The developer is Argonne Leadership Computing Facility (ALCF) Perform a first round of computationally intensive phase retrieval calculations from the diffraction pattern produced by the X-ray beam. These measurements were used to train a neural network, allowing accurate similar calculations to be performed more quickly and at lower computational cost closer to the beamline.of ALCF It's also Department of Energy Science office facilities available.
“We need to make sure the neural network is properly trained,” said Argonne computer scientist Tekin Bicer, another author of the study.“When the model is trained during an experiment, it produces estimates that are close to those achieved on supercomputers at a fraction of the computational cost and significantly reduces lag time. ”
Bicer explained that the initial training of a neural network does not take a long time.
““When you first turn on the device, you can't use the neural network because it hasn't been trained yet,” he said.“But once you have a little data from traditional analysis, you can start using neural networks right away. ”
By using machine learning on individual X-ray diffraction patterns, this workflow eliminates the need for the strict overlapping sampling constraints inherent in ptychography and also significantly reduces the required beam dose. This breakthrough reduces the risk of sample damage and is suitable for imaging delicate materials.
Neural networks can also be applied to electron and light microscopy, Cherkala explained. ““The world's most advanced microscope will no longer be held back by a lack of analytical capability and will be able to reach its full potential,” he said.
A paper based on this research has been published in the journal Nature Communications.
This work was funded by: Department of Energy Office of Science, Office of Basic Energy Sciences, and Argonne Laboratory-led research and development grants. This study used Argonne's hard X-ray nanoprobe beamline. APS And another Argonne Nanoscale Materials Center Department of Energy Science office facilities available.
About Argonne Leadership Computing Facility
The Argonne Leadership Computing Facility (ALCF) provides supercomputing capabilities to the scientific and engineering community to advance fundamental discovery and understanding in a wide range of fields. Supported by the U.S. Department of Energy (DOE) Office of Science's Advanced Scientific Computing Research (ASCR) program, ALCF is one of two DOE Leadership Computing Facilities in the nation dedicated to open science.
About Argonne National Laboratory
Argonne National Laboratory seeks solutions to pressing national problems in science and technology. The nation's first national research laboratory, Argonne conducts cutting-edge basic and applied scientific research in virtually every scientific field. Argonne researchers work closely with researchers at hundreds of companies, universities, and federal, state, and local agencies to solve specific problems, advance America's scientific leadership, and make the nation better. We provide support to prepare for a better future. With employees from more than 60 countries, Argonne is managed by his UChicago Argonne, LLC of the U.S. Department of Energy's Office of Science.
Source: Jared Sagoff, Argonne
