Electron microscopy in the age of automation
by Kimberlee Papich, Pacific Northwest National Laboratory
“Many of the greatest challenges of our time, from clean energy to environmental justice, require new approaches to the craft of scientific experimentation. This is exceedingly apparent in the field of electron microscopy. As researchers utilize this powerful window to peer into the atomic machinery behind today’s technologies, they are increasingly inundated with data and constrained by traditional operating models. We must leverage artificial intelligence and machine learning in our scientific instruments if we are to unlock breakthrough discoveries.”
This is Steven R. Spurgeon’s forward-looking assessment of the present and future state of microscopy and instrumentation in scientific experimentation. Spurgeon, a materials scientist at Pacific Northwest National Laboratory (PNNL), is an international expert in the study of nanomaterials using electron microscopy. At PNNL, he and his colleagues are working to reimagine the discovery and design of new material and chemical systems by applying state-of-the-art computing and data analytics to instrumentation.
Accordingly, academia and industry are turning to these PNNL electron microscopy science experts for their solutions. PNNL is at the helm of thought leadership in this growing research area and is now bringing advanced technologies to market to accelerate scientific discovery.
An evolution in scientific experimentation
Spurgeon and his colleagues are attempting to address a challenge that is ubiquitous throughout multiple industries—experts are deluged with large volumes of data and hampered by outmoded operating models, making knowledge extraction difficult. From new battery development to emerging quantum computing technologies, all domains are grappling with this burden.
In advanced manufacturing, the opportunity for automation in instrumentation is keenly evident—modernization would have an immediate and transformative impact. In the semiconductor industry, failure analysis is conducted on an immense scale—24 hours a day, 7 days a week. Microscopes and other systems must screen millions of transistors to assure the quality and reliability of microelectronics. Experts are increasingly concerned with how to translate these large data streams into rapid and explainable decisions that ultimately drive down costs.
The solution they seek requires hardware and software architectures that can emulate the human brain in terms of cognition. This would allow for the evaluation of unique scenarios while tapping into the ability of computers to tirelessly scale analysis to different types and volumes of data.
Industry-driven technology transfer
In an October 2020 Nature Materials commentary, a team co-led by Spurgeon shared its vision for electron microscopy infused with the latest advances in data science and artificial intelligence. Fast-forward to present day and this vision is being realized inside PNNL’s Radiological Microscopy Suite. There, researchers have developed a prototype of a next-generation microscope platform, and industry players are taking note.
PNNL and Japan Electron Optics Laboratory/Integrated Dynamic Electron Solutions (JEOL/IDES), a world leader in electron microscopy, recently signed a licensing and co-development agreement to commercialize the application. Together, they will bring to market the platform’s core concept—applying minimal, or ‘sparse,’ data analytics to perform image classification—an important step toward instrument automation. Technologies developed under this partnership will be further refined and made available to research organizations and private industry. Accessing the platform will allow these experts to process microscopy data without the need for entirely new instrumentation hardware.
“JEOL/IDES sees the clear need for improvement in the way microscopy data is acquired and analyzed. This doesn’t just mean automated instruments, but smart automated instruments that can acquire data expertly and effectively,” said Tom Isabell, vice president for product management for JEOL/IDES. “We need to develop a new paradigm in which data is acquired efficiently and the vast amounts of data are analyzed intelligently, in turn leading to an even more efficient way to collect further data. PNNL has shown world leadership in taking on this smart microscopy model and JEOL/IDES looks forward to partnering with PNNL to develop and implement these new technologies.”
The broad application of this platform reflects the intent of PNNL’s Office of Technology Deployment and Outreach and the early work of commercialization manager Jennifer Lee in spearheading the licensing agreement with JEOL/IDES. She was motivated by her interactions with industry partners, where she heard a clear theme—the labor-intensive, manual work involved in processing large volumes of microscopic data was simply too onerous. Industry partners were looking for multi-faceted expertise, not only in materials and electron microscopy science, but especially in data science, inherent to a research entity that could quickly deliver on a solution.
“At PNNL, we take an industry-driven approach to all of our technology transfer efforts. We work hard to understand the industry’s pain points and bring those concerns back to our scientists to address,” said Lee. “In our work with JEOL/IDES, for example, there was immediate support and palpable expertise for developing an approach that could replicate the human brain’s decision-making capabilities, resulting in the quickest laboratory-directed commercialization effort, from start to finish, yet.”Play00:0400:00MuteSettingsPIPEnter fullscreenPlayFully automated data collection and classification of MoO3 nanoparticles in the PNNL transmission electron microscope. Credit: Steven Spurgeon and Stephanie King | Pacific Northwest National Laboratory
Automation meets electron microscopy
PNNL’s next-generation microscope platform implements a never-before-seen analytics and control architecture. Experts are redesigning the electron microscope’s foundation, leveraging low-level system automation, domain-grounded data pre-processing, and emerging sparse data analytics to rapidly extract statistical information. They’re making significant progress toward the microscope of tomorrow, one that is highly integrated and automated, which can target challenges in energy storage, quantum information science, and more.
“Steven and his team are addressing an age-old problem in the control and operation of electron microscopes. Their approach has the potential to greatly impact the scientific community by helping researchers conduct richer and more efficient analyses at scale,” explains Sergei V. Kalinin, a corporate fellow at Oak Ridge National Laboratory and a leader in machine learning and automated experiments in electron and scanning probe microscopies not involved in this research.
To bring the microscopy platform to life, Spurgeon assembled a team from inside and outside PNNL, including fellow materials scientist Matthew Olszta, statistician Sarah Akers, computer scientist Derek Hopkins, and Kevin Fiedler, a mathematician from Washington State University. Spurgeon and Olszta’s microscopy expertise was an ideal match for Akers’ few-shot machine learning, which represents a new kind of data analytics that can make decisions using very limited examples. To build a centralized instrument controller, Spurgeon tapped Hopkins, who specializes in hardware/software integration and lab automation. Hopkins and Fiedler designed an architecture to process and analyze incoming images to enable large-area montaging and stage feedback.
The team’s resulting machine learning work is currently in review in an article led by Akers, titled, “Rapid and Flexible Segmentation of Electron Microscopy Data Using Few-Shot Machine Learning,” with a more detailed article on the system to follow. Several joint appointments are also in the works for Spurgeon.
The prototype microscopy system is now being deployed at PNNL on two flagship transmission electron microscopes—a JEOL GrandARM 300F and a JEOL ARM 200CF—with the eventual goal to extend it to other instruments. This unique capability will enable richer, around-the-clock statistical analysis to take advantage of the laboratory’s best-in-class instrumentation.
Democratizing data-driven analysis
“The true potential of this work is that it can be extended to many other areas, drawing on PNNL’s expertise across multiple scientific disciplines,” said Spurgeon. “We have the opportunity to move the conversation away from simply buying higher-powered instruments toward more informed modes of operation and analysis. We can think of this as a democratization of best-in-class analysis capabilities.”
To accelerate this transition, and in support of science, technology, engineering, and mathematics workforce development, the PNNL team recently advised a group of students through the University of Washington’s Data Intensive Research Enabling Clean Technologies (DIRECT) capstone program. The students were tasked with developing a graphical user interface for interacting with the few-shot model. This web-based application allows end users to intuitively process their data and export the results for further use. The students completed a publication, released their codebase, and will present a poster at the Microscopy and Microanalysis Virtual Meeting in early August.
On the road to the future
In addition to the team’s publications and the licensing agreement, other upcoming activities speak to broad enthusiasm for the microscopy platform, namely an invited tutorial and four talks planned for the Microscopy & Microanalysis Virtual Meeting. Hosted by the Microscopy Society of America, the annual meeting is open to its 3,000 members and is considered the premiere event covering original microscopy research.
Cumulatively, these activities are helping propagate the current and future potential of this new platform. This will lead to unlocking experimentation at scale and deriving richer, more meaningful physical models for technologically relevant systems. The team’s work has only just begun, as they plan for the full implementation of the system and build on their machine learning work to increase the power and generalizability of their approach.
Concluded Spurgeon, “We started with a new approach to classifying data in the microscope, but we’ve grown beyond that to addressing how we as a community approach experimentation. Traditional approaches are very manual and labor-intensive, but, most importantly, they can’t keep pace with the latest generation of hardware. We believe our platform is a first step in that direction. The feedback we’ve received from the scientific community and industry has been very positive, which is extremely gratifying.”
More information: Sarah Akers et al, Rapid and Flexible Semantic Segmentation of Electron Microscopy Data Using Few-Shot Machine Learning, Research Square (2021). DOI: 10.21203/rs.3.rs-346102/v1Journal information:Nature MaterialsProvided by Pacific Northwest National Laboratory