Scanning transmission electron microscopy (STEM) has become a uniquely powerful method for structural and functional imaging of materials, allowing researchers to study the chemical composition of samples. In previous years, electron microscopy has experienced a new methodological model that aims to fix its analytical workflow's restrictions. Machine learning and artificial intelligence address this challenge by offering exploration, automation, and development resources.
How Does STEM Work?
STEM is a conventional transmission electron microscope (TEM) with additional detectors, scanning coils, and necessary circuitry. In conventional TEM, images are formed by passing electrons through a sufficiently thin specimen, while STEM involves electron beams focused on a fine spot, which is then scanned over the sample in a raster pattern.
A STEM image is formed by counting the scattered electrons at every point in the pattern and assembling the raw electron counts into an image. By rastering the electron beam across the sample, STEM becomes a suitable approach for analytical techniques. It also can generate contrast without the need to defocus.
AI-Enhanced Electron Microscope Images
STEM works by scanning focused electron beams across a thin sample and then detecting passing electrons to create an image. However, generating images with high resolutions comes at a cost. Higher resolutions are usually achieved using high-intensity electron beams that can damage or even destroy samples.
Reducing the dose of electrons would limit the sample damage, but the images generated from extracted information will have worse resolution. This is due to Poisson noise, a basic form of uncertainty in light measurement built on the quantized nature of light and the independence of photon detection.
Also known as photon noise or shot noise, Poisson noise is a fundamental property of electron beams. It cannot be corrected by adjusting a STEM instrument, as with other noise.
Given the setup of electron microscopy (EM), data post-processing and fine-tuning of the acquisition conditions are two main approaches toward acquiring the best possible EM data. On the other hand, noise removal or data cleaning is an important factor in data reproducibility and acquiring statistically meaningful quantitative analysis. In this case, machine learning is a good option for denoising STEM data.
In 2023, a study was conducted by a group of scientists in Ireland that involved using machine learning for quantified resolvability enhancement of STEM data. The team, led by Laura Gambini of Trinity College Dublin, explored the possibility of training machine learning algorithms to reduce the noise in STEM images scanned using low-dose electron beams.
They specifically trained an auto encoder on a dataset composed of simulated STEM images, making sure that the proper microscope settings and specimen variety were reflected in the dataset. The auto encoder was then tested for its ability to reduce Poisson noise by processing natural and artificial STEM images.
The researchers visually compared the images before and after processing and noticed a clear reduction in data noise. The study demonstrated that the de-noised images contain more precise information than the original ones, and images are processed using other methods.
RELATED ARTICLE : Microscopy Technique Breaks Through Color Barrier Of Optical Imaging
Check out more news and information on Microscopy in Science Times.