Insights from industry

Introduction to Particle Analysis via AZtecFeature

insights from industryDr Matt HiscockProduct AnalystOxford Instruments NanoAnalysis

In this interview, AzoNano talks to Dr Matt Hiscock about AZtecFeuture and its role in particle analysis.

What is particle analysis, and how is it used?

Particle analysis, or feature analysis, is an approach that seeks to characterize the constituent parts of a material. 

Materials analysed with particle analysis often include things like additive manufacturing metal powders or NCM powders used in the manufacture of batteries. They could also include things that have significant lateral variation, such as grains in a rock.

In short, particle analysis can be applied to an array of different sample types and applications.

In SEM-based particle analysis, we often use the images generated by the SEM to identify what features to analyze in samples and then automatically analyze their composition using Energy Dispersive X-Ray Spectrometry (EDS). We can then classify the analysed features using a classification scheme – which adds a layer of interpretation to our analysis.

Feature analysis can be used in different ways depending on the specific problem to be solved. In some cases, such as the analysis of gunshot residue (GSR) in forensics, we use particle analysis to perform “needle in a haystack” searches – finding rare and important particles amongst many unimportant ones.

Alternatively, it can be used to work out the characteristics of a population – determining the relative proportions of the parts of that population. Some applications fall somewhere in between these two end-members – such as in the case of technical cleanliness in automotive manufacture – where we are looking at contaminants of varying severity and size.

Image Credit: Oxford Instruments NanoAnalysis

What are some of the challenges that exist in the field of particle analysis, and how does AZtecFeature help address them?

There are three key challenges when it comes to feature analysis.

The Number

One of the major challenges in feature analysis is that we are typically analyzing thousands or tens of thousands of features per analysis, meaning that it becomes impossible for us to manually review each one. 

This means that we need to have a lot of trust in and become reliant on algorithms, such as AZtecLive’s TruQ, to ensure that we are detecting elements correctly and taking into account common physical effects that could otherwise give us incorrect results. 

The Variation

Another potentially difficult scenario relates to analyzing samples with a wide range of feature sizes. When this happens, we normally set the magnification for our analysis relatively high – so that we can image the small features accurately, but this can cause problems for larger features which then get cut by field boundaries. 

We have a solution to this in AZtecFeature where we can automatically reconstruct these broken features, thereby ensuring that total particle counts and morphology measurements are all correct.

The Phases

The final problem to mention relates to how we detect particles. We normally use backscattered electron (BSE) images for particle detection as this image type can give a good indication of where particles begin and end. This is because the contrast in this image type relates to the mean atomic Z number of the phase being imaged. 

Sometimes, either due to bad luck or due to the way that we set up an image, we can find that multiple phases end up having the same grey level in an image – perhaps both being saturated white. 

This means that if we have a particle containing one or both of these phases, it will not be clear what is present. To overcome this, we use an approach where specific features can be mapped with EDS, and then the features are identified using a phase detection algorithm that is automatically applied to the maps.

What are some of the qualities that users expect and desire from a particle analysis system?

There are multiple qualities that a good particle analysis system should possess, but there are only a few key points. I normally break it down to three things - a system needs to be smart, fast and accurate.

Being smart means that the system must be capable and able to do its job in an efficient way. That is to say; it must be able to perform the task of particle analysis accurately, giving the correct results for the features that are analyzed. This means taking advantage of advanced algorithms like those previously mentioned. Furthermore, it should be able to deal with common issues experienced in particle analysis, such as:

  • Offering an intelligent, flexible and adaptable process for particle detection that can deal with a wide variety of samples
  • Providing the ability to effectively and routinely deal with small particles
  • Giving the user flexibility in how and what to analyze, enabling them to do things like terminate a run early if sufficient data has been acquired
  • Being able to deal with broken particles and many more

Why is speed such an important element of particle analysis?

I do not think I have ever met a feature analysis customer who did not desire better throughput.

The whole reason for performing feature analysis is to make more efficient use of time. This is because we turn to feature analysis when we want to have analysis automated for us – implying that we have probably already realized that we do not want or need to spend as much time sitting with the microscope performing manual analyses. 

Fundamentally, being able to go fast means that the operator of the instrument can run more samples in a given period of time and therefore complete more work. This means that the instrument provides a greater return on investment. 

What factors can hinder the speed of the particle analysis process?

The speed at which you can perform feature analysis will depend on a variety of factors. One of the big ones of these is the nature sample itself – whether we are trying to detect between significantly different types of features or very similar ones. 

If we are trying to measure subtle differences, then we will need to acquire longer duration spectra than if we are differentiating between very different phases. Another major factor that can increase the time of analysis is the size of the features to be analyzed. 

If we need to measure very small features, then we will need to increase the magnification significantly in order to image them effectively. This, of course, means that, in order to cover a given area, we need more fields of view and, therefore, more stage moves and image acquisitions. 

Depending on the precise nature of the sample, these two activities can take the largest proportion of time and significantly slow our overall run speed down.

Can you provide an overview of the speed advantage that AZtecFeature delivers and how it does this?

AZtecFeature is able to run at feature acquisition rates (not including image acquisition and stage moves) in excess of 120,000 particles per hour – or over 33 particles per second. This speed includes the detection of particles, measuring their morphology, acquiring EDS data from each particle, processing that data with the TruQ algorithms, quantifying the particle composition, combining the morphology and compositional measurements on a particle-by-particle basis, and classifying and displaying the result – in other words, an awful lot.

This is achieved by a combination of optimizations across the measurement chain – including using large area detectors in order to get the EDS data we need quickly; by using specially optimized microscope control and pulse processing systems to ensure that communication between systems is as rapid as possible; and by utilizing smart algorithms to ensure that we only analyze what we need or want to. 

Finally, all of this is packaged into intuitive workflows which aid the user through the analysis process, making it faster to learn and run. A lot of factors contribute to overall throughput.

Image Credit: Oxford Instruments NanoAnalysis

How do you ensure speed without impacting accuracy? 

Ensuring accuracy when running fast is absolutely essential for automated particle analysis. The first thing that is key to ensuring accuracy is the use of our TruQ algorithms. These are written in such a way that they ensure that you can rely on the quality of results at any speed. A good example of this is our pulse pile up correction algorithm, which forms part of TruQ. 

This algorithm is able to deal with the fact that multiple X-Rays may arrive at the detector at the same time and be counted as a single X-Ray of, for example, double the energy (if we are talking about 2 of the same X-Rays arriving at once). 

If we did not correct for this effect, our compositional measurements would be wrong, and we would run the risk of identifying elements that are not there – as we would have extra peaks in our spectra. 

The pulse pile up correction removes pile up peaks and puts them back in the correct place, ensuring correct element ID and quantification even at very high speeds. Another example is our second pass imaging approach – which locates particles with the accuracy of a slowly acquired image but the speed of a rapidly acquired one.

What benefits does TruQ technology grant AZtecFeature?

TruQ is right at the core of AZtecFeature and is essential for correct element identification and quantification. It corrects for a wide range of physical phenomena, including effects such as the absorption of X-Rays by the material that they are generated in and effects on spectra around peaks.

It has been tested against alternative approaches, and we are absolutely confident in the quality of the results that it gives. Fundamentally, it gives better results, more of the time.

What adaptations can be made in relation to a client’s budget?

The main adaptation comes down to the hardware that you choose to use AZtecFeature with. AZtecFeature works with both Oxford Instruments NanoAnalysis’ Xplore and Ultim Max ranges of detectors. These detectors are available with sensor sizes from 15 mm2 to 170 mm2

The larger the sensor, the more X-Rays can be detected and measured per unit of time, and therefore the shorter the amount of time we need to collect sufficient data to identify a particle. As such, with a larger area detector, we can work more quickly – i.e., have higher throughput. As a result, I would always recommend going with the largest detector that the budget will allow for.

How easy is this system to set up and use?

AZtecFeature has been designed to be as easy as possible to use. We make use of guided workflows to take the user through the analysis process and provide “Step Notes”- in interface instructions that can be customized to specific tasks so that there is always a helping hand when it is needed. 

We also combine all of the settings and optimizations that are made by users into recipes for easy recall later down the line – we even try and make it impossible to lose classifications. 

Finally, for a range of industries, we offer pre-defined analysis recipes that are fully optimized for specific tasks, including the analysis of gunshot residue in forensics, the analysis of non-metallic inclusions in steel, understanding battery cleanliness, analyzing metal additive manufacturing powders and technical cleanliness in automotive manufacture.

What are some future improvements you plan on making to AZtecFeature?

As with so many approaches that involve image analysis, the ever-growing capabilities of machine learning-based approaches for identifying features are of great interest – watch this space.

About Dr Matt Hiscock

Dr Matt Hiscock holds an MSci in geology from Bristol University and a PhD in geochemistry from Edinburgh University. He worked in the mining industry in Australia as well and has helped run an academic SEM facility focusing on geological analyses. Matt oversees our particle analysis products and works globally on applications - working to understand Oxford Instruments NanoAnalysis customers' needs and showing how our products address them.

This information has been sourced, reviewed and adapted from materials provided by Oxford Instruments Nanoanalysis.

For more information on this source, please visit Oxford Instruments Nanoanalysis.

Disclaimer: The views expressed here are those of the interviewee and do not necessarily represent the views of AZoM.com Limited (T/A) AZoNetwork, the owner and operator of this website. This disclaimer forms part of the Terms and Conditions of use of this website.

Citations

Please use one of the following formats to cite this article in your essay, paper or report:

  • APA

    Oxford Instruments NanoAnalysis. (2023, August 03). Introduction to Particle Analysis via AZtecFeature. AZoNano. Retrieved on April 26, 2024 from https://www.azonano.com/article.aspx?ArticleID=6353.

  • MLA

    Oxford Instruments NanoAnalysis. "Introduction to Particle Analysis via AZtecFeature". AZoNano. 26 April 2024. <https://www.azonano.com/article.aspx?ArticleID=6353>.

  • Chicago

    Oxford Instruments NanoAnalysis. "Introduction to Particle Analysis via AZtecFeature". AZoNano. https://www.azonano.com/article.aspx?ArticleID=6353. (accessed April 26, 2024).

  • Harvard

    Oxford Instruments NanoAnalysis. 2023. Introduction to Particle Analysis via AZtecFeature. AZoNano, viewed 26 April 2024, https://www.azonano.com/article.aspx?ArticleID=6353.

Ask A Question

Do you have a question you'd like to ask regarding this article?

Leave your feedback
Your comment type
Submit

While we only use edited and approved content for Azthena answers, it may on occasions provide incorrect responses. Please confirm any data provided with the related suppliers or authors. We do not provide medical advice, if you search for medical information you must always consult a medical professional before acting on any information provided.

Your questions, but not your email details will be shared with OpenAI and retained for 30 days in accordance with their privacy principles.

Please do not ask questions that use sensitive or confidential information.

Read the full Terms & Conditions.