LCGC Europe
LCGC Europe spoke to Leon Barron and Matteo Gallidabino to discuss novel nontargeted approaches to analyze explosive materials using ion chromatography (IC) with high resolution mass spectrometry (HRMS), and the challenges and solutions analysts can encounter when developing nontargeted methods.
LCGC Europe spoke to Leon Barron and Matteo Gallidabino to discuss novel nontargeted approaches to analyze explosive materials using ion chromatography (IC) with high resolution mass spectrometry (HRMS), and the challenges and solutions analysts can encounter when developing nontargeted methods.
Q. Nontargeted analysis (NTA) is currently gaining wider acceptance. What is NTA and where is it being used in your area of research?
Leon Barron: Most analyses are targeted in nature. That is, a number of specific compounds are selected before the analysis occurs. Nontargeted analysis (NTA) refers to applications where no specific analytes are shortlisted beforehand and the instrument captures everything it can detect, so that the data can be reviewed in a flexible way later. There are several ways to perform NTA including: (a) using all of the data generated by the instrument to classify or differentiate samples as a whole from each other by, for example, principal component analysis (PCA); (b) identifying specific “features” in the data that change significantly following exposure to a toxic substance, for example, and, whilst still not necessarily knowing its identity. As an extension of NTA, suspect screening is the identification of new compounds in the sample by matching measured data to one or more databases or by manual search using theoretical ion accurate m/z . For liquid chromatography (LC)- and gas chromatography (GC)-based techniques, NTA has been most useful when coupled to high resolution mass spectrometry (HRMS) instruments, which comprise of either time-of-flight (TOF) or orbital ion trap-based mass analyzers. The use of HRMS helps immensely to resolve significantly larger numbers of features and arguably represents the best means to most rapidly identify these afterwards too. In my area of environmental and forensic chemistry, NTA is becoming more commonly used. For example, environmental metabolomics is now emerging more to identify any endogenous metabolite features that change in aquatic species as a result of exposure to contamination or specific environmental conditions in rivers (1–4). We have also extensively used suspect screening to identify new organic contaminants, including pharmaceuticals, illicit drugs, explosives, their metabolites, precursors, and transformation products in complex samples such as wastewater and river water to monitor communityâscale activities (5–7). In forensic science, the ability to retrospectively mine such large datasets is very useful to go back and assess whether an analyte might have been present (8).
Q. Have there been any major technological breakthroughs nontargeted analysis?
LB: Arguably the most significant technological breakthrough that has pushed NTA forwards is the increased commercial availability of HRMS instruments that can be coupled to separation techniques such as LC or GC, for example. Mass accuracies of <1 ppm are now readily achievable with resolutions up to 140,000 fullâwidth at half peak maximum (FWHM), providing elemental composition-level information in many cases. Similarly, the ability to perform different modes of data independent analysis (DIA) offers extra flexibility for NTA including “all ion” fragmentation and sequential window acquisition of all theoretical mass spectra analysis (SWATH), for example. Along with the ability to perform traditional targeted analysis too (data-dependent analysis [DDA]), it has become possible to perform targeted, NTA, and suspect screening using the same instrument and, in certain cases, simultaneously. As a result, published targeted methods are generally growing with respect to the number of analytes they include as new compounds are discovered or added continually (9,10). This has presented analysts with a new challenge scale and treatment of data. Given the amount of data these instruments acquire, data analysis is now the bottleneck. It can take significantly longer to review and interpret the results generated for a single sample than it takes to run it in the laboratory! Whilst excellent processing tools and databases exist for MS data, there is less focus on separations data in my opinion and machine learning has recently proved useful here (10,11). I see massive potential in the use of machine learning generally moving forwards, not only for suspect screening but also for NTA, for example, for prediction of changes in ‘omics datasets or linking these to effects following toxicant exposure.
Q. You recently developed a method using both targeted and nontargeted gradient ion chromatography (IC) with HRMS to profile black powder substitutes and gunpowder residues. What were the aims of this research?
LB: The aim of this work was to develop and validate a new gradient IC-HRMS method that would be broadly applicable to quantitative determination of trace concentrations of lowâmolecular-weight inorganic and organic anions, but primarily that would be suitable for forensic casework in energetic materials analysis, including ammunition and explosives (12). We also wanted to exploit HRMS to potentially offer us more information about the sample. First, we focused on identifying a black powder substitute in fingermarks and sweat deposits from a donor using IC-HRMS. Using a mixture of targeted analysis and NTA with PCA, we investigated the time since the materials were handled, which was very exciting! Following this, we identified features in the data that drove any temporal trends to potentially serve as a new way to include or exclude similar residues of such materials found at a crime scene that were relevant to the case. We also aimed at gunshot residue to see whether we could use NTA to classify by the original ammunition used.
Matteo Gallidabino: Samples submitted to forensic analysis are usually characterized by a higher number of species than those usually targeted by traditional methods. This means that they also potentially contain complementary information, which may be helpful to track back the trace origin and deposition mechanism. Hence, a complementary aim of this project was to assess if enhanced intelligence could actually be extracted from forensic samples through the judicious combination of NTA with modern data analytics.
Q. What were the main obstacles you encountered in this project and how did you overcome them?
LB: I have been working with IC-MS since around 2001 and coupling the two techniques has often been cumbersome in comparison to LC–MS in my experience (13). Thankfully, modern IC-MS systems are now available with integrated instrument control and data analysis. However, in this particular application there were two main challenges because IC generally operates using purely aqueous eluents. First, IC eluates are not very volatile and auxiliary pumps are often needed to deliver organic solvent into the eluate to aid gas phase transfer in electrospray ionization (ESI), especially for trace analysis. Second, extraction of complex samples typically encountered in forensic casework normally use organic solvents not always compatible with IC separations. Therefore, we have been trying to find ways to circumvent these two issues by using organic solvent in the eluent itself, which presents its own challenges (14). So far, we have tried a number of additives, including methanol, acetonitrile, and ethanol, which have each removed the need for auxiliary pumps making the coupling process much simpler, and keeps the system cleaner at the same time. The trade-off is IC selectivity, which changes markedly and, in some cases, unpredictably. Furthermore, some organic solvents transform under alkaline conditions, which leads to interference; for example, acetonitrile can hydrolyze in hydroxideâbased eluents to yield acetate ions in the background signals. In this work, we aligned the IC eluent with that of the sample extraction solvent directly, that is, 50:50 (v/v) ethanol–water. We also improved the selectivity over previous IC methods by introducing a gradient separation using carbonate–bicarbonate as an eluent and this worked very well.
As we do not usually know the identity of an energetic material in forensic science, it is necessary to perform both organic and inorganic screening. This extract would therefore normally be analyzed directly by an LC–HRMS method for a large suite of organic high-order explosives, precursors, and transformation products and then, following solvent exchange to remove ethanol, by IC-MS. By developing the separation in ethanolic eluents, our aim was that the sample extract could be analyzed directly, which could increase throughput and robustness.
MG: Challenges to overcome during this project were numerous, and not just limited to the decision of which strategy to adopt. A priority for the IC-HRMS method was that it could easily be aligned with current forensic practices. As 50:50 (v/v) ethanol–water is often used to extract explosive samples in casework, we decided to adopt this as the eluent. Preliminary tests on ESI performance supported the choice because they showed that this mixture led to the same or better signals than conventional eluents used in liquid-based chromatographic techniques. That was promising, but the viscosity of the mixture obviously also brought some challenges with the column back pressure that had to be addressed by increasing the column temperature. Also, ESI using 50:50 (v/v) ethanol–water has been rarely investigated before, so the best conditions were essentially unknown. Therefore, the implementation of the method basically became a problem of fine-tuning all the parameters involved in the chromatographic, ionization, and detection steps! We eventually used a statistical-based, design-of-experiments (DOE) approach to deconvolute this complexity and properly investigate analyte separation and responses to find optimal conditions. The use of predictive modelling methods was therefore not just limited to the evaluation of the new approach in an operational context, but also to its optimization. Thanks to this, in any case, we were able to achieve excellent analytical performance.
Q. What is novel about your approach and what benefits does it offer over previous techniques?
LB: The use of IC-MS is not new, but its use in forensic science is really only emerging now. The benefits of this particular method were that a larger number of anions could be detected (n = 19) than previously possible as a result of the optimization of gradient conditions. The lower limit of detection lies in the low µg/L range, making it suitable for direct trace analysis across a range of applications if needed. One of the main benefits this method offered is obviously its direct integration into standard workflows, making the analytical process far simpler and practical for the analysis of organic–solventâbased extracts. As well as keeping the system clean, it also enables elution of hydrophobic/non-charged species, which would otherwise be fully retained by IC, thereby widening its scope.
While several applications have been reported in environmental science using IC-MS, for example (15), the use of ICâHRMS is quite rare. This technique offers obvious advantages in forensic science, especially for NTA and suspect screening. Here, our approach allowed us to show how both the targeted anions and the NTA profile of the rest of the contaminated sweat sample changed over a period of hours following contact with the black powder substitute, even after washing their hands! Similarly, for a range of different gunshot residues collected after firing a gun, we were able to link these with three original ammunition brands that were used. Using NTA, we were able to tentatively identify several new potential compounds afterwards that could provide additional linkages between different evidence types together. The ability to provide some degree of source apportionment is a major advantage of any technique in forensic science and we established this proof of principle here.
MG: This novel approach is quite revolutionary and has all the characteristics to have a large impact on forensic practice. We proved that the combination of NTA-based techniques and advanced data analysis could actually provide enhanced intelligence for use in crime investigation. Not only does the method allow the main components in the submitted samples to be rapidly identified, but it can also extract additional information such as the time since handling and potential origin of the analyzed traces. Some supplementary work is still needed to truly implement these possibilities in actual casework, but our method is an effective step forwards. In this regard, it has the potential to unlock a range of new possibilities in forensic profiling, and also to further highlight the value of forensic science in crime investigation.
Q. Are you planning to develop this research further?
LB: Yes, we have just started a new project that will combine LC and IC-HRMS analysis together to identify precursors and indicative reactant species related to threat agent manufacture, including explosives and drugs, for example. In 2017, we were the first laboratory to identify residues of highâorder explosives in municipal sewage using LC–HRMS, having performed drug-based wastewater epidemiology for many years (7,16). This project seems like an obvious way to integrate both techniques for application to a very complex matrix, such as wastewater, using NTA. Lastly, confirmatory analysis even of simple anions and cations is very much needed in other areas of forensic science, for example, in support of “acid attack” investigations. I also plan to extend my previous research into the analysis of disinfectants and their by-products. We have already made some progress recently using IC-HRMS for drinking water (17), but it will be good to extend our knowledge on the breadth of toxic species formed following disinfection processes in several other areas too.
MG: We previously showed how machine learning could help to associate different gunshot traces found at the scene of a gun crime (18). The approach worked well, but it could be further improved if coupled with NTA data and, thus, integrated in an ‘omics workflow. We are working on that and going to test this hypothesis, and also for the analysis of arson accelerants. The final objective is to develop a transversal profiling approach that can be applied across different fields, and better support the criminal judicial system in the decision-making.
Q. Do you have any practical advice for chromatographers who are embarking on developing a nontargeted analysis method?
LB: First, try to make your NTA method as generic as possible so that it can capture a wide chemical space. This may mean developing a longer, shallower gradient to separate as many features as possible. Also, be aware that a single NTA method will not cover everything. You may need to identify species that might fall outside the scope and whether you need multiple separation modes to cover what you need, for example. This is exactly why we put LC and IC-HRMS together for NTA of explosives-related evidence. Lastly, when setting up your sequences, make sure to randomize your samples, controls, and quality controls (QCs). NTA can often produce deceptively nice trends or classifications, but it is important that groupings or observed changes in the data are actually real, and not just a product of instrument performance drift. You may also want to think about your mass analyzer because you may not always get very high mass accuracy, resolution, and data acquisition speed all in one HRMS instrument.
In my opinion, visualizing, manipulating, and interpreting the data are actually the hardest parts, rather than the laboratory science. Vendor-licensed and open-source software is available and helps with data normalization and chromatogram alignment, and a range of online databases support new compound identification if needed later on. Also, to give you added flexibility, try to learn a coding language such as R or Python. In many cases, freely available codes have already been written for complex tasks and these can be a very useful resource, not only for NTA but also other areas such as PCA and machine learning. It is not as hard as you think!
References
Leon Barron is a senior lecturer in forensic science at King’s College London, UK. He received his Ph.D. in analytical chemistry from Dublin City University, Ireland, in 2005. His expertise lies in analytical chemistry, particularly in separation science, mass spectrometry, and machine learning for targeted and nontargeted applications in environmental, forensic, and biological systems analysis.
Matteo D. Gallidabino is currently a senior lecturer in forensic science at Northumbria University in Newcastle, UK. He has a comprehensive background in criminalistics that he obtained at the School of Criminal Justice of the University of Lausanne, Switzerland. His work focuses on next-generation analytical techniques, successfully combining separation methods, mass spectrometry, and advanced data analytics to provide enhanced information in a forensic context and better support the court in the decision-making process.
LCGC Blog: Forensics Laboratories Underassess Uncertainty in Blood Alcohol Determinations
August 8th 2023The level of uncertainty provided by most forensic laboratories for reported blood alcohol results has been woefully underassessed. Not only is this bad science, but someone’s civil liberties may be at stake.