LCGC North America
In this interview, Thomas Letzel of the Technical University of Munich considers the current state of water analysis, looking at recommended techniques, the growth of nontargeted screening, and multi-disciplinary collaboration.
In this interview, Thomas Letzel of the Technical University of Munich discusses recent developments and trends in environmental water analysis, addressing topics such as the current state of analytical technologies for water analysis, the need to complement reversed-phase liquid chromatography (LC) with hydrophilic-interaction chromatography (HILIC) and supercritical fluid chromatography (SFC), and recommended approaches for water screening.
Do you think that existing analytical techniques and methods are sufficient to address the current challenges in water analysis?
I would say yes. We have very robust systems available, and it's impressive what analytical chemistry can do nowadays. The techniques are very sensitive, and we always want them to become even more sensitive. But to be honest, if a hair falls into the sample, then my result will show the proteins of that hair, and not the molecules of the samples I am trying to analyze, particularly if I have highly sensitive instruments. That means sample preparation is more important than ever.
But I think where we have room to advance is in terms of flexibility. We have so many techniques available, but they often cannot communicate with each other. For example, there are fluorescence screening systems that can simultaneously measure both absorbance spectra and fluorescence excitation–emission matrices and provide a nice fingerprint (1), but you cannot correlate that with mass spectrometry fingerprints, because there is no software that enables the two systems to communicate. You cannot transfer the data from one system to the other.
I think that kind of flexibility and communication between techniques will really be needed in future.
In an interview with LCGC a few years ago (2), you noted that the leading technique for water analysis is LC–MS/MS with reversed-phase LC, but that there's a need to complement that with other techniques, like HILIC or even SFC. Has the use of those other techniques advanced or expanded recently?
These complementary approaches are needed more than ever, because water-soluble molecules, by definition, are very polar, and we cannot see polar molecules very well with reversed-phase LC combined with mass spectrometry.
Some people describe this as an analytical gap. But, to be honest, it's not an analytical gap. We just need to look at alternatives, like HILIC. HILIC is a good technique to separate very polar to moderately polar molecules.
The problem is that the mechanism of this technique is a little bit different from reversed-phase LC. You have to be sure that on the surface there is a water layer, for example. And the pH has to be robust, and the salt content is also important. But if you're an experienced chromatographer, you can learn these things easily.
In recent years I have given a lot of courses on HILIC. In the beginning of a course, I often ask, "Who has done HILIC in the past?" And everybody says, "I have. I tried." And then I ask, "Who still does it now?" And almost nobody does. But then I explain the mechanisms, and they always smile when they realize, "Oh, that is what I did wrong." When you understand the mechanism, it goes quite well.
The same is true for ion chromatography. This technique has been known for a long time, but you need charged molecules to use ion chromatography. HILIC can analyze both charged molecules and uncharged molecules. Therefore, I think, the more universal technique for the future is HILIC when compared to ion chromatography. Ion chromatography is established in many labs, but analysts should be not afraid to implement HILIC.
Capillary electrophoresis (CE) is also quite good for very polar molecules, and we should also consider supercritical fluid chromatography (SFC). Formerly, a lot of pharmaceutical companies used it for preparative purposes, and more recently it has become more popular for analytical use. But the coupling of analytical SFC with MS is relatively young.
Many labs are a little afraid to install unfamiliar techniques in their labs, and it's not about the techniques themselves, because they are not overly complex. If I'm an experienced analytical chemist, sometimes it's nice when not everything is familiar. Learning the details of an unfamiliar technique makes things interesting.
You can exercise your expertise as a chromatographer.
Exactly, yes, that's true. But my point is that there is not an analytical gap. The techniques are there, and labs should try to use them. There is a lag in education. But that is another topic.
What are some of your recommendations for LC–MS/MS screening and quantitation of contaminants in water?
There are different views on that. On the one hand, if we speak about water analysis, we speak typically about anthropogenic contamination. This may come from surface runoffs, sewage treatment plants, or other sources.
For the ecological environment, direct effluent from wastewater treatment plants is important. For example, we can look at diclofenac in Europe. Typically, it's not degraded in sewage treatment plants (in the United States, we don't see it, because the drug is not used much there).
So we know that this drug can be in the water. If we want to do a quantitative analysis, we use an internal standard with most techniques, or with mass spectrometry, you use an isotope-labeled standard. You inject that, or you spike your sample with that. Then you can easily calculate the recovery, and then do quantitative analysis.
With modern triple-quadrupole systems, you can quantify up to several hundred molecules in one run if you have the internal standards available. This type of work is called targeted screening (formerly called quantitative analysis), and we can do very sensitive and robust measurements this way.
But you also see a problem with this, because a lot of this analysis has become routine. For example, if you sell fruit, then you want to know that there is no pesticide in the fruit. That means everybody wants this to be measured, and therefore, we have a lot of labs doing this kind of work. But only the big labs can survive nowadays, because they can do these analyses more cheaply than small labs can.
But we can look at this from another point of view, in terms of nontargeted screening, which means we do not have a target in mind to identify or quantify. With nontargeted screening using time-of-flight mass spectrometry (TOF-MS), we can measure a huge region in one scan. You don't have to focus on one molecule. You can measure whatever comes out of the column at that moment and can be ionized. And you can do that very sensitively and very accurately, thus observing an empirical formula. And if you have a tandem mass spectrometer (MS/MS), you can also identify the molecules, or you can fragment them and you get structural information about them.
I have a picture that describes nicely the difference between targeted and nontargeted analysis. It is the cover picture of a book that came out last year, edited by Jörg Drewes and me (3). In this picture, there are two boats. One reflects nontargeted screening, where you catch the fish using a net. And in the other boat you have fly fishing and you catch one fish at a time. In the latter approach, I have captured all the fish in one place, and then I can count how many individual fish I catch. That is like targeted analysis.
In environmental chemistry, at the moment we have the mind-set that we want to know all the molecules by name. In targeted screening, we know them. We know that they are relevant. Therefore, we look at them.
On one hand, that's exactly what we should do in environmental analysis. Because we have to know how anthropogenic compounds are entering the environment and affecting the environment. We want to know if a compound is a metabolite, if it's a transformation product, or if it's an original molecule that is passing through a sewage treatment plant. We want to know all the compounds by name.
So in one sense, this is realistically the way we should go. But in another sense, maybe we are thinking in an old-fashioned way, because nontargeted screening can deliver more.
I think that the future of nontargeted screening is that we won't even want to know all the molecules by name, because in each sample we measure, we see hundreds and thousands of molecules. They are not all anthropogenic. They are not all "bad guys." Some of them, or even most of them, are biogenic. If a leaf falls into the river and degrades, then you have those biogenic molecules in the water, and we measure them in the same way we measure the anthropogenic ones. Sometimes the biogenic molecules dominate, and then you have a picture full of molecules.
In nontargeted screening, I want to measure all the molecules that occur in the sample. This is not only true for water, it's also true for other matrices and other disciplines. Metabolomics uses this approach, analyzing all the molecules in blood, in urine, and in other samples. I have a PhD scientist in my group at the moment running plant metabolite studies where we are doing nontargeted screening.
With a nontargeted screening approach you can monitor if something is changing. If we have retention times, then we can say something about the polarity of the molecules, for example. If we are using mass spectrometry, we have some information about the masses of the compounds. If I oxidize molecules, they become more polar and they become heavier because of oxygen addition, for example.
That means I can look into these data and get more information. There are physicochemical parameters behind the data. Sometimes we may just want to look at the changes from a statistical point of view. There are various questions you can answer with nontargeted screening.
Then we have to combine this screening with functional assays, with bioassays, because then we need to look at molecules at that point in terms of functionality, to see what is toxic or what may be causing environmental problems.
You and a group of scientists created an analytical software platform, called FOR-IDENT, using chemical compound databases for molecule identification (4). It can be applied to samples ranging from water, to plants, to house dust. Can you tell us about this platform and why it's useful?
We realized that if we wanted to do nontargeted screening, we needed more tools. In nontargeted screening in proteomics or metabolomics, we refer to the accurate mass data that we can get from our high-resolution mass spectrometers.
That is also true in nontargeted screening for environmental analysis. If you have accurate mass, you know something about the empirical formula of the molecules, and if you have a database that contains these molecules, you can say, "This empirical formula may be this compound."
But with HPLC and GC, you have also other information. For example, you have information about the polarity of the molecules. In LC, you can also say something about the octanol–water dissociation coefficient, LogD. If you inject ten compounds, then you can make a retention time index. It means your molecule elutes at a specific point, and this is why this retention time index is located on this LogD value. That means we have physicochemical information about the molecule.
We cannot predict retention times with that, but based on the accurate mass data, in a compound database, we may get three suggestions. One is very polar; it's LogD negative. The other is, let's say, zero. And the third is very positive. You then have the LogD value because you know the retention time in your chromatography. Then you can easily say, "It's more likely that my compound is the nonpolar one, or it's more likely the very polar one." This works as an easy and very efficient filter.
And then we realized that we have accurate mass and retention time information from an analysis, but if you use tandem mass spectrometry you also have mass spectra, and the compound identification suggestions you get from retention time and accurate mass might include some isomers that are very similar in polarity. They are similar in mass, but maybe different in their mass spectra.
And the last thing is that, in the last 10 or 20 years, we have seen a lot of stand-alone software solutions come out to help with this type of identification. We decided to create a database that also does that but is modular, and, includes a lot of different parameters that communicate with each other. So that is what we implemented with our FOR-IDENT platform. It's a modular platform to handle nontargeted screening data in a way that helps scientists identify or potentially identify compounds found in nontargeted screening.
That means we have three identifiers for one measurement, all from the physicochemical information you get out of LC–MS, and you can do the same with GC-MS. There, you don't have polarity, but you have the Henry constant, vapor pressure, and the retention time index.
So, GC is a bit different. To use the FOR-IDENT database, you need the molecular ion and the fragment spectrum, but in GC–MS, we use electron impact as the ionization technique. That means you fragment your molecules in the ion source.
That's where new techniques, such as soft ionization techniques, can be valuable. We have a nice cooperation at the moment in our lab where we coupled a new cold plasma ionization source (5) to our SFC, our LC, and our GC; we coupled one source for all of these to the same mass spectrometer, and we generate molecular ions and MS/MS spectra from it.
That gives us a new capability to identify the compounds. Typically, if you want GC spectra, you'll go to a NIST database where you have the spectra available, but if you don't have the spectra available, you cannot do that.
But if you have a compound database like STOFF-IDENT, you can do this. You have a molecular ion. You have MS/MS information, and you have the retention time information. Then you can use FOR-IDENT-type databases or platforms.
Your work on the FOR-IDENT platform and other databases is very collaborative. How does that collaborative work fit into a broader context across Europe or even worldwide?
There is a really powerful community doing this pioneering work, on the techniques themselves and also in handling the data, and we have agreed on an overall classification procedure. A whole community all over the world agrees on that.
This is really a good feeling. One thing I forgot to mention before is that the FOR-IDENT platform also makes it possible to upload supplier data. That had been needed for years. In the past, many suppliers had been reluctant to share these data, but then they realized that this is a solution needed by the community. I'm really proud of what the community has done so far.
For me personally, another important outcome of this work is that last year we formed a new analytical research and educational institution for nontargeted screening, called AFIN-TS GmbH (www.afin-ts.de). This is outside the university, because we really want to educate industry. I mentioned before that so-called analytical gap is not really about technological solutions; it's more about education. We realized that sometimes using new technologies is not that easy for industrial or government laboratories. Therefore, we created a research institution, because industry sometimes needs this type of help. Hopefully, this work will help nontargeted screening to become not only popular, but also more routine, so that it helps us all answer future questions. We are hosting a workshop this fall on nontargeted screening in Munich (6). I encourage people to attend and share ideas.
(1) Aqualog. Information available at https://www.horiba.com/en_en/products/detail/action/show/Product/aqualog-1578/
(2) A. Matheson, LCGC E-Separation Solutions, December 10, 2015.
(3) J.E. Drewes and T. Letzel, Assessing Tranformation Products of Chemicals by Non-Target and Suspect Screening. Strategies and Workflows, Vol. 2 (American Chemical Society, Washington, DC, 2018).
(4) FOR-IDENT. Information available at https://www.for-ident.org/
(5) Sicrit ionization devices. Information available at http://www.plasmion.de/
(6) Solutions and Workflows in (Environmental) Molecular Screning and Analysis (SWEMSA 2019), October 21–23 2019. www.swemsa.eu.
ABOUT THE INTERVIEWEE
Thomas Letzel
Thomas Letzel is an associate professor and the Chair of Urban Water Systems Engineering at the Technical University of Munich (TUM), in Germany and a co-founder of AFIN-TS GmbH.
Using Chromatography to Study Microplastics in Food: An Interview with Jose Bernal
December 16th 2024LCGC International sat down with Jose Bernal to discuss his latest research in using pyrolysis gas chromatography–mass spectrometry (Py-GC–MS) and other chromatographic techniques in studying microplastics in food analysis.
Overcoming Common Challenges to Determine Residual Impurities Using IC in APIs with Limited Water
December 10th 2024Organic solvents are generally not compatible with ion chromatography (IC) systems. The approach presented here assists the use of organic solvents for sample preparation and provides a mechanism for the removal of the organic solvents from the chromatographic flow path.
The Chromatographic Society 2025 Martin and Jubilee Award Winners
December 6th 2024The Chromatographic Society (ChromSoc) has announced the winners of the Martin Medal and the Silver Jubilee Medal for 2025. Professor Bogusław Buszewski of Nicolaus Copernicus University in Torun, Poland, has been awarded the prestigious Martin Medal, and the 2025 Silver Jubilee Medal has been awarded to Elia Psillakis of the Technical University of Crete in Greece.