A snapshot of key trends and developments in the chromatography sector according to selected panellists from companies who exhibited at Analytica 2018.
GC/GC–MS
LGGC: What trends do you see emerging in GC or GC–MS?
Phillip James: Increasing levels of regulation across many sectors has resulted in growing numbers of samples requiring analysis, and this has led to several trends in gas chromatography (GC). A shortage of skilled chromatographers means that there is a greater need for instruments that are simpler to operate and maintain, sample preparation also needs to be simplified and automated, and more sophisticated software for interpreting results is needed. There is also a demand for faster analysis times because of this greater number of samples.
I also believe we will see a trend towards greener chromatography. This increased level of testing means a much larger CO2 footprint for the analytical testing industry. I think a move towards greatly reducing the CO2 footprint for each sample is going to become more relevant.
Lorne Fell: An emerging trend is certainly nontarget analysis: the ability to truly discover what is holistically in a sample has been recognized in recent articles, conferences, and workshops. The need for nontarget analyses will continue to branch beyond the typical markets of metabolomics and food, flavour, fragrance into petroleum, food safety (foodomics), and environmental exposure (exposomics).
Another newly developing trend is dual detection for GC and multidimensional GC (GC×GC), particularly the combination of mass spectrometry (MS) with flame ionization detection (FID). These technologies allow researchers to quantify without the use of internal standards for each analyte using a FID because of its near universal response; as well as confirm their identity with MS deconvolution and library searching.
Ulrich Meier: Instruments that are more versatile, and easier to use and service with smaller environmental footprints and using less bench space are emerging. Instruments with a more compact design, easy exchangeable injection, and detection techniques are another trend.
Laura McGregor: In recent years, we have seen GC×GC becoming a more routine technique. Until now, users have often shied away from the technology, convinced that it was too expensive, or too difficult, to implement in their laboratories. However, more affordable hardware and simpler workflows mean it is now starting to be adopted by highâthroughput laboratories.
Jack Cochran: On the MS hardware side, the trend for improvements in selectivity and sensitivity continues. There seems to be a move in some application areas from highly selective tandem MS (MS/MS) systems towards accurate mass instruments that can provide selectivity and universal detection simultaneously.
Some chemists argue that further MS sensitivity improvements are not necessary but there are select ultratrace applications, such as determination of brominated dioxins in environmental samples, that do demand better sensitivity. Given the popularity of “just enough” sample preparation methods like QuEChERS, extra MS sensitivity also supports injecting less dirty sample on the GC instrument to reduce the amount of maintenance required on the inlet and column. That maintenance leads to loss of sample throughput.
I do not want to overlook the development of powerful data analysis software as a trend either, especially for accurate mass MS instruments. Calculation of chemical formula, automated searching of web chemical structure databases, generation of Kendrick mass defect plots, and statistical analysis programs really extend the power of these instruments.
LGGC: What is the future of GC or GC–MS?
Phillip James: One area I believe the future of GC lies is with ultra-fast GC (UFGC). The technology has the potential to answer many of the issues facing analysts today. Greatly reduced power consumption for each sample, faster cycle times, smaller instrument footprints, and greater portability are some of the benefits. Greater adoption of this technique will help drive the development of improved detection systems, which are both faster and more sensitive.
Rapid advances in computing power have allowed for new types of data processing that were previously impossible. This will allow new types of detector and vast improvements in automated interpretation of results and data. I also believe that there is still scope for further advancements of GC column technologies to complement UFGC
Lorne Fell: As I mentioned previously, the technologies that are best suited for the nontarget analysis markets will continue to evolve and become more mainstream. These technologies will include multidimensional chromatographic separations, highâspeed, and highâresolution mass spectrometers. The future, however, will be dominated by the software’s ability to easily and reliably generate chemical and biochemical information out of mountains of data. I believe GC and GC–MS are having a resurgence in popularity with several significant technological advances and, GC is, once again, achieving prominence at major conferences.
Ulrich Meier: GC is migrating to GC–MS, and GC–MS is migrating to GC–MS/MS. MS systems are increasingly replacing systems with specific detectors, such as the electron capture detector (ECD), the phosphorous nitrogen detector (PND), the flame photometric detector (FPD), or the FID because of tighter regulatory conditions and the acceptance of MS by many GC users. There is also limited range of detectable components by nonâMS detectors. Positive identification is still a main motive and more GC laboratories are investing in MS-based configurations.
Laura McGregor: There is a high demand for faster, fully-automated methods with simple reporting and minimal review. Basically, “push button” systems that answer a particular question. For example, analysts commonly want to know: “What’s the difference between sample A and sample B?” Instrument manufacturers will, of course, be striving to provide instruments that answer such questions, although I doubt we will ever be able to match the omniscient analytical systems in TV forensics shows!
Jack Cochran: The future of GC and GC–MS looks very good because some molecules are just always going to be more efficiently analyzed by gas chromatography. And when I say “efficiently”, I’m not only talking chromatographic separation power, but also ionizability, spectral library availability, quantification accuracy, and the cost of analysis.
LGGC: What is the GC or GC–MS application area that you see growing the fastest?
Phillip James: Whilst not a specific industry or application, the increase in regulatory testing has created a trend that only a small percentage of samples analyzed contain compounds of interest above regulatory levels. The majority of samples are either blank or below levels of interest but are still tying up expensive laboratory resources. From this we have seen a requirement for more intelligent systems, for the rapid screening of samples with only the positive sample then passed for detailed re-analysis.
The cannabis industry is an area for big growth in the need for testing throughout the supply chain. With more and more customers realizing the benefits analytical testing can bring right across this industry, this will only grow as further territories legalize the use and the industry matures.
Lorne Fell: Metabolomics is certainly still on a significant growth curve, but the analysis of cannabis is out-pacing all others right now. It will be very interesting to see how it plays out; regulations, legal matters, and methods are still in flux, but there is a large degree of momentum behind it right now.
Metabolomics is such a broad category that one needs to uncover the areas of higher growth. “Exposomics” is certainly gaining a lot of attention and importance.
Ulrich Meier: Hyphenated systems are gaining more interest. The possibility to get analytical information by, for example, hyphenating thermal gravimetric analysis (TGA), infrared (IR), and GC–MS in a single run using a limited amount of sample make it an attractive proposition for material characterization.
GC×GC is an interesting field in academia or for very specialized users, but it still has to gain more acceptance in routine analysis.
Laura McGregor: GC×GC has been long-established as the technique of choice for the petrochemical industry, but recently we have seen a stronger uptake by the environmental sector. For example, environmental contract laboratories are now replacing time-consuming off-line fractionation steps and multiple analyses with a single solvent extract run by GC×GC with a FID.
In some cases, new legislation is actually driving the need for these advanced analytical techniques, for example, the Canadian Ministry of the Environment has now listed GC×GC as an acceptable analytical technique and published multiple regulatory methods on its use with a microâelectron capture detector (µECD) for monitoring chlorinated species, such as polychlorinated biphenyls (PCBs) and chloroparaffins.
Jack Cochran: Probably life sciences, which seems a bit counter-intuitive given that I often think of big molecules and liquid chromatography (LC) when I think of life sciences. But metabolomics, which involves the determination of small molecule metabolites in a biological system, has exploded over the last few years. Metabolomics research areas include disease diagnosis, fruit flavour improvements, drug development, and environmental impact, just to name a few.
Interestingly, the use of GC–MS for metabolomics overcomes an obstacle for the analysis of polar molecules, by using derivatization to make the compounds GC-amenable. It is worth the extra sample preparation trouble to use GC–MS because LC–MS can suffer from inaccurate quantification in complex metabolomic samples as a result of charge competition effects in electrospray ionization.
LGGC: What obstacles stand in the way of GC or GC–MS development?
Phillip James: I don’t see any real barriers to GC or GC–MS technology development. The advances in computing and production methods mean it is now possible for new concepts to be introduced. There is scope for some truly innovative and disruptive GC technology to appear in the next decade.
Lorne Fell: Obstacles are interesting to discuss because GC and GC–MS have been in the forefront of analytical capabilities for so long. Certainly spectral expansion of existing libraries (NIST etc.) is still necessary because many chemicals are still being discovered and need to be more quickly entered into libraries and published for all to use. For GC specifically, column developments for higher temperature (beyond 400 °C) analytical phases for petroleum applications would fulfil an unmet need in today’s marketplace.
Furthermore, clarity and speed for analytical method approval from governmental regulatory bodies would be highly beneficial.
Lastly, the impression (or myth) that GC and GC–MS is old technology and no longer useful is certainly an obstacle for development and continued adoption-and is certainly not true!
Ulrich Meier: Running multiple instruments in the laboratory with specific software is a limiting factor. GC and GC–MS running on different software platforms require a higher level of training for the user. Solutions integrating all available sample introduction techniques, GC, GC–MS, and MS/MS on a single software platform with easy data exchangeability that also fits for a regulated environment is, in my opinion, still more a wish than a reality.
Laura McGregor: Instrument hardware is constantly evolving, but it seems that the software aspects struggle to keep up with the huge volumes of data that may now be generated from these advanced techniques, especially when using highâresolution mass spectrometers. This so-called “Big Data” and the associated need to streamline processing workflows is one of the main obstacles to routine adoption of advanced GC–MS techniques.
For GC×GC, “omics”-type workflows are a key challenge. These require trends and differences to be spotted across huge sample batches, each containing hundreds, if not thousands, of variables. Currently, there is no fast and foolproof solution to this.
Jack Cochran: A major obstacle is the GC system itself, in the context of the analytes and the samples. Only relatively volatile analytes can be determined with GC and GC–MS. Highly polar or highâmolecular-weight compounds can only be chromatographed with difficulty, if at all. Samples containing nonvolatile matrix components can take the GC system down quickly. If you cannot get the compound through the GC, the power of the detector does not matter.
Another obstacle is the inability of GC–MS to distinguish isomers from each other, especially when they coelute on a GC column. This is important because one isomer may be toxic, or contribute most to a flavour or fragrance, versus another isomer. Stationary phase selectivity research seems mostly to be a thing of the past, and this would be a good obstacle to overcome.
A relatively new GC detection system based on vacuum ultraviolet (VUV)spectroscopy offers a potential solution to the isomer determination obstacle because absorbance spectra are based on a molecule’s specific shape. This results in a unique fingerprint for a compound that allows for its spectral deconvolution from coeluting peaks even when they are isomers.
LGGC: What was the biggest accomplishment or news in 2017/2018 for GC or GC–MS?
Phillip James: The growing acceptance of UFGC is the most interesting development. Whilst the technique has been around for many years, the increasing number of companies offering UFGC systems or accessories means it is now being taken seriously. The availability of systems that combine UFGC with conventional air blow chromatography will help drive adoption of the technique by allowing users to move to UFGC methods with the safety net of still being able to fall back on existing methods if required. Further adoption of this technique has the potential to act as a catalyst for other future developments.
Lorne Fell: The most innovative accomplishment that I witnessed this year was the development and description of comprehensive three-dimensional (3D) GC by Robert E. Synovec from the University of Washington.
Ulrich Meier: Hyphenation of instruments such as TGA–IR–GC–MS, GC–inductively coupled plasma (ICP)–MS, or high performance liquid chromatography (HPLC)–ICP–MS. These techniques are becoming available in a wider market.
Laura McGregor: As far as GC×GC is concerned, the biggest change over the past couple of years is the significant expansion of the application range, driven partly by more flexible instrument configurations. For example, GC×GC has now been applied to challenges as diverse as identification of cancer biomarkers, characterization of paper, monitoring human decomposition odour, and profiling illicit drugs. Looking forward, we are expecting this trend to continue, with advances in sample introduction, parallel detection incorporating novel detectors, soft ionization, and development of new stationary phases all helping to provide more information than ever before on the composition of complex samples.
Jack Cochran: For me it’s seeing applications of what I consider two exciting new approaches to GC detection: atmospheric pressure chemical ionization (APCI)-MS and VUV spectroscopy. APCIâMS can offer 10 times (or more) sensitivity improvement for some analytes compared with electron ionization MS. This type of mass spectrometry’s soft ionization promotes molecular ion formation, which is very important for compound identification. The environmental community is embracing this new technology for analysis of halogenated persistent organic pollutants-just one example of its utility.
VUV spectroscopy offers a unique selectivity compared with any other detectors currently in the GC space. Strong applications include analysis of gasolineârange samples, and determination of terpenes in flavours and fragrances and cannabis. Both applications benefit from absorbance spectra deconvolution to simplify complex samples.
Phillip James is the Managing Director of Ellutia.
Lorne Fell is a Separation Science Product Manager at Leco Corporation.
Ulrich Meier is a Business Line Leader Chromatography at PerkinElmer LAS GmbH.
Laura McGregor is a Product Marketing Manager at SepSolve Analytical.
Jack Cochran is the Senior Director of Applications at VUV Analytics.
LC/LC–MS
LGGC: What trends do you see emerging in LC or LC–MS?
Hansjörg Majer: Hydrophilic interaction liquid chromatography (HILIC) broadens the range of target compounds to small polar molecules. Although HILIC has been discussed for a long time, the method itself has become more routine because of better tools to perform this technique with confidence. Comprehensive twoâdimensional liquid chromatography (LC×LC) will leave the research field and enter routine analysis if more tools are developed to make this powerful technology simpler.
Higher peak capacities can now be obtained by hyphenating multidimensional liquid chromatography with ion mobility spectrometry, and high resolution mass spectrometry is driving the “omics” scene.
Ashley Sage: I see a continued growth in the use of liquid chromatography with tandem mass spectrometry (LC–MS/MS) technology where previously other types of analytical instrumentation would have been preferable. This is because the improvement in MS technology, in both hardware and software, allows highly selective and sensitive methods to be developed for applications including food safety, food authenticity, and security; environmental protection, including water quality testing; biopharmaceutical research and development; and clinical research, forensics, and “omics”-related research. The use of high-resolution accurate mass LC–MS in all these applications is a growing trend.
Atsuhiko Toyama: Automation has always been the trend in LC and LC–MS and has been driving the development of various “analyzers” for alleviating the need for optimization and streamlining the sample preparation workflow. I predict that this trend will continue but with increased sophistication. Now instrument vendors are beginning to provide fully automated, “sample-to-result” analytical platforms.
Kai Scheffler: On the LC side the trend towards ultrahigh-performance liquid chromatography (UHPLC) using subâ2âµm particle size is ongoing, whereas for lowâflow applications the trend splits into two directions: towards very low flow (subânanolitre) rates, and towards capillary flow LC. Very low flow is required where applications demand extremely high sensitivity. The demand for increased sensitivity, throughput, and robustness has seen capillary flow LC becoming more important because of its ability to provide increased MS sensitivity compared to typical analytical flow LC–MS, with the additional advantage of lower solvent consumption and higher throughput while maintaining similar sensitivity as nanoflow LC. For routine and quality control (QC) markets we see demand for increased productivity, robustness, reliability, and accuracy with high selectivity and sensitivity; this can be addressed by LC–MS, with a continuing trend towards the use of high-resolution accurate mass (HRAM) MS within these environments.
LGGC: What is the future of LC or LC–MS?
Hansjörg Majer: Nano-LC and capillary LC are the way forward for greener and faster liquid chromatography technology.
Fast targeted analysis using fast and specific columns is the way forward for routine analysis and increasing peak capacity is the way forward to understand complex analysis.
The routine analyst will ask for automation of the complete workflow, including sample preparation as a kit or as a specific analyzer.
Ashley Sage: The future of LC–MS is the continued growth and adoption of the technology to solve challenging analytical problems. The technology involved in mass spectrometry development over the past 20 years or so has meant smaller, faster, more selective, and more sensitive instrumentation being designed and implemented for analytical assays that can be considered as extremely complicated. The analysis of multiple analytes with minimal sample preparation or the identification of protein structure is no longer a complex challenge. One of the biggest challenges going forwards is processing the data and understanding what all this data means to scientists.
Atsuhiko Toyama: Although much effort has been put into simplifying data processing and review, these factors still remain far from full automation, consuming a lot of time and analytical expertise. It is easy to envisage the implementation of artificial intelligence as an integral part of an analytical platform to assist data processing and review, as well as enabling selfâdiagnosis, selfâtuning, and self-maintenance to further reduce human intervention in a routine operation.
Kai Scheffler: In the future, LC will continue to be a separation technique of choice, and we expect to see increasing connectivity with MS, in particular for workflows in the routine applied markets and quality control. MS, and in particular HRAM MS, provides an additional level of confidence that most optical detectors simply cannot provide. In addition, with recent technological advances, both triple quadrupole (QqQ) and HRAM MS can now provide a level of sensitivity that opens doors for replacing other technologies, for example, costly immunoassays in clinical laboratories. This does, however, require very easyâtoâuse, robust instruments and workflows providing reliable, high-quality data, regardless of the user’s experience.
LGGC: What is the LC or LC–MS application area that you see growing the fastest?
Hansjörg Majer: Food safety continues to be one of the fastest growing areas. Fast biomarker analysis will become a more prominent area in clinical applications when drug monitoring becomes more routine.
Multidimensional LC combined with multidimensional spectroscopy will drive all the “omics” sectors, but will become important in other complex areas, including food fraud.
Ashley Sage: Several application areas are growing fast with the use of LC–MS. These include food safety (protection of the consumer from contaminants and adulteration of food ingredients and products); pharmaceutical development for health protection, particularly the design and development of new biopharmaceuticalârelated therapeutic medicines; clinical research, especially related to disease identification and patient treatment; and metabolomics and proteomics and the understanding of how human health is affected by external factors and how debilitating diseases, such as cancer and Parkinson’s, can be better understood to hopefully find better treatments and potentially cures in the future.
Atsuhiko Toyama: Multitarget screening and general unknown screening for food safety, toxicology, and drug-of-abuse testing have been showing dramatic changes in the past few years. Recently developed data-independent acquisition schemes have been implemented in this area that enable both quantification and qualification in an attempt to replace high-sensitivity targeted quantification by QqQMS. Further improvements in the scanning speed of QqQMS have enabled thousands of multiple reaction monitoring (MRM) transitions to be programmed into a single analysis to catch up with of the number of compounds to be screened and quantified. High speed is attributed not only to the increased compound coverage, but also to improved identification confidence by a type of MRM that decreases false-positive reporting.
Kai Scheffler: The biopharmaceutical market is a rapidly growing market with an increasing need for full characterization and monitoring of critical attributes of biopharmaceuticals from development to production, requiring high-performance separation coupled with HRAM MS. The falling patent cliff for biotherapeutics opens opportunities for biosimilars. This, combined with the ability to address stringent regulatory requirements, are key growth drivers for LC–MS technology. Whereas most biopharmaceutical QC laboratories are currently using LC–ultraviolet (UV)âonly methods next to a whole variety of other technologies, there is strong interest in applying LC–MS to monitor multiple critical quality attributes within a single assay, thus increasing analysis speed and confidence, while reducing the number of required assays. Also, demand for targeted quantitative assays in DMPK is growing (both regulated and nonregulated), requiring robust, reliable, sensitive, reproducible workflows addressing all regulatory requirements.
LGGC: What obstacles stand in the way of LC or LC–MS development?
Hansjörg Majer: Instrument developments to improve micro-LC methods by, for example, decreasing dead volumes to make them more robust will evolve this technology.
Software developments to improve “feature recognition” in multidimensional LC combined with multidimensional spectroscopy should be investigated further.
Mass spectrometry has become prevalent and at times so good that the LC part is regarded in some quarters as being unnecessary. However, the lower the concentration level of compounds of interest, the more complex the composition of the sample and the more is expected to be seen by MS, the more benefit there is in having a tool to de-complex the sample before it enters the mass spectrometry inlet. This is often neglected when new direct injection techniques for MS systems are discussed. But the struggle between sensitivity, speed, and the technological limitations that exist today makes LC a great way to ensure the MS system can handle the entire contents of the sample.
Ashley Sage: Smaller, faster, more sensitive, and higher performing instruments are always the holy grail of LC–MS development. However, improvements in sample analysis workflows to streamline and simplify processes are always being challenged. These include simpler and faster sample preparation routines, developments in chromatography systems, particularly the implementation of easierâtoâuse microâflow LC systems for efficient and sensitive methods, newer phases for improved separation power, and also developments in MS technology in terms of sample introduction techniques, such as ion sources, and MS scan functions, such as mass analyzers. You should also include software developments here too. As analytical scientists generate more data, better software routines are required to process and understand what the data is saying.
Atsuhiko Toyama: The main obstacle is the long-unsolved trade-off between good LC separation and LC–MS sensitivity. Researchers are forced to choose between nanoflow or conventional LC–MS, where nanoflow can achieve 100 times more higher sensitivity in comparison to conventional flow, which in turn sacrifices robustness, flexibility, throughput, and resolution of chromatographic separation. Achieving the same level of analyte ionization in the presence of a high volume of mobile phase, or in a miniaturized LC system, would be both groundbreaking and dramatically expand the application of LC–MS in all fields. Unravelling the enigma of electrospray ionization might be the key to a major breakthrough.
Kai Scheffler: As the trend towards increased speed and sensitivity continues for both LC and MS, instrument developments have to balance this with system robustness and reliability, ensuring high overall quality of data as well as matching the acquired MS data points to the narrow UHPLC peak widths.
Bringing LC–MS to routine markets and QC requires a holistic solution that is easy to use, offering intuitive and compliant software for all analytical requirements, as well as instrumentation with smaller footprints as a result of limited space in these highâthroughput environments. LC–MS is still perceived as a challenging and complex technology requiring high levels of expertise. Lastly, for analysis of intact proteins, MS instrument platforms have significantly improved; however, more column chemistries providing the separation capabilities required for complex protein mixtures would benefit analysts.
LGGC: What was the biggest accomplishment or news in 2017/2018 for LC or LC–MS?
Hansjörg Majer: The commercialization of LC–MS/MS systems as analyzers, dedicated to automating a complete workflow, has ramifications in the industry. While these new analyzers can make routine jobs easier, they lessen the skills of LC–MS/MS practitioners at a time when more knowledge and understanding is necessary.
Ashley Sage: A very good question! From my perspective, it is the increased use of LC–MS for all the applications discussed. From ensuring the safety of our food with crises such as the contamination of eggs with fipronil through to the new characterization of bio-therapeutic proteins to help treat diseases. I believe that LC–MS will continue to be a routine front-line analytical technique and we are yet to see new applications for its use. Maybe we might even see some new MS developments over the coming years to advance the technology even further. As a science, we have even put mass spectrometers in space, so who knows what may come next!
Atsuhiko Toyama: The micro-pillar array columns can be seen as one exciting technology that has been delivered to market in the past year. I see this as an interfacing technology between conventional nanoflow HPLC and nextâgeneration microfluidics, though the real impact still needs to be demonstrated in more applications and pioneering research areas. Nowadays, the speed improvement of LC–MS/MS allows to increase the number of MRM acquired and to convert the responses of multiple MRM into a pseudoâMS/MS spectrum which provides higher confidence in compound identification than measuring a full MS/MS spectrum.
Kai Scheffler: The market introduction of different MS instruments with advances for triple quadrupoles as well as high resolutionâbased platforms provided increased sensitivity and scan speeds up to 40 Hz, resolving power up to 1,000,000 FWHM, and ultraviolet photo dissociation (UVPD)-a novel fragmentation previously only available as a customized instrument modification. These are all a variety of tools for significantly improved qualitative and quantitative analysis of different molecules, in particular of proteins on the intact and peptide level in complex biological samples, showcased by the record of more than 1200 unique peptides identified per gradient minute (1).
References
Hansjörg Majer is the European Business Development Manager at Restek Corporation.
Ashley Sage is a Senior Manager, Applied Markets Development, EMEAI at Sciex.
Atsuhiko Toyama is a Manager, Marketing Innovation Centre at Shimadzu Corporation.
Kai Scheffler is a Product Manager, Chromatography and Mass Spectrometry Division at Thermo Fisher Scientific.
Sample Preparation
LGGC: What trends do you see emerging in sample preparation?
Paul H. Roberts: Simplifying workflows by combining or eliminating steps and reducing sample preparation time alongside improved automation and throughput are major trends. Reduction in sample volumes, with consequent reduction in solvent use and evaporation time is also a trend. Matrix scavenging techniques instead of analyteâtargeted sample preparation is a growing area.
Alicia Douglas Stell: Some of the fastest growing trends in sample preparation include the need for faster and easierâtoâuse systems that yield repeatable results without a technician dedicating significant time to their setup and use. As analysis techniques become more sophisticated and can report lower detection limits the need for rapid and consistent sample preparation is the ideal for accurate analyses.
Peter Dawes: The need for automation of sample preparation processes is obvious and is often stated as necessary, but broad adoption has been very slow with automation seen as expensive, requiring skilled operators to implement, and most automation platforms are intended for high-throughput applications. The broader need is for systems that are flexible and easy to set up for many different jobs in a laboratory’s day. Typically, sample runs are between 10 and 100 samples. Systems designed with this in mind are now becoming available.
The drive for development of automated sample preparation systems is extreme ease-of-use with a reduction in cost for sample preparation and a greater reliability in the results.
Oliver Lerch: The trend towards miniaturization and automation of sample preparation that we have seen recently will continue. This means that less sample and less solvent is required for the analysis and less waste is produced. Virtually all core sample preparation techniques are influenced by this trend. As more companies develop innovative sample preparation equipment, more users will start adopting this equipment for their daily laboratory routine.
Nonselective sample preparation techniques are also needed for the emerging trend of nontargeted analysis, but selective workflows are necessary for target compound analysis. The trend towards multitarget methods with more than 100 analytes will continue.
The rising interest in extraction of polar compounds from water matrices, as well as nonpolar compounds from fatty matrices, poses a challenge in sample preparation.
Matt Brusius: While the technologies that feed into sample preparation have generally remained unchanged, certain applications and types of equipment are becoming more popular, which has ultimately changed the overall landscape of “sample preparation”.
Sample preparation of large molecules has increased and the need for a more streamlined workflow at the commercial level is helping to drive innovation in this space as scientists diversify beyond small molecule. In addition, further adoption of automated sample processing platforms (robotics) along with devices like positive pressure manifolds have provided a better, more effective, and reliable way to process samples providing a quicker route to the instrument and downstream analysis.
Danielle Mackowsky: In the past year, sample preparation techniques traditionally used in a specific industry have begun to have broader appeal across multiple disciplines. For example, QuEChERS was developed in 2003 for pesticide residue testing in foodâbased matrices. With the emergence of the cannabis industry worldwide, QuEChERS is now being introduced to a new subsection of scientists. In addition, this technology has expanded into use on samples that are not of an agricultural realm. Numerous post-mortem forensic toxicology laboratories are starting to incorporate this methodology for the universal extraction of drugs of abuse from a variety of sample types.
LGGC: In your opinion, what is the future of sample preparation?
Paul H. Roberts: Sample preparation often follows analytical instrumentation developments. Although liquid chromatography–mass spectrometry (LC–MS) systems are becoming more sensitive, and detection limits may be met by using simple “dilute and shoot”- type sample preparation, this can also lead to problems with the analytical system as matrix components accumulate over time. So, sample preparation needs to effectively clean-up samples, but be balanced with simplicity and ease of workflow.
Alicia Douglas Stell: The future of sample preparation frees technicians up from manual and time-consuming setup and extraction techniques and yields repeatable results in a fraction of the time when compared to existing techniques.
Peter Dawes: The future of sample preparation must be first to enable a move away from the large amount of manual handling with repetitive liquid handling. The liquid handling aspect of sample preparation is the easiest to automate and enables the introduction of a much greater degree of precision and reliability. However, too often we are asked to simply automate legacy sample preparation methods, which can be done, but it is grossly inefficient. The future lies with a whole raft of new tools and techniques designed for efficient automation providing faster and more reliable sample preparation than the current manual processes. These include, but are not limited to, special solid-phase extraction (SPE) cartridges, enzymatic reactors, affinity chromatography, derivatization, filtering, extreme accuracy of liquid handling, and better controlled reaction kinetics.
Oliver Lerch: Sample preparation will have its place in the world of chromatographic mass spectrometric analysis. The key point is to find the right balance between the effort spent on sample preparation and the robustness of the analysis method. In this sense, the “just enough” approach will replace the “dilute and shoot” approach. This will lead to many simple sample preparation techniques being implemented, including dilution, centrifugation, filtration, protein precipitation, and various combinations of these techniques.
Comprehensively optimized and sophisticated sample preparation workflows combined with the most sensitive gas chromatography (GC)–MS and LC–MS equipment will be needed to meet the requirements of certain legislation, for example, for the analysis of baby food or in the context of the EU water framework directive.
The acceptance of automated sample preparation will increase both in dedicated systems that integrate sample preparation and GC–MS or LC–MS analysis, and in offâline workstations on the laboratory bench that prepare samples for multiple instrument types.
Matt Brusius: This depends on the application. In general, I ascribe to the notion that the future of sample preparation looks like a “quick-and-dirty” clean-up directly in front of a very powerful high resolution mass spectrometer. It is also possible that things like MS prefilters, that is, ion mobility, could eventually eliminate the need for sample preparation and chromatography entirely. For a more conservative prediction, as instruments become more powerful, I think that more matrix-specific filters and streamlined SPE procedures will be implemented to reduce total amount of time spent performing sample preparation.
Danielle Mackowsky: The future of this discipline will allow scientists to use minimal sample volume to achieve their analysis goals. As instrument detectors become more sensitive and less cost prohibitive, less sample is required for each extraction. In pain management laboratories, it is not uncommon to use as little as 100 µL for each patient sample and this could possibly become the norm across multiple disciplines. Reducing sample volume also cuts down the overall amount of organic solvent needed, making sample preparation a greener and more cost-effective choice for laboratories as a whole. To keep up with this trend, sample preparation consumables will be adjusted accordingly to accommodate smaller sample volumes.
LGGC: What one recent development in the area of sample preparation would you say is the most important?
Paul H. Roberts: Simple, user-friendly automation that is tailored specifically for techniques, such as SPE or supportedâliquid extraction (SLE), rather than “bolted on” to liquid handling devices.
Alicia Douglas Stell: Rapid automated sample preparation is the most significant development realized in more than 15 years. With traditional sample preparation taking 62% of the time spent on typical chromatographic analysis, sample preparation is the bottleneck to rapid LC and GC analysis. Systems that can cut typical sample preparation times down from over 30 min for each sample to 5 min or less are well positioned to eliminate the bottleneck.
Peter Dawes: Apart from the rapidly growing range of automatable processes in sample preparation, the increasing availability of ion mobility mass spectrometry systems is important in my opinion. It is understood that there are no universal truths in analytical chemistry that cover all sample types, but there are many situations where ion mobility mass spectrometry indirectly streamlines the requirements of sample preparation workflows where difficult matrices are involved. Along the same lines as the huge power that can be achieved through orthogonal multidimensional chromatography separations, the application of the same principles of orthogonal mechanisms to sample preparation or detection can reduce the need for high performance and criticalâtoâoperate analytical separation systems.
Oliver Lerch: This is difficult to say; in my view no single groundbreaking new development has been made recently. The QuEChERS method in its multiple variations for all kinds of matrices is definitely one of the main achievements over the last decade.
Matt Brusius: I would say that generally it has been more about the increase in instrument capability than one vital breakthrough in sample preparation. Most of the commercially available developments have been incremental and are mostly still based on technology that is 10- to 15-yearsâold. For example, polymericâbased SPE media has undergone some improvements in that some products provide additional application-based benefits, such as in-well hydrolysis, or phases that are truly water wettable, which improves overall processing time. Ultimately, I think the most interesting developments are the ones that combine some type of clean-up device directly with ambient ionization mass spectrometry to bypass chromatography entirely.
Danielle Mackowsky: In the United States, opioid addiction has claimed the lives of millions of Americans. Forensic toxicologists need to stay on top of what analogues are being abused in their region in an accurate and rapid way. Development of fast and efficient extraction methodologies has allowed them to distinguish between the emerging compounds and more traditional drugs of abuse. In the environmental sector, per- and polyfluoroalkyl substances (PFASs) are a diverse group of synthetic organofluorine compounds that have been widely used in industrial applications and consumer products. PFASs are bioaccumulative in wildlife and humans and can now be tested for accurately in drinking water and other available matrices thanks to sample preparation technologies.
LGGC: What obstacles do you think stand in the way of sample preparation development?
Paul H. Roberts: The focus on analytical instrumentation developments can lead to loss of expertise and innovation within the “chemistry” of sample preparation.
Alicia Douglas Stell: The greatest obstacle that stands in the way of dramatic sample preparation development is fear of change. In order to make a dramatic step forward, sample preparation will have to abandon the “tried and true” methods of the past 50 years and design, from the ground up, technologies that meet the demands of today’s busy laboratories, including: speed, simplicity, and reproducibility in an effective platform with a small footprint.
Peter Dawes: As with the introduction of any new technology there is the burden of personal and financial investment in existing methods that hold back the adoption of better and more robust methods. For example, the new automated sample preparation techniques work better with smaller sample sizes achieving better limits of detection and limits of quantitation, but so many regulated analytical methods define that large volumes be analyzed. It is obviously important to ensure a sample being analyzed is representative, but automation does not lend itself well to using a litre of sample when a few millilitres is more than adequate, which we recently had to deal with for fire retardant analysis (PFAS) in water.
Oliver Lerch: From the users’ perspective, sample preparation method development is increasingly a challenge. Laboratory resources are limited and qualified laboratory personnel is increasingly hard to find. This means that we may see method development being shifted from the end user to suppliers of analysis equipment in the future.
Manufacturers of sample preparation tools and systems have invented several new techniques over the past years, but customer acceptance is lower than expected and this may slow down further innovations. Many users take a lot of convincing and stick to tried and trusted products and concepts. Generally, sample preparation only gets limited attention in analytical chemistry research and this may also be a factor that slows down new development.
Matt Brusius: The cost for each sample is always something that limits the new types of sample preparation technology. I think to justify significant up-front cost and time in sample preparation the technology must provide some type of intangible benefit, such as peace of mind that your system will not fail overnight in the middle of a run. I think it is the other components of the workflow, such as the column and the MS system, that drive sample preparation development, and not the other way around.
Danielle Mackowsky: Laboratories are constantly under pressure to reduce their turnaround times. Because of this, the development of universal sample preparation materials and extraction methodologies is critical. It can be challenging from a product and method development standpoint to create an allâencompassing sample preparation solution without compromising any results from a specific analyte class. In short, there is a definite struggle at times to find a “oneâsize-fits-all” solution that is amendable to a variety of compounds and settings.
LGGC: What was the biggest accomplishment or news in 2017/2018 for sample preparation?
Paul H. Roberts: Simplified workflows and matrix scavenging techniques for urine samples.
Alicia Douglas Stell: The biggest accomplishment for sample preparation in 2017/2018 was the introduction of a new technology for rapid automated solvent extraction of samples for analysis by GC and LC that included built-in methods optimized for a wide variety of sample types. This takes the guesswork and much of the method development time out of the sample preparation equation and produces repeatable results in a “hands-off” automated fashion.
Peter Dawes: I have been very impressed with the clinical analyzers that are now on the market (with more coming) that automate the entire workflow from sample preparation to LC and MS detection. It will be even more impressive when the LC part of the process, which after all in many analyses is just sample clean-up for the mass spectrometer, can be eliminated. This will be possible as a result of better sample preparation processes, such as small particles (less than 3 µm instead of 50 µm) and highâresolution SPE cartridges that make the kinetics of the process less critical and more effectively remove matrix and fractionate the sample. If we can already do it reliably with difficult matrices like whole blood and urine for targeted analysis directly into a mass spectrometer, it should be much simpler for targeted environmental analysis.
Oliver Lerch: As I mentioned previously, there are some obstacles to new approaches being adopted. The implementation of nanomaterial with unique physical and chemical properties will definitely boost sample preparation techniques going forwards. Applications of nanomaterial for dispersive µ-SPE in the form of (magnetic) molecularly imprinted polymers, magnetic beads, and coatings were reported in 2017 and 2018.
Matt Brusius: I don’t know if you can necessarily pinpoint one single accomplishment, but overall, sample preparation is always changing and improving. Product advances are making it even easier to get cleaner samples in less time. Coupled with the other advances in LC-, GC-, and MS instruments, this has been an exciting year for sample preparation and many advances have been developed to bridge the gap into full workflow solutions. I think the immediate future holds further incremental improvements while the instrumentation platform will most likely drive the true innovation.
Danielle Mackowsky: Sample preparation purists are constantly up against the enemy that is “dilute and shoot”. Within the past year, the tide has begun to turn away from this methodology and back to sample preparation solutions because the required detection limits are becoming lower and lower. Constant replacement of guard and high performance liquid chromatography (HPLC) columns can quickly become a large financial setback for laboratories that have multiple LC–MS/MS instruments. In addition, the time it can take to clean a source that has been exposed to diluted matrices, such as urine or even plant material, can be detrimental to laboratories up against tight deadlines. Investing in sample preparation on the front end of your sample workflow can pay off dramatically downstream, and many scientists are now starting to re-embrace this mentality.
Paul H. Roberts is the Global Product Manager, Analytical Consumables & Systems at Biotage.
Alicia Douglas Stell is a Senior Scientist, Molecular Sample Preparation Division, at CEM Corporation.
Peter Dawes is the President of Eprep Pty Ltd.
Oliver Lerch is a Senior Application Scientist at Gerstel GmbH & Co. KG.
Matt Brusius is a Product Manager, Sample Preparation at Phenomenex.
Danielle Mackowsky is a Forensic Technical Specialist at UCT.
Data Handling
LGGC: What is currently the biggest problem in data management for chromatographers?
Andrew Anderson: While the extent of this problem depends on the discrete responsibilities for each chromatographer, we would posit that the “oppression of transcription” between different foundational systems presents the greatest challenge. We define this as the effort to have to use different IT systems (and consequently, transcribe information between them) to matriculate through a set of chromatographer tasks.
Consider the new paradigm of qualityâbyâdesign (QbD) for chromatographic method development. Different data handling systems are used at different stages of this process. For example compositional data (chemical information), definition of the statistical design of experiment (DoE), experiment execution, and project reporting all require different pieces of software. Significant human effort is spent transcribing information between these systems.
The core mission of a chromatography data system (CDS) (the most widely used informatics system by chromatographers) is use of methods for instrument control and data acquisition. Rather than attempting to extend CDS capabilities beyond those essentials, it is logical that their interfacing capabilities be sufficient to interface with other informatics systems that consolidate and assemble data appropriately from multiple CDS systems and experimental studies.
We believe that transcription and documentation time exceeds the time chromatographers are able to spend doing what they trained for, and applying their expertise in performing design, experimentation, and analysis.
John Sadler: Chromatographers generate a large volume of data and vendors have responded with tools that improve the ability to find, and share, the appropriate results required for their job. Today, we see data review as a bigger challenge. Historically, chromatographers have reviewed every peak in a chromatogram or every compound in a target list. Innovative data analysis tools, specifically designed to present chromatographic data in a format optimized for visualization by the human eye, allow rapid detection of anomalies to enable the chromatographer to review by exception and dramatically improve the speed of data review.
Heather Longden: The biggest problem in data management is dealing with chromatographic data in multiple proprietary formats. Companies and regulators are looking for a way to compare data across multiple analytical techniques, but are even struggling to find a common format for a single technique such as chromatographic data. Current solutions that rely on printed reports, or exported data (into some common generic human readable format), suffer from incompleteness (missing data, missing versions of data, methods, and audit trails) or lack security around the file format (how do you know that the exported data has not been altered? Specifically, if it is human readable?) The only current solution is to print or export the data and then verify once it lands in its final location and make sure that the values match the original and nothing was lost in the conversion.
LGGC: What is the future of data handling solutions for chromatographers?
Andrew Anderson: We anticipate that increased standardization in method ontologies, IT system interoperability, and integrated decision support interfaces will provide chromatographers with muchâneeded productivity enhancements.
John Sadler: Further automation of the analytical tasks. Automated data review and analysis will continue to reduce the need for manual data interaction. The systems will continue to improve their ability to recognize peaks and patterns. The interfaces will alert the analyst to review a specific compound and provide guidance when necessary. Ultimately, this will not only reduce the time to results, but will also improve confidence that test results are accurate.
Heather Longden: The future of data handling is making the data review process complete, guided, and documented to prevent errors and omissions. Review processes should be streamlined to increase efficiency and focus on the major areas of concern. Leveraging “across laboratory” analytics to understand the overall quality of the data generation cycle should be designed into this review process.
LGGC: What one recent development in Big Data is most important for chromatographers from a practical perspective?
Andrew Anderson: We anticipate that integrating Big Data repositories to Machine Learning or Deep Learning systems will allow chromatographers insights into retention or separation phenomena. We believe that this may afford a reduction in the number of physical experiments required for design space mapping, and correspondingly, method robustness validation experiments.
John Sadler: I believe there are existing aspects of data analytics that are very powerful, but not commonly used today. Pattern recognition and peak deconvolution are two that come to mind. The use of data fusion may enable deeper insight from the combination of multiple chromatographic techniques, with mass spectrometry and spectroscopy. However, the chromatography industry has been slow to adopt new data science-based solutions.
Heather Longden: The biggest recent development is the focus on results trending across large data sets, especially as it applies to continually monitoring “out of spec” but also “out of trend” results. This focus requires re-examining metrics tools like control charting, or other metrics gathering tools. As mentioned above, in order to gain meaningful metrics, the data needs to reside in a single application or location. Solutions that are cloud deployable allow data from multiple chromatographic laboratories to be managed in a single location, whether inside one regulated company or across company borders. In a world where the global supply chain is increasingly fragmented, gathering the data from contract research organizations (CROs), contract manufacturing organizations (CMOs), and contract testing organizations (CTOs) into one data pool is essential before trends can be observed.
LGGC: What obstacles do you think stand in the way of chromatographers adopting new data solutions?
Andrew Anderson: Across a variety of industries, chromatographic methods serve a fundamental purpose for ensuring product quality. With this purpose in mind, we must recognize that overall quality assurance and regulatory compliance comes with a significant documentation and validation effort. While new technological advances will create productivity and innovation opportunities, we must be mindful to also provide documentation and validation capabilities to ensure efficient implementation. The main obstacle to the adoption of these new data solutions that will reduce data transcription and reporting efforts for separation scientists, while providing the scientific tools essential to method development, will likely be ease of integration into the current informatics environment. A typical separations laboratory includes a variety of instruments from different vendors with disparate software on top of all the other informatics systems that support R&D. Integration of these systems for a seamless workflow will be a challenge many organizations will need to overcome.
John Sadler: Unlike spectroscopists, who have embraced mathematical data transformation for decades, chromatographers have been reluctant to broadly adopt these techniques. To unlock the power of new technology, chromatographers are going to need to change their mindset and embrace these advances.
Heather Longden: In regulated companies, the challenge of any change to registered methodologies is one significant obstacle to adopt new data solutions. Not only does this often require validation overhead, but additionally, it also necessitates completely reworking standard operating procedures (SOPs) and training of both users and reviewers in the new process. If software applications already in use can be adopted to wider, more globally harmonized deployment (potentially even in business partner environments), then validation, SOP, and training burdens can be minimized.
Andrew Anderson is the Vice President of Innovation and Informatics Strategy at ACD/Labs.
John Sadler is the VP/GM, Software & Informatics Division at Agilent Technologies.
Heather Longden is the Senior Marketing Manager, Informatics and Regulatory Compliance, at Waters Corporation.
RAFA 2024 Highlights: Contemporary Food Contamination Analysis Using Chromatography
November 18th 2024A series of lectures focusing on emerging analytical techniques used to analyse food contamination took place on Wednesday 6 November 2024 at RAFA 2024 in Prague, Czech Republic. The session included new approaches for analysing per- and polyfluoroalkyl substances (PFAS), polychlorinated alkanes (PCAS), Mineral Oil Hydrocarbons (MOH), and short- and medium-chain chlorinated paraffins (SCCPs and MCCPs).
Pharmaceutical excipients, such as polyethylene glycol-based polymers, must be tested for the presence of ethylene oxide (EtO) and 1,4-dioxane as part of a safety assessment, according to USP Chapter <228>.