A snapshot of the key trends and developments in the separation science sector according to selected panelists from chromatography technology companies.
LCGC International spoke with Tony Edge, R&D Leader at Avantor; Stephane Moreau, Manager LC–MS & LifeSciences at Shimadzu Europa GmbH; Carsten Paul, Senior Staff Product Manager of LC Systems at Thermo Fisher Scientific; and Fraser McLeod, Vice President of QA/QC and Wyatt Technology at Waters Corporation.
What trends do you see emerging in LC or LC–MS?
Edge: We are seeing a growing use of the technology in fields that previously would not have used LC–MS. More and more LC–MS is seen as an underpinning technology that allows for the determination of the molecular constituents of a sample at very low levels. This presents many opportunities but also many challenges as more non-specialists become LC–MS users. It also means that the technology will start to move into the mainstream, with the wider general population starting to come into contact with sampling devices that are shipped to LC–MS laboratories, and who knows, one day perhaps even have LC–MS instruments that we can hold in our hands.
Moreau: After a decade where improvements on instruments have drastically sped up the possibilities on the hardware side—such as ultrahigh-pressure liquid chromatography (UHPLC) for fast chromatography, analytical intelligence to automate start-up and shut-down, fast polarity switching, fast scanning, and fast multiple reaction monitoring (MRM) acquisition on the MS side—the quantity of data generated in a shorter time is now putting pressure on analysts and software. The use of artificial intelligence (AI) for automatic integration and data processing is starting to reduce the burden on people for large batch analysis as well as reducing variability in data processing. The additional benefit of such tools is the ability to exploit the amount of data to anticipate deviation and future breakdown on systems. The other part is the automation of sample preparation. In the post-Covid era, the laboratories are struggling to hire workforces, and automation is the way to compensate for the lack of workforce.
Paul: Overall, it seems that the race of all vendors to increase the performance of the UHPLC system, measured often in terms of column efficiency or maximal available system pressure, is over. HPLC is now only one of the many tools available in a laboratory, which emphasizes the need to provide logical and consistent user experience and mitigates the need for in-depth expertise in every piece of laboratory equipment. Examples include providing better instrument diagnostics or “at-instrument” guidance for common procedures such as exchanging wear parts. Lastly, automation of full workflows from sample preparation through data analysis has become a reality and offers benefits in terms of freeing up valuable analyst time and increasing result reproducibility.
McLeod: What we are seeing overall in the pharmaceutical industry is organizations trying to use information more efficiently across their pipeline, whether it be from a method life cycle perspective, instrument utilization, or cross-correlating data for molecule attributes. The overall theme is driving better efficiency to reduce costs. In quality control (QC), we are seeing an increased focus on tracking, trending, and eliminating causes for method failure.
In terms of modalities, new, more challenging molecules are in discovery and development, many of which are metal-sensitive. Without the use of instruments and column chemistries that have specific bio-inert surfaces, scientists have to contend with the frustration of having to passivate the equipment, redo analytical runs, or worse, miss peaks and signals that otherwise would be detectable with the right equipment.
What is the LC or LC–MS application area that you see growing the fastest?
Edge: The area that has seen the greatest growth in recent years is the area of in situ measurements, although it is not the biggest area of use. This can be applied to a range of market sectors, including production lines, surgeries, environmental monitoring, homeland security, or even space exploration. The possible application areas of having mass spectrometers made smaller and simpler to use opens up whole new markets, and I see this as being the most exciting area of development. It will need appropriate extraction and chromatography systems to support the application of the LC–MS instrumentation, but advances in high-throughput LC and improvements in more robust solid-phase extraction (SPE) have suggested that there are more applications that will be done in this growing area.
Moreau: Currently, the penetration of mass spectrometry, such as matrix-assisted laser desorption ionization (MALDI) and LC–MS in the field of clinical analysis, is at a turn with the implementation of the newIn Vitro diagnostic (IVD) regulation. The systems are becoming more and more robust, and the number of applications is increasing in therapeutic drug monitoring, oncology, toxicology, and microbiology. The benefits of data given by LC–MS (selectivity, multiplexing, and precision) for the doctor’s decisions is recognized. However, this also requests from LC–MS suppliers to simplify workflows with automation in the sample preparation, laboratory automation system (LAS) connections, and standardization with ready-to-use clinical reagent kits. In the post-Covid era, the laboratories are struggling to hire workforces, and automation is the way to compensate for the lack of people and ensure continuity.
Paul: The landscape of chemical medicines and biologic entities used for pharmaceutical development is constantly evolving. Over the last decade, we already saw a strong uptake on the development of protein-based therapeutics. Today, we see much more complex developments of biotherapeutics, such as antibody–drug conjugates (ADCs), bi- or tri-specific antibodies, or even complex mixtures of antibodies. In the last few years, we have also seen strong momentum for the development and commercialization of nucleotide-based therapeutics, including oligonucleotides and messangeer ribonucleic acid(mRNA). Other nucleotide-based modalities, such as oligonucleotide conjugates or small interfering RNA, have also seen a strong uptake, signaling a future of therapies heavily leveraging biotherapeutics. The complexity of these types of medicines requires more sophisticated analytical tools for characterization and quality control, including increasing use of MS.
McLeod: From a modality perspective, there has been a shift to cell and gene therapy in the Americas and European Union (EU), and also with ADCs, particularly in China. Customers are also very much engaged in oligonucleotides and CRISPR-Cas 9 as the next “big” thing.
What obstacles stand in the way of LC or LC–MS development?
Edge: The obstacles that I foresee in this area are trying to ensure that the technology does not become too accessible. As a greater number of users are engaged with the technology, there is a tendency for the manufacturers to dumb down the operation of the instrumentation, and this will lead ultimately to scientists reporting the data generated by the mass spectrometer without necessarily being aware whether the data makes sense. The technology is not yet robust enough to be widely spread amongst the non-analytical community without a degree of education.
Moreau: With current trends towards further implementation of MS to accelerate and refine clinical decisions, several points must be improved without compromising existing benefits. For LC–MS, one of them is the variability from system to system and the perceived complexity of configurations. Compared to immunoassay analysers looking like “black boxes,” LC and LC–MS are looking complex despite the big improvements realized. The software workflow, the laboratory information system (LIS) connections, and the graphical user interface (GUI) will also help to adopt the technology and a better integration in each laboratory. LC–MS can not only contribute to improving diagnostics for patients, but it also helps laboratories to improve their workflow and return of investment by multiplexing analysis, reducing plastics waste, and reducing energy consumption.
Paul: With a weaker economic outlook in 2024, it is more important than ever to ensure any new instrument technologies have the user in mind. Without this focus, customers will simply not be interested in stretching their budgets because of greater scrutiny to justify the investment.
New technologies, such as AI applications, might also be entering the field of HPLC, for instance, for data analysis or user support, even though the acceptance of users and regulatory agencies particularly in regulated environments is not fully defined.
Another aspect is the need for better data sharing and integration in a highly connected world between sites, platforms, and software packages. Here, industry standards are still lacking.
McLeod: Although the toolbox is growing, the molecules are more complex than traditional therapies, and to some extent, customers are still defining what attributes are important in terms of analysis. The challenges include developing the required techniques for separations and analysis quick enough to help our customers. As always, supporting our solutions with the informatics tools to analyze data efficiently and effectively is always a priority.
What was the biggest accomplishment or news in 2023-2024 for LC or LC–MS?
Edge: There have been no truly groundbreaking developments in the field of LC or LC–MS. However, small steps have been made to move the technology forward, particularly in the resolution and applicability of ion mobility to LC–MS. In the field of chromatography, the development of lithographic-based columns that do not use spherical particles has improved batch-to-batch robustness of columns, and it has also allowed the theoreticians the opportunity to develop new designs of flow paths through a column.
Moreau: Lipidomics is a very important area of research with crucial implications in many areas such as health and diet, to name a few. But analyzing lipids remains a very complex topic because of the structure of these compounds. Recently, new technologies for fragmentation of ions have appeared in MS, which considerably simplify the interpretation of spectra and allow much easier differentiation between different structures. This will for sure give a boost to the lipidomics research with translation in future diagnostics.
Paul: Low-flow HPLC, spanning from nano to capillary flow rates, historically needed a higher user expertise to operate such delicate systems. This has dramatically changed and now any HPLC user can run such an instrument and could potentially benefit from low-flow HPLC such as higher MS sensitivity, and low solvent and sample consumption. In combination with the newest developments in high-resolution accurate-mass analyzers, higher throughput, higher sensitivity, and deeper coverage is achievable in proteomics experiments.
Finally, it seems that nowadays USP 621 emphasizes more than before the modernization of methods leading to higher throughput, higher sensitivity, and reduce environmental impact.
McLeod: We are seeing the value proposition of error reduction and ease of use is resonating with our customers, and we expect in 2024 a broader range of workflows focusing on this value proposition.
LCGC International spoke with Jim Gearing, Associate Vice President of Marketing, Agilent Gas Phase Separations Division; Massimo Santoro, Group Business Development Director at Markes International; Ed Connor, GC Product Manager at Peak Scientific; and Bruce Richter, Vice President of Research & Development at Restek Corporation.
What trends do you see emerging in GC or GC–MS?
Gearing: Over the years there have been helium shortages that have caused concern for GC and GC–MS testing. Many laboratories look to both conserve their use of helium and move to an alternate carrier gas to save money and avoid future problems. For helium conservation, the trend is to switch to a different gas when the system is not in use. For example, GC and GC–MS systems can be programmed to switch to either H2 or N2 during a sleep mode when not in use. The same systems can be used to switch carrier gases while in operation. Nitrogen is a good carrier gas choice for GC systems when the chromatographic separation allows.
For GC–MS and when critical separations are required, hydrogen is the preferred alternate carrier gas choice. Laboratories are increasingly focused on safe operation with hydrogen. An integrated hydrogen sensor allows the GC to catch any leaks before they become a problem, shut the system down, and let the operator know what has happened and how to fix it.
Santoro: GC and GC–MS continue to be the techniques of choice when it comes to very complex samples, or when the user wants reliable, trustworthy data. Green chemistry, especially when it comes to sample preparation prior to GC and GC–MS analysis, faster analysis times, and reduced costs for each sample, is the key trend we observe most frequently speaking to customers around the world.
Connor: We obviously pay close attention to trends of carrier gas usage, and in particular use of alternatives to helium (H2/N2). We have seen a continued growth of hydrogen carrier gas adoption thanks to the ongoing pressure faced by laboratories in either finding supply of helium, or in combatting spiralling costs of helium cylinders. These supply-cost pressures, along with significant advances in technologies and guidance by industry leaders to facilitate use of hydrogen and nitrogen carrier gas, have driven adoption, which looks set to continue for the foreseeable future.
Richter: The evolution and improvement of instrumentation is changing the way that many scientists are doing their work. Improved resolution, sensitivity, and scan speeds of mass spectrometers (MS) are having interesting impacts in many laboratories. For example, increased scan speeds can shorten analysis times. Improved resolution and sensitivity of the MS can mean less sample preparation is needed for some sample types. Multiple methods can be combined into single analyses to improve laboratory productivity. For example, polycyclic aromatic hydrocarbons (PAHs) and polychlorinated biphenyls (PCBs) have been traditionally analyzed using separate conditions and instrument configurations. Now, with the proper instrumentation and conditions, they can be analyzed using the same instrument and same method conditions, thus saving time and increasing productivity. Triple-quadrupole MS systems are being used now in areas where high-resolution sector MS systems dominated in the past.
In your opinion what is the future of GC or GC–MS?
Gearing: The future of GC and GC–MS falls into a couple of categories:
Connor: Despite GC and GC–MS now being well into middle-age, the use of these techniques is intrinsically tied to so many aspects of testing of commodities, air quality, medicines, and health that they are going to be around for quite a while longer. As with most other areas of our lives involving technology, the development of new instruments is likely to centre around miniaturization of hardware, and for operation and analysis, an increased reliance on the instrument of things (IoT) and artificial intelligence (AI) to reduce analysis time and improve the robustness of data.
Santoro: GC and GC–MS will remain the gold-standard reference techniques for anyone who’s dealing with volatile and semi-volatile organic species. While they will be pushed to their limits, challenged with new contaminants at increasingly challenging low detection limits, faster analysis time, and reduced cost, I believe gas chromatography will always maintain its status in any modern analytical laboratory. If I were to predict, developments are more likely to happen on the software side of GC and GC–MS, and I believe they are necessary given the huge amount of data we already produce from every analysis and given the scarcity of operators in modern many laboratories. AI, machine learning, and higher degrees of automation will be key partners in the future development of GC and GC–MS.
What is the GC or GC–MS application area that you see growing the fastest?
Gearing: The energy market, specifically around alternative energy, is undergoing a renaissance. There is growth of H2 energy production, transport, and qualification testing; end-use application qualification and testing, that is, transportation fuel cells, blending with natural gas for residential, etc.
There is also an expansion and evolution of the battery. Although currently a challenging end market, vehicle use continues to grow, enhanced by other end uses like battery storage. Production, research, and recycling continue to expand, increasing analytical testing needs.
Interest in and use of synthetic aviation fuels (SAF) continues to increase. Commercial acceptance of biofuels only gets closer to reality with the recent trans-Atlantic flight using 100% SAF. Adoption of a new ASTM GC×GC method allows laboratories to test these fuels efficiently.
Lastly, a move to hydrogen as a carrier gas for GC–MS applications is of increasing interest. But this requires method redevelopment.
Santoro: I would say that environmental and food-related applications continue to be the most common application areas, where GC–MS is used to measure emerging contaminants. The list of emerging contaminants continues to grow, from PFAS, to odorants and malodorants, microplastics to adulterants, and many more. One application area that is receiving a lot of attention, and is fast growing, is the application of GC–MS to clinical analysis. For example, hopefully in the not-too-distant future, we’ll be able to see breath analysis as a routine screening technique for early disease detection. Many new projects exist in this area, so a breakthrough should be coming soon.
Connor: All the noise at the moment seems to centre around PFAS analysis. GC–MS is an important component in the PFAS analysis toolkit, especially as volatile and semi-volatile compounds constitute a significant proportion of these environmental pollutants. We expect to see a growing list of regulated PFAS compounds in the coming months and years which will maintain the focus on their analysis.
Richter: The first area that comes to mind is PFAS. While liquid chromatography–mass spectrometry (LC–MS) is the method of choice for these compounds, GC–MS can be used for many of these compounds as well. It seems everywhere we read, we hear about new methods or findings dealing with these compounds. I believe these compounds will be areas of focus for analytical chemists for some time.
Secondly, microplastics come to mind as well. For these substances, there should be two areas of concern. First, their presence in the environment is of concern from a health standpoint, especially for nanoparticles because they have many routes into an organism and do not exit readily. In addition, I believe we need to better understand the extent that persistent organic pollutants (POPs) will be absorbed into microplastics. It stands to reason that if microplastics are present in soil or water, that the POPs will tend to accumulate at higher concentrations in the plastic particles than in the water or soil. Then, if an organism is exposed to those particles, more severe health impacts are possible. We need to understand what plastics are present and at what levels, but we also need to understand the pollutants that are absorbed into the microplastics. GC and GC–MS will be important parts of gaining those understandings.
What was the biggest accomplishment or news in 2023-2024 for GC or GC–MS?
Gering: The field of GC and GC–MS witnessed significant advances in the past year. These developments focus on enhancing system intelligence and operational efficiency. Two practical examples are:
Innovations like these contribute to a more intelligent and sustainable GC and GC–MS landscape, benefiting both operators and scientific research.
Santoro: Green chemistry, with solvent-less sample preparation, facilitated by vacuum-assisted, high-capacity sorbent extraction; a wider adoption of hydrogen as carrier gas, reducing analysis time to increase laboratory productivity while at the same time saving money and the resource-limited helium carrier; and finally a higher acceptance and implementation of GC×GC with or without MS detection as a way of increasing the amount of information from every sample. I think we will see GC×GC developing into a routine technique with newer, easier hardware and software solutions being available now, just like one-dimensional GC (1D-GC).
Connor: Improved compatibility of both MS and front-end applications for use with H2 carrier gas has alleviated a lot of pressure on laboratories who have been able to save their precious helium supplies for applications where no alternative is viable.
Richter: I have been in analytical chemistry and chromatography development for many years, and it is still amazing that we continue to get incremental improvements in our instrumentation as time goes on—fast temperature programming rates that allow analyses to be done faster, higher sensitivity for detectors. The ability to quantify and identify compounds at lower and lower levels is impressive.
It seems that several vendors are providing some possible ways to address the increased cost of helium, the most commonly used carrier gas in GC. Some have developed MS systems that can be used with hydrogen gas with good sensitivity. Others are implementing switching valve systems that allow changeover of carrier gases between runs to minimize the use of helium between runs or during downtime.
LCGC International spoke with Shawn Anderson, Associate Vice President of Digital Lab Innovations at Agilent Technologies; Marco Kleine, Head of the Informatics Department at Shimadzu Europa GmbH; Trish Meek, Senior Director, Connected Science, Waters Corporation and Todor Petrov, Senior Director, QA/QC, Waters Corporation; and Crystal Welch, Product Marketing Manager at Thermo Fisher Scientific about the latest trends in data handling.
What is currently the biggest problem in data management for chromatographers?
Kleine: One of the biggest problems in data management for chromatographers is the huge volume of data generated during analyses. Chromatography techniques, such as liquid chromatography (LC), high performance liquid chromatography (HPLC) and gas chromatography (GC), produce large amounts of data that need to be organized, stored, analyzed and in some cases transported through a network. This can be a time-consuming and error-prone task. Additionally, the lack of standardized data formats and the compatibility issues between different chromatography software systems (CDSs) can make data management even more complicated.
Anderson: Thoughtful inclusion of the chromatography results in a larger data set, to allow for insights into purity and yield improvements. After all, separation and detection combined are only one step (often the last) in what is usually a process to produce a molecule. There are many other steps as well, and correlating the purity and yield results with other factors in this process can drive true innovation. As a prerequisite for this, many chromatographers are yearning for more widespread adoption of vendor-neutral standards for data. The FAIR (Findability, Accessibility, Interoperability, and Reusability) data principles are useful for guiding this journey (1).
Welch: The largest problem continues to be reducing the time and effort spent to manage data. It is still common that files are spread across multiple storage locations, with the effort to compile information together being manual and time-consuming. People want their chromatography systems to work like their phone software, with a more stable platform and easy-to-use applications, with all data secured into a central location so they can use, view, and download it onto their next tablet or cell phone without having to transfer it from device to device.
Petrov: If we look at the chromatography data as something that has a lifecycle, there are different challenges in the different phases the data goes through. For example, once a multitude of chromatograms are acquired and quantitative results are calculated, the first challenge an analyst faces is with screening the data to determine which data sets are in line with the expectations and which are outliers. In today’s technology, machine learning can be utilized for anomaly detection to make the data review process more efficient by focusing on the exceptions.
Once the data passes the first review gate, the next challenge may often be with data sharing for collaboration purposes. Companies have large networks of partners that generate chromatography data that the sponsors need to review as well. The growth of contract services demands efficient solutions for data sharing with minimum delays. In today’s technology, cloud-based solutions offer the best mechanisms to achieve that.
Once the chromatography data has been reviewed and has served its primary purpose, it needs to be made available for extracting analytical insights across other processes the sample in question has been subjected to. The data format standardization is the main challenge in this phase.
The data gets archived eventually and while the amounts of it accumulated over time can be challenging to manage, a major challenge is the expectation that data sets can be resurrected at any time in the software application that has produced them originally. This implies data format compatibility that goes back decades or having to maintain dated application instances.
Meek: Throughout each of the steps in the lifecycle that Todor described, laboratories need to be able to share laboratory data and methods with their internal and external colleagues, show auditors that they are following regulatory guidance and Good Laboratory Practice (GLP), and use their data to make decisions about whether water is safe to drink or if a product can be released.
While organizations often rely on systems like electronic lab notebooks (ELNs) and laboratory information management systems (LIMS) to aggregate and share to handle final results, like peak concentrations and amounts, across the enterprise it does not include all of the chromatography data, so it is often evaluated without the context of how that data was acquired. As we work with laboratories, their biggest challenge is getting the complete picture of their data.
What is the future of data handling solutions for chromatographers?
Anderson: We believe that we are seeing the limits of the current LIMS-oriented model, and we are likely to see an advancement in insight generation that is distinct and separate from the LIMS wheelhouse of sample management and test execution. There are numerous innovations around this that are becoming popular. One is data format standardization in a vendor-neutral way, likely based on ASM, the allotrope simplified model. This provides a common input language for organizations to develop and maintain their own data lakes. Another is cloud/prem hybrid storage, which balances redundancy and backup security with low-latency, real-time access. This hybrid model can also allow for more powerful (and cheaper) data processing operations in the cloud while keeping control and stepwise analyses on premises and close to the instrument and end user.
Kleine: The future of data handling solutions for chromatographers is likely to involve advances in automation, cloud-based storage, data analytics and standardization.
In terms of automation the increasing volume of data generated during a measurement means automation will play a key role in data handling. AI-driven algorithms can automate data processing and analysis, reducing the amount of work and minimizing (human) errors.
Cloud-based technologies will enable chromatographers to store and access their data remotely from everywhere. Cloud-based solutions also enable data sharing and collaboration with other researchers.
Advanced data analytics techniques, such as machine learning and artificial intelligence, will help to extract more detailed information from chromatographic data.
Additionally, standardization will become important. Efforts have already been undertaken to establish standardized data formats and protocols for chromatographic data to ensure integration and compatibility between different instruments and software platforms.
Welch: Solutions in this space are looking to take the hard work out of data analysis and management—whether that is by enabling software to process data holistically and offer things like consensus reporting for multi-omics, reduce manual processes with automation, or leverage new AI tools with a goal of getting closer to the truth.
Petrov: Many organizations are moving or have moved their IT infrastructure to the cloud, including data handling solutions like CDS. There are multiple reasons for the increasing interest in software as a service (SaaS) solutions for chromatography data. The primary reasons are to simplify the management of the applications and to make the data accessible to the organization. SaaS solutions provide benefits such as secure worldwide access, up-to-date application and infrastructure security, scalable IT infrastructure, economies of scale, competitive operational costs, and lower initial costs compared to non-subscription deployments on premise.
Meek: In addition to the infrastructure changes, techniques such as machine learning will become critical to data acquisition, processing, and analytics. There are many opportunities to improve on traditional data processing algorithms and support review by exception by deploying artificial intelligence.
What one recent development in “Big Data” is most important for chromatographers from a practical perspective?
Anderson: It is difficult to not answer “Generative AI” for this question. An obvious use case might be to train a model on chromatographic methods for categories of molecules and then ask the AI to generate ideal yet broadly applicable separation methods. Another area that is intriguing (but not as fashionable as AI) is using Big Data for real-time decision-making. One example is using chromatographic data from bioreactor sampling to trigger changes in media composition or temperature settings. Another example is setting limits for hardware metrics such as pump cycles to automatically trigger preventative maintenance scheduling.
Kleine: For a long time, chromatographers have relied on manual data analysis methods, which can be time-consuming and lead to errors. With the latest development in (big) data analytics, chromatographers now have access to powerful tools, like databases, that can support and automate data analysis. These data analytics tools utilize machine learning algorithms, pattern recognition techniques, and statistical analysis methods to analyse large volumes of chromatographic data quickly and accurately. They can help in identifying peaks, quantifying compounds, detecting outliers, and optimising experimental conditions.
Welch: Big Data can mean different things to different people, but one practical example would be utilizing trending over time to inform on when to perform maintenance, replace instrumentation, or just manage practical utilization of instrumentation better. Tools like schedulers, control charting, or predictive modeling can help plan for events and keep the whole lab moving forward.
Petrov: The term “Big Data” is typically used to describe large, unstructured data—think random text, images, and videos—where searching for an item of interest is not trivial and pattern recognition and training models are utilized instead. The chromatography data is structured for the most part, except for the chromatograms themselves, and therein lies the opportunity for using machine learning algorithms originally developed for Big Data. Detecting anomalies using such algorithms can substantially increase the efficiency of traditional methods for comparing chromatograms.
If we extend the scope beyond chromatography and consider the data lakes storing data from multiple phases a substance goes through during its development or manufacturing process, unstructured data is how that can be described. From that standpoint, anomaly detection algorithms can be beneficial, as well as another type of machine learning algorithms, known as classifiers. The classifiers identify clusters of similar data, and once clusters are associated with outcomes, the algorithms can predict an outcome for a set of data exhibiting similarities to a known cluster.
What obstacles do you think stand in the way of chromatographers
adopting new data solutions?
Anderson: Primarily the pain and time investment to change. Data will need to be transformed and migrated into these newer paradigms and this will often be a lower priority than the many day-to-day laboratory business demands. A large contributor to this daunting effort is (re)validation, which is required in regulated environments. In non-regulated environments it is also becoming more commonplace because these organizations also recognize the value of truly FAIR data.
Kleine: There are five main obstacles today:
Welch: There is always a lag seen between new technology and adoption due to it not fitting exactly into the prior solution footprint. For example, moving software to cloud-hosted took a change in everything from architecture to validation approaches. But the only way to move forward is to challenge whether we keep procedures for familiarity or functionality.
Petrov: I see two major obstacles standing in the way of adopting new chromatography solutions. One is the accessibility of such solutions in terms of deployment difficulties associated with software upgrades and validation. Solutions delivered as SaaS will help lower that barrier. Another obstacle is the willingness to accept that automated decision-making can displace the human factor in industries with critical outcomes as life sciences. If you think about it, humans are trusted with certain decisions because they have been trained appropriately and have proven that they can make such decisions as discerning good from bad chromatograms. Algorithms can be trained too, and they can prove in subsequent tests that they can make such decisions. The real difference is that once properly trained, algorithms can do that day in and day out with a lot higher efficiency and a lower failure rate than humans.
Meek: There is an additional challenge, that adopting new technologies can be difficult in a regulated environment. Regulators have shown, however, that they are supportive of using technology to eliminate manual processes such as manual integration to ensure consistent and reliable results. AI does pose a particular challenge given the natural drift that can occur in models, which is why, at least for the time being, a human in loop approach that leans on the expertise of chromatographers provides the best balance.
What was the biggest accomplishment or news in 2023-2024 for Data Handling?
Anderson: Some might mention the growth in popularity of ASM or the availability of generative AI tools; however, we don’t think this area has seen the biggest accomplishment yet. Perhaps the coming months of 2024 will surprise us all.
Kleine: It is becoming easier and easier to store and handle large amounts of data. Improved computing power and network connections make this possible. Measurement results no longer need to be stored locally, making the storage space for data scalable. The large amounts of data are therefore also available over a longer period. With the help of large amounts of data, an AI can support the user in chromatography in the evaluation and interpretation of measurements.
Welch: The biggest thing in the last year must be AI. Who hasn’t read something about ChatGPT? But the foundation for AI is not really in the algorithms or the user interface, but in how AI uses large banks of data. So, data architecture, classification, cataloguing, and the design of data tagging and master lists are really where the fundamental shifts are coming. Without stable structures, AI cannot utilize the available information in a productive way.
Meek: While not “data handling” specifically, I think everyone would agree that, since its launch in November 2022, ChatGPT has dominated technology news. While generative AI may have been the focus of the media, any AI-based technology is only as good as the quality and volume of data that informs it. I think the biggest accomplishment over the past two years is the work companies are doing to build data lakes that enable them to use data science to look across research and development and from the lab to the production floor.
Petrov: Organizations in the pharmaceutical space have been able to use AI to develop novel therapeutics in drug discovery and development. Using AI to generate extremely complex molecules and then test their binding capabilities in the virtual space is a ground-breaking advancement to speed up drug discovery like never before. Over time, we expect to see this technology deployed across the product lifecycle through manufacturing.
(1) Wilkinson, M. D.; Dumontier, M.; Aalbersberg, I. J.; et al. The FAIR Guiding Principles for Scientific Data Management and Stewardship. Sci. Data 2016, 3 (1). DOI: 10.1038/sdata.2016.18
LCGC International spoke with Marco Wolff, Product Manager of Automated Sample Preparation at Gerstel; Lauryn Bailey, Vice President of Global Marketing Strategy & Product Management at Phenomenex; James Edwards, Chromatography Manager at Porvair Sciences Limited; and Arielle Cocozza (in collaboration with Emily Eng and Stephanie Haviland), Technical Specialist at UCT, Inc.
What trends do you see emerging in sample preparation?
Wolff: More laboratories are automating their traditional manual workflows. A lot of laboratories still rely to some degree on manual sample preparation, but tough competition is bringing about further consolidation in the contract analysis business. This results in critical mass being reached in the form of sufficient numbers of samples for each laboratory to warrant investment in automation. Cost is always a key driver, but new requirements for sustainability and environmental impact reporting mean that the pressure is on to improve in these areas. The pursuit of “green” analytical practices is characterized by a reduced reliance on potentially toxic reagents and solvents. This movement aligns with both environmental concerns and the pragmatic goal of lowering cost-per-sample, both of which can be realized by miniaturization, which goes together with automation.
Laboratories are actively seeking methods to streamline and optimize sample preparation workflows to achieve sustainability goals while enhancing precision, accuracy, and efficiency in laboratory processes.
The current shortage of qualified laboratory staff also provides a further boost to the automation trend. Automated systems increase efficiency and provide assured productivity, while freeing up scarce qualified staff for more important tasks. This shift towards automation aligns with the broader industry goal of maintaining high-quality analytical standards while adapting to the evolving landscape of workforce availability.
Of course, not all analytical work is performed by contract laboratories. For example, many companies require in-house capabilities for quicker turnaround to support their production. Such typically smaller laboratory organizations are also finding it hard to compete for qualified personnel and a growing number are seeking comprehensive solutions for specific analytical questions, such as, for example, determination of production contaminants like 3-monochloropropane-1,2-diol (3-MCPD) in food or of environmental pollutants like per- and polyfluoroalkyl substances (PFAS). There is a discernible shift toward complete packages that include not just the device, but also the application and qualified support to ideally implement a new application out of the box.
Bailey: There are multiple key areas that will be trending in sample preparation. For one, there is a demand to achieve lower detection limits. This could be because of stringent regulations, for example,PFAS tolerance limits or a reduction in sample sizes, such as drug metabolism and pharmacokinetics (DMPK) assays in drug development. Either way, these types of applications will require sample concentration or alternative approaches to sample preparation to achieve the detection requirements with the available sample size. A solution to this is “fit-for-purpose” sample preparation solutions, designed with specific analytes in mind to enable laboratories to achieve the results they demand. Second, there is a demand for automation and improved laboratory efficiency. Sample preparation has been a notoriously manual process, so there continues to be a drive to driving efficiencies either through better processes (fewer steps), integrating into automation platforms to increase throughput and improve reproducibility and accuracy, or both. Finally, there are demands for sustainability and less waste. Sustainability and managing the environmental impact are becoming increasingly important, and sample preparation techniques can often be very high in plastic and solvent waste. As many companies focus on sustainability and reducing their environmental footprint, we will start to see sample preparation techniques evolving to result in less solvent usage and overall waste for laboratories.
Edwards: One of the trends I have observed is the growing emphasis toward greener sample preparation. Given that sample preparation often involves the use of organic solvents, there is a clear opportunity for scientists to adopt more environmentally friendly practices. Several ideas have been proposed to achieve this, including substituting harmful solvents, reducing sample and solvent volumes, and implementing more reproducible processes that require fewer replicates or retests.
Cocozza: Recent advances in liquid chromatography–mass spectrometry (LC–MS) technology have had a significant impact on sample preparation. The high resolving power and sensitivity of these instruments have greatly influenced trends in sample preparation, leading to a greater emphasis on automation, method consolidation, and sample size reduction. One notable trend is the miniaturization of sample preparation processes, which have been facilitated by using smaller sample sizes. This approach has not only enhanced laboratory efficiency, but it has also enabled the analysis of more complex samples that were previously challenging to analyze. By reducing the amount of sample required for analysis, researchers can optimize time and resources while obtaining more accurate and precise data.
In your opinion, what is the future of sample preparation?
Wolff:The future of sample preparation appears exceptionally promising, particularly considering challenges faced by analytical laboratories. The key to addressing these challenges lies in automation, a transformative force that not only streamlines workflows, but it also ensures consistent and reliable results.
Laboratories are increasingly recognizing the indispensable role of automation in overcoming the complexities associated with sample handling. By leveraging advanced robotics and intelligent software, laboratories can achieve a level of precision and repeatability that is challenging to attain through manual methods.
One notable advantage of automated sample preparation is its ability to facilitate “good” analytical work, including sample preparation, without adding personnel. “Dilute and shoot” methods, while suitable for some straightforward analyses, face limitations as questions become more complex or sample matrices more intricate, even with the advent of increasingly sensitive mass spectrometers. In such cases, automated sample preparation becomes indispensable to ensure reliable and robust results.
Undoubtedly, a crucial part of the future of sample preparation is software. To be successful, laboratory solutions will need “user-friendly” rugged software for both system control and data analysis. Integrated software solutions will continue to streamline laboratory operations from sample logging to data reporting, increasing the overall efficiency and effectiveness of analytical workflows.
Laboratories are increasingly demanding intuitive software interfaces that allow for easy and user-friendly operation of analysis systems. This ensures that even users with varying levels of expertise can navigate and utilize these systems effectively, minimizing the learning curve.
Moreover, tailored data analysis software designed to address specific analytical questions will be in demand. Customized analysis software will streamline data interpretation and accelerate decision-making processes. Even non-skilled production operators will quickly be able to get go/no-go results from a laboratory instrument. An additional benefit is the reduction in the need for software training. Laboratories can streamline their onboarding processes as user-friendly interfaces and task-specific analysis tools become more widely used. This saves time and resources while allowing the laboratory staff to focus on their core responsibilities and get the job done.
Bailey: Sample preparation is moving away from a one-size-fits-most approach and back to a targeted approach. The future is workflow-specific solutions that are designed for targeted analytes and compound classes rather than a “catch-all” approach. This is because the demand for accurate results is paramount, and these targeted workflows will ensure the most accurate results for a targeted assay.
The future of sample preparation will also be a simplification through innovation. Future techniques will take added steps and added waste out of the sample preparation process to deliver the results the laboratory needs more sustainably and in less time.
Edwards: The future of sample preparation is likely to revolve around workflow automation. Automation aligns with the principles of green sample preparation by enabling more accurate work with smaller volumes and minimizing human error. Due to this increased accuracy at smaller volumes, we can also expect sample preparation methods to shift towards smaller bed weights to allow analysts to be able to reduce the volume of solvents and sample they use.
Cocozza: Automated systems for solid-phase extraction (SPE) seem to be a trend for laboratories to prepare more samples simultaneously. There is a need for high throughput and streamlined processes while maintaining consistency and quality. With new regulations for testing, the volume of samples rises while laboratories are tasked with doing more with less. This means less labor, less bench space, fewer consumables, faster workflows, and a greater emphasis on automation without sacrificing quality.
To meet these expectations, laboratories are moving towards quick, easy, cheap, effective, rugged, and safe (QuEChERS) for food matrices, micro solid-phase, and SPE. Using QuEChERS eliminates the need for large sampling volumes, large amounts of bench space, and bulky extraction equipment. Another benefit this provides is it uses less solvent and encourages environmentally friendly practices, leading to the development of greener sample preparation methods with reduced solvent waste generation.
What obstacles do you think stand in the way of sample preparation development?
Wolff: Obstacles hindering the development of automated sample preparation are notably linked to outdated standards and norms. These often fail to consider the potential of automation, for example requiring large sample volumes that are neither conducive to automation nor necessary with appropriate miniaturization. An example is the requirement for a 250-mL water sample to determine PFAS levels, when 1–2 mL would be sufficient.
Another obstacle is the lack of standardized communication interfaces between hardware and software. The lack of standardized communication interfaces poses a problem when creating integrated systems that incorporate solutions or devices from various manufacturers. The absence of standardized interfaces complicates the seamless interoperability of different components within a comprehensive system. Standardizing these interfaces would greatly facilitate the development and integration of comprehensively automated sample preparation systems.
Furthermore, obstacles emerge in the form of customer expectations when looking for sample preparation solutions. On the one hand, there is a growing demand for complete, turnkey solutions that encompass every aspect of an analytical application, including sample preparation. On the other hand, solutions must be designed with an openness that allows for customization to accommodate specific customer requirements. This customization can vary widely, from changes in sample quantity and matrix to adapting to different analysis systems and chromatography data systems (CDS). Striking the right balance between providing a pre-packaged solution and maintaining the flexibility for customization remains a pivotal challenge for suppliers.
Bailey: The ever-evolving needs of the life science industry is the biggest obstacle we face in sample preparation solution development. First, there are emerging analytes of interest. From emerging contaminants to novel drug therapeutics, the scope of the molecules researchers need to analyze is continuously changing. Sample preparation needs to keep up with those needs. Second, project timelines are getting condensed. Now more than ever, we want to get test results immediately, or drugs to market faster. This means that researchers are working hard to accelerate time to results. Sample preparation techniques need to align to these industry timelines.
Edwards: Two main challenges come to mind. First, there is a time investment required for proper sample preparation. Some individuals may perceive this investment as not worthwhile, but inadequately prepared samples can lead to issues during analysis. These issues include increased instrument maintenance, more frequent sample retesting, and complex baseline interpretation. Second, there can be a disconnect between academic and industry partners, making collaboration on effective sample preparation solutions more challenging.
Cocozza: As instrumentation becomes more specialized and sensitive, analytical equipment, such as quadrupole time-of-flight MS (QTOF-MS) and high-resolution MS (HRMS) instruments, may hinder new development from a financial standpoint. The substantial upfront investment required may be out of reach for some laboratories despite their ability to complete preparations sufficiently. Beyond initial costs, ongoing expenses for maintenance and servicing, along with the need for specialized training programs, contribute to the cost factor. The complexity of the operation, potential compatibility issues, and the education for personnel required add to the analytical challenges. Regulatory compliance requirements become more stringent as the analytical methodology advances, creating more hurdles for laboratories developing analytical methods.
The second challenge pertains to the ongoing expenses associated with the purchase and maintenance of robotic automation systems. Although benefiting from the efficiency gained by automation, laboratories can encounter many issues related to day-to-day use. As these systems are utilized consistently, the chances of unexpected failures are bound to occur. Laboratories tend to have solvent-laden air, which can contribute to circuitry failures. In high-throughput preparation environments, there is also rigorous handling by technicians, sample matrix interference, or degradation of consumables that can contribute to automated systems not performing at their optimal capability. To mitigate the risks of downtime and ensure the uninterrupted operation of critical processes, laboratories find themselves compelled to purchase instruments in duplicate.
What was the biggest accomplishment or news in 2023-2024 for sample preparation?
Wolff: The most noteworthy accomplishments and developments in sample preparation in 2023 and 2024 are a mixture of technological advances and a revival of collaborative initiatives.
One outstanding development is the growing support for initiatives aimed at establishing open standards for manufacturers of laboratory equipment. Worldwide acting companies are demanding standardized communication interfaces and data standards. Notably, the laboratory automation data standard (LADS) has gained prominence, leading toward the development of standardized interfaces between laboratory instruments from different companies. Other communication and data standards, like SiLA 2, the Allotrope Data Format (ADF), and Analytical Information Markup Language (AnIML), contribute to this development. These initiatives reflect a collective commitment to overcoming interoperability challenges and promoting seamless integration between different laboratory technologies.
In this context of standardization, the inauguration of the Future Lab at the IUTA Institute in Duisburg, stands out as a milestone this year in Germany. This visionary endeavor, embedded in a long-term project, showcases how the future laboratory could look in a meaningful way.
As the world has emerged from the challenges posed by the Covid-19 pandemic, the analytical community has returned to in-person meetings, conferences, and trade shows. The post-pandemic era has seen a revitalized enthusiasm for face-to-face interaction. Combined with the benefits of electronic exchanges through online meetings, a hybrid approach is establishing itself that maximizes the advantages of both. This adaptive approach reflects the resilience and forward-thinking spirit of the analytical community, ensuring continued knowledge exchange and collaboration in a dynamic and evolving landscape.
Bailey: There are multiple accomplishments in sample preparation in 2023. First, 2023 saw an introduction of more “micro” solutions. From pipette tips to micro-cartridges, solutions are coming to market to enable researchers with limited samples. Second, there was an increased focus on automated solutions. A number of automated or automation-enabled sample prep workflows have been developed, allowing researchers to increase their sample volumes and productivity, allowing for resource savings (time and personnel). Finally, the year saw novel solutions developed for specific workflows. Sample preparation innovations are now taking the scientists’ end goal in mind, so more solutions are surfacing to meet the specific needs of the laboratory to deliver the most confidence in their results.
Edwards: I would say the increased discussion around green sample preparation. While attending conferences, I have noticed an increasing trend of conversations centered on this topic. Considering the significant environmental impact of laboratories, it is crucial to address this area and seek opportunities for improvement.
Cocozza: The Business Research Company recently released its Sample Preparation Global Market Report, which projects that the market will see significant growth over the next few years. According to the report, the market is expected to reach $12.79 billion by 2028, with a compound annual growth rate of 8.1%. Although this is not a groundbreaking innovation in sample preparation, this report emphasizes the ongoing importance of this process in various scientific disciplines and highlights its sustainability in the future. It’s worth noting that even as we try to streamline and automate the process, sample preparation remains a crucial aspect of many scientific experiments.
LCGC’s Year in Review: Highlights in Liquid Chromatography
December 20th 2024This collection of technical articles, interviews, and news pieces delves into the latest innovations in LC methods, including advance in high performance liquid chromatography (HPLC), ultrahigh-pressure liquid chromatography (UHPLC), liquid chromatography–mass spectrometry (LC–MS), and multidimensional LC.
Next Generation Peak Fitting for Separations
December 11th 2024Separation scientists frequently encounter critical pairs that are difficult to separate in a complex mixture. To save time and expensive solvents, an effective alternative to conventional screening protocols or mathematical peak width reduction is called iterative curve fitting.