Special Issues
As the GC–MS market sees a trend toward high throughput and fast GC analyses, advances in data analysis software must be up to the task.
The terms "high sample throughput" and "fast GC" have become increasingly common in the gas chromatography (GC) market. Whether it's a QC chemist who is looking for a way to handle samples whose numbers increase with the legislation that govern them or an environmental chemist looking for an edge in an extremely competitive market, increased throughput is a concern throughout the gas chromatography–mass spectrometry (GC–MS) world. As the need for reduced sample run times climbs the list of importance for those looking to purchase new capital equipment, instrument manufacturers have been developing systems to meet new productivity requirements.
(Photo: Getty Images)
This can be seen in gas chromatographs capable of faster ramps or reduced cool times between runs, and by mass spectrometers capable of higher acquisition rates. As an example, a GC–time-of-flight (TOF) MS system (Pegasus, LECO Corporation, St. Joseph, Michigan) is capable of acquiring data at 500 spectra per second, fast enough to sufficiently sample across extremely narrow peaks (less than 50 ms). With hardware capable of these speeds, analysis times routinely can be halved, quartered, or even better. However, this increased throughput does not come without challenges of its own. In fact, two very significant challenges arise as a direct result of this much sought after speed — coelution in the analyses and the review of the resulting data. To properly handle these two challenges, adequate software becomes an absolute necessity. The employed software must evolve from something that does not just allow data to be acquired and reported, but to something that is a dynamic and powerful tool that provides much more capability to handle both of these important issues. This article shows how software is able to accomplish these tasks when used with a GC–TOFMS system.
Figure 1: Chromatogram displaying nine pesticides present over a four-second window following the deconvolution.
Coelution
Coelution is familiar to most analysts who have spent much of their careers attempting to reduce the number of instances or its severity. When performing time-compressed chromatography this is, of course, no longer possible, as reduced runtimes allow for less chromatographic space over which to spread the analytes. Instead, software must be capable of deconvoluting the newly coeluted peaks. Figure 1 further illustrates the symbiotic relationship between hardware capable of high acquisition rates (in this case, 40 spectra per second) and a software capable of properly handling that data by deconvoluting the resulting spectra. The deconvolution algorithm present within ChromaTOF software (LECO) is capable of separating nine present pesticides over a 4-s window. One such spectrum containing three of the pesticides at 158.529 s is shown in Figure 2, with the masses corresponding to fenthion, chlorpyrifos, and parathion identified. Following the algorithm's deconvolution of the spectrum, the resulting spectrum for chlopyrifos can be extracted easily. An example of this is shown in Figure 3.
Figure 2: Resulting spectrum at 158.529 s identifying fenthion, chlorpyrifos, and parathion.
Data Review
The second challenge faced by increased throughput is less often considered but is equally important. As sample run times are reduced significantly, the number of samples analyzed increases, as does the amount of data to review. Take as an example an analyst who has run the same analysis for years with a standard run time of 1 h. If, in the interest of increased throughput, a high-speed system is purchased and run times are reduced to 10 min, the laboratory manager will benefit from dramatically increased data acquisition. However, the analyst will have approximately six times as much data to review. This data-review process now represents the most likely candidate for "bottlenecking" the process. In this situation, the software must be capable of reducing the amount of time taken to analyze each sample by about the same factor that the sample run time was reduced. Effective software must meet this goal in two ways. First, the software should take advantage of increased automation so that analysts have less to address between samples. Second, the software should present itself to the user in a manner that makes it quick and efficient to accomplish the same tasks.
Figure 3: Deconvoluted spectrum (top) and NIST library spectrum (bottom) for chlorphyrifos.
Many functions in the software used here (ChromaTOF, LECO) are capable of increased automation to accomplish this goal. For example, it allows for a sample to have more than 15 processes automatically applied through one mouse click, including Automated Peak Find, library searching, quantifying of the analytes, applying a retention index, printing a series of reports, uploading data to a laboratory information management system (LIMS), and exporting files to a backup drive. By automating these functions, the software frees analysts from having to perform these time-consuming processes and enables them to focus on reviewing the results instead.
While automation is extremely important in allowing for increased throughput with regard to data review, the layout and design of the software is equally important. Users must be able to work seamlessly within a region of the software without needing to move back and forth between sections to accomplish a single goal. It also is essential that different information be displayed in a manner that facilitates speed, ease, and flexibility. The capability to have information displayed while reviewing samples or calibrations, the flexibility in how it is displayed, and its location on the screen increases operator productivity. In Figure 4, a screen shot is displayed showing the increased functionality for the calibration for several analytes. From this single view, analysts can see a list of the analytes in the calibration in addition to the standards in which they appeared, as well as the chromatogram and spectrum for every analyte in each of the standards.
Figure 4: Screen shot generated from software displaying calibrations for several analytes (ChromaTOF, LECO).
Conclusion
As today's laboratories further look toward ways of increasing throughput and productivity, faster sample methods such as TOFMS are gaining interest. These methods have introduced instrumentation with the potential of gaining large amounts of data in a shorter amount of time, enabling more samples to be run. However, this increased amount of data presents a new set of challenges in the data-analysis process. With so much data present, it is often difficult to "sort through" that data in a time-efficient manner, even reversing the goal of increased throughput.
With the proper software, however, TOFMS instruments have the potential of being one of the most sought-after tools for GC analysis. In order to be beneficial to the user, this software should overcome two challenges: the coelution of samples and the review of resulting data. In this article, examples were provided of how an integrated software package can provide a solution for the data collected.
Lucas Smith
LECO Corporation
Please direct correspondence to lucas_smith@leco.com
The Next Frontier for Mass Spectrometry: Maximizing Ion Utilization
January 20th 2025In this podcast, Daniel DeBord, CTO of MOBILion Systems, describes a new high resolution mass spectrometry approach that promises to increase speed and sensitivity in omics applications. MOBILion recently introduced the PAMAF mode of operation, which stands for parallel accumulation with mobility aligned fragmentation. It substantially increases the fraction of ions used for mass spectrometry analysis by replacing the functionality of the quadrupole with high resolution ion mobility. Listen to learn more about this exciting new development.
Liquid Chromatography to Analyze Vitamin D Proteins in Psoriasis Patients
January 21st 2025Can a protein involved in delivering Vitamin D to target tissues have an altered serum profile in psoriasis patients with cardiovascular disease? Researchers used liquid chromatography (LC) to help find out.
The Complexity of Oligonucleotide Separations
January 9th 2025Peter Pellegrinelli, Applications Specialist at Advanced Materials Technology (AMT) explains the complexity of oligonucleotide separations due to the unique chemical properties of these molecules. Issues such as varying length, sequence complexity, and hydrophilic-hydrophobic characteristics make efficient separations difficult. Separation scientists are addressing these challenges by modifying mobile phase compositions, using varying ion-pairing reagents, and exploring alternative separation modes like HILIC and ion-exchange chromatography. Due to these complexities, AMT has introduced the HALO® OLIGO column, which offers high-resolution, fast separations through its innovative Fused-Core® technology and high pH stability. Alongside explaining the new column, Peter looks to the future of these separations and what is next to come.
A Guide To Finding the Ideal Syringe and Needle
January 20th 2025Hamilton has produced a series of reference guides to assist science professionals in finding the best-suited products and configurations for their applications. The Syringe and Needle Reference Guide provides detailed information on Hamilton Company’s full portfolio of syringes and needles. Everything from cleaning and preventative maintenance to individual part numbers are available for review. It also includes selection charts to help you choose between syringe terminations like cemented needles and luer tips.