LCGC North America
The best approach is to document everything for each hardware configuration as a separate method and then qualify and validate them for each sample or category of sample.
John V. Hinshaw
A reader recently wrote:
What constitutes a complete description of a chromatography method? For example, the method of integration is never described for a transfer to a different lab. The same is true of the inlet liner used, including the configuration, volume, use of glass wool packing, and so forth, nor is the time for the purge indicated very often.
This question came by e-mail while I was writing the preceding two-part series of columns on pressure, temperature, and column issues that arise when labs run the same gas chromatography (GC) method on multiple instruments. This month's "GC Connections" addresses some of the issues that method transfer raises for method documentation and validation.
In the context of the chromatography lab environment, a method could be defined as a collection of instructions that completely describe the sample analysis process such that trained technicians can prepare and analyze the same type of sample and produce equivalent results. I didn't write "the same sample" or "the same results" for good reasons. Each single sample is a unique, more or less representative member of a sample collection that, taken as a whole, characterizes the entire population under scrutiny with varying degrees of accuracy. No two samples — or snowflakes — are exactly the same. Even if the instrumentation were capable of analyzing samples with vanishingly small run-to-run deviations, at some level, the results will all be different solely due to sample-to-sample variations in collection, preparation, and handling. Instrumental variability only increases the results scatter.
Figure 1: Solvent-column incompatibility: (a) methanol and (b) methylene chloride. Column: 30 m x 250 μm x 0.25 μm film DB5ms (J&W Scientific, Agilent Technologies, Wilmington, Delaware); 1 mL/min helium carrier, constant flow, vacuum compensation on (7 psig initial pressure); 40 °C, hold 2 min, 1.5 °C/min to 175 °C, hold 20 min; inlet: splitless, 250 °C, 2-min splitless time, programmed-pneumatic control; detector: QMass 9000 (Perkin-Elmer, Shelton, Connecticut), m/z 40â300 total ion chromatogram. Sample: 1 μL of 2 ppm each 1-octanol, n-undecane,2,6-dimethylphenol, 2,6-dimethylanaline, n-dodecane, and n-tridecane. From reference 1.
A chromatography method could be said to consist of three major parts: sample preparation, component separation, and data handling. These parts correspond with three categories of lab equipment: wet chemistry and associated automation equipment, chromatography systems, and data-handling systems. The three method parts are interrelated closely; any one part is incomplete without the other two, and characteristics of each part affect the others. Sample preparation is very broad in scope and lies somewhere outside of the usual range of topics addressed in this column. But we should examine a simple example in order to better understand some of the broader issues.
Suppose that we want to determine the amounts of active ingredient in each of 20 bottles of 1000 tablets that were manufactured from the same batch of starting material by a chromatographic assay, as well as measure their variability. We could, for example, sample 20 tablets from each bottle, dissolve them together, extract, dry, and filter the solution, then take it up in a suitable solvent and pipette it into three autosampler vials for triplicate chromatography analyses. Do we have 60 of the same sample? Not at all. Neither the three vials in each group nor the 20 groups themselves can be considered as the "same" sample. Each is subject to various influences in the sampling and preparation process such as the selection of tablets from each bottle, the precision and accuracy of the pipettes, balances, and volumetric flasks used, as well as the technicians' ability to obtain the specified measurement performance, plus the timing, temperature, humidity, and storage of the samples during preparation and before analysis. Many opportunities for uncertainty and outright mistakes arise in this chain of events, all of which occur before the sample gets to the syringe or sample loop for injection.
Is this preparation methodology suitable for these samples? Perhaps not. The prepared sample should be compatible with the subsequent chromatography. In GC, the solvent (if there is one) plays a crucial role in injection and also influences the separation process. Figures 1a and 1b show the difference that solvent choice can make for trace-level analysis. In Figure 1a, with methanol solvent, the peaks are broadened and distorted. Uneven solvent wetting of the column surfaces caused the collection of droplets that were pushed along by carrier gas flow. Methylene chloride solvent, shown in Figure 1b, was more compatible with the column inner surfaces and formed an even film that engendered greatly improved peak shapes. Solvent identity is just one example of the many ways in which sample preparation interacts with the chromatography.
There is little use in proceeding with a complex or large-scale analytical project without carefully defining the goals for the analysis first, followed by in-depth investigation, evaluation, and subsequent validation of suitably coordinated analytical processes that are capable of meeting that goal, including sample acquisition and preparation, separation, and data manipulation. Experimental design is a complex and extensive field. I have a 650-page book on the subject, and I don't purport to understand most of it well at all. The reader's question is better understood by placing it in the context of an overall experimental design and realizing that there is critical information not captured in the chromatographic portion of the methodology.
A chromatography method should capture a snapshot of all the information that would be necessary to recreate the analysis at a later date. This information includes both configuration and setpoint-actual portions. Equipment information, device setup, and operational setpoints are nominally the same from one analysis to the next. Real-time readouts of temperatures, flows, and other controlled or monitored parameters change from moment to moment. For the most part, those readouts are tracked by the instrument system, and it will indicate if the operational conditions have strayed outside acceptable ranges. Presumably, an analysis will not occur unless the conditions are met. This type of ready-not ready indication is a fundamental contributor to data quality and integrity. If operational parameters dynamically fall outside their nominal ranges during an analysis, however, the occurrence should be captured in a run-specific record that, in effect, forms part of the methodology for the individual sample under scrutiny. Of course, this type of event should trigger an immediate review of the results' validity. In any case, such dynamic run records form an important part of the data record without which the actual analysis could not be recreated accurately.
Often, laboratories will employ different instruments for a particular analysis depending upon the work load, scheduling, and maintenance. This usually isn't a problem if the instruments are the same model with the same options, if they all have been maintained and validated appropriately, and if they are shown to be capable of delivering equivalent results for the method and sample in question. Additional concerns arise should different options or instrument models be used. In such cases, validation becomes the only path to demonstrating equivalency. Furthermore, unless each instrument is tied into the same data-handling and information management systems, it might be difficult or impossible to demonstrate equivalency between disparate manufacturers' chromatography systems.
There always will be instrument-dependent disparities in the contents of a method. Different instruments are controlled in different ways, and there is imperfect cross-model uniformity in the name, number, range, and function of many instrument parameters. The best approach is to document everything for each hardware configuration as a separate method and then qualify and validate them for each sample or category of sample. That way, if instrument model A and instrument model B are to be used for the same analysis, each has an appropriate method with the right settings and configurations. The same statement can be made about data-handling systems.
The categories of instrument parameters presented here include items that are common to or have equivalent settings in most instrument systems. It is not practical to enumerate all of the parameters on an instrument-specific basis. The list would be far too long for publication and would be obsolete before it could be printed due to the constantly evolving landscape of the instrument business. And of course, gathering all that information could be very time consuming. For real-world situations, analysts might choose to start with the list printed here and then add or remove items as appropriate to their individual instruments, options, and configurations.
I have omitted common information that should be recorded in all cases, such as model and serial numbers, service and maintenance records, installation history for configurable options, and so forth.
The following items are associated with the modern electromechanical liquid autosamplers used in most labs today. Some autosamplers expose additional control parameters that permit special sampling modes such as large-volume, solid-phase microextraction (SPME), or headspace, which are not listed here. The absence, presence, and type of each configurable hardware item, each functional parameter's nominal setpoint value, and the actual values reported during the analysis should be recorded. For example, if the method calls for an injection volume of 1.0 μL from a 10.0-μL syringe, then the installation of the correct syringe and the successful injection of the correct volume as reported by the autosampler controller should be noted in the run record for each analysis.
The inlet strongly influences GC separation and quantitation. With the wide variety of available inlet systems and options, chromatographers should record the exact injection type and mode employed. It's not sufficient just to note if a packed or capillary inlet was used. Carrier gas pneumatics are included in the inlet category because they depend upon each other so much. Whether mechanical or electronic, the pneumatic configuration and operational mode should be documented along with the pressure, flow, or velocity setpoints. Some items in the list will apply only to specific pneumatic systems or modes of operation, and there are other, less common categories that have been omitted.
Carrier Source
Inlet Pneumatics
Inlet Configuration
Injection
The column category includes the GC oven controls and configuration. Some items are specific to packed or capillary columns. Columns age with time and use, and so it is good practice to regenerate them periodically, if possible, and replace them at regular intervals.
GC Oven
Column Identity
This list is good for most common GC detectors, but it doesn't encompass mass spectrometers or similar detectors. Because they are computer controlled, it's fairly straightforward to have them produce and store the kind of method documentation under discussion here.
Detector Configuration
Detector Gas
Detection Conditions
Data processing parameters vary widely across the different products. However, this should not be a problem because of all the chromatography method categories; data processing is the easiest to document. Essentially, all of the major products support full encapsulation of the data-handing portions in the associated run records and most go beyond that to provide audit trail documentation and efficient interfaces to laboratory information management systems (LIMS).
The data system, if possible, should act as the primary driver for electronic method documentation. For data and instrumentation products from the same manufacturer, this isn't much of a question unless obsolescence is an issue, but cross-model deployments can miss significant portions of the full set of analysis-specific information. Some manufacturers' data systems support other companies' products in this sense, but in such cases, it is a very good idea to validate the level of documentation and compatibility with the specific instrumentation configuration and interfaces that are to be used for routine production.
Data Acquisition-Signal Processing
Peak Detection
Peak Measurement
Timed Events
Peak Identification
Calibration
Quantitation
A chromatographic method consists of several categories of information, both about the instrumentation configuration and setup as well as about the actual analyses as they occurred. By documenting as much of this knowledge as possible, analysts can produce better quality results as well as rely on having a rich source of information should it be necessary to reproduce the analyses or reexamine the results at a later date. Modern data systems have the capability to provide much of the information and store it in appropriate places.
(1) J.V. Hinshaw,
LCGC
14
(7), 568-575 (1996).
John V. Hinshaw "GC Connections" editor is senior staff engineer at Serveron Corp., Hillsboro, Oregon, and a member of LCGC's editorial advisory board. Direct correspondence about this column to "GC Connections," LCGC, Woodbridge Corporate Plaza, 485 Route 1 South, Building F, First Floor, Iselin, NJ 08830, e-mail lcgcedit@lcgcmag.com.
For an ongoing discussion of GC issues with John Hinshaw and other chromatographers, visit the Chromatography Forum discussion group at www.chromforum.com.
AI and GenAI Applications to Help Optimize Purification and Yield of Antibodies From Plasma
October 31st 2024Deriving antibodies from plasma products involves several steps, typically starting from the collection of plasma and ending with the purification of the desired antibodies. These are: plasma collection; plasma pooling; fractionation; antibody purification; concentration and formulation; quality control; and packaging and storage. This process results in a purified antibody product that can be used for therapeutic purposes, diagnostic tests, or research. Each step is critical to ensure the safety, efficacy, and quality of the final product. Applications of AI/GenAI in many of these steps can significantly help in the optimization of purification and yield of the desired antibodies. Some specific use-cases are: selecting and optimizing plasma units for optimized plasma pooling; GenAI solution for enterprise search on internal knowledge portal; analysing and optimizing production batch profitability, inventory, yields; monitoring production batch key performance indicators for outlier identification; monitoring production equipment to predict maintenance events; and reducing quality control laboratory testing turnaround time.
2024 EAS Awardees Showcase Innovative Research in Analytical Science
November 20th 2024Scientists from the Massachusetts Institute of Technology, the University of Washington, and other leading institutions took the stage at the Eastern Analytical Symposium to accept awards and share insights into their research.