Over the last two years, our consulting firm has had the opportunity to review and assess more than 150 litigation discovery packets from a multitude of forensic testing laboratories. We have written previously about the overall lack of sufficient method validation and quality control in the cases that we have reviewed, the majority of which have been for blood alcohol determinations (1–3).
We have argued these deficiencies and others in the courtroom in several instances. It is disheartening to see forensics analysts from crime laboratories cling to outdated standard operating procedures that do not conform to consensus standards propagated by nationally recognized organizations, such as the American Academy of Forensic Science (4,5). As analytical chemists who are regularly involved in the developmecnt of new methods, be it for environmental, pharmaceutical, or forensic science, we rely on consensus standards to define the steps and procedures needed to prove that a method and the measurements made are reliable. When these steps are not followed, the method and measurements may be subject to uncertainties and inaccuracies that have not been properly assessed.
In the scientific publication process, studies lacking appropriate validation and quality control are regularly rejected during peer review. Similarly, in forensics, measurements that have not been supported by widely accepted criteria for validation and quality control should not be relied upon in litigation, especially considering that someone’s civil liberties may be at stake.
The uncertainty (or error) associated with a reported value is an important criterion to assess the reliability of a measurement. Accuracy can quickly be called into judgement when uncertainty becomes elevated. Uncertainty should also be assessed regularly, during the course of routine measurement of samples, because instrument performance does not remain constant over time. Instruments have to be regularly maintained and repaired, because their performance will eventually deteriorate with use.
Uncertainty should also be comprehensively assessed on the instrument in question. Performance results obtained from one instrument should not be used to indicate the performance of a different instrument. This statement is obvious to the readership of LCGC, but such assessments—using results from one instrument to validate performance of another—has been commonly encountered in our review of forensic laboratory documentation.
When a blood alcohol concentration is reported, it is usually accompanied by a value for uncertainty at the 99.7% confidence interval. In a large collection of cases we have reviewed, this level of uncertainty has been declared to be 4.3% (for example, 0.188 ± 0.008 g/dL). This assertion claims that the “true” result for this blood alcohol determination has a 99.7% chance of being between 0.180 and 0.196 g/dL, and only a 0.3% chance of being outside that range.
When you look in these cases to see from where the 4.3% uncertainty value is derived, you find that it has been assessed solely based on a) the repeated analysis of calibration and control standards in neat aqueous solution with internal standardization, and b) the manufacturer’s indicated uncertainty in the certified reference ethanol standards that they provided. In the end, they ascribe more than 70% of the assessed total variability in a reported blood alcohol determination to the variability coming from the repeat analysis of pure standards, with the remainder being attributable to the variability in the concentration of the certified reference materials, as assessed by the manufacturer.
In our opinion, this is a gross underassessment of uncertainty, especially for a method that is intended to measure a chemical substance from a biological fluid. Additionally, this uncertainty evaluation is only performed semi-annually, and the assessed uncertainty determined (4.3% at the 99.7% confidence interval) is applied across all instruments in, and results from, laboratories in the forensic laboratory system for blood alcohol determinations. Such a level of uncertainty can hardly be expected to be consistent for every instrument and operator in a large system of forensics laboratories, nor does it contain an assessment of uncertainty arising from biological matrices.
In the documentation for uncertainty evaluation for this collection of cases, the laboratories claim that blood matrix effects are negligible and do not need to be assessed, because they were evaluated on a couple of instruments in one of the crime laboratories in 2016. To be clear, they contend that blood matrix interferences are absent in all the instruments across the forensic laboratory system because a set of tests were performed on one set of instruments at a single crime laboratory, seven years ago. Additionally, not all the headspace gas chromatography instruments across the system are from the same manufacturer. The instruments used to perform the blood matrix interference studies in 2016 were from PerkinElmer, whereas many of the other laboratories in the system use Shimadzu gas chromatographs. Some laboratories use pressure-loop headspace systems and some used rail-based syringe autosamplers. They assume that all the instruments behave identically, which cannot be true.
Total error in an analysis method can be determined by assessing error propagation. Total error propagates as the square root of the sum of the squares of the errors from different error sources. Detector noise is a source of error, but this is usually very minor compared to other sources of error. Gas chromatographs are high precision instruments, and with internal standardization, they can provide very precise data, especially for pure standards. When the samples become more complex, such as moving from analysis of ethanol in water to analysis of ethanol in whole blood, greater variability will be imparted and must be assessed. Most analytical chemists will agree that the primary source of error in an overall method is sample preparation. Though sample preparation for blood alcohol determination is straightforward and generally involves a series of pipetting steps, it is not unreasonable to point out that pipettes can perform differently when transferring water versus whole blood, just based on viscosity alone. This variability can also depend heavily on the pipetting technique used by the analyst.
There are other sources of uncertainty that are often unaccounted. As mentioned, matrix effects can develop over time as instruments are used. If blank and ethanol-fortified whole blood controls are not regularly analyzed as part of quality control in a batch sequence, to verify absence of matrix effects and maintenance of accuracy, respectively, the forensic laboratory has no way to know whether their data is subject to additional uncertainties. The magnitude of the effects that these can exert on results is also difficult to conjecture. Besides neglecting matrix effects, enormous variability can be introduced through improper sample handling and storage. This particular issue is a topic that deserves its own subsequent blog post.
Overall, the level of uncertainty provided by most forensic laboratories for reported blood alcohol results has been woefully underassessed. The methodology that has been used to estimate uncertainty does not capture changing variability amongst different instruments and instrument types, as they are used over time. It does not capture variability associated with the preparation and measurement of complex biological samples, and it definitely does not capture variability in sample handling and storage. When these sources of error are not adequately assessed, then they can only be accounted by assuming reasonable levels of the variability possible for each. When those errors are propagated together with the limited assessment of variability from the forensic laboratory, then the window of “true” values represented by a reported measurement becomes much wider, such that the accuracy of the measurement, especially relative to some threshold (for example, 0.08 g/dL) becomes very debatable. Without proper assessment of the uncertainty of a method, the accuracy of the result it provides cannot be reliably established. In many of the cases we reviewed, forensics laboratories need to revise their procedures for uncertainty assessment, to be more realistic.
(1) Schug, K. A.; Hildenbrand, Z. L. Accredited Forensics Laboratories Are Not Properly Validating and Controlling Their Blood Alcohol Determination Methods. LCGC N. Am. 2022, 40 (8), 370-371. DOI: 10.56530/lcgc.na.hz5482n7 (accessed 2023-04-26)
(2) Schug, K. A. Fundamentals: Full Method Validation is Still a Glaring Deficiency in Many Forensics Laboratories. LCGC N. Am. 2021, 39 (11) 200. https://www.chromatographyonline.com/view/full-method-validation-is-still-a-glaring-deficiency-in-many-forensics-laboratories (accessed 2023-04-26)
(3) Schug, K. A. Forensics, Lawyers, and Method Validation—Surprising Knowledge Gaps. The LCGC Blog. June 8, 2015. http://www.chromatographyonline.com/lcgc-blog-forensics-lawyers-and-method-validation-surprising-knowledge-gaps (accessed 2023-04-26)
(4) AAFS Standards Board, ANSI/ASB Standard 036, First Edition 2019. Standard Practices for Method Validation in Forensics Toxicology. https://www.aafs.org/sites/default/files/media/documents/036_Std_e1.pdf (accessed 2023-04-26)
(5) AAFS Standards Board, ANSI/ASB Standard 054, First Edition 2021. Standard for a Quality Control Program in Forensic Toxicology Laboratories. https://www.aafs.org/sites/default/files/media/documents/054_Std_e1.pdf (accessed 2023-04-26)
Profiling Volatile Organic Compounds in Whisky with GC×GC–MS
November 1st 2024Researchers from Austria, Greece, and Italy conducted a study to analyze volatile organic compounds (VOCs) present in Irish and Scotch whiskys using solid-phase microextraction (SPME) Arrow with comprehensive two-dimensional gas chromatography coupled to mass spectrometry (GC×GC–MS) to examine the organoleptic characteristics that influence the taste of spirits.
GC–TOF-MS Finds 250 Volatile Compounds in E-Cigarette Liquids
November 1st 2024A study has used gas chromatography coupled to a time-of-flight mass spectrometer to build an electron ionization mass spectra database of more than 250 chemicals classified as either volatile or semi-volatile compounds. An additional, confirmatory layer of liquid chromatography–mass spectrometry analysis was subsequently performed.
GC–MS Targeted Analysis of PFAS Helps Expand Knowledge of Toxicokinetic Data
November 1st 2024Limited toxicokinetic and toxicologic information is available about a diverse set of per- and polyfluoroalkyl substances (PFAS), but methods based on gas chromatography–tandem mass spectrometry (GC–MS/MS) can help unravel some of the mystery.
Multivariate Design of Experiments for Gas Chromatographic Analysis
November 1st 2024Recent advances in green chemistry have made multivariate experimental design popular in sample preparation development. This approach helps reduce the number of measurements and data for evaluation and can be useful for method development in gas chromatography.