The Column interviewed Maiken Ueland, an ARC discovery early career research fellow at the Centre for Forensic Science and deputy director of The Australian Facility for Taphonomic Experimental Research (AFTER) at the University of Technology Sydney in Australia, on her work in forensic taphonomy, where she uses analytical, biochemical, and spectroscopic techniques to conduct human post-mortem investigations.
Q. What are your main research interests, and what led you to the field of forensic taphonomy?
A: My main research interest lies within the intersection between analytical chemistry, technology, and forensic science. One of the main areas I am involved in is creating new technology and methods to find victims in mass disasters.
I loved solving puzzles as a kid and I was always very curious; then when I got older I found science and I realized how much science could help people. I realized how forensic taphonomy—which is what my area is called—allows me to try and understand how the human body works and I can use that knowledge to help law enforcement agencies. I want to be a voice for victims, and also prevent future crimes.
Q. What exactly is forensic taphonomy?
A: Forensic taphonomy is the study of everything that happens to an organism from the moment it dies until it is found. We use the information gained as the body breaks down to find missing persons and victims of crime. We also use the information to determine how long ago and how a person died, and also how to identify individuals so that we can help assist law enforcement.
Q. How do you develop chromatographic strategies for this research? What chromatographic techniques do you use in your research?
A: In my area we deal with very complex samples. For example, the volatilomes (volatile organic compound [VOC] profiles) of humans—which is something I analyze for both living and deceased individuals—are incredibly complex and can be made up of 1000s of compounds. The biggest challenge is to have sufficient tools to both capture and analyze these complex profiles (odours).
For the volatile work this is mainly untargeted, which means we do not know the identity of everything we are looking for. We use in-built libraries and external standards to try and identify these, but since there are so many this is a major challenge. We also do not know until we have done a lot of statistical analysis which compound(s) are going to be important for our research.
After years of research across the globe, we do know some of the biomarkers in the scent of decomposing human remains, which allows us to use a more targeted approach for those. However, because a lot of research has been done using pigs or with different analytical instrumentation, we are still not fully aware of all of the compounds that will be important. The goal is to use these biomarkers to develop field-based technologies.
I use more sophisticated analytical instruments such as comprehensive two‑dimensional gas chromatography time‑of-flight mass spectrometry (GC×GC–TOF-MS) for VOC/volatilome analysis, which is beneficial over GC–MS for complex profiles.
For a more targeted approach we have used GC–inductively coupled plasma (ICP)‑MS, as mentioned below.
I also work on targeted approaches when analyzing lipids from textiles or tissue samples. This work is focused on attempting to create better ways to determine time since death (how long someone has been deceased for), and is performed mostly using GC–MS/MS.
Q. You use a range of analytical, biochemical, and spectroscopic techniques to conduct human post‑mortem investigations, and in a recent paper you analyzed the decomposition of human remains in a simulated building collapse using GC×GC–TOF-MS (1). Please can you
talk about your findings.
A: One of my main areas involves the use of VOCs to find missing persons in mass disasters. In this paper we had two large simulated mass disaster events using six donated humans. One of the major findings was the degree of differential decomposition we saw upon recovery of the victims. The donors were purposely placed in single or commingled configurations, as this is something that can occur when disasters happen in densely populated areas. This was suspected to affect the decomposition, but it had not been studied until this time.
The VOC profiles in the area were also examined and we could use them to determine at which point of decomposition the majority of the remains were in. This is particularly important in the training of scent detection dogs.
The VOC profiles also add to the database and search for biomarkers for victim localization. The ultimate aim is to develop portable technology for field use that uses these biomarkers.
Q. What are the advantages of GC×GC–TOF-MS over existing techniques? Are there any disadvantages?
A: The primary difference between GC–MS (a common method for VOC analysis) and GC×GC–TOF-MS is the addition of a secondary column in GC×GC, which enables further separation of compounds with similar sizes and chemical properties. This additional column is generally of a different polarity than the first, which increases the separation ability and reduces coelution of chemical compounds. GC×GC–TOF-MS is preferred when analyzing biological volatilomes due to its ability to accurately detect more compounds through its enhanced separation ability and its increased peak capacity when analyzing complex samples. A disadvantage of using GC×GC–TOF-MS is that it does not have high-resolution (HR) MS. It is possible to get a GC×GC–HR–TOF-MS system to help improve the identification of unknowns, but this is a very costly instrument.
Q. Where do you source the human cadavers?
A: We have a body donation programme at the university. People donate their body to science and the donors we get are individuals who have specifically requested for their body to be used for taphonomic experiments. Without these absolutely amazing donations we would not be able to do any of the work we do. We are incredibly grateful for their gift.
Q. What are the challenges that you face when planning and performing these studies?
A: One of the biggest challenges is the number of variables that we are dealing with when analyzing living and deceased individuals. These variables include differences amongst individuals and sampling during different seasons and temperatures. We are also reliant on receiving donors. We need a good breadth of donors over all seasons in our specific scenarios, which means that the work might span over years.
Q. In another recent study, you used GC–ICP-MS for the first time to investigate VOCs from human remains (2). What led you to consider this technique in this research?
A: This work came about due to a collaboration with David Clases. We were looking for a way to accurately quantify the level of mainly sulfur components from decomposing human remains over time. Sulfur components are known to be important biomarkers present in the odour emitted from human remains. A lot of work is involved in attempting to determine the constituents in the human volatilome, but this does not really look at the concentrations. We therefore used ICP-MS after separation by GC to get the concentrations as an improved alternative to GC–MS. The concentrations were found to be time-dependent and showed potential as forensic markers to determine post-mortem intervals.
Q. The development of an electronic nose to detect human cadavers came about after its initial development in detecting illegally traded wildlife. Please can you discuss how this evolved.
A: The development of the electronic nose was a collaboration between Steven Su, Shari Forbes, and Wentian Zhang. This initially started as colleagues noted a smell when preparing illegally traded horn samples for DNA sampling. We were therefore curious to see whether scent could be used to detect and potentially give a preliminary identification of wildlife species. This work was initially performed using GC×GC–TOF-MS and showed promising results (3). We therefore got together and decided to develop electronic nose technology to allow for fast and on‑site detection of illegally traded wildlife. This work has evolved over the last six years and we are now also developing technology to detect human remains using the same principles.
Q. You have also been involved in another interesting forensic analysis project involving “reptile volatilome profiling optimization” (4). Can you elaborate on what this project involves and what your main findings were?
A: This project is part of the work on the illegal wildlife trade. We have previously analyzed various commonly encountered traded items such as ivory and rhino horns using GC×GC–TOF-MS. But being from Australia, our collaborator, the Australian Museum, was also very interested in testing whether VOC analysis could be used to locate a commonly traded lizard species and stop items before they are illegally traded out of Australia. Amber Brown, Greta Frankham, Barbara Stuart, and I therefore initiated a project where instead of looking at wildlife parts we analyzed living lizards. In the published work, the VOC collection and analysis was optimized. This was the first time that VOCs of lizards were collected, and we therefore needed to create a new method. This work demonstrated the complex profiles of reptiles and also showed by using the VOC profiles how to distinguish shingleback lizard, eastern blue tongue lizard, and Children’s python from each other.
The aim of this work was to also standardize a volatilome collection method with limited influences from secondary or tertiary factors that may reflect the biological status of the animal (metabolism, health status). This was to ensure that in the future we select biomarkers that will encompass all of the lizards within the test species rather than the biological factors that may separate young from old or the different sexes.
Q. What analytical challenges did you encounter and how did you solve them?
A: There was no other work profiling the volatilome of lizards on this scale. We therefore had to take a broad screening approach, as we did not know which compounds to expect. To create the best possible method, we also had to optimize a number of parameters. The biggest challenge in any optimization is to optimize parameters that affect each other. This usually results in many differing combinations having to be tested. The work also demonstrated the need for improved statistical analysis when dealing with biological specimens.
References
Maiken Ueland is currently an ARC discovery early career research fellow at the Centre for Forensic Science and deputy director of The Australian Facility for Taphonomic Experimental Research (AFTER) at the University of Technology Sydney in Australia. She is an emerging leader in the field of forensic taphonomy, where she uses analytical, biochemical, and spectroscopic techniques to conduct human post‑mortem investigations. Her main research areas are human decomposition chemistry, with a special focus on markers in tissue and odour and their use in criminal investigations, including locating missing persons and estimating time since death. Her interest lies in the interface between forensic science and analytical chemistry. Her current work focuses on developing methods for the successful location and recovery of victims in mass disaster scenarios using electronic nose technology and sophisticated analytical instrumentation.
Next Generation Peak Fitting for Separations
December 11th 2024Separation scientists frequently encounter critical pairs that are difficult to separate in a complex mixture. To save time and expensive solvents, an effective alternative to conventional screening protocols or mathematical peak width reduction is called iterative curve fitting.
Identifying and Rectifying the Misuse of Retention Indices in GC
December 10th 2024LCGC International spoke to Phil Marriott and Humberto Bizzo about a recent paper they published identifying the incorrect use of retention indices in gas chromatography and how this problem can be rectified in practice.