I grew up professionally in a laboratory with two HPLC systems, along with capillary and packed-column gas chromatography (GC) instruments. I moved on to be a technical support chemist for J&W Scientific, a pioneering company in capillary GC columns (now part of the Agilent family). My graduate studies were centered in capillary zone electrophoresis, and I have since contributed to microfluidic separations. I have a deep history and real appreciation for separations science—I have taught it at the graduate level for twenty-five years. I am currently the Chair of the ACS Subcommittee on Chromatography and Separations Chemistry (SCSC).
That being said, I have struggled to write this article. Why? I wasn’t sure. I basically write for a living, so 900 words should not be that hard. But I think I have figured it out, and this isn’t going to be a popular opinion: I do not believe in classic linear separations anymore; further, I think separations science is wildly underdeveloped and underappreciated.
I do not think it will be a popular opinion because many people have spent their lives learning and developing these techniques, and many people (myself included) have made a good living working within these industries. Suggesting they are not as useful as they could be and may need to be replaced is discomforting.
Okay, Mark, now explain yourself.
So where is this coming from? Well, I guess it gets back to seeing and thinking about the stunningly small injection volumes and fast total analysis times from microfluidics and high-speed GC. Couple that with multidimensional separations in general, then add in hybrid 3D fabrication techniques for microscale systems and gradient style techniques (increasing concentration while separating), and one can quickly surmise the days of long linear runs of univariate data are numbered.
Yeah, well, not so quick.
I was around for the introduction of capillary GC and capillary electrophoresis, which were to replace the packed column GC immediately (it took twenty years or more) and HPLC (it never happened). Things change slowly. Acceptance of new techniques is limited. If engineers and scientists have existing working solutions, they do not really need to change.
That being said, where do I think separations science will be in 10 years? 50 years?
Extending observations about injection bandwidth, separation speed, available resolution, developing gradient techniques, orthogonality of techniques, and hybrid 3D fab capabilities, well...at a pretty amazing place.
There will be “on demand” capability, where the desired target from a complex sample is isolated, concentrated, purified to homogeneity, and delivered to a detector or to a next step. The internal workings will be a highly efficient, multidimensional programmable separations scheme set on foundational physical interaction and hard-won empirical results. All this will be held in a volumetrically small system engineered and created using hybrid 3D fabrication based on the fundamentals of the separations sciences we all know and love. It will probably look like the Borg ship from “Star Trek,” but our beloved linear separations schemes will be gone. We have witnessed similar evolutions and revolutions in various spectroscopies (nuclear magnetic resonance [NMR] is an exemplar) and especially mass spectrometry (MS). Although, admittedly, they do not resemble the Borg ship. We can get there, but it will take several axes of separation and uber-efficient sample transfer.
If we can do separations on demand, we can do what I call “whole sample information extraction”—that is, analyze everything (total information theory) in a sample with a gigantic and complex scanning mode. Imagine starting with a blood sample, separate based on size, various similar cells gathered together, then separate on some physical principle such that all cells in a fraction are identical. Then, with each of these fractions, disrupt the cells and separate all the organelles and biochemicals to pure complex structures, disrupt these and separate the resulting complexes or pure materials and delivery to the detector (or multiple detectors). The now-pure homogeneous fractions can be fed into information-rich detection systems (MS, cryoEM, NMR, whole genome sequencing, lipidomics, and so on).
Since this system can be operated in a dynamic mode, the various fractions can be queried against a variable input (diseased or healthy, polluted or pure, immune response or non) and AI or machine learning strategies can be employed to determine the most important elements.
Well, I suspect I have made myself a pariah in my own scientific field—a lot of my friends are making a very good living using and improving our current techniques. However, if my students and younger colleagues’ enthusiasm is any indicator, we will be 3D printing our way into a very interesting future based on our beloved and respected principles of separations. I look forward to a much greater impact on our society as these capabilities are realized.
RAFA 2024 Highlights: Contemporary Food Contamination Analysis Using Chromatography
November 18th 2024A series of lectures focusing on emerging analytical techniques used to analyse food contamination took place on Wednesday 6 November 2024 at RAFA 2024 in Prague, Czech Republic. The session included new approaches for analysing per- and polyfluoroalkyl substances (PFAS), polychlorinated alkanes (PCAS), Mineral Oil Hydrocarbons (MOH), and short- and medium-chain chlorinated paraffins (SCCPs and MCCPs).
Pharmaceutical excipients, such as polyethylene glycol-based polymers, must be tested for the presence of ethylene oxide (EtO) and 1,4-dioxane as part of a safety assessment, according to USP Chapter <228>.