This blog is a collaboration between LCGC and the American Chemical Society Analytical Division Subdivision on Chromatography and Separations Chemistry.
Separations science within microfluidics has already begun to make significant impact across any number of fields. But many times they are embedded within a larger system—their importance hidden or minimized.
The tiny length scale of microfluidics enables things to happen that have no equivalent on the benchtop. For instance, extremely large electric fields and gradients may be induced that are impossible on the macroscale. When resolution is coupled to the value of these elements, it can skyrocket. If the details of the techniques naturally lend themselves to multi-step separations or enhance detector function, the impact from the advanced resolution is multiplied.
I began to wonder “by how much?” Can I estimate how much better the separations can be and how much better the attached mass spec, cryoEM, spectrophotometry or NMR might work? What is the true “impact?” If I want to answer such a question, what are my metrics? What do I put on the x- and y-axes when I go to plot it? “Impact” didn’t seem to work well as an analytic unit of measure.
I can do a quick estimate of peak capacity (the number of separate pieces of information which can be gathered), coupled to the resolution and the maximum and minimum elution time or spatial elements. Microfluidics can provide ten to a thousand times better peak capacity on a single axis. Seems pretty good, right? So what. What does that actually mean? How can you understand the impact of such a thing?
Cue my ignorance of entire fields of science.
Like most things in my career, someone else has been thinking about this type of stuff for a long, long time, and I just didn’t know about it. And of course, they did not use any words I would search for, or understand, initially. Some lucky wandering through some old gas chromatography and mass spectrometry literature from the sixties and seventies*, along with a friendship with a mathematician, were the key.
A new world for me: Information Theory. They have been at this a long time. The original driver was (to my understanding) to estimate the amount of information that could be pushed through the trans-Atlantic cable. To understand impact of an analytical technique I needed to be looking to boats, cables, oceans, and electronics? Obvious, right?
Buried within their analysis was a formula for the smallest “size” (bit) of information compared to the total width of that information, which in our world is peak capacity. Further, the amplitude range can be similarly quantified for “information content” or bits (these parts are pretty easy to understand, of course in hindsight only). The core of this is reflected in Shannon’s equation:‡
Adapted to separations science, this can give the number of available “bits” of information from that system.
Cue more ignorance.
Turns out spectroscopists, when challenged with similar problems, attacked them with concepts consistent with Shannon’s equation. Kaiser wrote two “A-page” articles in 1970 (Anal. Chem. 42 (2970) 24A) that explains these concepts applied to identifying and quantifying all of the elements in lunar rock samples. A truly fascinating read. In that article, he defines the “Informing Power” of an overall technique (in available bits) and “Informing Volume” of that technique as applied to a particular problem. Importantly, also defined is the volume of information needed from the sample or problem posed.
Now I have the tools, a way to estimate and compare the informing power of a separations science technique in terms of the available information in bits. I also have a construct to estimate the volume of the information needed to attack a particular problem.
On this second point is an interesting question we are playing with: how much information is contained in blood? It sounds ludicrous to ask such an open-ended question. But with these strategies enumerated by Information Theory, this can be considered. Think of cells, small bioparticles (exosomes for example), proteins, peptides, small molecules, and organic and inorganic ions; set maximum/minimum concentrations and smallest clinically relevant or biologically importance difference in concentration and ta-da, you have it. It’s a huge number that we are still refining, but for arguments sake let’s set it at 1022 bits. There are similarly large numbers for any number of other systems (environmental samples; search for life in the solar system, and so on).
Is it feasible to create analytical techniques to match this information volume? Well, when you start running through the numbers and the math, the simple answer is yes—but the keys are microfluidics and the multiplicative factors of having multiple axes and the amplification of the information rich detection schemes. When we are multiplying exponents, they grow real fast. Coupled with this outrageous claim (yes, outrageous—I just said you can get 1022 bits of information from blood—didn’t notice, did you?), the various axes of separation and detection must be compatible in time, space, configuration, and concentration.
We are in the process of writing an academic paper in this space. I am sure we are off a bit in detail here and there, but the numbers are so large that it won’t change the conclusion. When separations science is fully exploited and coupled with information rich detectors, the amount of information available will challenge data storage and computational systems: the bottle neck won’t be the analytics anymore.
On to my next ignorance and thanks for reading.
* Z. Anal. Chem. 245, 84 (1969); J. Chromatogr. 172, 15 (1979); J. Chromatogr. A 79, 157 (1973) ; Anal Chem 46, 283 (1974).
‡ C : channel capacity (bits/s) or information rate; B : bandwidth (Hz) or passband bandwidth; S : received signal power (watts or V2); N : power of the noise (watts or V2)and interference over the bandwidth, measured in watts (or volts squared).
About the Author
Mark A. Hayes holds an associate professorship in the School of Molecular Sciences at Arizona State University, where he serves as an active researcher, mentor, teacher, and colleague. His academic career has produced significant results across several disciplines within the analytical, clinical, biological, and physical chemistry communities that includes aspects of engineering, physics, biology, and medicine. He has contributed to several different research areas, ranging from creating bionanotubules from liposomes with electric fields to establishing a framework for vastly improved microscale array-based separations in more than 80 publications and book chapters. He has served as Program Chair, Governing Board Chair, Long Range Planning Chair, and Marketing Chair for Federation of Analytical Chemistry and Spectroscopy Societies (FACSS) over a several-year period and was instrumental in altering the management structure and changing the name of the North American meeting to SciX Conference. He served as president (ending in 2015) of AES Electrophoresis Society. He has mentored 60 undergraduate and graduate students, producing 16 doctorates while supporting them with research funds and prestigious fellowships (NSF, Kirkbright, ACS, Fulbright, FLAS and local awards).
This blog is a collaboration between LCGC and the American Chemical Society Analytical Division Subdivision on Chromatography and Separations Chemistry (ACS AD SCSC). The goals of the subdivision include
For more information about the subdivision, or to get involved, please visit https://acsanalytical.org/subdivisions/separations/.
Profiling Volatile Organic Compounds in Whisky with GC×GC–MS
November 1st 2024Researchers from Austria, Greece, and Italy conducted a study to analyze volatile organic compounds (VOCs) present in Irish and Scotch whiskys using solid-phase microextraction (SPME) Arrow with comprehensive two-dimensional gas chromatography coupled to mass spectrometry (GC×GC–MS) to examine the organoleptic characteristics that influence the taste of spirits.
GC–TOF-MS Finds 250 Volatile Compounds in E-Cigarette Liquids
November 1st 2024A study has used gas chromatography coupled to a time-of-flight mass spectrometer to build an electron ionization mass spectra database of more than 250 chemicals classified as either volatile or semi-volatile compounds. An additional, confirmatory layer of liquid chromatography–mass spectrometry analysis was subsequently performed.