This month E-Separations Solutions' Technology Forum looks at the topic of Software and the trends and issues surrounding it. Joining us for this discussion is Jan Hruby of DataApex, Albert Barckhoff of PerkinElmer, Bob McDowall Ph.D. of McDowall Consulting, and Ed Long, Barry Coope, and Seamus Mac Conaonaigh of Thermo Fisher Scientific, Inc.
With chromatographic and mass spectrometric instrumentation becoming faster and more sensitive than ever, a new and somewhat unforeseen problem has emerged in the field of separation science: How do we deal with the the mountains of data that are being generated? To answer this question, new and better software solutions are being created every day in the marketplace, leading to some exciting breakthroughs that promise to make labs and businesses in general more productive.
This month, Chromatography Online'sTechnology Forum looks at the topic of Sample Prep and the trends and issues surrounding it. Joining us for this discussion is Jan Hruby of DataApex, Albert Barckhoff of PerkinElmer, Bob McDowall Ph.D. of McDowall Consulting, and Ed Long, Barry Coope, and Seamus Mac Conaonaigh of Thermo Fisher Scientific.
What trends do you see emerging in Software?
Hruby: The chromatography market is subjected to many regulations and so it is in many respects quite conservative. A general trend is to make the controlling simple and intuitive by applying the same user interface patterns all over the place. A new trend is also cooperation among software companies e.g. in sharing of their control modules.
Barckhoff:The biggest trend that I see in laboratory software is a movement toward more automated and application "intelligent" systems. No longer is it sufficient for analytical software to provide generic functionality only, such as chromatographic peak detection and integration. The technological advances of laboratory instrumentation over the past five to ten years have been significant. Couple that with the proliferation of more sophisticated solutions to problems such as air and water monitoring, food testing, nutraceutical evaluation, and bioterrorism surveillance, and the consequence is that instrument technology has largely outpaced the ability of traditional software applications, or even the advanced user, to keep up. This problem is exacerbated by laboratories' continuing need to reduce cost and increase efficiency, which often leads to the paradoxical situation of less experienced operators being asked to perform more and more sophisticated tasks. One of the solutions to this situation is to transfer the application knowledge and decision-making capabilities, in addition to routine instrument control and basic functionality, into the laboratory software designed for these sophisticated systems.
McDowall: A separation between companies that will write software for the regulated world as well as the non-regulated and those that will not.
Thermo Group: This is an exciting time for analytical software as it is evolving in many interesting ways. There is a greater need for more powerful and robust analytical software -- especially those programs that are associated with highly complex instrumentation such as liquid chromatographs, gas chromatographs or mass spectrometers -- to provide intimate and comprehensive instrument control along with data processing. Essentially, the computer software is now the single point of access and operation of such instrumentation. Software specific for chromatographic instrumentation, CDS (Chromatography Data System), is trending toward intimate instrument control for all instrument components (autosampler, pump, detectors) to not only process the detector results, but tightly and unambiguously maintain the instrument running conditions with each injection. Automating these complex instrument components within a laboratory workflow is one of the key drivers for CDS software.
Furthermore, we see a broader interest in laboratories to better integrate their software operations so that users need not struggle with mastering separate but frequently used applications, and can more efficiently and directly operate them together. This trend is strongly seen with laboratory information management systems or LIMS.
Laboratories are demanding that the LIMS not only safely store and manage results, but that they can also operate within lab workflows to seamlessly manage results, communicate with instruments, and provide deep level views and presentations of analytical results. Integrating the sample results from highly complex mass spectrometry instrumentation with a LIMS was once considered impractical but we see new tools from software vendors that make this possible and they are drawing considerable interest from laboratories.
What is the future of Software?
Hruby: Software of the future is not a mere tool anymore, it is a guide. For many years software was a tool that was able to boost the efficiency of an experienced user. Software of today and the near future is able to guide an inexperienced user through the task. The software is provided more as a service than as a product. For example web technologies facilitate use of the online, continually developed, "self-improving" wizards and walkthroughs for the typical tasks. Another key word of the future software is accessibility: from various types of devices (PC, PDA, GC chromatograph, etc.), platforms (Windows, Mac, Linux), users (experts and non-experts).
Barckhoff:The future of software must be to automate as much of the analytical interpretation and decision making of routine analyses as possible, and to simplify the integration of diverse systems and shared information. Because the complexity of laboratory applications has grown so rapidly, it has become increasingly difficult for the user to deal efficiently with all aspects of sample preparation, instrument control, data acquisition, analysis and interpretation, and reporting. Compounding this problem is the growing need for cross-technique and confirmational analyses for more difficult sample matrices. This then requires a user either to be cross-trained on various complicated systems, or to combine data and information from different, and often incompatible ones. The ability of software to remove such burdens from the analyst, and to incorporate more sophisticated algorithms capable of automating difficult operations, will go a long way to increasing laboratory efficiency and data reliability. In addition, increased commonality and uniformity between systems and their data will significantly improve the ability to combine information between those systems. The current effort by the ASTM to assist in the development of the AnIML (Analytical Information Markup Language) standard for laboratory instrument data exchange is one of the major industry steps in this direction.
McDowall:It must become the linking force for all instruments and processeswithin the laboratory. However, we only have point solutions that do not intgrate well. We need a laboratory middleware that is a simple interface and transport medium for laboratory data.
Thermo Group:Analytical software will continue to develop and increase in value among instrument vendors and users. It will continue to make it easier to operate and automate some of the most sensitive and sophisticated technologies available and it will continue to drive product differentiation among instrument manufacturers.
One example is in the use of browser-based applications. AJAX and the newer rich HTML technologies go a long way towards improving the user experience for web applications. New technologies such as Windows Presentation Foundation and Silverlight from Microsoft offer the promise of delivering identical user experiences within the browser and on the desktop, even extending that promise to operating systems and web browsers other than Microsoft's own. Early previews of the software appear to bear this out, and though much of the functionality is centered on media delivery, there is plenty to support business applications as well. It is, however, early days, and only time will tell if this promised shift in the way we write and deliver applications will live up to all expectations.
What is the Software application that you see growing the fastest?
Hruby: As far as we can see the fastest growing applications are LIMS, validation tools, and the software for newer technologies such as GC-MS and LC-MS.
Barckhoff:To address the issues of dealing with multiple, diverse systems, I see the growth of intelligent laboratory integration software to be an area of major focus for instrument and system providers over the coming years. Such systems will need to support the direct interfacing of hardware and software for interdisciplinary techniques and multiple vendors, as well as provide the "business logic" to make intelligent decisions about the operation of those systems and the interpretation of data from them. The unification of instrument control and laboratory information in a single system has been a strong desire of laboratory managers and directors for many years.
McDowall:Probably Electronic Lab Notebooks - moving from the research arenainto manufacturing and development. However, there is a clash between CDS, ELN, and LIMS - what does each do and who is the master and who are the slaves?
Thermo Group:One of the most profound changes occurring in liquid chromatography has been the explosive growth in "High Speed" chromatography based on sub 2 micron particle LC columns. Not only does this provide faster separations, but studies have also shown that high speed LC instruments can produce better separations and improved resolution. Software has had to keep pace with these changes by not only sampling data from these devices fast enough to suitably characterize the data, but also to synchronize the intricate pumps, injections, and gradients so that under automated conditions, high speed chromatography can generate reliable and reproducible injections.
There are so many other applications that continue to grow and spread throughout the world, but one of the more impressive growth technologies is with the LC-MS and GC-MS markets. This is fueled by the developing MS technology so that use of these tremendously sensitive and sophisticated devices can become more affordable and practical in many "routine" laboratory environments and also by the widespread growth in biological markets such as proteomics and metabolomics, food safety, etc.
What obstacles stand in the way of Software development?
Hruby: One of the obstacles is that there are many data formats and communication protocols used to control chromatography instruments. With growing complexity of the instruments, together with the complexity of the software, the developers often have to solve the dilemma that the user interface of the software must be simple with as few options as possible, and at the same time it must also provide tools to handle complex problems.
Barckhoff:The difficulty in the development of such a system is the sheer scope of effort. Transferring all the necessary application logic into an intelligent software system capable of autonomous control and making valid, informed decisions is a daunting task. Sophisticated "rules-based" logic sub-systems must anticipate a myriad of potential deviations from normal operation and direct the workflow of the system accordingly. Additionally, developing a universal approach to interfacing radically diverse instrument types and software systems is enormously more difficult than providing such an interface for a single class of instrument and data.
McDowall:A lack of vision of users to say what they really want: - we don'thave true electronic workflows for the majority of applications. If the users don't ask for functions they won't get done.
We dont have a standard for data files that allows interoperability:acquire on one vendor's system / software and interpret on another's. Users can't take the raw file and all the meta data and run on another system. This locks you to a specific vendor and may also lock you to their equipment - for good and bad.
Thermo Group:Developing software that users can find easy, intuitive, and flexible is an on-going challenge. While software vendors are all earnestly striving to provide software that suits the needs of the end user community, they are often coupled with providing software quickly and in accordance with meeting hardware releases from targeted instrumentation. While these are not "obstacles", they are key challenges to any software vendor.
Developing software that is tested, validated, and based on extensive user-input -- but at the same time - developed with in intimate synchronization with instrumentation and specific delivery dates, is a constant challenge.
What was the biggest accomplishment or news during the past year for Software?
Hruby: It's hard to single out something specifically from the last year. In a longer perspective we can see how most of the software now can share data. The design is more convenient, the tools like automatic updates enable us to benefit from continuous fine tuning of the applications.
Barckhoff:One of the most significant advancements from the software development point of view is that of the Microsoft .NET Framework and other development tools, such as Microsoft Workflows and Windows Presentation Foundation. These technologies significantly enhance the developer's ability to design and deploy sophisticated applications without the burden of developing all of the low-level data and communication structures that have been required in traditional programming environments. By taking full advantage of existing operating system functionality for low-level tasks, such as distributed networking architecture or data and application sharing across an enterprise, applications designed using these techniques can easily reduce coding and development effort by 25 to 40 percent or more.
Thermo Group:It's now imperative for organizations to implement enterprise-wide document management and collaboration tools to stay competitive, increase productivity and comply with standards. There has been a need in the industry that has gone largely unanswered until now, for a secure, cost effective, validated content management solution that incorporates structured data from LIMS, CDS and other enterprise systems along with unstructured information that support laboratory processes, all in a completely validated solution. In the past solutions have been incomplete, cumbersome and expensive.
Best of the Week: Food Analysis, Chemical Migration in Plastic Bottles, STEM Researcher of the Year
December 20th 2024Top articles published this week include the launch of our “From Lab to Table” content series, a Q&A interview about using liquid chromatography–high-resolution mass spectrometry (LC–HRMS) to assess chemical hazards in plastic bottles, and a piece recognizing Brett Paull for being named Tasmanian STEM Researcher of the Year.
Using LC-MS/MS to Measure Testosterone in Dried Blood Spots
December 19th 2024Testosterone measurements are typically performed using serum or plasma, but this presents several logistical challenges, especially for sample collection, storage, and transport. In a recently published article, Yehudah Gruenstein of the University of Miami explored key insights gained from dried blood spot assay validation for testosterone measurement.
Determination of Pharmaceuticals by Capillary HPLC-MS/MS (Dec 2024)
December 19th 2024This application note demonstrates the use of a compact portable capillary liquid chromatograph, the Axcend Focus LC, coupled to an Agilent Ultivo triple quadrupole mass spectrometer for quantitative analysis of pharmaceutical drugs in model aqueous samples.