Technology Forum: Software

Article

LCGC North America

LCGC SupplementsSpecial Issues-08-02-2009
Volume 27
Issue 8
Pages: 734

New and better software solutions are being created every day in the marketplace, leading to some exciting breakthroughs that promise to make labs and businesses in general more productive.

What trends do you see emerging in software?

McLeod: I think one of the key trends is simplicity. In the past a lot of software, especially in the scientific industry, has been focused on delivering more and more functionality to users –to the point that use of the software becomes extremely complex. The means that users are faced with 2 options – either use the most basic features that the software offers, or spend a long time learning the software so that they can fully exploit the benefits offered by the more advanced functionality. This pattern can be broken by making advanced features more accessible and easier to use – and is why simplicity is a key trend.

Barckhoff: The biggest trend that I see in laboratory software is a movement toward more automated and application "intelligent" systems. No longer is it sufficient for analytical software to provide generic functionality only, such as chromatographic peak detection and integration. The technological advances of laboratory instrumentation over the past five to ten years have been significant. Couple that with the proliferation of more sophisticated solutions to problems such as air and water monitoring, food testing, nutraceutical evaluation, and bioterrorism surveillance, and the consequence is that instrument technology has largely outpaced the ability of traditional software applications, or even the advanced user, to keep up. This problem is exacerbated by laboratories' continuing need to reduce cost and increase efficiency, which often leads to the paradoxical situation of less experienced operators being asked to perform more and more sophisticated tasks. One of the solutions to this situation is to transfer the application knowledge and decision-making capabilities, in addition to routine instrument control and basic functionality, into the laboratory software designed for these sophisticated systems..

Thermo Group: This is an exciting time for analytical software as it is evolving in many interesting ways. There is a greater need for more powerful and robust analytical software — especially those programs that are associated with highly complex instrumentation such as liquid chromatographs, gas chromatographs or mass spectrometers — to provide intimate and comprehensive instrument control along with data processing. Essentially, the computer software is now the single point of access and operation of such instrumentation. Software specific for chromatographic instrumentation, CDS (Chromatography Data System), is trending toward intimate instrument control for all instrument components (autosampler, pump, detectors) to not only process the detector results, but tightly and unambiguously maintain the instrument running conditions with each injection. Automating these complex instrument components within a laboratory workflow is one of the key drivers for CDS software.

What do you see in the future for software technology?

McLeod: Fortunately, I think a lot of the future will be based around open standards for data storage and for instrument control. A group called the Open Chromatography Association is already pursuing an open standard for control of chromatography instruments, and more and more people are starting to adopt the AnIML data format. These attempts at standardization will ensure that companies will be able to spend more time developing the high value features that users are looking for.

Barckhoff: The future of software must be to automate as much of the analytical interpretation and decision making of routine analyses as possible, and to simplify the integration of diverse systems and shared information. Because the complexity of laboratory applications has grown so rapidly, it has become increasingly difficult for the user to deal efficiently with all aspects of sample preparation, instrument control, data acquisition, analysis and interpretation, and reporting. Compounding this problem is the growing need for cross-technique and confirmational analyses for more difficult sample matrices. This then requires a user either to be cross-trained on various complicated systems, or to combine data and information from different, and often incompatible ones. The ability of software to remove such burdens from the analyst, and to incorporate more sophisticated algorithms capable of automating difficult operations, will go a long way to increasing laboratory efficiency and data reliability. In addition, increased commonality and uniformity between systems and their data will significantly improve the ability to combine information between those systems. The current effort by the ASTM to assist in the development of the AnIML (Analytical Information Markup Language) standard for laboratory instrument data exchange is one of the major industry steps in this direction.

Participants

Fraser McLeod

Dionex Corporation

Albert Barckhoff

PerkinElmer

Ed Long, Barry Coope, and Seamus Mac Conaonaigh

Thermo Fisher Scientific

Recent Videos
Related Content