Special Issues
This article discusses the business benefits of interfacing laboratory applications together to eliminate paper and streamline working practices.
LIMS, CDS, ELN, and so forth — these acronyms conjure up an impression of a laboratory that is well equipped and superbly efficient. What is the reality? Close at best, but only in a minority of laboratories, and miles away in the majority of laboratory situations. A laboratory may have a number of scientific applications purchased but the real question is: Can they implement each one effectively and get it to work efficiently and save time for the users? The problem is that many applications when implemented actually cause the users more work not less.
Let us ask some (awkward) questions:
Are any analytical instruments connected to your laboratory applications?
Are applications interfaced together to avoid retyping of data?
If the answer (please don't ask to phone a friend or go 50-50) to either of these questions is no, then your laboratory has islands of automation floating serenely in an ocean of paper. Your interface is paper. This is inefficient, error prone, and slow.
The problem is that we hardly ever consider connecting these applications as the topic usually is considered outside the scope of each project. Lack of interfacing compounds the problem of inefficiency in the laboratory as all manual input into any computer system needs to be checked manually by a second person to reduce typographical errors. Interfacing the instrument to an application or interfacing two applications together is a crucial factor in improving the efficiency of the laboratory as well as eliminating one of the real non-added value jobs in the laboratory — transcription error checking. Paper interfacing is easy to achieve but is not cheap, as it requires continual human labor for input and checking the entries also is error prone, as there is no automated data-extraction routine.
Let us look at the question "why interface?" in more detail. Here, we will look at the major area for the laboratory and its analytical instruments interfaced to either an electronic laboratory notebook (ELN) or laboratory information management system (LIMS). The best way to shame you into considering interfacing more proactively is to look at a very common application in the laboratory — a chromatography data system (CDS). The majority of CDS implementations will control the gas and liquid chromatographs interfaced to it, determine the method that an instrument will run, inject the samples according to a sequence file, acquire and interpret the data, and generate the results — all in a single system. This is a great example of interfacing laboratory instrumentation to an application.
Going into more detail, what is the advantage of interfacing to the users? In a single system, you can set up, control instruments, and acquire chromatographic data. Working electronically, the analyst can view the chromatograms on the screen and reintegrate where appropriate and calculate the final results. The majority of networked CDS applications have the ability to incorporate custom calculations, which means that the analyst does not have to print out data to input them into Excel for final reporting. When an analyst is finished, they will ask their supervisor to check their work. Instead of printing out piles of paper, a supervisor can review the results including retention times, peak shape and resolution, standards, and quality control samples to show that the method was under control. Furthermore, they review the integration and determine if the decisions made by the tester were appropriate and change anything if required. All this work will be audit-trailed so that any quality assurance checks can determine easily if procedures were followed. Note, in this discussion, there is no mention of transcription error checking but there is no transcription to check as it is all electronic.
So, if we can do it for a CDS why not a LIMS? It is too much work, too complex, or can we wait until later in the project phasing? The reasons might be all of the above. However, that misses the point. Implementing a LIMS requires that there is immediate benefit for the users and payback for the organization who is paying for your incompetence in not interfacing instruments into the process.
Now, let us look at how we interface laboratory applications together. Similar to the CDS discussion earlier, electronic transfer of data or results between applications should be fast, involve minimal human input, and be error-free. Ideally, a user should push a button and the data file goes to the required system. Again, the same benefits are seen as before, with the elimination of transcription error checking. But here is where the CDS argument can break down: many of these systems are not interfaced to other applications in the laboratory, notably LIMS. In this instance, results are ferried from the CDS to the LIMS by, you guessed it, paper and manual input.
So, if your laboratory is going to invest in nice shiny new informatics applications, you had better make sure that they are interfaced not only to the instruments that generate the data but also to other applications inside and outside the laboratory. However, integration of instruments and applications also raises a number of questions. First, which is the master application and which are the slaves?
Imagine the following situation. You have logged all the samples into the LIMS, put an analytical run on your CDS, and some bozo wanders into the laboratory waving a sample about and asking for a rapid analysis. You can prepare and add the sample to the CDS sequence immediately. However, what is the impact of putting an extra sample into a sequence on the other laboratory applications? There is no record in the LIMS of this sample having entered the laboratory, so what happens if a result is transferred to the LIMS with no registered sample? You had better find out in the implementation rather than when it happens for real. Some LIMS can enter the sample information retrospectively but others cannot. It could be that laboratory processes need to be formalized so that you must always enter the sample information into the LIMS first, which is then downloaded into the CDS, and no other way of working is permitted. This can sometimes be a trade-off between flexible working and the constraints of an electronic process but the latter will reinforce the hierarchy of the informatics applications. It reflects the infinite flexibility, and occasional noncompliance, of a paper process and the constraints of a defined electronic process.
Figure 1 illustrates the points I have been making in the previous discussion. The upper flow depicts a current paper-based process and the lower one an optimized electronic process for a laboratory. No new instruments or applications have been implemented in the second flow — the only difference between the two flows is that the process has been redesigned and optimized for electronic working.
Figure 1: Current paper-based and optimized electronic laboratory processes
Take 1, starting at the top, the current process begins at the analytical instrument that is used to acquire and process the sample data. The analyst has to enter the information about the samples for assay manually, typically from a paper output from the LIMS. After the data have been acquired and processed, a report is printed from the instrument. However, this is due to the fact that the validated calculations have not been implemented either in the LIMS or in the instrument's data system. This is error number one. It begins or probably perpetuates the laboratory's dependence upon paper. So, the paper output from the instrument is reviewed and the analyst then identifies that the appropriate data to be entered manually into a spreadsheet for calculation of reportable results, these entries will need to be checked for transcription errors by a second person. Why use a spreadsheet? Well it is widely available and easy to use and better than implementing the calculations in the instrument data system or LIMS. The spreadsheet calculations are printed out and the reportable results typed into the LIMS, checked again by a second person, and at last the final report is available in the LIMS. Yes, it is on paper. This is error number two. There is no interface between the instrument data system and the LIMS, the connection is manual and interfaced on paper. Now, in this discussion, I have not mentioned the initialing or signing of printouts and laboratory books or their equivalent, which slows the process down further.
Take 2: Look at the optimized process at the bottom of Figure 1. Visually, it is simpler as there are fewer process steps (boxes) and fewer arrows to consider — should it be faster? Yes, it should, and let us look at the details. The process starts from the LIMS with a download of the sample information for analysis to the instrument data system. This is a tested and validated step and therefore, it needs no transcription checking by a second individual. After the analysis, the required calculations have been implemented either in the instrument data system or the LIMS and have been tested to show that they work as required. Therefore, the spreadsheet has been eliminated along with the manual input of data and associated transcription checking. In its place, the data system transfers either the reportable results to the LIMS or the data for the LIMS to calculate the final results. Altogether, the optimized process is quicker and simpler.
Therefore, I suggest that instead of implementing point informatics solutions in your laboratory, you design an electronic environment. This should eliminate transcription error checks and make transfer between instruments and software applications efficient and rapid. Developing an electronic environment is not a one-off process but a continuous project with different phases where single applications are implemented, typically one at a time. The aim should be that succeeding implementations should bring their own advantages but must also leverage the benefits of existing and operational applications.
R.D.McDowall
McDowall Consulting
Please direct correspondence to rdmcdowall@btconnect.com
Best of the Week: Food Analysis, Chemical Migration in Plastic Bottles, STEM Researcher of the Year
December 20th 2024Top articles published this week include the launch of our “From Lab to Table” content series, a Q&A interview about using liquid chromatography–high-resolution mass spectrometry (LC–HRMS) to assess chemical hazards in plastic bottles, and a piece recognizing Brett Paull for being named Tasmanian STEM Researcher of the Year.
Using LC-MS/MS to Measure Testosterone in Dried Blood Spots
December 19th 2024Testosterone measurements are typically performed using serum or plasma, but this presents several logistical challenges, especially for sample collection, storage, and transport. In a recently published article, Yehudah Gruenstein of the University of Miami explored key insights gained from dried blood spot assay validation for testosterone measurement.
Determination of Pharmaceuticals by Capillary HPLC-MS/MS (Dec 2024)
December 19th 2024This application note demonstrates the use of a compact portable capillary liquid chromatograph, the Axcend Focus LC, coupled to an Agilent Ultivo triple quadrupole mass spectrometer for quantitative analysis of pharmaceutical drugs in model aqueous samples.