This month's Technology Forum looks at the topic of Software/LIMS and the trends and issues surrounding it. Joining us for this discussion is Kim Shah, Director of Marketing and Business Development for Informatics at Thermo Fisher Scientific, John Helfrich Director of GMP Lab Automation Programs at VelQuest Corporation, and Donna Lococo, Enterprise Software Solutions Product Leader at PerkinElmer Life and Analytical Sciences.
This month's Technology Forum looks at the topic of Software/LIMS and the trends and issues surrounding it. Joining us for this discussion is Kim Shah, Director of Marketing and Business Development for Informatics at Thermo Fisher Scientific, John Helfrich, Director, GMP Lab Automation Programs, VelQuest Corporation, and Donna Lococo, Enterprise Software Solutions Product Leader, PerkinElmer Life and Analytical Sciences.
What trends do you see emerging in Software/LIMS?
Shah:The strategic and business importance of the laboratory has evolved tremendously over the past 25+ years. Since the advent of LIMS, laboratory staff have shifted their expertise from manual and time consuming activities such as maintaining paper records, instrument attenuation, and cutting / weighing chromatograms to more sophisticated data analysis that drive critical decisions. In addition, the ability of the laboratory to impact strategic corporate objectives such as decreasing product time to market and ensuring regulatory compliance is a crucial function of the lab. As LIMS (and labs) have evolved, the importance of system integration and automation has drastically increased, enabling labs to be at the forefront of process harmonization, data centralization and minimizing human error.
Helfrich:With the advent of "LIMS" many labs now have an electronic filing cabinet toprocess and store data, however much of the expected automation still restsat the administrative level and not at the experimental levels.
Lococo: Early commercial LIMS were much simpler, and were built around the understanding that laboratories were businesses whose products involved analytical data. The focus at that time was on making sure that samples didn’t get lost, that they were processed correctly, that no data were lost, and that final reports (and invoices) went to customers. The LIMS platform was often proprietary, and the need for LIMS to communicate directly with systems outside of the laboratory was fairly limited.
By the late 1980’s, LIMS had become more flexible, and had evolved to include audit trails, calculations, fairly sophisticated limit checking, and management of quality control data. Around this time, another very significant development in the direction of LIMS was the migration toward commercial relational database platforms by LIMS vendors. Not only were more tools available for use with the laboratory, but for the first time data could be queried and reported based on criteria or relationships that the LIMS vendor never dreamed of. The relational database products could also handle more active records than the earlier systems. As a result, LIMS took on a new role as a data repository that could be used to examine the time dimension of laboratory processes. It was also more feasible at this point to integrate LIMS with other systems in the company, especially if the same database platform was in use.
The next wave of change was driven by a tension between the dramatic increase in data content available from analytical instrumentation, and the need to comply with the regulatory environment within a given industry. LIMS have been very useful in helping achieve a balance, and LIMS providers have continued to improve their products to ensure that the regulatory functions enhance, rather than detract, from their overall solution.
What is the future of Software/LIMS?
Shah:Many companies are focused on better collaboration between scientists to minimize rework and to take advantage of a global workforce. Informatics as a whole can help scientists to collaborate and make better decisions. Instrumentation has improved drastically over the past twenty years and it will continue to do so moving forward. The question is how to turn that data into information. That is why Informatics is key to increasing the productivity of the laboratory.
Laboratories are generating more and more information as analytical techniques become more sophisticated. This means that there is increasing pressure on the laboratory to automate and integrate systems in order to make use of the additional data.
Helfrich: The concept of LIMS in many lab managers' minds is to totally automate theelectronic capture and movement of data directly from the experimental labbench. This process in LIMS is highly customized and requires significantinvestments in programmers to be on staff. In the QC lab environment manyinnovative companies (particularly pharmaceutical companies) are couplingthe lab administrative LIMS functions with an electronic notebook systemthat is purpose-built for cGMP method execution. The integration includesall instruments integrated into the system to capture data direct todatabase and eliminate all paperwork for the analyst. The review process isalso streamlined with the ability to view a data "dashboard" with automatedcompliance checks on all the instruments, supplies and expected datathresholds for QC.
Lococo: LIMS are increasingly finding ways to integrate automation capabilities into their core, and that becomes a means of using LIMS in a more proactive manner. Workflow control has been an important factor in expanding LIMS beyond its traditional role as a reports generator and historical data repository. The notifications sent by LIMS contain context-sensitive links that allow a more efficient overall process.
I’d like to see more attention given to use of LIMS to actually identify and track the most critical indicators of laboratory performance. LIMS do collect a great deal of data about many laboratory processes, and most of them use database platforms which support a wide variety of analysis tools. What seems to be missing is an application that ties these together with some domain expertise. I would like to see not only more use of dashboards and other more visual presentations (in real time, of course!) but also some application of the principles of experimental design for the determination of which indicators are truly meaningful. The pretty graphics aren’t really that impressive if they are presenting the wrong parameters.
What is the Software/LIMS application area that you see growing the fastest?
Shah: Because of the burden of validation and implementation, many companies are averse to adopting new technology. Software vendors have to demonstrate that we are working towards more purpose-built solutions with rich functionality already baked into the base product so customers don’t need to go through costly, timely, and risky customizations. Vendors must build solutions on open standards and Services Oriented Architecture (SOA) that are designed to be quickly and easily implemented. Staying abreast of the latest technology and creating a long term vision to future proof a customer's investment are burdens LIMS vendors must incur, but it's what customer's demand and need.
By providing purpose-built LIMS solutions for specific applications that are married to our horizontal integration technologies, we can provide a holistic view across the entire drug development lifecycle, while still allowing fast implementation and adoption times for each of our solutions.
Helfrich: The key obstacles for traditional LIMS implementation is the custom codingneeded when attempting to couple the LIMS with actual lab workflows. Theidea is to eliminate the paper lab notebook, log books, and data binders.This process is still highly custom for LIMS implementations today.
Lococo: Too many systems still rely on manual results entry! We’ve done a good job in understanding how business rules need to be reflected in LIMS, and the automation of workflow and data publication has made great progress. The next great challenge for us is to get closer to the instrument, and really look at how to not only streamline, but truly automate the communication between instrument data systems and LIMS. This is admittedly an idealistic stance, and obviously not all instruments or data systems can support this type of approach. However, it does seem to me to be the most challenging obstacle for LIMS, and our primary bottleneck.
What obstacles stand in the way of Software/LIMS development?
Shah: In the past, information was siloed to a particular laboratory and very little collaboration and sharing was possible. By deploying LIMS, laboratories began to centralize information and work globally. Today, the growing need to partner and outsource work both domestically and abroad has never been more important. LIMS can not only make information available to the organization as a whole, it can also ensure that good laboratory practices are followed.
Forward-thinking LIMS vendors must form partnerships with vendors such as Microsoft, Oracle, and others to ensure that information sharing through LIMS will be enabled on a global scale across both multiple locations and disciplines. LIMS data will, in the future, be part of the informatics landscape of every business, and used to enable business decisions across the enterprise.
Helfrich: Most LIMS are developed for specific application areas including R&D,process development, and quality control. For QC, the ability to couple aLIMS with an ELN (electronic lab notebook) provides a uniform informaticsplatform for quality operations on a global scale. Not only is datacaptured and stored, but the QC analysts procedure (method execution) isdone in a consistent and reliable fashion on a lab to lab and country tocountry basis.
Lococo: LIMS enforce a consistent data model within an organization, and that is the first step whether you are planning to share data with a lab across town, or halfway around the world. But more importantly, LIMS can participate in any number of information sharing models.
2024 EAS Awardees Showcase Innovative Research in Analytical Science
November 20th 2024Scientists from the Massachusetts Institute of Technology, the University of Washington, and other leading institutions took the stage at the Eastern Analytical Symposium to accept awards and share insights into their research.
Inside the Laboratory: The Richardson Group at the University of South Carolina
November 20th 2024In this edition of “Inside the Laboratory,” Susan Richardson of the University of South Carolina discusses her laboratory’s work with using electron ionization and chemical ionization with gas chromatography–mass spectrometry (GC–MS) to detect DBPs in complex environmental matrices, and how her work advances environmental analysis.
AI and GenAI Applications to Help Optimize Purification and Yield of Antibodies From Plasma
October 31st 2024Deriving antibodies from plasma products involves several steps, typically starting from the collection of plasma and ending with the purification of the desired antibodies. These are: plasma collection; plasma pooling; fractionation; antibody purification; concentration and formulation; quality control; and packaging and storage. This process results in a purified antibody product that can be used for therapeutic purposes, diagnostic tests, or research. Each step is critical to ensure the safety, efficacy, and quality of the final product. Applications of AI/GenAI in many of these steps can significantly help in the optimization of purification and yield of the desired antibodies. Some specific use-cases are: selecting and optimizing plasma units for optimized plasma pooling; GenAI solution for enterprise search on internal knowledge portal; analysing and optimizing production batch profitability, inventory, yields; monitoring production batch key performance indicators for outlier identification; monitoring production equipment to predict maintenance events; and reducing quality control laboratory testing turnaround time.
Infographic: Be confidently audit ready, at any time and reduce failures in pharma QC testing
November 20th 2024Discover how you can simplify the audit preparation process with data integrity dashboards that provide transparency to key actions, and seamlessly track long-term trends and patterns, helping to prevent system suitability failures before they occur with waters_connect Data Intelligence software.
Critical Role of Oligonucleotides in Drug Development Highlighted at EAS Session
November 19th 2024A Monday session at the Eastern Analytical Symposium, sponsored by the Chinese American Chromatography Association, explored key challenges and solutions for achieving more sensitive oligonucleotide analysis.