LCGC North America
This article describes a software solution for automating the chromatographic method validation process starting from experimental planning, data acquisition and processing, through final report generation in a seamless manner. All experimental planning and calculations are accomplished within the chromatography data software and, thus, are structurally validated, secure, and audit trailed. Highlights of the software are provided, including benefits to the analyst. The analysis of important method validation characteristics such as linearity, accuracy, and precision is automated. These characteristics and their acceptance criteria can be captured in a method template, which adheres to the company's standard operating procedure. This template method can then be used repeatedly by other scientists in the organization, hence, eliminating the need to create a new experimental plan each time a new validation is conducted.
Guest Authors
Patrick Lukulay and James Morgado
This article describes a software solution for automating the chromatographic method validation process starting from experimental planning, data acquisition and processing, through final report generation in a seamless manner. All experimental planning and calculations are accomplished within the chromatography data software and, thus, are structurally validated, secure, and audit trailed. Highlights of the software are provided, including benefits to the analyst. The analysis of important method validation characteristics such as linearity, accuracy, and precision is automated. These characteristics and their acceptance criteria can be captured in a method template, which adheres to the company's standard operating procedure. This template method can then be used repeatedly by other scientists in the organization, hence, eliminating the need to create a new experimental plan each time a new validation is conducted.
Analytical method validation is the process of establishing through experimentation that a method is suitable for its intended use (1). It is an important regulatory requirement for pharmaceutical analysis, as it provides documented evidence and assurance that the methods are suitable for the determination of identity, quality, strength, purity, and potency of drug substances and drug products. In October 1994, the International Conference on Harmonization (ICH) established characteristics to be assessed for validation. These guidelines are also reflected by both the United States Pharmacopeia (USP) and U.S. Food and Drug Administration (FDA) guidance documents on method validation. ICH was meant to bridge differences between compendia and regulators including EC, Japan, and the United States (1). The initial guidance was not meant to provide direction on how to conduct method validation. In 1996, a more detailed guidance document was drafted that provided direction on what method characteristics to validate — that is, the minimum number of samples, injection replicates, data, and what statistics to include in the validation report (2). Additional direction was provided on what conditions would warrant revalidation.
The validation process can be a time-consuming and repetitive task, consisting of several sequential steps. These steps include planning (protocol generation), sample preparation and experiment setup, data acquisition, calculation of results, and report generation. Such a process lends itself well to automation, benefiting the analytical scientist by minimizing the tedium of validation. Standardizing the validation work process and eliminating errors are added benefits. Because all regulatory stipulations on method validation are general guidances, they provide several options on how to establish validation. There is no standard way of conducting validations. To standardize the validation process, various companies have adopted standard operating procedures (SOPs) to stipulate details about how validation should be conducted in their work environment. These attempts to simplify and standardize the process commonly include the development of spreadsheets to help in the planning of validation experiments and allow the calculation of description statistics. This type of approach is problematic because the spreadsheet environment is not completely compliant. Extra work is required to verify that the data transfer from the chromatography data software to the spreadsheets is free of transcription errors and that the calculations are accurate. Additionally, in producing the final report, data must be compiled tediously from various software applications including MS Word, MS Excel spreadsheets, and the chromatography data software. Limitations of employing in-house spreadsheet solutions include
This article describes a software solution for automating the method validation process starting from experimental planning through final report generation in a seamless manner. It highlights important considerations and challenges in automating method validation and provides solutions to those challenges, resulting in a software product that is flexible and simple to use in different work environments. The following validation elements can be validated within the software: specificity, linearity, accuracy, precision (repeatability, intermediate precision, and reproducibility), limit of quantitation, limit of detection, method robustness, solution stability, and filter validation.
Current Solutions
There are currently a number of commercially available software products that assist in the automation of the analytical method validation process. Many of these solutions address the limitations of earlier approaches in which spreadsheets were used. A few of these solutions include Fusion AE (S-Matrix Corp., Eureka, California), Validation Manager (VWR International, West Chester, Pennsylvania), and ChemStation Plus Method Validation Pack (Agilent Technologies, Inc., Palo Alto, California). The Fusion AE product (3) is a modular informatics platform for automated experimentation. The product includes application modules that allow for high performance liquid chromatography (HPLC) method development as well as HPLC and gas chromatography method validation. The product also supports plug-in data exchange modules that provide file-less exchange of designs and results in a fully audited and compliant environment for the following instrument data systems: TotalChrom (PerkinElmer Instruments, Shelton, Connecticut), Galaxie (Varian, Palo Alto, California), and Millennium32 and Empower (Waters Corp., Milford, Massachusetts). The Fusion product also supports manual data entry through secure means. The ChemStation Plus Method Validation Pack product is an add-on module for the base ChemStation software package. It works in conjunction with the ChemStore add-on module. It allows for the automated validation of methods, including planning, test execution, and final reporting. It also allows for the use of imported support data as well as manual data entry. Validation Manager is a stand-alone product that allows for the execution, assembly, analysis, and reporting of validation characteristics for analytical methodology. It supports direct data import from the LaChrom HPLC System Manager Software (Hitachi High Technologies of North America, Schaumburg, Illinois) as well as from Excel and Word based table structures. Each of the previous software solutions meets the requirements for 21 CFR 11 compliance as well as the ICH, EU, USP, and FDA guidelines. There are other solutions — for example, ELSA32 (Waters Corp.) — on the market that might or might not be equivalent to the previously mentioned products. Exactly how any of these solutions meet corporate laboratory requirements would have to be determined on an individual basis. An article comparing and contrasting the functionality and ease of use of ELSA32 and Validation Manager previously has been detailed by Flarakos (4). The version of Validation Manager is not mentioned in the article. As such, the currently available version of Validation Manager might differ from the one presented therein.
Desirable Features of Automated Method Validation Software
Because of the disparity in the ways various R&D business lines conduct method validation using different tools with different capabilities, Pfizer Global R&D set out to harmonize method validation within the various R&D units and manufacturing sites. To this end, a team of scientists developed a set of user requirements. These specifications defined the ideal automated method validation solution from protocol preparation to result generation and reporting. A vendor selection was undertaken to identify a company suitable to develop a global validation solution. In addition to the requirements of Pfizer, any vendor developing a product should ensure that the software would meet the needs of other companies in the pharmaceutical industry and other relevant industries. Some desirable features of an ideal automated method validation software solution are detailed in the following list:
Implementing these features has presented several challenges as discussed in the following.
Challenges and Considerations for Automated Method Validation Software
Flexibility: Because method validation is conducted using various approaches in the pharmaceutical industry, commercial software has to be capable of accommodating different users' needs, A solution is needed that is flexible and simple, yet comprehensive enough to allow various SOPs to be implemented seamlessly. This challenge was addressed by designing the user interface such that there is an association between the experimental design (sample injections, number of replicate injections, number of levels) and each validation element. In this way, it is clear to the program which injections are used for which validation elements. Additionally, injections can be used for multiple validation elements. For example, injections used to validate linearity can also be used to assess precision and accuracy. The scientist checks association boxes to accomplish these instructions. This eliminates the need for layers of queries to determine how to set up the injection sequence in the sample set. Another benefit of this approach is that one sample set (as opposed to separate sample sets) can be used to set up injection sequences for multiple validation elements. The setup of the experimental design described previously can represent each company's SOP and is saved as the recommended protocol to follow. This inherent flexibility allows various SOPs to be implemented and saved as templates for the general use of scientists within a given business line.
Validation SOP management: Because of the enormous amount of information that is gathered during a validation exercise (including acceptance criteria, number of replicates that vary by the stage of validation, type of compound–drug product or drug substance, and the method type), it is necessary to manage the information effectively for proper reporting to be accomplished. Moreover, because it is this type of information that constitutes a company's SOP, it is desirable to save it as part of a template so that scientists would not have to input the same information every time a validation is conducted. With a template, future validation experiments for a drug compound at a given stage of development, for a given compound type and method type, can be uploaded with all the accompanying acceptance criteria, replicate and level information, and so forth. This undoubtedly leads to enhanced business efficiency.
This consideration was addressed by implementing a configurable template-based system that captures the validation requirements found in an organization's SOP. It is designed to be hierarchical based upon "Compound Type," "Method Type," and "Development Phase," a senior analyst would preconfigure the system to implement an organization's SOPs. Standard experiment designs are saved as templates that include the sample sequences, process parameters, and acceptance criteria for the respective validation design. An analyst would simply execute the templates and then process the raw data for chromatographic results and validation results. Because the system is preconfigured, all the validation results are based upon the correct acceptance criteria for the respective validation. Implementing a system in this manner allows for the rational management of analytical data and validation results.
Development challenges: Designing such a solution is not a trivial endeavor. The software flexibility required due to the varied approaches of method validation results in many challenges. The design must accommodate the different approaches while adhering to the primary goals of analytical software such as storage of meta data and results in a secure manner, granular control of user access in all areas of the software through a full set of user privileges, calculation of results in a structurally validated environment, 21CFR compliance, and providing a full-featured-yet-intuitive user interface. Areas in particular where different approaches to method validation cause development challenges include the following.
Data Checking: Data checks should occur to provide assurance that both raw and processed data adheres to the correct number of levels and the correct number of sample replicates. The validation SOP can be written to define the number of levels as an absolute number or as a minimum number. Intricate software logic is required to accommodate this. Furthermore, replicates can be defined as replicate injections from the same sample vial or they can be defined as replicate sample preparations. Sometimes both replicate injections and replicate sample preparations are used. A software solution must logically account for the innuendos caused by these differences and document such details accurately. It is of utmost importance for software to handle these differences properly during the statistical analysis of such data.
Approvals: At different points during the progression of a validation study, many users require some sort of data review and approval process before the next step can be undertaken. However, there is no agreed upon convention as to when these approvals should occur. Reviews and approvals could be required at any of the following steps. The software should allow the user to configure where the following approvals take place:
Robustness and intermediate precision: There are many variables to consider for robustness and intermediate precision experiments. There is no consistency as to which factors should be varied for these two validation characteristics. One user might assess different column lots in robustness. Another user assesses this factor during intermediate precision testing. Factors such as flow rate, gradient start time, and detector wavelength require changes to acquisition conditions. Others require changes during sample preparation or processing. Factors can be categorical (that is, column lot, analyst, system, or day) or numerical (that is, pH, solvent composition, or wavelength). In some cases, dummy factors are performed. In all cases, factors must be documented properly.
Different experimental designs can be used, each of which provides specific benefits as well as trade-offs. Some users might not use experimental designs at all and vary each factor independently from the others. The standards must be acquired with respect to the factor variations — that is, they must be acquired using the same factor variations (when the factors relate to instrument conditions) as the unknowns. The software should check the data to ensure that the correct factors have been varied. Data analysis varies as well, ranging from a simple % RSD calculation to full statistical evaluation of the factor effects and factor interactions.
Managing data in the user interface: Validating a method generates a vast amount of both mathematical and statistical data. To satisfy a variety of users, software should provide all calculations and results, only a subset of which is required for any given consumer. The user should be able to display results of interest while hiding the display of fields of noninterest. The user interface must present the data in a logical and intuitive manner so the user can easily navigate and effectively mine the data.
Final product: The proposed solution is a workflow-based, preconfigurable, hierarchal delivery system that allows for the implementation of an organization's SOP-based validation practices and validation acceptance criteria. Figure 1 presents the configurable environment that allows for the hierarchical workflow implementation. Validation methods can be approved and locked down so as to act as a validation protocol.
Figure 1: Corporate validation protocol parameters are translated directly into the software product.
The software also utilizes a Validation Management System (Figure 2) that allows users to monitor the status of which sample sets are acquired, as well as which sample sets still need to be acquired, processed, or approved. Users can easily see what work must be done and what has yet to be done. The software tracks which sample sets and related injections pertain to which validation parameter (for example, linearity, accuracy, precision). A complete set of relevant results and statistics are processed for each component in the samples.
Figure 2: Validation management window allows you to completely perform and manage your validation workflow from acquisition through data processing and reporting.
Validation result data is categorized logically and displayed in appropriate plot and tabular formats. These plots and tables are interactive, thus, enhancing a user's ability to navigate and interpret the data. The user chooses which results are to be shown both in the software interface and on reports. Furthermore, results that fail to meet acceptance criteria are indicated and flagged in the multicomponent results and analysis window (Figure 3).
Figure 3: Software provides a full set of validation results and statistical analysis for each validation parameter. Results are displayed in interactive tables and plots, allowing users to evaluate their data easily.
Report templates can be used to standardize the report format for ease of review. In addition, a submission-ready reports template can be created for direct implementation into a regulatory IND/ NDA /EMEA/J-NDA and allows for multicomponent validation, processing, and reporting.
Michael E. Swartz
New fully integrated software of this type will allow for significant time saving by providing inexperienced analysts with the ability to execute an analytical validation based upon the current practices of their respective organization and division. This software platform satisfies the requirements of a fully flexible and configurable system.
Ira S. Krull
Conclusion
Automating analytical method validation greatly simplifies the validation process and minimizes the tedium of method validation. By having a flexible and configurable validation platform, various company SOPs can be configured into a standard template for easy and seamless execution. The generation of report methods that are ready for importation into target regulatory submission documents eliminates the need to reformat validation reports for submission documents and, hence, increases business efficiency.
Additionally, because the validation software is part of a validated chromatography data system, all calculations are done within the chromatography data system environment to generate results that are verified statistically and audit trailed. There is no need for data exchange with a third party or transfer to external software, thus, eliminating transcription errors.
Guest author Patrick Lukulay is Senior Principal Scientist in the Research Analytical group at Pfizer's Michigan Laboratories.
Guest author James Morgado is a Scientist and Validation Specialist in the Analytical Research & Development Group at the Pfizer, PGRD Groton, CT Facility e-mail: james.e.morgado@pfizer.com
References
(1) Text on validation of analytical procedures, ICH harmonized tripartite guidelines, 27 October 1994.
(2) ICH harmonized tripartite guideline, validation of analytical procedure: Methodology, 6 November 1996.
(3) P. Lukulay and R. Verseput, Pharm. Tech. (2005).
(4) J. Flarakos, LCGC 19(3), 304–310 (2001).
Michael E. Swartz "Validation Viewpoint" Co-Editor Michael E. Swartz is a Principal Scientist at Waters Corp., Milford, Massachusetts, and a member of LCGC's editorial advisory board.
Ira S. Krull "Validation Viewpoint" Co-Editor Ira S. Krull is an Associate Professor of chemistry at Northeastern University, Boston, Massachusetts, and a member of LCGC's editorial advisory board..
The columnists regret that time constraints prevent them from responding to individual reader queries. However, readers are welcome to submit specific questions and problems, which the columnists may address in future columns. Direct correspondence about this column to "Validation Viewpoint," LCGC, Woodbridge Corporate Plaza, 485 Route 1 South, Building F, First Floor, Iselin, NJ 08830, e-mail lcgcedit@lcgcmag.com
AI and GenAI Applications to Help Optimize Purification and Yield of Antibodies From Plasma
October 31st 2024Deriving antibodies from plasma products involves several steps, typically starting from the collection of plasma and ending with the purification of the desired antibodies. These are: plasma collection; plasma pooling; fractionation; antibody purification; concentration and formulation; quality control; and packaging and storage. This process results in a purified antibody product that can be used for therapeutic purposes, diagnostic tests, or research. Each step is critical to ensure the safety, efficacy, and quality of the final product. Applications of AI/GenAI in many of these steps can significantly help in the optimization of purification and yield of the desired antibodies. Some specific use-cases are: selecting and optimizing plasma units for optimized plasma pooling; GenAI solution for enterprise search on internal knowledge portal; analysing and optimizing production batch profitability, inventory, yields; monitoring production batch key performance indicators for outlier identification; monitoring production equipment to predict maintenance events; and reducing quality control laboratory testing turnaround time.