LCGC North America
The author examines the process of validating a CDS and describes the trials along the way such as speed of validation, risk management, and so forth.
Chromatography data systems (CDSs) are used throughout regulated laboratories in the pharmaceutical and allied industries. The role of a CDS in research and development and production (good manufacturing practice [GMP]) can be for determining the impurities of raw materials and finished products, in process control and stability testing, while in good laboratory practice (GLP) development laboratories, a system can be used for the measurement of a drug and its metabolites in biological fluids from nonclinical and clinical studies to determine the absorption, distribution, metabolism, and excretion (ADME) of the compound. Regardless of the role of the regulated laboratory, there is a need to validate to show that the CDS is fit for its intended purpose as required by the GLP or GMP regulations as well as 21 CFR 11 (electronic records; electronic signatures rule).
Although validation of computerized systems is a topic of many books and regulatory guidance documents, only a few of which I have referenced (1–3), there is only one book that deals solely with CDS validation (4). However, this book (4) only addresses the topic of CDS validation but does not discuss the timelines of a project in any great depth.
In this article, I would like to focus on what is needed to ensure a rapid validation of a CDS while maintaining the overall quality and managing regulatory risk. I'll look at how CDS validation is typically conducted today, what we should aim to achieve in a rapid validation projects, and then list my 10 critical success factors for such a project. Before we discuss these areas, I am assuming two things that I will not discuss further. First, that the CDS is a multiuser networked system, and second, that the users will be working with electronic signatures and keeping paper output from the system to a minimum.
CDS Validation: The Way It Is
Typically, many CDS validation projects take between six months (if you are lucky) and 18 months (if you are not) to validate the core system and then roll out the system to the rest of the user base in the laboratory. This means that in the worst case, when you have finished the current validation, the CDS vendor has probably released another version of the software for you to implement and validate. In the current economic environment, it could be a way of preserving your job. Because of the time and resource required to validate each release, at best, a laboratory will only implement every other release of CDS software, which makes it difficult for a vendor to offer effective support when there could be up to five supported releases in the customer base at any one time.
CDS Validation: The Way It Should Be
In an ideal world, a CDS validation should be fast and efficient; something like this: A laboratory would have written down their user requirements and have a good reason for the purchase of a particular system. The project team would be trained and have access to the system from the first days of the project so that they could understand the nuances of the way the software works and how it will be configured (including custom calculations and custom reports). In this way the user requirements would be refined to fit the purchased system rather then a generic CDS. Risk management would work in conjunction with what the vendor has done to focus testing down to show that the system works in your environment. Work on configuring the system with input of calculations and report templates will continue after the system has completed its validation and therefore, this aspect is, in my opinion, best controlled by procedure. It allows the process to be decoupled from the validation of the application software.
The Core System
In the introduction, I mentioned that we should aim for the validation of a core CDS to have an achievable target to aim for in the validation. Therefore, from this we have an implicit statement that we will be staging the validation in a number of phases. So what constitutes the core system? To give the classic consultant's response: it depends. The factors influencing this are
Therefore, the core system could vary in size from about five to 20 chromatographs. This gives the project team the phase 1 target to aim for but they must also consider the whole system. Reducing the size of the initial target to hit means that the user acceptance testing (UAT) or performance qualification (PQ) also can be phased and this will be discussed in more detail in the next section.
The 10 Critical Success Factors for Fast CDS Validation
My 10 critical success factors (CSFs) for a rapid CDS validation are listed in Table I and will be discussed in more detail under each heading. There is no particular order of these CSFs in the table and the text of this article.
Table I: 10 critical success factors for rapid CDS validation
1. Management Involvement and Backing: Laboratory management can make or break any validation project by either refusing to resource it adequately (expecting the users to still carry out their normal duties as well as working on the project) or by unrealistic demands (I want it tomorrow, sorry yesterday). In either case, the project will suffer and the laboratory will end up with a poor and rushed validation.
Therefore, this CSF has two aspects to it. First, management must set realistic and achievable goals, deadlines and expectations for the project team to achieve and keep to. Second, management must get out of their offices and talk publicly to the chromatographers who will be the user base to ensure positive support for the system and to aid the CDS validation project team. Ideally, management should change individuals' objectives to ensure that their role working on the project team or supporting the implementation is a part of each analyst's overall performance goals.
2. Dedicated Project Team Members: It is important to understand one of the reasons that CDS validation projects take more time than anticipated is that the people working on the project are trying to do two jobs: their normal one and the CDS project. Something has to give and it is usually the CDS project, especially if there are high priority samples to analyze. Therefore, it is the role of management to ensure that the project is resourced adequately. The selection of the people to work on the project is a critical success factor and requires experienced chromatographers who, in my view, should work on the project full time if an aggressive three-month time schedule is to be met. If these chromatographers have experience using the selected CDS, then so much the better, as these people can provide other project team members who have little or no experience with help. Depending upon the size of the system to be implemented and the number of users, this can vary, typically between two and seven members.
Always remember that a CDS project requires a multidisciplinary approach: information technology (IT) and quality assurance (QA) also are involved and the laboratory on their own cannot deliver a successful CDS validation project. QA and IT individuals on the project will not be full time but need to be kept involved with the progress of the project, asked for input to key activities, and when documents will be available for comment and approval.
3. Use an Appropriate Lifecycle Model: CSV requires a lifecycle model and typically, the pharmaceutical industry uses a classical V model approach (5). However, the classical V model does not reflect what you will be doing on a CDS validation project as there is not extensive configuration or customization of the system. Therefore, for a typical CDS validation project you will not need to write a functional specification or a design specification as the systems do not require any custom code to perform their function most of what you need comes out of the box (for example, instrument control, data acquisition, and data processing). What you do need to document is the configuration of the software: how you want the application to work with electronic signatures, user account management (user types and access privileges), custom calculations, and reports.
Therefore, a simplified lifecycle model is essential to reduce the number of documents needed in the validation. Figure 1 shows a simplified lifecycle model for a CDS — three of the documents shown you may not be familiar with:
Figure 1: A lifecycle model for a chromatography data system.
Further easing of the way a system is validated is that some tasks can be adequately controlled though procedures. Custom reports and custom calculations are a case in point. These can be procedurally controlled as they will be developed long after the whole system havsbeen validated. However, the SOP needs to include the specification and testing of each report or calculation in enough detail so that it can be tested and then integrated into the CDS so that the system retains its validation status.
4. Knowledge of the CDS Application: As mentioned previously, this is a critical success factor that will determine (non)progress of the whole project. We have looked at the general user base here, but I want to focus on a specific group. Project team members from the laboratory will likely become local application administrators for the laboratory and will be the first line for resolving problems with the system (for example, power or super users in some terminology). Therefore, they need to have a good technical understanding of the application to help specify, configure, and test the CDS. Even if they are experienced users at the chromatographic level, further training will be needed to understand the technical level of the CDS to ensure that they can contribute fully to the project and beyond. If there is a time problem, then outsourcing of custom calculations or laboratory reports to the CDS vendor could be one option to keep the project on time.
5. Active and Flexible QA Involvement: The role of QA often is maligned and misunderstood. They are the guardians of quality but often are seen as ultraconservative. However, you must have QA on your side if you want to get the CDS validation project though in tight timescales. Key project documents must be reviewed and approved by QA to see that they meet company and regulatory requirements. If you want to modify the validation approach and omit one or two stages of work that would normally be done, how would you approach QA if there was an antagonistic atmosphere?
You must get QA on-side with the project and communicate and explain why you want to do tasks in a way that is different from a typical validation project. This is especially important at the start of the project when the initial planning takes place to define both the overall timelines but also when key documents are expected to be available for review. If you don't inform QA that a document is to be reviewed around a certain date, don't be surprised if you are told the document goes to the back of the queue. If the project wants to do something different to the normal computerized system validation procedure, you need to keep QA informed. Working with QA is infinitely better than working against them.
6. Effective and Compliant IT Participation: The words IT and compliant in the same sentence often can seem strange and can be met with skepticism, but this is an essential part of the CDS validation project as the software has to run on the network operated by IT. Note also that there is the word effective in the CSF. It's no good being compliant if the network is slow or the hardware is delivered late or is unreliable. Therefore, the project needs to have technical input from IT on the design phases of the project, to carry out the installation and qualification of the servers, operating system, and the database but also operate the system when the CDS is released for operational use.
If the CDS has over 25-or-so users then the technical architecture could benefit from the use of terminal emulation. Instead of installing the CDS application software on each PC in the laboratory and qualifying it, the software is installed once on a terminal emulation server (typically running Citrix) and is qualified once. Users in the laboratory will then log onto the system via an applet running on their PCs but apart from that, there is little difference in the way that they interact with the CDS.
The benefit from using terminal emulation from the project's perspective is that there is a single software installation followed by a single installation qualification and operational qualification. For a core system this might not have much of an impact but if the overall installation is greater that 50, then the use of terminal emulation will save much time and effort. The other side of the coin is that the IT department needs to have the expertise to install and operate a terminal emulation system. The alternative is the classical client–server installation and qualification of the CDS application on all laboratory PCs described previously.
7. Use the Vendor Effectively: The vendor of the CDS software has a role to play in the project from provision of technical advice (server and disk size for the laboratory), training (both regular users but also administrators), quality system development and support of the CDS application, supply of qualification documentation, and professional services. Training has been discussed in earlier CSFs here, and technical advice is an area where there can be very useful input from the CDS vendor in the technical architecture that is needed to run their product efficiently. However, I would like to focus here on two aspects, first, the quality development and support of the CDS application and second, the supply of qualification documentation for the project.
During the development of the software, the vendor will test the software many, many times. If these functions are not configured when the software is installed, why do you need to retest them? However, the problem is how to justify a reduced testing approach? This is where a vendor can help the validation project. If a summary of the in-house testing of the CDS could be available to the project, it would provide the documented evidence to justify a reduced testing approach in some areas. Note that this is not a carte blanche to not test anything, as the project still will have to demonstrate fitness for purpose in the laboratory environment.
The other specific area in which a vendor can be useful is the provision of qualification documentation for the CDS. This is a double-edged sword and it really is an issue of caveat emptor (buyer beware). There are very specific regulatory requirements in U.S. and EU GMP (6,7) for review and approval of documents used in any validation and use of vendor-supplied documentation could mean that the laboratory generates noncompliances. Why, you may ask? The reason is that the laboratory must review and authorize qualification documentation before it is executed but often, there is not any space for you to do this. The vendor's engineer pops into the laboratory, executes the protocols, and leaves them for one of the project team to review and approve. This is wrong and vendors must be more responsible and professional in this area. There will be a longer and more detailed discussion on this subject in a forthcoming "Focus on Quality" column in Spectroscopy.
8. Planning, Planning, Planning: To cement all the different disciplines and tasks together it is essential to have effective project planning. The project needs to be broken down into the component tasks that, when completed, will deliver the validated core system. One problem in project planning is to break the project down into enough detail but not too much or too little. For each task, allocate the individuals who will be responsible for each one — for example, for writing the URS there would be a responsible individual who would need to coordinate the writing of the document and liaison with IT and QA to ensure that the inputs from these groups are elicited. Also, if there is a document required to support the validation, the review personnel need to be known and the approximate time when the draft will be available so that individual's work can be synchronized.
Often, a project's plans can be compared with major works of fiction, so to avoid this label, project plans need to be refined and updated to allow for change time or the addition or elimination of tasks. The timetable for the validation of the core system will be aggressive but the allocation of time for each one must be realistic.
9. Focus on the Core System: Once the overall system has been specified in the URS, the project team then needs to concentrate on the validation of the core system to ensure that the timescales of the project can be achieved. Otherwise, looking at the core as well as the whole system will be a major cause of delay in delivering the core system.
10. Get More from Less Testing: Finesse and subtlety are characteristics not often associated with computerized system validation. However, it is in the design of the overall test suite for the performance qualification (PQ) or user-acceptance testing that they need to come to the fore. Shown in Figure 2 is an outline design of a test suite for a CDS. This is for the core system validation and does not represent all of the testing that will take place but it will serve to illustrate the points I wish to make. The TS numbers in the figure refer to individual test scripts.
Figure 2: Outline design of a phase 1 performance qualification test suite for a CDS.
There are two levels of testing to note. First, there is the system level testing and second, the analytical level testing based around the project-based organization of work.
System level testing focuses on three areas: security and access control (TS01), data server reboot (TS02), and creation of new projects (TS03). All three of these test scripts are designed with two criteria in mind: first to test the function (which will test the some of the URS requirements), and second to test to trigger entries into the system audit trail. These entries in the system audit trail will be picked up and verified in TS02.
Similarly, at the project level, there are five analytical test scripts (TS06, TS07, TS08, TS09, and TS13) checking the different calibration methods used by the laboratory. From the arrows linking many of these test scripts, you can see that there are interrelationships between many of the them. For example, TS06 tests the calibration model of average by amount with an external standard and includes capacity testing of the largest numbers of sample numbers expected during an analytical run. This test script will be run concurrently with TS05 looking at buffering of chromatographic data by the data server. After the TS06 is started, the network connection to the data server will be disconnected. The data server will buffer the data to the end of the run and then after the network connection is restored, the data will be transferred to the server. After these test scripts are completed, the system suitability calculations will be verified to comply with 211.68(b) (8), European Pharmacopoeia chapter 2.2.46 (9), and United States Pharmacopoeia <621> (10). Furthermore, the project and data generated in TS06 is used in the test script for project, archived, and restored (TS14). To save time and effort, the data generated for TS09 (multicomponent analysis) will be used for the area percent calculations (TS13) and avoid the need to undertake a second analytical result. Similar to the system-level testing of the system audit trail in TS04, the project audit trail will be tested using audit trail entries in a number of test scripts (TS07, TS11, TS12, and TS13).
Summary
In this article, I have asked the question: Why does it take so long to validate a CDS? In answering this, I have provided 10 critical success factors that should be considered when establishing a CDS validation project. Ideally, the core system should be validated within three months.
R.D. McDowall R.D. McDowall is principal of McDowall Consulting, director of R.D. McDowall Limited, and "Focus on Quality" column editor for Spectroscopy, LCGC's sister magazine. Address correspondence to him at 73 Murray Avenue, Bromley, Kent, BR1 3DJ, UK, or e-mail: rdmcdowall@btconnect.com.
References
(1) FDA Guidance for Industry, General Principles of Software Validation, 2002.
(2) Good Automated Manufacturing Practice (GAMP) version 5, International Society for Pharmaceutical Engineering, Tampa Florida, 2008.
(3) United States Pharmacopoeia XXIII, <1058> Analytical Instrument Qualification United States Pharmacopoeia Inc, Rockville Maryland, 2010.
(4) R.D. McDowall, Validation of Chromatography Data Systems: Meeting Business Needs and Regulatory Requirements (Royal Society of Chemistry, Cambridge, 2005).
(5) Good Automated Manufacturing Practice (GAMP) version 4, International Society for Pharmaceutical Engineering, Tampa Florida, 2001.
(6) 21 CFR 211.160(a).
(7) EU GMP Annex 15.
(8) 21 CFR 211.68(b).
(9) European Pharmacopoeia 2.2.46.
(10) United States Pharmacopoeia <621>.
RAFA 2024 Highlights: Contemporary Food Contamination Analysis Using Chromatography
November 18th 2024A series of lectures focusing on emerging analytical techniques used to analyse food contamination took place on Wednesday 6 November 2024 at RAFA 2024 in Prague, Czech Republic. The session included new approaches for analysing per- and polyfluoroalkyl substances (PFAS), polychlorinated alkanes (PCAS), Mineral Oil Hydrocarbons (MOH), and short- and medium-chain chlorinated paraffins (SCCPs and MCCPs).
Pharmaceutical excipients, such as polyethylene glycol-based polymers, must be tested for the presence of ethylene oxide (EtO) and 1,4-dioxane as part of a safety assessment, according to USP Chapter <228>.