LCGC North America
Data integrity is paramount when working in a regulated environment. Data process mapping is an excellent way to identify and mitigate data gaps and record vulnerabilities in a chromatographic process. This approach is simple and practical.
Understanding and mitigating risks to regulatory records is an important part of a data integrity program. We discuss data process mapping as a technique to identify data gaps and record vulnerabilities in a chromatographic process and look at ways to mitigate or eliminate them.
Welcome to the second installment of "Data Integrity Focus." In the last part, we looked at the overall scope of a data integrity and data governance program via a four-layer model (1–3). In this part, we look at a simple and practical methodology that can be applied to identify the risks with any process in a regulated "good practice" (GXP) laboratory. Once identified, the risks can be mitigated or eliminated to ensure the integrity of data and records. The methodology is called data process mapping, and it is a variant of process mapping, which some of you may be familiar with if you have been involved with implementation of a computerized system or six sigma improvement project. Once the process is mapped, the data and records created, modified, or calculated are identified and assessed to see if there any data vulnerabilities in a paper process or computerized system.
It is very important to understand that data integrity is not just a computer or information technology (IT) equipment problem. There are many manual process generating paper records that occur in the laboratory, such as sampling, sample preparation, calculation, and review (3–5). Many observation tests such as appearance, color, and odor are typically recorded on paper. Even with a computerized system, there are additional and essential paper records, such as the instrument and column log books.
What do the regulatory guidance documents say about assessment of processes? There are three documents that I would like to focus on. The first is the World Health Organization (WHO) in their guidance document (6), that notes:
The second is the UK's Medicines and Healthcare products Regulatory Agency (MHRA) in their 2018 GXP guidance, which makes the following statements about assessment of processes and systems (7):
The third and final guidance is from the Pharmaceutical Inspection Cooperation Scheme (PIC/S) (8):
From this information, risk-proportionate control measures can be implemented.
Summarizing the guidance documents:
Typically, assessment of computerized systems involves a checklist where questions are posed for a spectrometer and the associated computerized system, such as:
The checklist questions can go on, and on, and on, and, if you are (un)lucky, it can go into such excruciating detail that it becomes much cheaper and safer than a sleeping pill. There are three main problems with a checklist approach to system assessment:
If a checklist is not the best tool, what tool should be used to identify data and records and then understand the risks posed?
Instead of starting with a fixed checklist, start with a blank whiteboard or sheet of paper together with some Post-it notes, pencils, and an eraser. Why the eraser? You are not going to get this right the first time, and you'll be rubbing out lines and entries on the notes until you do. You'll need a facilitator who will run the meeting and two to three experts (perhaps laboratory administrators) who know the process, and, if software is involved, how the application works at a technical level.
The first stage is to visualize the process. Define the start and end of an analytical process (for example, from sampling to reportable result). The process experts should write the stages of the process down on the notes, and place them on the whiteboard or paper in order. The first attempt will be rough and will need revising, as the experts can miss activities, or some activities will be in the wrong order, or the detail is uneven. The facilitator should encourage and challenge the experts to revise and refine the process flow, which may take two or three attempts. Although you can use a program like Visio to document the process, this slows the interaction between the participants during the initial mapping. I would suggest paper and pencil or whiteboard is an easier, and more flexible, option at this stage. When the process is agreed, then commit the final maps to software.
The second stage is to document data inputs, outputs, processing, verification steps, and storage for each of the process activities. This can involve manual data recording in log books, laboratory notebooks, and blank forms, as well as inputs to and outputs from any computerized systems involved in the process. Typically, such a process has not been designed, but has evolved over time, and can often look like a still from Custer's Last Stand with the number of arrows involved. This is the data process map or what we can call the current way of working.
Once the process is complete and agreed, look at each step and document:
Any vulnerabilities need to be risk assessed, and remediation plans need to be developed. These plans will fall into two areas: quick fix remediation and long-term solutions. We will look at these two areas now for an example involving a chromatography data system.
From the theory, we need to look at how data process mapping could work in practice with a chromatograph linked to a chromatography data system (CDS). Welcome to a chromatography laboratory near to you operating the world's most expensive electronic ruler, as shown in Figure 1.
Figure 1: Current hybrid process for chromatographic analysis.
Let me describe the main features of the simplified process:
Some of you may be reading the process with abject horror, and may think that this would never occur in a 21st century chromatography laboratory. Based on my experience, and this is also seen in numerous U.S. Food and Drug Administration (FDA) warning letters, this process is more common than you may think. Remember that the pharmaceutical industry is ultraconservative, and if it worked for the previous inspection, all is well. However, to quote that world-famous chromatographer, Robert Zimmerman, the times, they are a-changin'. Hybrid systems (discussed in the next part of this series) are not encouraged by at least one regulator (6), and now some inspectors are unwilling to accept procedural controls to mitigate record vulnerabilities.
Once the process is mapped, reviewed, and finalized, the data vulnerabilities can be identified for each process step. The main data vulnerabilities identified in the current chromatographic process steps are listed in Table I. To put it mildly, there are enough regulatory risks to generate a cohort of warning letters. There are many data integrity red flags in this table, including the fact that work cannot be attributed to an individual, defining raw data as paper, and failing to backup, or even save, electronic records. There is also the shambles of the business process, due to the use of the spreadsheet to calculate all the values from SST parameters and reportable results. Overall, the process is slow and inefficient. These risks need to be mitigated as an absolute minimum or, even better, eliminated entirely.
Table I: Main data vulnerabilities identified in a chromatography process
Enter stage left that intrepid group: senior management. These are the individuals who are responsible and accountable for the overall pharmaceutical quality system, including data integrity. The approaches that a laboratory will take are now dependent on them.
Figure 2: Remediate or solve data integrity vulnerabilities?
Figure 2 shows the overall approach that should happen to resolve data integrity issues. There are two outcomes:
1. Short-term remediation to resolve some issues quickly. Ideally, this should involve technical controls where available (for example, giving each user a unique user identity, or creating and allocating user roles for the system and segregation of duties). However, remediation often involves procedural controls, such as the use of SOPs or log books to document work. This slows the process down even further, and will result in longer second-person review times (11).
2. Long-term solutions to implement and validate technical controls, to ensure that work is performed correctly and consistently. This should involve replacement of hybrid systems with electronic working and ensuring business benefit from the investment in time and resources.
The problem is management. In many organizations, they want only to focus on the first option (fix and forget) and not consider the second, as it would detract from the work or cost money. While this may be thought to be an option in the very short term, it is not viable when regulatory authorities become more focused on hybrid systems with procedural controls.
In organizations that claim there is no money to provide long-term solutions, however, the financial taps are quickly turned on following an adverse regulatory inspection. However, it is better, more efficient, and cheaper to implement the long-term solution yourself, because then the company, not the regulator, is providing the solution.
From the data process map in Figure 1, some short-term solutions can be implemented as shown in Figure 3. Rather than attempt to fix a broken and inefficient process, use the CDS software that the laboratory has paid for to calculate the SST and final results. This would eliminate the spreadsheet, as well as manual entry into the spreadsheet and subsequent transcription checks.
Figure 3: Short term remediation of the chromatography process.
Attention must also be focused on the CDS application, and some of the main changes for immediate implementation must be:
This should result in an improved business process, as shown in Figure 3. The CDS is still a hybrid system, but the spreadsheet has been eliminated, along with manual entry to a second system, but the process is under a degree of control. Left like this (the fix and forget option from Figure 2), there is substantial risk remaining in the process, such as backup of the standalone systems and the need for plans for a long-term solution.
Long-term solutions require planning, time, and money. However, with the potential business and regulatory benefits that can be obtained, management should be queuing up to hand over money. Let us look at some of the remaining issues to try and solve with this process:
Figure 4: Long-term solution for the chromatographic process.
The regulatory risks of the original process have been greatly reduced or eliminated at the end of the long-term solution. The laboratory can face regulatory inspections with confidence.
I would like to thank Christine Mladek for helpful review comments during preparation of this column.
(1) R.D. McDowall, LCGC North Amer. 37(1), 44–51 (2019).
(2) R.D McDowall, Validation of Chromatography Data Systems: Ensuring Data Integrity, Meeting Business and Regulatory Requirements (Royal Society of Chemistry, Cambridge, UK, 2nd Ed., 2017).
(3) R.D. McDowall, Data Integrity and Data Governance: Practical Implementation in Regulated Laboratories (Royal Society of Chemistry, Cambridge, UK, 2019).
(4) M.E. Newton and R.D. McDowall, LCGC North Amer. 36(1), 46–51 (2018).
(5) M.E. Newton and R.D. McDowall, LCGC North Amer. 36(4), 270-274 (2018).
(6) WHO Technical Report Series No. 996 Annex 5 Guidance on Good Data and Records Management Practices. 2016, World Health Organization: Geneva.
(7) MHRA GXP Data Integrity Guidance and Definitions. 2018, Medicines and Healthcare products Regulatory Agency: London.
(8) PIC/S PI-041 Draft Good Practices for Data Management and Integrity in Regulated GMP / GDP Environments. 2016, Pharnaceutical Inspection Convention / Pharmaceutical Inspection Co-Operation Scheme: Geneva.
(9) R.D. McDowall, Spectroscopy 32(11), 24-27 (2017).
(10) Technical Report 80: Data Integrity Management System for Pharmaceutical Laboratories. 2018, Parenteral Drug Association (PDA): Bethesda, MD.
(11) M.E. Newton and R.D. McDowall, LCGC North Amer. 36(8), 527–529 (2018).
(12) M.E. Newton and R.D. McDowall, LCGC North Amer. 36(7), 458–462 (2018).
R.D. McDowall is the director of RD McDowall Limited in the UK. Direct correspondence to: rdmcdowall@btconnect.com
AI and GenAI Applications to Help Optimize Purification and Yield of Antibodies From Plasma
October 31st 2024Deriving antibodies from plasma products involves several steps, typically starting from the collection of plasma and ending with the purification of the desired antibodies. These are: plasma collection; plasma pooling; fractionation; antibody purification; concentration and formulation; quality control; and packaging and storage. This process results in a purified antibody product that can be used for therapeutic purposes, diagnostic tests, or research. Each step is critical to ensure the safety, efficacy, and quality of the final product. Applications of AI/GenAI in many of these steps can significantly help in the optimization of purification and yield of the desired antibodies. Some specific use-cases are: selecting and optimizing plasma units for optimized plasma pooling; GenAI solution for enterprise search on internal knowledge portal; analysing and optimizing production batch profitability, inventory, yields; monitoring production batch key performance indicators for outlier identification; monitoring production equipment to predict maintenance events; and reducing quality control laboratory testing turnaround time.
2024 EAS Awardees Showcase Innovative Research in Analytical Science
November 20th 2024Scientists from the Massachusetts Institute of Technology, the University of Washington, and other leading institutions took the stage at the Eastern Analytical Symposium to accept awards and share insights into their research.