LCGC North America
The current final FDA guidance document discussing the revised scope and applicability of 21 CFR Part 11 clearly adopts a risk-based approach to Part 11 compliance. This is certainly a very pragmatic approach and one that many regulated firms had adopted even before the recent release of this document. This article will discuss the particulars of a risk-based approach to Part 11 compliance. The favored risk assessment protocols such as HACCP, FMEA and FMECA will also be addressed as well as how one can apply them to 21 CFR Part 11 compliance.
By creating and enforcing 21 CFR Part 11 (Part 11), the U.S. Food and Drug Administration (FDA) has taken a bold step in establishing one of the first regulations to define the conditions under which the agency will accept information in electronic form. With this move, the agency has not only propelled itself into the electronic age, but has created a need for industry to adapt an existing computing infrastructure to the new ways of electronic government and commerce. Since this step was taken, other governmental bodies have produced a number of directives and laws that affect electronic records and electronic signatures. The rules for authentic and trustworthy electronic information undoubtedly have played an increasingly important role in the creation and use of computer technologies as dependency upon them increases. Equally as important is the ability to implement the most effective systems and policies for compliance. This task presently is being worked out among industry professionals, but just like the current revisions of the good manufacturing practices (GMPs) from 25 years ago, developing trends likely will transition into best practices for the future.
Victoria Lander
As a follow-on to Part 11 interpretation, The FDA issued the 2003 "Guidance for Industry: Part 11, Electronic Records; Electronic Signatures — Scope and Application" partly due to concerns expressed by the industry that the breadth of applicability and the cost of Part 11 compliance have hindered the use of new technology (1). The latest guidance states that records must still be maintained in compliance to the underlying predicate rules, but that the FDA will take a "risk-based" approach to enforcing compliance to some of the technical controls for Part 11 such as validation, audit trails, record retention, and record copying (2). The FDA also will include Part 11 in its formal review of current good manufacturing practice (cGMP) regulations and follow a more subjective course in taking regulatory action for compliance. The FDA's intent is to get back to its GxP, or predicate rule, fundamentals for the interpretation and enforcement of Part 11. These fundamentals involve systems for generating electronic records required in support of the agency's regulations for "best practices" (together referred to as GxP) that encompass good clinical practice (GCP), good laboratory practice (GLP), and cGMP.
What is Risk?
A firm's Part 11 remediation plans should include a risk analysis that accounts for how various systems that generate regulated electronic records potentially could affect the safety of the consumer. Although there are many definitions of "risk," depending upon your industry and perspective, a useful one is the definition of risk according to the ISO/IEC Guide 51:1999; a combination of the probability of occurrence of harm and the severity of that harm (3). Whether applied to Part 11, or to other safety-related aspects of FDA-regulated products, the regulatory perspective for risk should focus on risk to product quality and public safety. Such products obviously would include foods and cosmetics, blood products and drugs, medical devices, and any other regulated products that are ingested or consumed by or applied to a living creature (human or animal). When a system generates electronic records that can greatly impact product safety and quality or the integrity of regulated records, it is considered a "high-risk" system, and the technical controls of Part 11 that protect electronic record integrity would still apply. Otherwise, the system is considered to be "low-risk" and the agency simply will enforce the GxP requirements for protecting record integrity instead of the more stringent Part 11 controls.
For manufacturers of drugs and medical devices, a risk-based approach to protecting product quality and public safety stems logically from the fact that Part 11 was predicated on the GxPs. For example, the FDA expects a firm subject to GxP to develop a risk evaluation of its product and to then to mitigate the identified risks. Identified risks can be addressed by technical fixes that effectively eliminate the risks or reduce their likelihood of occurrence or severity of consequences to acceptable levels. There also can be risks for which there are no technical fixes. These latter risks can be addressed by including warnings in the accompanying product labeling. Other residual risks following mitigation can remain so minimal as to intrinsically be left at acceptable levels.
Figure 1 lists some of the high versus low risk systems that generate GxP records according to the recent ISPE White Paper submitted to the FDA, which addresses the risk-based approach to 21 CFR Part 11 compliance (4).
Figure 1
In fact, applying a risk-based approach on Part 11 compliance should be nothing new for regulated firms. A similar approach outlined in the Quality System Regulation (QSR) requires a firm to perform a risk analysis of the various record-generating and record-keeping systems maintaining electronic records and implementing electronic signatures (5). Such an analysis would also address a system's interactions with other interconnected systems. The result of this analysis would allow the company to determine which records have high impact consumer safety issues. The firm would then evaluate the effects of the identified risks and rank them according to their criticality.
Examples of Applying Risk to Part 11–Triggering Systems
A practical example of applying risk analysis to Part 11 remediation would be the use of quality data from a Part 11 compliant database for inclusion in a Corrective and Preventative Action (CAPA) report. A CAPA is a critical business tool that will boost efficiency, improve product quality, and help assure FDA inspectors that you are running a quality operation. A spreadsheet program typically generates this report, and while the spreadsheet formulas still should be validated according to GxP, the overall relative risk to public safety is low and therefore, the typical Part 11 technical controls (for example, audit trails) would not be required to protect the integrity of the spreadsheet. The spreadsheet itself, however, must be maintained and utilized in a current validated state and be GxP compliant.
On the other hand, adverse event reporting and clinical trial data that fall under GCP regulation can have a potentially high impact on public safety and the quality of a regulated product. Programs that analyze and visualize clinical data subsequently have an impact on record integrity. Such systems would be considered high risk, and therefore should continue to incorporate the technical controls for Part 11 compliance as well as maintain predicate rule compliance.
In summary, Part 11 remediation has not changed for high-risk GCP-related systems such as adverse event and CRF data management systems, SAS analysis software, web trial systems, electronic patient diaries, patient randomization, and trial supply labeling systems. Both GCP and Part 11 definitely apply to these high-risk systems. In addition, the FDA's "Guidance to Industry for Computerized Systems Used in Clinical Trials" remains in effect and applies to these systems as well (6).
Keep in mind that while Part 11 is an enforceable law, an FDA guidance document is not a law. Guidance documents present the FDA's current thinking on a subject and are only a recommendation on how to proceed in addressing a law's requirements. Guidance documents are not binding on either the industry or the agency.
Risk Assessment Methodologies
There are many risk-assessment protocols or methodologies available originating from various industries (automotive, aerospace, defense, food, and so forth). It behooves regulated manufacturers to build risk analysis into their quality processes from the start. FDA-regulated firms commonly have utilized several of these methodologies over the years. What follows is a discussion of the most common risk-analysis methodologies, and it is by no means all-inclusive. For further reading on risk-management methodologies and their application, see reference 7.
Fault tree analysis: A fault tree analysis (FTA) is a deductive, top-down method of analyzing system design and performance. It involves specifying a "top event" to analyze, followed by identifying all of the associated elements in that system that could cause that top event to occur. Fault trees provide a convenient symbolic representation of the combination of events resulting in the occurrence of the top event. Events and gates in fault tree analysis are represented by graphic symbols such as AND/OR gates. Sometimes certain elements or basic events might need to occur together in order for that top event to occur. In this case, these events would be arranged under an AND gate, meaning that all of the basic events would need to occur to trigger the top event. If the basic events alone would trigger the top event, then they would be grouped together under an OR gate. The entire system, as well as human interactions, would be analyzed when performing a fault tree analysis.
Failure mode effects and criticality analysis (FMECA): FMECA originated in the1950s in the military and aerospace industries. The basic idea is to categorize and rank potential process failures, or critical issues, and then to target the prevention of those critical issues. It is important to prioritize the potential failures according to their risks and then implement actions to eliminate or reduce their likelihood of their occurrence.
Failure modes and effects analysis (FMEA): FMEA originated in the1960s and 1970s and was first used by reliability engineers. FMEA involves the evaluation of documentation of potential failures of a product or process. Actions are then identified that could eliminate or reduce the potential failures. It is a system of various group activities provided through documentation of potential failure modes of products and processes and their effect on product performance. FMEA is a tool that should identify product and process failures before they occur, identify appropriate risk-mitigation measures to prevent or otherwise control the failure, and ultimately improve product and process design. An assumption is made that all product and process failures (and the actions required to control these failures) are predictable and preventable. Surprisingly, organizations still frequently experience predictable and preventable failures with costly consequences. These failures can lead to product recalls, death or injury, poor quality, and unanticipated cost. Although the aerospace and defense industry have used FMEA for decades, FMEA recently has been making significant inroads into the biomedical device industry. Figure 2 illustrates the "bottom up" approach of FMEA and the focus of this methodology on mitigating hazards from the component, or the granular, perspective of a system or process.
Figure 2
Hazard analysis and critical control points (HACCP): HACCP is a common methodology embraced by the FDA. HACCP received its start in the food arena and was developed initially in the 1960s by the Pillsbury Co., NASA, and Natick Labs for the space program to reduce the need to test the finished packaged product. Pillsbury made a commitment to improve on existing "good quality programs" by using techniques developed to supply food to NASA's astronauts. In 1996, the U.S. Food Safety and Inspection Services Task Force (FSIS) developed a HACCP-based regulatory proposal that became the Pathogen Reduction/Hazard Analysis and Critical Control Point Systems (HACCP) Rule. In this rule, FSIS determined that its food safety goal was to reduce the risk of food-borne illnesses associated with the consumption of meat and poultry products to the maximum extent possible. This goal can be met only by ensuring that appropriate and feasible measures were taken at each step in the food-production process where hazards can enter and where procedures and technologies exist or can be developed to prevent the hazards or reduce the likelihood they will occur.
HACCP is made up of seven basic principles that enable the production of safe products as illustrated in Figure 3. The principles are determined through the analysis of production processes, followed by the identification of all hazards that are likely to occur, then the identification of critical points in the process at which these hazards can be introduced into the product and therefore should be controlled, the establishment of critical limits for control at those points, followed by the verification of these prescribed steps, and then establishment of the methods by which the firm and the regulatory authority can monitor how well process control through the HACCP plan is working. Overall, risks are minimized by proper implementation of HACCP. It is understood that implementation of HACCP does not mean the absolute elimination of risks, but rather, that one can prevent and reduce hazards to a degree that substantially reduces the risk to an acceptable level.
Figure 3
Where to Start
Risk-based compliance can analyze computer systems and information-handling processes to assess not only risk, but also the cost of converting paper-based information to an electronic format. A good place to start is to perform a system assessment and then plot various systems and processes on a simple X-Y matrix that measures, from low to high, the risk to security of the data (x axis) and the cost of remediation (γ axis). Systems and processes needing upgrades or replacement are prioritized according to where they fall in the matrix. Computer systems, for example, that fall in the "high data security risk, low conversion cost" area of the matrix could be targeted first for compliance validation.
To address Y2K issues, many organizations already have generated an inventory of all their computer systems and evaluated them to determine the potential risk in the event of a computer error or failure. Companies with cost considerations and many noncompliant computer systems must, of course, prioritize which ones to remediate first. To sanely tackle this problem, organizations must estimate the data-security risk for each system and the cost of Part 11 validation, and then plot that system on a matrix. Systems and processes falling into the high-risk category should get top remediation priority.
The following is a typical criticality assessment (high to low) for nonclinical laboratory systems (8):
Conclusions
21 CFR Part 11 is not going away and the FDA intends to enforce it. What has changed recently is the adoption of a narrower scope for the Rule, a new understanding of agency enforcement discretion, and the application of a risk-based approach to compliance. The important thing to remember about choosing a risk-assessment protocol or methodology for Part 11 remediation is to use basic common sense. All of the methodologies mentioned in this article have a basic premise in common that is founded on pure common sense. That is, to analyze processes, first identify where the greatest potential for risk lies (to product quality and ultimately to public safety), and subsequently put in place ways to mitigate those risks and document the entire endeavor. Whether adopting a standard risk assessment methodology like FMEA or HACCP or using some other methodology, the important thing to remember is that the FDA will show enforcement discretion if there is a well-documented plan in place and true progress is being made in carrying it out.
Victoria Lander is the Corporate Compliance Manager at Waters Corp., Milford, Massachusetts. She is responsible for marketing and market development for scientific corporate compliance initiatives and products focused on the pharmaceutical industry. Victoria received the B.S. degree in Biology and the B.A. degree in Psychology from Purdue University.
Michael E. Swartz "Validation Viewpoint" Co-Editor Michael E. Swartz is a Principal Scientist at Waters Corp., Milford, Massachusetts, and a member of LCGC's editorial advisory board.
Michael E. Swartz
Ira S. Krull "Validation Viewpoint" Co-Editor Ira S. Krull is an Associate Professor of chemistry at Northeastern University, Boston, Massachusetts, and a member of LCGC's editorial advisory board.
Ira S. Krull
The columnists regret that time constraints prevent them from responding to individual reader queries. However, readers arewelcome to submit specific questions and problems, which the columnists may address in future columns. Direct correspondence about this column to "Validation Viewpoint," LCGC, Woodbridge Corporate Plaza, 485 Route 1 South, Building F, First Floor, Iselin, NJ 08830, e-mail lcgcedit@lcgcmag.com.
References
(1) Line 84, 85: Guidance for Industry: Part 11, Electronic Records; Electronic Signatures — Scope and Application (FDA 2003) (http://www.fda.gov/cder/guidance/5505dft.pdf).
(2) Line 229, 252, 307: Guidance for Industry: Part 11, Electronic Records; Electronic Signatures — Scope and Application (FDA 2003) (http://www.fda.gov/cder/guidance/5505dft.pdf).
(3) ISO/IEC Guide 51:1999: Safety aspects — Guidelines for their inclusion in standards. International Organization for Standardization (http://www.iso.org/).
(4) ISPE White Paper: International Society of Pharmaceutical Engineers: Risk-Based Approach to 21 CFR Part 11 (2003) (http://www.21cfrpart11.com/pages/library/index.htm).
(5) FDA Quality System Regulation (QSR): 21 CFR Part 820 (Revised as of April 1, 2006) (http://www.accessdata.fda.gov/scripts/cdrh/cfdocs/cfcfr/CFRSearch.cfm?CFRPart=820&showFR=1).
(6) General Principles of Software Validation; Final Guidance for Industry and FDA Staff (FDA, Center for Devices and Radiological Health, Center for Biologics Evaluation and Research, 2002) (http://www.fda.gov/cdrh/comp/guidance/938.html).
(7) E.H. Conrow, Effective Risk Management: Some Keys to Success (AIAA Press, Reston, Virginia, 2003).
(8) B. DePompa Reimers, "Easing the Pain of Part 11," Bio-IT World, April 2003 (http://www.bioitworld.com/archive/041503/strategic_pain.html)
AI and GenAI Applications to Help Optimize Purification and Yield of Antibodies From Plasma
October 31st 2024Deriving antibodies from plasma products involves several steps, typically starting from the collection of plasma and ending with the purification of the desired antibodies. These are: plasma collection; plasma pooling; fractionation; antibody purification; concentration and formulation; quality control; and packaging and storage. This process results in a purified antibody product that can be used for therapeutic purposes, diagnostic tests, or research. Each step is critical to ensure the safety, efficacy, and quality of the final product. Applications of AI/GenAI in many of these steps can significantly help in the optimization of purification and yield of the desired antibodies. Some specific use-cases are: selecting and optimizing plasma units for optimized plasma pooling; GenAI solution for enterprise search on internal knowledge portal; analysing and optimizing production batch profitability, inventory, yields; monitoring production batch key performance indicators for outlier identification; monitoring production equipment to predict maintenance events; and reducing quality control laboratory testing turnaround time.