There’s a lot of interest in moving data to the cloud for security, centralization, and collaboration. Doing so, however, has implications for data integrity and audits. To understand more, we spoke to Monica Cahilly, the president and founder of Green Mountain Quality Assurance, in a podcast interview. Cahilly has more than 25 years of consulting experience, with specialized interest and enthusiasm for data integrity assurance and data governance. A short excerpt of that interview is presented here.
Monica Cahilly: The cloud is an exciting area. It helps lower cost, for example, for smaller organizations. Although it is an exciting area, I think what we have to be mindful of is that the cloud, in some instances, still poses some significant risks. I think we have to really understand the implications of moving into that environment, and then make sure that an audit team knows what those implications are, so that they can correctly qualify these environments, and then continuously monitor them at an appropriate frequency.
Cloud providers are, in fact, performing GXP-regulated work. What’s important for us to recognize is that these vendors may be inspected by authorities under certain circumstances, such as when a significant problem occurs in their GXP work that may impact patient safety, product quality, or the integrity of critical data. If there’s a problem that occurs because of something that is done by a vendor or one of their employees, then we obviously would be accountable for that as the contract giver. So we really have to understand that this is a contract acceptor–contract giver relationship and requires that the partnership be an active one. And we have to ensure that we are, in fact, doing adequate data monitoring.
When we think about 21 CFR Part 11, we have to remember that it was written back in the 1990s, well before these cloud environments became what they all are now, I think there was already some early virtualization of them back in that timeframe, but it has grown so much since then. Now we’ve got many cloud opportunities.
In 21 CFR Part 11 it say that when you place your data in the hands of a third party, you have additional risk. So, in the context of Part 11, that system would be referred to as an open system as opposed to a closed system (one where you’re managing your data under your own governance structure within your company). So, if the third party is a cloud provider, that is, in fact, an open system.
The text of 21 CFR Part 11 is not explicit on what to do with regard to open systems in a cloud environment. There are some suggestions; for example, we can encrypt the data in transit. We also might consider encrypting the data if we have confidentiality, security, data privacy, and intellectual property concerns that we need to be mindful about. As a result, we have to remember that we have additional risks in these environments. The audit team also needs to understand what those risks are and how to manage them. It is not just about quality systems and verifying that the cloud provider has a documented quality system framework that they follow, but it also requires that we include a cybersecurity specialist on our audits so that a designated person can assess how the cloud provider is managing the more challenging aspects of the current risks and the cybersecurity. This also requires that we institute data monitoring and data analytics.
One of the things that I think is important to remember is that in a cloud environment, on the user side, you are typically only going to be able to see the audit trails that are visible through the front end, if this is a software as a service. But even though you are only going to be able to see the audit trails that are visible to that front-end or application environment, you are also responsible for any activities that are taking place in the back-end environments. For example, in infrastructure environments that may have an impact on the integrity of our data that’s within that system, we may not be able to see those logs through the front end. How then do we get at that information so that we can really govern the risks that are potentially coming to bear on the data in these platforms? There are a lot of risks that we want to be mindful of, and we want to try to make sure that we do not move into these environments until we have a proper governance program in place for those risks. However, I think that if we have a proper governance program in place, and a really good data analytics, data monitoring, and data governance team, the opportunities available through the cloud are really exciting.
This interview has been lightly edited for style and space.
GC–TOF-MS Finds 250 Volatile Compounds in E-Cigarette Liquids
November 1st 2024A study has used gas chromatography coupled to a time-of-flight mass spectrometer to build an electron ionization mass spectra database of more than 250 chemicals classified as either volatile or semi-volatile compounds. An additional, confirmatory layer of liquid chromatography–mass spectrometry analysis was subsequently performed.
AI and GenAI Applications to Help Optimize Purification and Yield of Antibodies From Plasma
October 31st 2024Deriving antibodies from plasma products involves several steps, typically starting from the collection of plasma and ending with the purification of the desired antibodies. These are: plasma collection; plasma pooling; fractionation; antibody purification; concentration and formulation; quality control; and packaging and storage. This process results in a purified antibody product that can be used for therapeutic purposes, diagnostic tests, or research. Each step is critical to ensure the safety, efficacy, and quality of the final product. Applications of AI/GenAI in many of these steps can significantly help in the optimization of purification and yield of the desired antibodies. Some specific use-cases are: selecting and optimizing plasma units for optimized plasma pooling; GenAI solution for enterprise search on internal knowledge portal; analysing and optimizing production batch profitability, inventory, yields; monitoring production batch key performance indicators for outlier identification; monitoring production equipment to predict maintenance events; and reducing quality control laboratory testing turnaround time.
Multivariate Design of Experiments for Gas Chromatographic Analysis
November 1st 2024Recent advances in green chemistry have made multivariate experimental design popular in sample preparation development. This approach helps reduce the number of measurements and data for evaluation and can be useful for method development in gas chromatography.