There’s a lot of interest in moving data to the cloud for security, centralization, and collaboration. Doing so, however, has implications for data integrity and audits. To understand more, we spoke to Monica Cahilly, the president and founder of Green Mountain Quality Assurance, in a podcast interview. Cahilly has more than 25 years of consulting experience, with specialized interest and enthusiasm for data integrity assurance and data governance. A short excerpt of that interview is presented here.
Monica Cahilly: The cloud is an exciting area. It helps lower cost, for example, for smaller organizations. Although it is an exciting area, I think what we have to be mindful of is that the cloud, in some instances, still poses some significant risks. I think we have to really understand the implications of moving into that environment, and then make sure that an audit team knows what those implications are, so that they can correctly qualify these environments, and then continuously monitor them at an appropriate frequency.
Cloud providers are, in fact, performing GXP-regulated work. What’s important for us to recognize is that these vendors may be inspected by authorities under certain circumstances, such as when a significant problem occurs in their GXP work that may impact patient safety, product quality, or the integrity of critical data. If there’s a problem that occurs because of something that is done by a vendor or one of their employees, then we obviously would be accountable for that as the contract giver. So we really have to understand that this is a contract acceptor–contract giver relationship and requires that the partnership be an active one. And we have to ensure that we are, in fact, doing adequate data monitoring.
When we think about 21 CFR Part 11, we have to remember that it was written back in the 1990s, well before these cloud environments became what they all are now, I think there was already some early virtualization of them back in that timeframe, but it has grown so much since then. Now we’ve got many cloud opportunities.
In 21 CFR Part 11 it say that when you place your data in the hands of a third party, you have additional risk. So, in the context of Part 11, that system would be referred to as an open system as opposed to a closed system (one where you’re managing your data under your own governance structure within your company). So, if the third party is a cloud provider, that is, in fact, an open system.
The text of 21 CFR Part 11 is not explicit on what to do with regard to open systems in a cloud environment. There are some suggestions; for example, we can encrypt the data in transit. We also might consider encrypting the data if we have confidentiality, security, data privacy, and intellectual property concerns that we need to be mindful about. As a result, we have to remember that we have additional risks in these environments. The audit team also needs to understand what those risks are and how to manage them. It is not just about quality systems and verifying that the cloud provider has a documented quality system framework that they follow, but it also requires that we include a cybersecurity specialist on our audits so that a designated person can assess how the cloud provider is managing the more challenging aspects of the current risks and the cybersecurity. This also requires that we institute data monitoring and data analytics.
One of the things that I think is important to remember is that in a cloud environment, on the user side, you are typically only going to be able to see the audit trails that are visible through the front end, if this is a software as a service. But even though you are only going to be able to see the audit trails that are visible to that front-end or application environment, you are also responsible for any activities that are taking place in the back-end environments. For example, in infrastructure environments that may have an impact on the integrity of our data that’s within that system, we may not be able to see those logs through the front end. How then do we get at that information so that we can really govern the risks that are potentially coming to bear on the data in these platforms? There are a lot of risks that we want to be mindful of, and we want to try to make sure that we do not move into these environments until we have a proper governance program in place for those risks. However, I think that if we have a proper governance program in place, and a really good data analytics, data monitoring, and data governance team, the opportunities available through the cloud are really exciting.
This interview has been lightly edited for style and space.
Best of the Week: Food Analysis, Chemical Migration in Plastic Bottles, STEM Researcher of the Year
December 20th 2024Top articles published this week include the launch of our “From Lab to Table” content series, a Q&A interview about using liquid chromatography–high-resolution mass spectrometry (LC–HRMS) to assess chemical hazards in plastic bottles, and a piece recognizing Brett Paull for being named Tasmanian STEM Researcher of the Year.
Using LC-MS/MS to Measure Testosterone in Dried Blood Spots
December 19th 2024Testosterone measurements are typically performed using serum or plasma, but this presents several logistical challenges, especially for sample collection, storage, and transport. In a recently published article, Yehudah Gruenstein of the University of Miami explored key insights gained from dried blood spot assay validation for testosterone measurement.
Determination of Pharmaceuticals by Capillary HPLC-MS/MS (Dec 2024)
December 19th 2024This application note demonstrates the use of a compact portable capillary liquid chromatograph, the Axcend Focus LC, coupled to an Agilent Ultivo triple quadrupole mass spectrometer for quantitative analysis of pharmaceutical drugs in model aqueous samples.