LCGC Europe
Where the wizard stands behind the curtain working levers.
Drug discovery is generally thought of as creation of a compound that possesses the potential to become a useful therapeutic. Successful discovery rests on a number of elements, one of which is early metabolite characterization of a novel chemical entity (NCE), which we addressed in the last instalment of this series. Another element that supports discovery is analytical instruments and their ability to identify the NCE and purify it for storage and distribution downstream. This month's instalment of "MS in Practice" focuses on this latter aspect of discovery.
Today, pharmaceutical companies such as Pfizer (Groton, Connecticut, USA) and biotechnology companies such as Neurogen (Branford, Connecticut, USA) work towards a common goal: developing and maintaining proprietary libraries of registered compounds in standardized format for distribution and use. As Michele Kelly, Associate Director of analytical chemistry, Pfizer, told me, "We see ourselves as the stewards of Pfizer's compound library. We are involved at the beginning, when the compound is first registered, and we are responsible for 10 years or more — whenever the compound is distributed for use — for its purity, identity, concentration and suitability."
The instrumental need imposed by pharmaceutical invention and development is much the same for any company, regardless of its size. How the need is met, however, varies. It depends upon available resources, the evolution of instrument technologies and the abilities of those who operate the instruments. Over little more than a decade, a number of instrumental practices have changed significantly. Particularly in liquid chromatography–mass spectrometry (LC–MS), dramatic improvements in design and software have induced large shifts in pharmaceutical practices.
The proliferation of MS as a tool for characterizing the earlier steps in drug discovery is, perhaps, the largest motivator of change in the pharmaceutical industry. The growth of automated synthesis, characterization and archiving led to a demand for increasingly reliable instrumentation. Once available, this instrumentation produced greater confidence in the results as a basis for expected results and the creation of standard protocols.
The advantages of automated synthesis are not restricted to analytical chemists. They also extend to medicinal chemists doing invention and to biologists, who are concerned with sample integrity as the basis for having made proper determinations of activity. Yet challenges remain for the stewards of library quality. These are illustrated in Figure 1, which shows the variety of contributions that a company must manage, regardless of source, to maintain a library effectively over the long term.
Figure 1: Contributions to the compound file: quality and standard format in the compound library is difficult to realize unless the diverse contributing parties can meet common goals.
Some far sighted pharmaceutical companies such as Pfizer engaged instrument manufacturers, notably those with more diverse capabilities, to create an interface by which non-specialists could submit samples. Behind the scenes (a situation that reminds me of the movie Wizard of Oz, where the wizard stands behind a curtain working levers) highly trained mass spectrometrists monitored the systems' performance and calibration and diagnosed malfunctions as they arose. Frank Pullen1,2 (Pfizer, Sandwich, UK) is credited with the earliest published work on automated, open-access configurations. In it, he relates his experience using MS software, which he adapted for use with diverse LC and GC inlets.
The early non-specialist systems were simple flow-injection, isocratic systems — essentially an LC inlet connected to a single quadrupole mass spectrometer employing, predominantly, electrospray ionization. A submitter would log a sample identification and then leave the sample vial in a tray. Later, he or she would return for a spectrum, which appeared in the printer outbox. In short order, submitters were offered a few limited, general-purpose gradients. Automated e-mailing of results followed as the software increasingly became attuned to catching likely errors. Empty bottles or incorrectly concentrated samples would skew a result wildly and although little could be done to correct such problems, the software could flag the affected result as an outlier. As LC–MS usage increased, sample backlogs reminiscent of traffic jams became notorious disrupters. But resolving the backlogs was easy, as one needed only to arrange a more suitable combination of instruments, location, capacity and uptime.
Thanks to the now-obvious practicality of walk-up analysis, not to mention the cost savings inherent in instrument sharing, the open-access model for synthetic chemistry analysis is now de rigueur for all pharmaceutical companies. Today, dedicated automated systems routinely purify and analyse samples and manage archives. Unattended submission stations stand ready to accommodate ever more esoteric needs. For example, direct probe for electron ionization is used for sample compounds that do not respond to electrospray (ESI) or atmospheric chemical ionization (APCI).3
At Pfizer, after a decade of use, the open-access system remains the basis for simple, immediate, compound identity confirmation. The sole criterion for its use, says Kelly, is "the necessary turnaround time that allows the chemists to continue their work." At its Groton, Connecticut, USA research site, Pfizer maintains 20 open-access systems. And though the company periodically reviews whether the systems are satisfying chemists' needs, few changes based upon the systems' reliability and throughput at peak usage have been indicated.
The practical aspects of open-access design and use are indeed important. Chemists who need answers are not apt to be impressed with a system's added capability unless it is just as easy as performing thin-layer chromatography. I highly recommend two publications on this topic. One is by Arthur Coddington,4 and the other is by Lawrence Mallis.5 The authors expand on column choices and the benefits to be gained from proper site selection. They also illustrate details such as how to cut PEEK tubing correctly, and they thoroughly explain the advantages of doing so.
Increasing sophistication of automated capabilities has removed the problem of establishing the identity of newly synthesized compounds from the forefront. In doing so, however, it has spawned new imperatives: standardizing data input and output and managing data for subsequent interpretation and distribution. Hence, the recent and visible rise of a new branch of data science: informatics. Entire companies such as NuGenesis (now part of Waters Corporation, Milford, Massachusetts, USA) were formed to respond to the new need, and software such as AutoLynx, a MassLynx application manager (Waters), was invented.
Mark Kershaw is charged with devising analytical support for traditional medicinal chemists and the Neurogen High Speed Synthesis (HSS) team. The HSS team uses parallel solution phase syntheses to support discovery projects through rapid creation of chemical libraries. Key to achieving objectives is integrating novel workstations with the informatics platform to reduce resource demands and permit efficient movement of samples and data. Thus, the ability to interpret the established platform is a criterion for any methodology or technology that Kershaw's team adopts. In its characterization process, Neurogen uses MassLynx software (Micromass), which permits third-party programmers access to various input–output data and control "hooks."
Jeffrey Noonan, associate director of the HSS group, describes his company as technologically adept, a "mobile entrepreneurial epicentre of technology development." Neurogen's success in devising innovative ways to support drug discovery back that claim: just seven chemists can synthesize, purify, quantify and characterize more than 100000 compounds a year.
As with the original open-access systems, LC–MS data are viewed at the scientist's desktop via intranet web pages. Neurogen's platform supports QC functions: batch testing of incoming raw materials, tracking of library samples through the HSS workflow (including MS-directed purification), fraction management, and re-analysis (when required). Having met QC standards, established library compounds are distributed for use in biology screening and pharmaceutics profiling.
Table 1: Featured Scientists
At Neurogen, purity is achieved primarily via automated parallel solid-phase extraction (SPE), which can process 352 library compounds in a 3 h unattended run. Combining liquid–liquid extraction with appropriate SPE media (silica or SCX) provides an effective means of removing common library impurities and reagents. Postpurification, LC–MS purity assessments are based upon automated evaluation of chromatographic peak and spectral data. Rules for purity have been developed, including such aspects as isomer detection and identification of starting materials and reagents. Suspect samples are flagged for MS-directed purification when the more cost-effective SPE methods fail, or about 10% of the time.6
Few companies of any size would jettison gainfully employed equipment in favour of the next new thing on the market. Faced with a need to characterize an increasingly large number of library samples a few years ago, one could select a multiplexed LC–MS inlet system or several redundant LC–MS systems. Kershaw opted for the latter, installing two readily available time-of-flight (TOF) instruments (Waters LCTs) coupled with monolithic columns (50 mm × 4.6 mm Merck Chromalith), which could in relatively short order process the 100000 plus samples in the library. High flow-rates and short reverse gradients deliver 2 min run times that result in peaks 1–2 s wide. Unlike older quadrupoles, which could not scan fast enough for such narrow peaks, TOF instruments are effective because the target mass was well characterized. Very clean samples at low back pressure also gave column lifetimes of 20000 injections.
High-throughput profiling capabilities currently are being developed. Although in the future, pharmaceutics characterizations will be performed as they are today, broad early indications of parameters such as solubility are a useful feature of web-based informatics systems. When the ACQUITY ultrahigh pressure chromatography system (Waters) was introduced last year, Kershaw obtained one coupled with a Waters ZQ mass spectrometer, a quadrupole instrument that can scan to 5000 amu/s. Adding the new system increased specificity, spectral clarity and sensitivity gains while not deviating from the established system workflow design.
Pfizer Central Research has developed an enviable track record for automation and production. One obvious and extensive change in its practice is the harmonization of methods over a number of worldwide research sites, including the company's contract contributors. Not long ago, dry samples were standard in many libraries, and they required unique stability testing and manipulation for distribution. So much handling invited human error, which manifested itself in subsequent screens and elsewhere. Today, all samples are maintained in dimethyl sulphoxide stock solutions, and they satisfy criteria for identity, purity (as determined by UV and evaporative light-scattering detection [ELSD]), and concentration.
Still, the concern that we must learn more about concentration is legitimate. The measurement associated with a compound's registration (that is, when it enters the library at "time 0") is critical to making accurate subsequent assessments. After "time 0" determinations and storage in dimethyl sulphoxide, changes in concentration affect calculations and the results of screens used to determine the value of the compound as a potential therapeutic. Drug discovery has adopted dimethyl sulphoxide as its reagent of choice. Although hygroscopic, dimethyl sulphoxide's viscosity and other properties make it the reagent of choice for storing and dispensing NCEs. And yet, despite being almost a universal solvent, dimethyl sulphoxide does not work well for some products. For instance, drying some less soluble compounds can leave a difficult-to-re-dissolve pellet in the vessel. Storage, usage, freeze-and-thaw cycles (water absorption issues), and various computational implications are all subjects of an ongoing debate reported by the Laboratory and Robotics Interest Group (www.lab-robotics.org).
Because both open access and purification standards are specified, Kelly's primary interest today is improving concentration assessments and better understanding end-use effects. Of particular concern is how samples are affected by their handling. Does the person accepting the sample store it during use at room temperature, or does he or she refrigerate it? What are the benefits of single-use containers versus multiple-use containers? At Pfizer, initial (first-registration) measurements of purity and suitability are made using UV and ELSD. But as with all areas of practice, where only limited information on detector response is available (for example, mass balance estimations for assessing degradation of a compound), no true universal method is available. However, the industry is eyeing two contenders for a universal method: ELSD and chemiluminescent nitrogen detection (CLND).
Gary Schulte (Pfizer), whom I encountered at last August's CoSMoS conference (www.cosmoscience.org), has monitored the ELSD–CLND debate for years. Here is what he had to say about it:
Quantification methods for unknowns not using specific standards with their calibration curves is a hot topic on the discovery side of most pharmaceutical companies at the moment. It pits ELSD versus CLND methods in the LC–SFC world, and these methods versus NMR methods. The reason is that many folks are trying to read the concentration of their screening samples in a high-throughput mode.
At their current state of development, both ELSD and CLND instruments, although integrated in automated, unattended systems, are in need of improvement.
Pfizer currently favours the ELSD method, which, though more broadly useful, can incur a 20–30% quantitative error. Other means of adjusting concentration errors have been reported. Burke uses the scatter of results likely because of concentration effects from initial returns on IC50 screens to back-correct CLND results.7 Published work on ELSD using standards shows an average error of 8.3%.8 I.G. Burke's application uses no standards but adds the step of back-correcting the initial results. Anecdotal evidence suggests either detector, in the hands of an experienced practitioner, is a capable device. But neither has established a clear reputation for ruggedness in use, nor can either claim an extensive comparative base with more common detection such as UV and MS.
The drug discovery concept has changed significantly in less than a decade. In a hybridization of the traditional practice, project-oriented medicinal chemists who rely on automated processes can still purify their products via flash chromatography, although automated support processes play a much larger role today. Automation itself has matured into a more homogeneous, extensively used tool, whereas its successful use a decade ago was restricted to but a few isolated roles. The open-access concept, alive and thriving, has become a highly useful means of characterizing NCEs early in the discovery process. Subsequent tiers of quality control analysis offer a view of sample integrity at the baseline of initial dissolution, a baseline that will be monitored over the sample's lifetime.
"MS in Practice" editor Michael P. Balogh is principal scientist, LC–MS technology development at Waters Corp. (Milford, Massachusetts, USA); an adjunct professor and visiting scientist at Roger Williams University (Bristol, Rhode Island, USA); and a member of LCGC Europe's Editorial Advisory Board. Direct correspondence about this column to "MS in Practice", LCGC Europe, Advanstar House, Park West, Sealand Road, Chester CH1 4RN, UK, or e-mail: dhills@advanstar.com
1. D.V. Bowen et al., Rapid Commun. Mass Spectrom., 8(8), 632–636 (1994).
2. F.S. Pullen et al., J. Am. Soc. Mass. Spectrom.,6(5), 394–399 (1995).
3. L.O. Hargiss et al., http://www.sisweb.com/referenc/applnote/app-90.htm, Novartis Pharmaceuticals, Summit, New Jersey, USA Sept. 2000.
4. A. Coddington, J. Van Antwerp and H. Ramjit, J. Liq. Chromatogr. Rel. Tech.,26(17), 2839–2859 (2003).
5. L.M. Mallis et al., J. Mass Spectrom., 23(9), 889–896 (2002).
6. J.W. Noonan et al., J. Assoc. Lab. Auto, 8, 65–71 (2003).
7. I.G. Popa-Burke et al., Anal. Chem., 76(24), 7278–7287 (2004).
8. B.T. Mathews et al., Chromatographia, 60 (11/12), 625–633 (2004).
AI and GenAI Applications to Help Optimize Purification and Yield of Antibodies From Plasma
October 31st 2024Deriving antibodies from plasma products involves several steps, typically starting from the collection of plasma and ending with the purification of the desired antibodies. These are: plasma collection; plasma pooling; fractionation; antibody purification; concentration and formulation; quality control; and packaging and storage. This process results in a purified antibody product that can be used for therapeutic purposes, diagnostic tests, or research. Each step is critical to ensure the safety, efficacy, and quality of the final product. Applications of AI/GenAI in many of these steps can significantly help in the optimization of purification and yield of the desired antibodies. Some specific use-cases are: selecting and optimizing plasma units for optimized plasma pooling; GenAI solution for enterprise search on internal knowledge portal; analysing and optimizing production batch profitability, inventory, yields; monitoring production batch key performance indicators for outlier identification; monitoring production equipment to predict maintenance events; and reducing quality control laboratory testing turnaround time.