An abbreviated version of this article appeared in the May 2021 issue of BioPharm International.
OR WAIT null SECS
Cynthia A. Challener, PhD, is a contributing editor to BioPharm International.
Mass spectrometry and automation are growing in importance for protein characterization, but further improvements are still needed.
While monoclonal antibodies (mAbs) still dominate the biologics drug market, many novel therapeutic modalities are in the biopharma pipeline, with some already commercialized. Many of these next-generation therapeutics present new requirements with respect to analytics. Accelerated approval designations create further challenges with respect to shortened development timelines
Protein characterization is an evolving and critical application for the biopharma industry that enables the evaluation of critical quality attributes (CQAs) such as identity, potency, purity, and stability, according to Anis H. Khimani, senior strategy and product portfolio leader for life sciences at PerkinElmer. “Analytical technology platforms that can help biopharma companies deliver on CQAs and create reproducible and robust methods of development and validation are important,” he states.
Proteins are large and complex molecules that require methods with high sensitivity, accuracy, and specificity, adds Patrick Tishmack, general manager of AMRI’s West Lafayette, Ind., facility. The characteristics of such large molecules challenge the capabilities of most analytical methods, which then push the limits of available technology. “Primary, secondary, and tertiary structure, as well as purity and potency, are critical performance characteristics to be determined for protein biologics. Aggregation and immunogenicity are also critical issues to address. Protein impurities that are similar to the molecule of interest require techniques with sufficient selectivity to differentiate and separate them. Small-molecule impurities require techniques that maintain reliability over a wide molecular weight range,” he say
As a result, the biopharma industry is pushed to innovate and be at the cutting edge of analytical technology development. Drug manufacturers, contract manufacturing and research organizations, instrument suppliers, and regulators must therefore work together to advance capabilities for protein analytics to ensure the rapid development of safe and effective biotherapeutics.
Biologics continue to rise in use as therapeutics, especially in cases where small molecules have not fully addressed diseases, such as oncology. This rise has, according to Trevor Jones, head of marketing for analytical chemistry at MilliporeSigma, highlighted the need for easily integrated characterization and quality control methods that can be seamlessly integrated and transferred among sites.
“Speed and accuracy are the drivers,” Jones asserts. “Increasing the speed of assays to approach real-time release and increasing precision or repeatability of experiments provide greater data quality assurance. Structure-function relationship is always key to understanding performance and setting specifications.”
In addition to increasing the speed of workflows, Ulrik Lytt Rahbek, vice-president of assay, analysis, and characterisation with Novo Nordisk, notes that automating sample workup and workflows and increasing the resolution and sensitivity of analytical techniques are additional drivers.
An abbreviated version of this article appeared in the May 2021 issue of BioPharm International.
For mAbs, platform characterization methods have been established. More complex biologics such as multi-specific antibodies introduce significant complexity, with the number of different charge variants—many of which can be undesirable from a safety and efficacy standpoint—increasing exponentially, according to Susan Darling, senior director for capillary electrophoresis (CE) and biopharma product marketing with SCIEX. “Many of the assumptions made about mAbs don’t apply to multi-specifics, and defined analytical tests are needed that provide as much detail for as many parameters as possible,” she says.
Other increasingly complex modalities include fusion proteins, bioconjugates, recombinants, nanobodies, and oligomeric structures, as well as newer approaches such as lipid nanoparticles, mRNA, and exosomes. Outside of the traditional biotherapeutics, there are the gene therapies that use viral vectors comprising 70 proteins and their encapsidated transgene that all require analytical characterization and development of their product quality attributes, according to Scott Berger, senior manager of biopharmaceutical markets with Waters Corporation.
“As these modalities enter the market, the challenge to assess CQAs and characterize them becomes more difficult. The challenge is to develop methods that are suited to the assessment of very high concentration drug substances, highly-glycosylated species, complex formulations/excipients, and even alternative approaches to traditional host-cell protein (HCP) coverage,” observes Todd Stone, senior manager of analytical development at Catalent.
Not only are new analytical methods required for the more diverse format of protein molecules, the chemistry, manufacturing, and controls (CMC) development cycle has decreased from approximately two years to less than six months, according to Gang Huang, senior vice-president of analytical sciences and clinical quality control for WuXi Biologics. To support CMC development within this compressed timeline, faster and higher throughput analytical methods are needed to meet such challenges. He adds that development of process analytical technologies (PAT) is essential to support continuous processing with real-time analysis solutions.
In addition, comments Berangere Tissot, director of biochem method establishment and biologics characterization with Eurofins BioPharma Product Testing, material availability is becoming an issue. “Protein characterization takes place at all steps within the drug product cycle and for many different reasons. Whether it is the first [investigational new drug] (IND)-enabling characterization or a reference-standard qualification post-commercialization, characterization data are embedded in any regulatory submission but are generally the last items to get checked off the list. The amount of material left to perform these characterization assays is therefore often limited, and the time allocated to get the results is inversely proportional to the depth of the results required,” she explains.
Novel, rapid methods are also needed in the research/discovery phase where thousands of highly complex molecules must be quickly analyzed for various properties and assessed for developability, according to Maria Wendt, head of US large-molecule research and global head of digital biologics platform at Sanofi. “More functionality is demanded from single molecules to afford multi-targeting products. Even so, protein characterization assays for these complex molecules need a certain throughput while still providing important individual molecular information. Melanie Fischer, group leader for assays and analytics at Sanofi, adds that “a comprehensive analytical toolbox composed of high-throughput compatible assets complemented with lower throughput high quality assays is thus key to success to react to the growing zoo of multi-specific drugs.”
The complexity of new modalities also makes their production, purification, and characterization more difficult than traditional mAbs, according to Wendt. These products, according to John Rogers, director of chemistry and biochemistry, protein and cellular analysis, Thermo Fisher Scientific, require extensive characterization for structural and chemical modifications, assessment of host cell and other contaminants, and identification of reference standards and methods. “These efforts are often limited by an inability or incompatibility of existing systems to meet the characterization needs,” he observes.
Conventional purity methods such as CE and high-performance liquid chromatography (HPLC), for instance, which are used to quantify product-related variants and impurities, have been powerful tools but suffer from relatively low resolution and insufficient sensitivity for the detection of low-abundant protein product variants, particularly in the analysis of multi-specific antibodies containing new variants such as chain-mispairing and other closely related post-translational modifications (PTMs), notes Huang.
Existing methods may also not allow characterization of proteins within complex mixtures, such as in the case of gene therapy products comprising both proteins and oligonucleotides, according to Tissot. “For this new class of products, the data sets are more complicated and there are no platform methods, controls, or reference materials in place yet. Therefore, characterization of these new products requires a different perspective on the existing toolkit, with more controls built into methods and more optimization required,” she remarks.
Drug formulations, meanwhile, are a complex mix of APIs, excipients, buffers, and depending on the mode of delivery, adjuvants, lipid carriers, emulsifiers, and surfactants. “With each novel formulation component, new analytical methods are needed to detect and evaluate their effects on the stability, immunogenicity, efficacy, and safety of the drug,” comments Bettina Kehoe analytical development scientist with Catalent.
In addition, for many new analytical and characterization techniques, in fact, the maturity is often at prototype level, and additional development is required to implement new methods that have demonstrated robustness for higher throughput of sample analysis, according to Rahbek. “Technology is either moving too fast or too slow,” adds Tissot. “Some assays are still using old technology that needs to be improved, such as Edman sequencing, while complex instrumentation is being introduced that only provides meaningful information in an academic research environment,” she explains
Darling comments that the most common challenge customers are having in the development of new protein characterization methods is ensuring sufficient reproducibility and robustness. This can often mean that exponentially more effort must be invested in achieving the reliability necessary for the analysis of drugs that are injected into humans. “With the wide range of newer, highly complex and diverse modalities, that goal is becoming more critical to reach,” she says.
Furthermore, new methods may not be adequately scalable and may require unique skills, instrumentation, and software, Rogers adds. Adopting new methods requires dedication of significant time to understanding any new application theoretically as well as specific issues around successful day-to-day operation, agrees Andrew Hanneman, a scientific advisor at Charles River. The appropriate setting (quality control, process development, characterization, etc.) for new methods may also not be initially obvious. As diverse biologics and formulations move from academic laboratories toward commercialization, there is also continuous need for updated analytical methods and instrumentation, according to Kehoe.
Specific issues outlined by Jones include reagents for sample preparation and sample isolation/purification and derivatization, which can cause poor reproducibility and development of reference standards can be difficult due to limited availability. Biologics with newer mechanisms of action will require novel approaches to cell-based assays. Antibody-drug conjugates and protein capsids in viral vectors present their own unique sets of requirements with respect to understanding how specific properties affect functionality.
Regardless of whether protein characterization is needed for a conventional mAb or new therapeutical modality, choosing the appropriate set of orthogonal analytical techniques is crucial and challenging, because one or two methods are rarely sufficient to adequately characterize a biologic, according to Tishmack.
Consolidating assays carries risk too, though, says Tissot, because each individual technique for characterizing a protein—or a more complex biologic—has inherent bias that could lead to a mis-estimation or misleading assignment of quality attributes when combined with other methods. “Some consolidation can happen, but the only way to overcome the need for speed is to develop better interpretation software, because interpretation is often the most complex and lengthy part of a characterization study,” she asserts.
The need to conduct numerous orthogonal techniques to ensure full protein characterization is a challenge not only due to time and resource demand, but also from compliance and data management perspectives. “Addressing software challenges from a compliance perspective is critical to meet regulatory guidelines,” Khimani observes. He points to electronic records and signatures, as well as strong security and audit trails, as hallmarks for 21 Code of Federal Regulations (CFR) Part 11 compliance. For characterization methods that may end up as quality controlassays in the absence of equally specific or equally performing quality control-friendly assays, Tissot notes that more may need to be done from a method-readiness standpoint than is usually performed on purified proteins, particularly with respect to 21 CFR Part 11 considerations.
The large amounts of data generated by the array of analytical technologies used for protein characterization, while welcome, is also presenting another layer of challenges. “Many organizations are struggling with both the volume of data and the capacity to leverage it fully in service of their drug-development goals, Bill McDowell, group leader and head of analytics at Abzena, observes.
“We have found that while our clients ask for their data in raw form, some of them just do not have the infrastructure in place to use, manage, and store it properly. It may be that the industry needs to lend even more scrutiny to data handling and sharing practices with labs and services providers while adopting a more common base of standards and technologies to suit the current and emerging needs of biopharma’s developers,” McDowell concludes.
Innovation and its adoption are linked to confidence in the measurement, asserts Colette Quinn, marketing manager for biopharmaceuticals at Waters Corporation. One way to build that confidence, she says, is to introduce new methods to various places within the workflow. There tends, Tissot agrees, to be less opposition to new characterization techniques for new quality control methods. “Because characterization relies on scientifically sound principles, it is up to the applicant (and its providers) to demonstrate that the method used or that the technology used is suitable for use and scientifically sound,” she notes.
Fortunately, regulators realize that protein characterization as quality control is an emerging field and, therefore, are flexible when it comes to the development of new tools and methods, according to Jones. “Drug manufacturers are often conservative in their approach to new methods because of the uncertainty it might cause in approval delays. We have found, however, that regulators are open to novel approaches so long as they are scientifically sound and meet validation and/or qualification criteria,” he comments.
The key to successful adoption of new analytical methods by regulators, according to Huang, is demonstrating that they are not only scientifically sound and robust, but also capable of generating reliable results and useful for defining control strategies. New methods should, he adds, be developed to analyze certain CQAs that are not covered by the existing panel of methods or otherwise be superior with respect to performance (accuracy, sensitivity, reliability, etc.). Ideally, says Khimani, they should also offer a broader applicability from discovery through development, including downstream process monitoring and quality assurance/quality control.
For more novel approaches such as multi-attribute monitoring (MAM), also known as quantitative peptide mapping, the emerging technologies teams within regulatory bodies often act as bridges, enabling companies to collaborate with regulators to build confidence and best practices for employing those methods, according to Berger. This approach has, he says, allowed many companies using MAM approaches to work toward deploying them beyond development and into clinical lot release.
There is also, however, Darling adds, the need to build consensus within the industry as a whole before any new method is widely adopted. First the consensus must be established between scientific experts at various biopharma companies; wider consensus can then only be built through back-and-forth with different groups within the various regulatory authorities. “Achieving critical mass is a slow process. Most companies vet a new technique or method for at least a year or two before moving it into quality control,” she says.
For regulators to accept new technologies for CMC-based characterization analysis, they need to see demonstration of equivalency or superiority of the new method for its intended purpose, Berger says. “Significant crossover studies may be required to demonstrate these outcomes, as assays in characterization not only document normal product variation, but are often employed for stability, forced degradation, and process development studies that give rise to a wider variety of product variants, often at low levels,” he comments.
Contributing factors leading to slower adoption of new analysis and characterization techniques include the need to invest in new equipment and build up necessary competencies, and the time delay for projects compared to what is possible using conventional methods, most notably due to the need to demonstrate comparability to current methods, according to Rahbek.
In addition, Darling points out that if an existing method is part of a regulatory filing for a global product, then both the old and new methods will need to be performed until all of the regulatory agencies approve the new method, which can take years. “Not surprisingly, there have to be compelling reasons and significant benefits for a drug maker to switch to a new method for a commercial product,” she says.
The resistance to adopting new analytical techniques for development candidates is often tied to the need to adhere to strict program timelines, according to Robert Vaughan, associate principal scientist in Catalent’s analytical development group. He also notes that there is additional risk associated with adopting less-established technologies given the added cost and potential for unanticipated method-specific challenges that might be encountered during development.
“Furthermore,” Vaughan adds, “adoption of any new method that requires specialized equipment would require training analysts in both process development and quality control, acquiring systems for both groups, integration into a regulatory-compliant data management system and subsequent validation. Redundancy of equipment is also important to maintaining timelines should one system go down, which can significantly increase the cost. When benchmarked against established platform methods that have a history of success within a regulatory environment (albeit lower resolution), the added time and cost can be a significant barrier.”
Rogers agrees. “Many companies already have investments in proprietary and/or established methods that would require significant commitment and investment to replace. There also may be questions regarding the validation requirements, and often there is a fundamental resistance to change that must be overcome as well,” he says.
Adding to the challenge is the fact that the benefits of new and improved methods may not initially be obvious prior to testing in one’s own laboratory, according to Hanneman. “In a fast-paced industry where experience is highly-valued, experienced developers may need to choose familiar approaches for expediency, presenting a barrier to adoption,” he observes.
It should be noted, though, that because the increasing number and diversity of new modalities is creating the need for new analytical methods, there is a growing level of incentive to adopt new techniques for these new biologics in particular. “All of these biotherapeutics would benefit from the use of orthogonal approaches that can help to reduce bias toward one method or one result,” Quinn asserts.
“Scientists tend to gravitate towards solutions with which they are most comfortable, and although data might be collected along the way, inserting a number into a cell on a spreadsheet can be a lost opportunity,” says Quinn. “Instead, the goal should be to increase confidence and look for trends that could save time when future studies are conducted. In addition to asking what has changed, it is important to consider whether that change matters. To answer both questions truly requires an orthogonal approach,” she asserts.
In addition, Quinn notes that by linking outcomes, the values from these systems can possibly become predictors of the next set of tests. As an example, she points to linking chemical outcomes like those determined via LC separation with native ion mobility or biophysical data such as differential scanning calorimetry and ultimately extending this information into interpretation of binding assays.
Berger adds that with the recent progress in International Council for Harmonisation guidelines Q12 and Q14 there is greater potential for more flexibility in analytical methods used for product analysis, which is aligning well with the growing interest in pursuing more formal analytical quality-by-design (AQbD) method development approaches. “The power of platform methods to accelerate COVID-19 molecule development and the benefits of systematically optimizing these methods is becoming very clear,” he says.
All stakeholders, such as biologics manufacturers, regulators, technology providers, and contract research organization (CROs), need to consider the properties and stability of protein drugs. A concerted collaboration and partnership between biologics developers, technology/service providers, and regulatory agencies greatly facilitates bringing a safe and efficacious protein drug to market, says Khimani.
“Collaborations between industry, regulatory agencies, instrument companies, and standards organizations during the development of new methods [are] critical,” agrees Rogers. It is key to developing technologies and methods that are robust and fit-for-purpose, adds Berger. “Organized industry meetings bring these groups together to explore technology, methodology, and its impact on regulatory science. Other more focused collaborative organizations such as the National Institute for Innovation in Manufacturing Biopharmaceuticals and the MAM Consortium dive deeper and look to establish consensus performance expectations and best practices to bring new methods into routine use,” he explains.
The R&D/analytical development departments of biologics manufacturing companies are in the unique position to bridge established, legacy-testing methods with newer technologies emerging from academic research programs and innovative instrument companies, Kehoe asserts. “Whether it is an improved reagent, a new assay, or a more efficient instrument, these departments can show equivalency to current CQA assays, demonstrate improved test performance, or provide additional orthogonal testing options,” she says.
“Instrument companies may be looking for new applications for their existing technology that have not yet migrated to the protein biologics field, or they may partner with a biologics manufacturing company or CRO who tests biologics to develop something completely new to solve an existing problem,” Tishmack comments.
Instrument vendors, meanwhile, will often introduce brand new technologies to the entire community including regulators, building a sense of urgency, according to Rahbek. In addition, Kehoe observes that as instrument companies continue to include installation qualification and operational qualification in their analysis programs, methods developed on these instruments can be transferred readily to quality control laboratories for the release of manufactured drug products.
Most CROs, biosimilar developers, and contract manufacturers often have low inducement to develop new analytical and characterization methods because they are mostly required to follow already established methods, according to Rahbek. However, Jones notes that some CROs and contract development and manufacturing organizations (CDMOs) do pursue the development of cutting-edge methods for protein characterization as a means of differentiation and to attract pharmaceutical companies seeking outsourcing partners that specialize in specific assays.
CROs with biologics expertise will know the existing challenges and continuously look for appropriate and efficient solutions, adds Tishmack. “Many times, a CRO will advance a unique application of a common technology because of the wide variety of problems they encounter and solve for their clients,” he says.
Overall, Huang observes, instrument companies manufacture analytical instruments and periodically incorporate new technologies into these analytical tools for biologics manufacturers and CDMOs/CROs. Drug developers are willing to explore new technologies and turn them into new analytical methods. Regulators ensure these new analytical techniques can serve the intended purpose while ensuring patient safety is not compromised. If a regulatory agency finds a new method for biologics characterization acceptable in a filing for one developer, this increases the likelihood of propagation of the method for use with other biologics, Tishmack observes.
While the fact that large biopharmaceutical manufacturers set the tone for biophysical characterization helps get cutting-edge techniques publicized, recognized, and exposed to regulators, there is a downside, according to Tissot: smaller companies are sometimes left in a difficult position because not all can afford highly expensive, specialty instruments nor the time to develop an optimized method internally or at an external provider. In addition, she notes that when instrument vendors collaborate with large biopharmaceutical manufacturers, they often tailor their software to the needs of those companies. Once the software is released, smaller drug developers are sometimes forced to follow the process that the software was optimized for.
Quinn argues, though, that most of the time the molecules drive innovation in the analytical instrumentation sector. “Strong collaborations between manufacturers, CROs, and analytical instrumentation companies leads to hyper-personalization of equipment, chemistry, and methods and enables faster progression of the science,” she asserts.
“What is exciting,” claims Fabio Rossi, scientific lead at Abzena, “is the increasingly collaborative role the industry is taking collectively to shape and refine protein characterization methods and study protocols. Abzena is able to improve on its clients’ methods to achieve more efficient and effective outcomes. Instrument companies are also in a cycle of continuous improvement, and over the past five years or more we have seen a tremendous improvement in study cost efficiency, speed, and the efficacy of the data analytics,” he says.
“Protein characterization techniques,” Rossi continues, “are diverse and the data each yield is too. But the information that can be derived from a well-integrated line of analytical enquiry linking the best of each technique will likely be more robust and valuable relative to the overall economics of drug development.”
The complexity of biologic products and the need for comprehensive characterization creates some unique challenges for analytical scientists. Despite all of the advances in mass spectrometry (MS) and other methods, there is still opportunity for further improvement.
For instance, there is a significant need for alternative approaches to HCP characterization and clearance during purification, according to Vaughan. “HCPs are a unique challenge given their complexity and diversity, and they pose a significant risk to any protein therapeutic given the potential of certain HCPs to impact product stability and immunogenicity of patients,” he says.
Enzyme-linked immunosorbent assay-based approaches used currently have high sensitivity against a single target, but there are thousands of HCPs present in the initial harvest, which have the potential to vary between lots. Mass spectrometry, Vaughan believes, is well-poised to fill this gap given its ability to both identify and directly quantify individual HCPs with both high sensitivity and accuracy.
Quantification of polysorbates (PS20 or PS80) in protein therapeutic formulations is another area in which advances in analytical techniques are still needed, according to Kehoe. “Regulators have recently started to increase the detail required of drug formulations, and the analytics for characterization of polysorbates is lagging behind,” she observes.
Glycosylation analysis remains a challenge as well. “Analysis of the N- and O-glycan profiles of any molecules but mAbs is an arduous task,” Tissot remarks. “Most of the methods are either highly quantitative or highly qualitative and there is no real happy medium where you can safely assign structures and report relative percentages in one go without major bias,” she explains. The ability to use LC-MS with electron capture dissociation/electron transfer dissociation (ECD/ETD) is an improvement, but most of the workflows and techniques remain in the academic realm to date.
The bi/multi-valent nature of multi-specifics bears the risk of self-association that can result in poor high-concentration solution behavior, according to Fischer. She believes high-throughput predictive assays to assess colloidal stability of biotherapeutics early on are a key advantage in the discovery of multispecific drugs. Similarly, the analytics of mispairing, in particular when compound numbers are high—in discovery but also in cell line development—is a challenge. Here again, Fischer notes that compatible MS approaches are important for effective analyses of samples with increasing compound numbers. “We have made some advances on these fronts at Sanofi,” she says.
Rogers also points to the need for tools and workflows for efficient protein variant screening and structure determination across a broad molecular weight range, amino acid misincorporation, characterization of intact proteins and protein complexes, characterization of low-abundance PTMs (natural or introduced by the manufacturing process), and the characterization and quantitation of specific proteoforms. “Workflows utilizing CryoEM, [ultra-high mass range] (UHMR)-MS, and complementary tools and software are of great interest to structural biology researchers in academia and industry because they offer great potential for improved screening, characterization, and understanding of protein structure and proteoforms,” he states.
The rapid emergence of viral and nanoparticle-based delivery has created the need for methods that address questions of composition (e.g., lipid profiling), as well as measures for proper assembly of these macrostructures (e.g., empty/partial/full analysis for the biotherapeutic molecule), according to Berger. The acceleration of gene and cell therapies reinforces and extends these needs.
In addition to developing capabilities in advanced MS techniques to address such challenges, Waters is working with emerging technology companies such as MegaDalton Solutions to bring new technologies like charge detection-MS (CD-MS) into the commercial realm. The company is also, Berger notes, investing significantly in new generations of HPLC and ultra-performance liquid chromatography (UPLC)/ultra-high performance liquid chromatography
(UHPLC) chemistries and consumables for the next generation of size- and charge-based separations for application to new modalities because these methods will also play a key role in routine analysis.
The possibility to automate and increase robustness of current manual workflows has led to expanded use of existing technologies, which has in turn led to measurable advances in protein characterization capabilities, says Rahbek, pointing to the increased use of CE techniques and MS-based analysis. MAM, Rahbek says, is of particular note because has the potential to replace traditional methods with one LC-MS based method.
The main overall trend, says Jones, is the movement of LC-MS out of the laboratory setting and into the quality control lab because of its value in biotherapeutic characterization. “We can start to contemplate having fewer assays moved into quality control that are faster and more accurate,” he states. The ability to monitor product and process variation at the intact, subunit, or peptide levels with greater specificity and sensitivity while conducting fewer assays is an objective many labs are now investigating for both development and manufacturing/quality control applications, Berger agrees.
“It comes back to the value statement of what can you discover if you can see more,” observes Quinn. “Not only is it the high resolution of new mass spectrometers but also the incorporation of ion mobility into quadrupole, time-of-flight mass spectrometers that is enabling scientists to deconvolute complex spectra and better identify peaks that were overlapping,” she explains.
Important recent advances in protein characterization include intact protein analysis with ultra-high mass range (UHMR)-MS, and simpler and less costly instruments for functional assessment of binding activity and protein secondary structure as, says Rogers.
Sensitive detection and quantitation of sub-visible viral particles and protein aggregates are improving in several areas as well, according to Hanneman. In addition to instruments capable of measuring several aspects of protein structure within a single run and MAM solutions, specific examples include improved in-vitro bioassays, analytical methods runnable under good manufacturing practices (GMPs) using fully 21 CFR Part 11-compliant instruments and software, native LC-MS of large intact proteins and glycoproteins by various HPLC methods including ion-exchange (IEX) and size-exclusion chromatography, and alternate fragmentation approaches for MS/MS of modified peptides and branched structures, including ECD/ETD-MS and multi-stage MS.
Ion mobility-mass spectrometry (IM-MS) and hydrogen-deuterium exchange-mass spectrometry (HDX-MS) are two other MS-based techniques that should find many applications with protein biologics, according to Tishmack. He notes that IM-MS adds a unique dimension to MS because the shape of the molecule can differentiate multiple conformations, aggregation states, or small mass differences that cause heterogeneity of proteins. HDX-MS, meanwhile, is useful for studying protein-protein and protein-ligand interactions, conformational changes related to protein activity, and protein stability among other phenomena. “The detailed understanding of protein behavior that HDX-MS affords therefore has direct application in biologics drug discovery and development,” Tishmack observes.
In Sanofi’s research lab, there is real need to access high-throughput methods that perform at a high level even for highly complex biologics. One recent advance noted by Wendt, is a method coupling size-exclusion chromatography with MS for the rapid characterization of tri-specific antibodies. “With this beautiful characterization capability for such complex molecules, Sanofi is placed in a much better position to deliver more complex drugs with a higher likelihood of success for addressing more challenging diseases,” she states.
“It has been difficult to date to couple really high-resolution techniques with mass spectrometry, but real progress has been made, which is giving scientists the ability to achieve deep characterization quickly,”agrees Darling. She highlights capillary isoelectric focusing (cIEF)-MS as a method providing rapid separation and identification of charge variants with resolution high enough to identify isoforms on intact proteins.
The possibility of identifying both charge and size variants at high resolution down to the PTM level for peptides is a game changer, according to Stone. “Efficiency gains from the elimination of bench-scale fractionation and sample preparation are enormous, while the increased depth of data provides a much more comprehensive view of potential molecule degradation pathways and subsequent adjustments to CQA control strategies,” he says.
In addition to such techniques with better resolution and sensitivity, Huang emphasizes the importance of the development of methods for in-vitro bioactivity analysis that better reflect the clinical effect of biologics products in humans. “Through the study of 70+ specific cellular models of pathologies, WuXi Biologics has developed proprietary technologies enabling the best linking of in-vitro cellular response and ex-vivo biomarkers to in-vivo responses for biologics. These technologies allow effective reduction of complicated clinical effects into simple, clear CQAs at the cellular or molecular levels, and are therefore becoming an integral quality tool for us,” he comments.
Considerable advances have also been made in biophysical instruments, according to Tissot. “Manufacturers are striving to produce new models that are more versatile, more resolute, and more efficient from a sample-volume-requirement standpoint,” he says. The latest analytical ultracentrifugation (AUC) instruments, for example, allow analysis using up to 20 discrete wavelengths. “This advancement has the potential to revolutionize the field of AUC by enabling more precise analysis of complex systems,” Tissot asserts.
Cryoelectron microscopy (CryoEM), adds Tishmack, is a powerful method for molecular structure determination that has certain advantages over crystallography, but the instrument has been very expensive and not readily available until recently. Today, however, a somewhat less powerful CryoEM instrument that is relatively affordable is available that, he says, should result in more applications of the technique for developing biologics particularly in the early stages where structure-activity relationships must be determined.
Some of the biggest developments, Darling asserts, have not occurred with particular assays, but rather with data processing capabilities, which have allowed more rapid evaluations. McDowell, agrees. “Although advanced MS techniques are attracting researchers’ attention, what is advancing the refinement of analytical techniques more recently is the introduction of increasingly smart information and data analysis and sharing systems and automation solutions that speed up things like sample prep and analysis,” he asserts.
“The refinement of both the specific techniques and their supporting technologies, as well as the refined and focused application of the methodologies themselves, has evolved to a point where the industry’s confidence in these studies is extremely high because the data adds such tremendous value,” concludes McDowell.
Although many advances have been made in new technologies such as CE and MS, there is still, Rahbek underscores, a need to develop these methods to a sufficient robustness level for a regulated setting such as a quality control lab.
Some of the biggest needs, according to Jones, are around the tools to support LC-MS, including reagents for derivatization. Derivatization is necessary for several tests, including peptide mapping and glycan analysis. Chromatography columns for specialty applications continue to be developed as well as mass spec standards, including isotopically labeled standards that meet quality control and regulatory standards.
As an example of a dramatically improved protocol, Wendt points to a new method Sanofi uses during cell-line development that allows for protein analysis following a simple clarification step and no need for Protein A purification. “This method truly enables high-throughput work,” she notes.
Analytical data management solutions are also needed that enable comparison across drug proteins as well as constant evaluation and update of metrics that can enable standardization across various types of protein drugs, according to Khimani.
There is also a need, according to Rahbek, for automating data analysis as the automation of instrumental workflows becomes a reality. Indeed, Tissot would like to see improvement in automation of both sample preparation and data interpretation so more characterization methods can be introduced in process-development studies. “Such a move would require characterization assays be turned into semi-quality control assays where some elements of precision or linearity would have to be demonstrated. The MAM workflow is a good example of this area of improvement but being able to extend this workflow to other complex quality attributes would be valuable,” she observes.
Wendt adds that the highly sophisticated technologies in use today provide very rich, multidimensional data that are not being optimally used. “Making use of big data, artificial intelligence, and machine learning will certainly bring us to the next level,” she states. “Data science is commonly realized as an area with high potential, and investments are needed to ensure appropriate data management and programming of in-silico tools and algorithms to leverage the data that are already being generated,” Wendt adds.
For instance, these approaches could be used to provide information about proteins much earlier in the development process, and in particular in the form of predictive models that can be used during biologic drug discovery. “If we are able to quickly screen thousands of molecules and accurately assess the potential of each based on a whole host of properties, candidates with much higher likelihoods for success in development could be identified more rapidly, or we could intervene earlier, for example via engineering, saving tremendous time, money, and resources,” asserts Wendt.
Protein characterization, believes Wendt, is one of the most important parts of the drug discovery business. “If we do not have good characterization of proteins, then we can’t know the potential products we are working on. We can’t achieve the best interpretation of functional assays nor develop the best, most reliable and most consistent products,” she states.
For the most part, according to Rossi, the industry is working hard to implement advanced protein analytic inquiry techniques to ensure therapeutic goals are met and the required biologic activities of key components of biologics of all kinds are demonstrated.
There is, Wendt says, always the need for analytical methods with higher throughput that provide more relevant and detailed data at lower cost. This demand has increased as the complexity of candidate biologics has increased. “We need advances in protocols, methods, and data processing capabilities that will allow rapid and in-depth characterization without time-consuming purification or derivatization steps,” she says.
“What pharmaceutical researchers are calling for now,” agrees Rossi, “is for access to more user-friendly studies that yield more reliable, validatable data faster and with fewer resources. Equipped with sharper tools and refined techniques, researchers will continue to be able to chip away at the things that drive development costs and ultimately the price of drugs higher, while working to speed development and see their breakthroughs to patients faster,” he concludes.
Cynthia A. Challener, PhD, is a contributing editor to BioPharm International.
Vol. 34, No. 5
When referring to this article, please cite it as C. Challener, “Evolving Analytical Technology Unravels Protein Characteristics,” BioPharm International 34 (5) 2021.