Determining Process Quality Metrics for CMOs

Published on: 
BioPharm International, BioPharm International-06-01-2012, Volume 25, Issue 6

Implementing quality by design makes the determination of quality metrics across CMOs and sponsors essential.

The bio/pharmaceutical industry is witnessing two important manufacturing trends. First is the increasing adoption of quality-by-design (QbD) approaches to enhance process understanding, which results in a long list of consumer, regulatory, and business benefits. The second trend is the increased use of contract manufacturing organizations (CMOs) to better manage plant capacity and control production volumes.

According to the 2012 BioPlan Associates' 9th Annual Report and Survey of Biomanufacturing, "Relatively few companies have outsourced all of their manufacturing, but nearly one-half of surveyed manufacturers expect to increase their budgets for biopharmaceutical CMO outsourcing (in 2012)" (1). As relationships with CMO relationships increase and expand, and QbD becomes more essential to business strategy, it is important to examine best practices for determining quality metrics across CMO and sponsor networks.

A CHALLENGING INTERSECTION

At the intersection of QbD and CMOs is the challenge for sponsor organizations, who have invested time, technology, and resources into QbD programs for their own manufacturing sites, to ensure that products manufactured by CMOs meet similar quality expectations for safety and efficacy. With QbD in mind, many biopharmaceutical manufacturing teams have integrated process intelligence technology tools and practices into their working approaches to collaborate across company-owned (or captive) global sites. Such tools can help teams access large amounts of process data stored in disparate systems (e.g., laboratory information management systems, enterprise resource planning systems, manufacturing execution systems, and records stored in paper formats) for proactive process monitoring and investigative analysis.

As part of the QbD process, many sponsors have developed sets of critical process parameters (CPPs) and key performance indicators (KPIs) to routinely monitor processes and increase inhouse process understanding. When it comes to outsourcing, sponsor companies desire a similar level of monitoring and understanding from their CMO partners.

The complexities of data sharing across sponsors and third party CMOs—many of which may be geographically dispersed—include the following challenges:

  • Sponsors and CMOs have different operational models and, therefore, different interests when it comes to data. Sponsors are looking for quality assurance of the product, while CMOs most often use process data to compare one batch with another at a local level. The manufacturing processes of CMOs can also vary greatly between plants, which means that direct site-to-site data comparisons may not be appropriate.

Advertisement
  • Most existing sponsor–CMO contracts do not fully specify which data must be shared, or what mechanism should be used for sharing it. In addition, contracts often do not incorporate enough flexibility to take advantage of improvements in process or data sharing technologies that may become available during the life of the agreement. Data sharing is usually limited to release data for regulatory requirements, and does not include data that are comprehensive enough for the proactive monitoring and investigational analysis required for QbD and continuous process verification. In addition, CMOs tend to produce products for several customers and can be cautions when it comes to allowing sponsors to tap into existing IT systems to retrieve data.

  • CMO records are frequently stored and communicated to sponsors as paper-based documents, making access and analysis difficult.

Sponsors need to collaborate with CMOs to develop monitoring programs that assist in process understanding and verification at CMO sites, similar to the programs sponsors have developed for use across their own global manufacturing networks.

CASE STUDY

A global biopharmaceutical company needed to establish parameters to monitor processes at a CMO site to facilitate data analysis for science-based decision making and greater process understanding.

The sponsor required data from the CMO to fuel the same automated observational software tools used at its own sites to monitor large amounts of disparate data. The sponsor had limited knowledge regarding the CMO's processes and restricted access to CMO process experts, since collaboration was not required by the sponsor-CMO contractual agreement, which only allowed for limited data sharing.

Without a more informed starting point, preliminary parameters to monitor processes were based on educated guesses. The sponsor—with help from its analytical consultant team—set-up nearly 400 "suspect process and outcome parameters" to start the investigation. For example, the team monitored incoming material temperatures at a certain point in the process, under the suspicion that this variable would affect outcomes at the next process stage. It also predicted other parameters that had the potential to impact product quality, such as measuring the presence of synthesis-related impurities after ingredients were combined in one process step.

If the sponsor had been working at its own manufacturing site, it might have used a typical riskassessment approach, as guided by ICH Q9 Quality Risk Management (2). Authors from Pfizer have described this approach in detail:

"Risk assessment is the process used to prioritize parameters and attributes most likely to impact the product quality... The focal point of a QbD risk assessment is to be able to link quality measures and process controls to the product quality of the drug delivery system, i.e., safety, efficacy, and performance. A Quality Target Product Profile (QTPP) is an effective tool to help identify the Critical Quality Attributes (CQA) of the manufacturing process that link to product quality" (2).

Pfizer's approach requires access to experts and upfront risk assessment prior to setting up trending. Because the case study involved a CMO site, the team had to set up automated trend monitoring systems and then watch the data patterns to identify what was important.

To do so, the sponsor team compiled electronic data supplied by the CMO for analysis at the sponsor's headquarter using the sponsor's process intelligence software system. Through automated trend monitoring of the suspect parameters, the team could eventually narrow down the 400 initial parameters to those that were critical to quality outcomes. The software was configured to send alerts when data was trending out of specified limits, and the team also had access to data required for investigation of potential cause-and-effect linkages. This approach was much more interactive and real-time compared with the up-front method used in Pfizer's risk assessment approach and involved continuous process improvement using the ongoing analysis of data to increase process understanding.

The sponsor found this monitoring by exception method to be a helpful way of compensating for the limited access to process experts at the CMO site. It allowed the sponsor to see and learn from the data directly, and also facilitated collaboration by using data as a communication vehicle between CMO and sponsor.

The above trend-monitoring approach was successful because the process monitoring systems were designed to monitor by exception (see sidebar), using an automated system to alert users when they should pay more attention to a specific parameter of concern and further analyze the data. However, the sponsor had to select suspect parameters for monitoring without knowing their criticality. Standard monitoring procedures were conducted, such as baselining data, removing special cause variation and estimating process capability, and control charts were developed to monitor process parameters by exception, which involved sending alerts when control and specification limits were violated. The true value of the control charts was not observed until a few months later when a deviation occurred at the CMO site. The sponsor had quick access to process monitoring information and could also use the charts to better communicate with the CMO site.

Sidebar: Monitoring by exception

Following the selection of suspect parameters, the next phase for the sponsor team was narrowing these to determine the true CPPs. The sponsor team received alerts automatically when exceptions occurred, but if one parameter revealed a deviation, then the team knew it needed to monitor three to five additional parameters closely related to the measurement area. As a result, these real-word deviations provided a mechanism for filtering the suspect parameters over time, helping to determine CPPs.

When working with CMOs, sponsors often receive information regarding process indicator variables, but when the sponsor's analysts realize that a process is not functioning properly, they still have to pinpoint the root cause. Unless comprehensive contractual agreements exist with details of data access and analysis, it can be difficult for the sponsor to access the right data for root cause analysis and statistical modeling. This case study expanded the sponsor's process understanding and allowed for an open conversation with the CMO site regarding which additional indictors and process and outcome parameters would help both parties monitor processes in the future. In the end, the approach described above offered numerous benefits for both the sponsor and the CMO. Figure 1 shows the time saved through monitoring by exception.

Figure 1: Potential time saved through monitoring by exception compared with manual monitoring using spreadsheets.

BEST PRACTICES FOR CONTINUOUS IMPROVEMENT IN CMO PROCESS UNDERSTANDING

Stage 3 of FDA's new process validation guidance reiterates that the process of developing and monitoring CPPs and KPIs is continuous and dynamic. The document states, "In addition to being a fundamental tenet of following the scientific method, information transparency and accessibility are essential so that organizational units responsible and accountable for the process can make informed, science-based decisions that ultimately support the release of a product to commerce"(3). With sponsor–CMO relationships, this approach also requires ongoing analysis and refining, which can be performed by monitoring by exception.

To carry out a QbD approach, Stage 3 of the FDA's validation guidance must be applied across the entire manufacturing network, including at CMO sites. Incorporating automation tools is important for achieving continuous and dynamic monitoring with CMOs because it allows complex processes to be monitored with minimal human error and resources. The following summarizes best practices and considerations for implementing a continuous improvement plan for process understanding based on the author's experience.

  • Establish a baseline for how often the organizations should reevaluate parameters and processes. The sponsor in the case study described in this article reevaluated plans and parameters every six months. Process capability and quality monitoring tools should be used to help develop and revise baseline metrics. This approach is consistent with the notation that monitoring should be continuous and dynamic at CMO sites, as well as at owned manufacturing sites.

  • Use information from baselining exercises to increase process understanding at a CMO site. Develop, monitor and revise statistical plans and sampling plans for continuous quality improvement. Important elements include: Who are the key people to review monitoring charts and involve in deviation investigation? What are the parameters? How often should we evaluate these parameters?

  • Engage in proactive process monitoring that makes analytics easier to use, but with a system that ensures you are not simply validating what you want to hear. Check assumptions and insist on doing it "the right way" on an on-going basis in spite of human nature. Data-driven monitoring and decision making is an ongoing part of QbD. It is just not about simply clicking and seeing results that validate assumptions.

  • Conduct high-quality statistical analysis and use data-based decision making to interpret what has occurred in a process, and then revise appropriately. A trained statistician, for example, can determine if a credible sample size was used and make a solid recommendation for action. Outsource the analysis function if the skill set—or required time—is not available within your organization.

  • The sponsor and CMO must communicate, collaborate and be willing to adapt to new working approaches based on sound statistical analysis. Data exchange can be a vehicle to increase communication between sponsors and CMOs, which is valuable for both parties.

  • Build flexibility into contracts. Sponsors and CMOs may need to renegotiate contracts to specify how, when and which data will be shared, or to accommodate technological improvements that have become available since the contact was signed.

Kate DeRoche Lusczakoski, Ph.D., is a senior analytics specialist at Aegis Analytical Corp., 1380 Forest Park Circle, Suite 200, Lafayette, CO 80026, klusczakoski@aegiscorp.com. ditor's Note: Aegis Analytical produces monitoring by exception software.

REFERENCES

1. E.S. Langer, BioPharm International 25 (2), 15–16 (2012).

2. V. McCurdy et al., Pharm. Eng. 30 (4), 12–32 (2010).

3. FDA, Guidance for Industry, Process Validation: General Principles and Practices (Rockville, MD, Jan. 2011).