Operational Excellence: More Data or Smarter Approach?

Published on: 
, ,
BioPharm International, BioPharm International-06-01-2011, Volume 24, Issue 6

The authors focus on operational excellence in manufacturing of biotechnology therapeutic products in the QbD paradigm.

Achieving operational excellence is a growing goal among biopharmaceutical enterprises. Companies are investing heavily in software, data marts, and data integration to achieve operational excellence. The authors explore approaches for meeting operational-excellence goals, and discuss how to manage collected data with a risk-based approach for improved process understanding.

Anurag Rathore, PhD

Operational excellence is achieved when each and every employee can see the flow of value to the customer, and fix that flow when it breaks down (1). To reach this goal, and thereby to achieve continuous improvement at all levels of organization, one needs to make the best possible use of available resources (e.g., data, information). Only when there is complete visibility of existing manufacturing processes and systems within an organization can this goal be achieved. In the biomanufacturing industry, complete visibility means effective communication of process data and product information.

Quality by design (QbD) is defined in the the International Conference on Harmonization's Q8 Guideline as "a systematic approach to development that begins with predefined objectives and emphasizes product and process understanding and process control, based on sound science and quality risk management" (2). In a typical QbD approach for development of biological products, decision on the Target Product Profile (TPP) is followed by a risk- and science-based assessment of product attributes with respect to their effect on the product's safety and/or efficacy (3, 4). This assessment results in identification of the so-called critical quality attributes (CQAs). Next, the process is designed to consistently deliver these CQAs within the acceptable limits.

A robust control strategy that can ensure consistent process performance in view of the various sources of variability that exist in biotechnology manufacturing is central to successful implementation of QbD. Once the manufacturing process and the control strategy are in place, validation is performed to demonstrate the effectiveness of the control strategy, and the product is filed for approval. After approval, ongoing monitoring is performed to ensure robust process performance over the life cycle of the product (2, 3). The approaches needed to make these decisions require sufficient access to process and product information that is otherwise distributed within a myriad of sources. Thus, the need for efficient information-sharing has become more crucial to industry in the QbD paradigm.

This article is the 23rd in a series in BioPharm International about the "Elements of Biopharmaceutical Production" (see full list at BioPharm-International.com). This article focuses on operational excellence in manufacturing of biotechnology therapeutic products in the QbD paradigm, including the use of efficient and effective data and knowledge management. The authors discuss key concepts and present a related case study.

DATA AND KNOWLEDGE MANAGEMENT: THE TRADITIONAL APPROACH

Data hoarding

In an attempt to achieve operational excellence, biopharmaceutical companies often spend significant amounts of resources on building data-collection and data-capture systems, leaving fewer resources for the more important task of performing analysis of the collected data. Collecting more doesn't enable doing more. Rather, the task adds to the complexity of finding a needle in haystack. Although data collection is an important step toward continuous improvement, the inability to visualize data and share its analysis across the organization is a key deficiency in current industry practice.

Advertisement

Complex analysis tools

Complex statistical tools for performing advanced analysis of manufacturing process data have become the norm in biotechnology manufacturing (5). As the complexity of process data and analysis increases, however, training an entire organization's staff on how to best use these tools is not feasible. Many organizations, therefore, opt to create a specialized team that is trained to perform such analysis. As a result, organizaitons end up being segregated into groups of staff who on one side, have hands-on scientific experience with the product or process but no knowledge of the analytical tools associated with them, and on the other side, staff who know how to use the statistical tools but do not have hands-on scientific experience with the product or process. This situation is not desirable.Appropriate analysis requires both sides of expertise.

DATA AND KNOWLEDGE MANAGEMENT: THE QBD APPROACH

To make the most out of collected data and information, one must monitor, explore, and analyze it in a way which is easy to access and understand by the majority of the staff. This goal can be achieved in the following manner:

  • Apply risk-management principles to data analysis. For all parameters, a company may continue to record data on paper (e.g., batch records) or in electronic format. Data archiving is a minimum expectation of cGMP manufacturing. However, more sophisticated forms of data archiving, visualization, and analysis may be reserved for parameters that have been prioritized. This point is further illustrated in the case study presented later in the article.

  • Create domain-specific visualizations that can help to monitor processes and serve as a standard way of communication of process data across the organization. Visualizations can also provide a quick glance into the status of the process. The effectiveness of visualizations depends on how well they represent the relationship between different types of data (e.g., batch performance data, time profiles, batch events). Visualizations should be accessible to all staff members associated with the process and the product.

  • Perform advanced data-capture and statistical analysis to gain deeper insights into the manufacturing process. These tools should be used (or piloted) by a smaller subset of the organization but should be introduced to the broader organization only after thoughtful consideration.

The primary goal of the proposed approach is to provide access to simple, easy-to-learn tools and approaches to the majority of the company's staff. As discussed in the precious section, this approach alleviates a significant portion of the inefficiencies that are otherwise present in the traditional approaches of data and knowledge management.

IMPLEMENTATION ROADMAP

Figure 1 illustrates an implementation roadmap for a typical biomanufacturing process where data is created at multiple sources. Several hundred process parameters are collected for each batch during process execution. Because monitoring each parameter in such a large dataset will require significant resources, a more practical approach would be to use risk-assessment and management principles to prioritize the process parameters. Risk-assessment tools such as the Failure Modes and Effects Analysis (FMEA) can be used effectively for this purpose.

In this approach, each parameter is assessed for its importance to the process. Questions that need to be asked may include: What is the severity of the impact of the parameter on the performance of the given process step and the quality of the product? Is the parameter part of the process control scheme? How often does the parameter go outside the allotted operating range? Is the parameter alarmed?

Based on this analysis, a risk priority number (RPN) can be calculated and used to prioritize the parameters. Based on process-execution experience and current level of process understanding, a cut-off RPN number can be determined to screen only those parameters for process monitoring that are above the cut-off RPN (see Figure 1). The data about parameters below the RPN cut-off can be archived for later use in compliance and advanced analysis purposes. The parameters identified for routine process monitoring can be represented in simple visualizations that slice and dice data in various combinations of batches, parameters, and process time (e.g., parameter versus time, parameter versus parameter, parameter versus batch). These visuals can help the organization spot process trends and communicate and resolve issues easily. Unresolved issues that need a deeper understanding of the process can be resolved by using advanced statistical analysis tools. The ideal outcome of advanced analysis is process improvement initiatives that are aimed to improve product yield and quality (6), as well as process-analytical-technology (PAT) based control schemes aimed at ensuring consistency in the product yield and quality.

Figure 1: Illustration of a quality-by-design approach to data and knowledge management. (ALL FIGURES ARE COURTESY OF THE AUTHORS)

CASE STUDY: A QUALITY-BY-DESIGN APPROACH FOR A MAMMALIAN CELL-CULTURE STEP

Figure 2 lists typical input and output parameters that are encountered in routine mamxmalian cell-culture applications. These data may come from online sensors that are collecting the data in real time, from batch records, from analysis in the Quality Control laboratory, or from the various benchtop instrumentation that accompanies the bioreactor. According to cGMP expectations, these data are collected and archived. Performing a statistical analysis of each parameter would be cumbersome and unlikely to add much value with respect to process understanding. An FMEA, therefore, was performed to segregate the parameters into multiple categories (see Figure 3).

Figure 2: The various input and output parameters typically encountered when performing mammalian cell-culture processes. DO is dissolved oxygen. RPM is the agitation rate. RVLP is Rubella virus-like particles. MMV is mouse minute virus.

Category A lists parameters that are collected and archived for future potential use. Category B lists the parameters that have been identified as important for the process step. The data are routinely monitored, which involves performing statistical analysis of the data and trending. Analysis via Nelson rules is a popular approach for performing this activity (6).

Figure 3: Outcome of risk analysis for prioritization of various input and output parameters typically encountered when performing mammalian cell-culture processes. Data for parameters are collected and archived. Only a subset of parameters are routinely monitored and a further smaller subset is involved in process-analytical-technology (PAT) based control of the process step.

Finally, a subset of Category B parameters may be used for designing a PAT-based control scheme. For example, a PAT control scheme for gas transfer would need measurements of pCO2, pO2, agitation rate (RPM), and air-flow rate (Category D). Hence, these parameters may be collected using automatic sensors and the information relayed to the controller. Similarly, a PAT control scheme for cell count will require efficient capture of information on pH, temperature, cell count, agitation rate, air-flow rate, and feed-flow rate.

CONCLUSION

This article aims to illustrate the need to achieve operational excellence with respect to data- and knowledge-management systems to successfully implement QbD for manufacturing of biotherapeutics. As discussed, there is a great need among the biomanuacturing industry to apply risk-assessment and management principles to achieve this objective. When done correctly, this approach is more economical and effective than traditional approaches otherwise used in industry.

Anshuman Bansal, PhD, and Jaspinder Hans are cofounders of Simplyfeye Softwares in Delhi, India. Anurag Rathore, PhD, is a partner at Simplyfeye and a faculty member at the Indian Institute of Delhi, India, +91-9650770650, asrathore@biotechcmz.com. Rathore is also a member of BioPharm International's editorial advisory board.

REFERENCES

1. "Operational excellence" as defined by Industry Week, www.industryweek.com/articles/operational_excellence_defined_14011.aspx

2. ICH, Q8(R1): Pharmaceutical Development, November, 2008.

3. A.S. Rathore and H. Winkle, Nature Biotechnology, 27 (1) 26–34 (2009).

4. A.S. Rathore, Trends in Biotechnol. 27 (9) 546–553 (2009).

5. R. Johnson et al., BioPharm. Intl., 20 (10) 130–144 (2007).

6. P. Konold, R. Woolfenden II, Cenk Undey, and A. S. Rathore, BioPharm. Intl. 22 (5) 26–39 (2009).

See thefull list of articles within this series.