Operational Excellence: More Data or Smarter Approach? - The authors focus on operational excellence in manufacturing of biotechnology therapeutic products in the QbD paradigm. - BioPharm

ADVERTISEMENT

Operational Excellence: More Data or Smarter Approach?
The authors focus on operational excellence in manufacturing of biotechnology therapeutic products in the QbD paradigm.


BioPharm International
Volume 24, Issue 6, pp. 36-41

DATA AND KNOWLEDGE MANAGEMENT: THE QBD APPROACH

To make the most out of collected data and information, one must monitor, explore, and analyze it in a way which is easy to access and understand by the majority of the staff. This goal can be achieved in the following manner:

  • Apply risk-management principles to data analysis. For all parameters, a company may continue to record data on paper (e.g., batch records) or in electronic format. Data archiving is a minimum expectation of cGMP manufacturing. However, more sophisticated forms of data archiving, visualization, and analysis may be reserved for parameters that have been prioritized. This point is further illustrated in the case study presented later in the article.
  • Create domain-specific visualizations that can help to monitor processes and serve as a standard way of communication of process data across the organization. Visualizations can also provide a quick glance into the status of the process. The effectiveness of visualizations depends on how well they represent the relationship between different types of data (e.g., batch performance data, time profiles, batch events). Visualizations should be accessible to all staff members associated with the process and the product.
  • Perform advanced data-capture and statistical analysis to gain deeper insights into the manufacturing process. These tools should be used (or piloted) by a smaller subset of the organization but should be introduced to the broader organization only after thoughtful consideration.

The primary goal of the proposed approach is to provide access to simple, easy-to-learn tools and approaches to the majority of the company's staff. As discussed in the precious section, this approach alleviates a significant portion of the inefficiencies that are otherwise present in the traditional approaches of data and knowledge management.

IMPLEMENTATION ROADMAP

Figure 1 illustrates an implementation roadmap for a typical biomanufacturing process where data is created at multiple sources. Several hundred process parameters are collected for each batch during process execution. Because monitoring each parameter in such a large dataset will require significant resources, a more practical approach would be to use risk-assessment and management principles to prioritize the process parameters. Risk-assessment tools such as the Failure Modes and Effects Analysis (FMEA) can be used effectively for this purpose.

In this approach, each parameter is assessed for its importance to the process. Questions that need to be asked may include: What is the severity of the impact of the parameter on the performance of the given process step and the quality of the product? Is the parameter part of the process control scheme? How often does the parameter go outside the allotted operating range? Is the parameter alarmed?


Figure 1: Illustration of a quality-by-design approach to data and knowledge management. (ALL FIGURES ARE COURTESY OF THE AUTHORS)
Based on this analysis, a risk priority number (RPN) can be calculated and used to prioritize the parameters. Based on process-execution experience and current level of process understanding, a cut-off RPN number can be determined to screen only those parameters for process monitoring that are above the cut-off RPN (see Figure 1). The data about parameters below the RPN cut-off can be archived for later use in compliance and advanced analysis purposes. The parameters identified for routine process monitoring can be represented in simple visualizations that slice and dice data in various combinations of batches, parameters, and process time (e.g., parameter versus time, parameter versus parameter, parameter versus batch). These visuals can help the organization spot process trends and communicate and resolve issues easily. Unresolved issues that need a deeper understanding of the process can be resolved by using advanced statistical analysis tools. The ideal outcome of advanced analysis is process improvement initiatives that are aimed to improve product yield and quality (6), as well as process-analytical-technology (PAT) based control schemes aimed at ensuring consistency in the product yield and quality.

CASE STUDY: A QUALITY-BY-DESIGN APPROACH FOR A MAMMALIAN CELL-CULTURE STEP


Figure 2: The various input and output parameters typically encountered when performing mammalian cell-culture processes. DO is dissolved oxygen. RPM is the agitation rate. RVLP is Rubella virus-like particles. MMV is mouse minute virus.
Figure 2 lists typical input and output parameters that are encountered in routine mamxmalian cell-culture applications. These data may come from online sensors that are collecting the data in real time, from batch records, from analysis in the Quality Control laboratory, or from the various benchtop instrumentation that accompanies the bioreactor. According to cGMP expectations, these data are collected and archived. Performing a statistical analysis of each parameter would be cumbersome and unlikely to add much value with respect to process understanding. An FMEA, therefore, was performed to segregate the parameters into multiple categories (see Figure 3).


Figure 3: Outcome of risk analysis for prioritization of various input and output parameters typically encountered when performing mammalian cell-culture processes. Data for parameters are collected and archived. Only a subset of parameters are routinely monitored and a further smaller subset is involved in process-analytical-technology (PAT) based control of the process step.
Category A lists parameters that are collected and archived for future potential use. Category B lists the parameters that have been identified as important for the process step. The data are routinely monitored, which involves performing statistical analysis of the data and trending. Analysis via Nelson rules is a popular approach for performing this activity (6).

Finally, a subset of Category B parameters may be used for designing a PAT-based control scheme. For example, a PAT control scheme for gas transfer would need measurements of pCO2, pO2, agitation rate (RPM), and air-flow rate (Category D). Hence, these parameters may be collected using automatic sensors and the information relayed to the controller. Similarly, a PAT control scheme for cell count will require efficient capture of information on pH, temperature, cell count, agitation rate, air-flow rate, and feed-flow rate.


blog comments powered by Disqus

ADVERTISEMENT

ADVERTISEMENT

FDA Panel Unanimously Backs Secukinumab for the Treatment of Psoriasis
October 22, 2014
Roche to Expand and Improve its Basel Site
October 22, 2014
Pall ForteBio Releases Bioprocessing Contamination Detection Kit
October 22, 2014
EMA Works to Speed Up Ebola Treatment
October 20, 2014
Amgen Sues Sanofi and Regeneron over Patent for mAb Targeting PCSK9
October 20, 2014
Author Guidelines
Source: BioPharm International,
Click here