Monitoring of Biopharmaceutical Processes: Present and Future Approaches - Enhance your control strategy with robust monitoring methods. - BioPharm International


Monitoring of Biopharmaceutical Processes: Present and Future Approaches
Enhance your control strategy with robust monitoring methods.

BioPharm International
Volume 22, Issue 5


In the QbD paradigm, the concept of design space plays a central role around which the various activities revolve.12,13 Design space has been defined as: "The multidimensional combination and interaction of input variables (e.g., material attributes) and process parameters that have been demonstrated to provide quality assurance. Working in the design space is not considered a change. Movement out of the design space is considered to be a change and would normally initiate a regulatory postapproval change process. Design space is proposed by the applicant and is subject to regulatory assessment and approval."2 After the design space has been defined from process characterizations studies, process validation is performed to demonstrate that the process will deliver a product of acceptable quality, if operated in the design space. The regulatory filing would include the acceptable ranges for all key and critical operating parameters (i.e., design space) in addition to a more restricted operating space typically described in the manufacturing procedures for biotech products as the set of operating ranges for the operating parameters.

After approval of a biotech drug has been obtained, process monitoring of the product quality and process performance attributes is performed to ensure that the process is performing within the defined acceptable variability that served as the basis for the filed design space. In the QbD paradigm, process changes within the design space will not require regulatory review or approval and this may facilitate process improvements during the product lifecycle. Excursions outside the operating space would indicate unexpected process drift and may initiate an investigation into the cause of the deviation and a subsequent corrective action. Excursions outside the design space will require a more thorough investigation of the root cause and the impact on product quality. As manufacturing experience grows and opportunities for process improvement are identified, the operating space could be revised within the design space without the need for a postapproval submission. Process knowledge and design space can be updated as understanding is gained over the lifecycle of a product. Changes to design space would require evaluation against the need for further characterization or revalidation.


The future of process monitoring lies in combined use of powerful analytical tools capable of supporting real-time decision making and sophisticated statistical tools that can analyze complex data sets in an efficient and effective manner.13 Becker, et al., have recently reviewed new approaches to sensor technology and control strategies to a variety of bioprocesses along with modern aspects of data evaluation for improved monitoring and control.14 Combinations of principal component analysis (PCA) and exponentially weighted moving average, as well as partial least squares (PLS)-based methods have been used to demonstrate their usefulness as monitoring tools capable of detecting small shifts in biological processes.15–17 The use of multivariate control charts for a cell culture step toward real-time process monitoring and for identification of atypical process performance has been suggested.18

As mentioned above, PCA and PLS are projection models and commonly used in multivariate process monitoring. PCA is often chosen when the objective is to simply monitor many variables that are collected in a continuous manner. An example of this is batch cultivation at a certain sampling frequency (based on the sampling and measurement systems in place and may vary between daily offline sampling such as pH and real-time analyzer or probe readings such as a pH probe). PLS is more effective when the objective is predictive monitoring. In other words, while monitoring many variables, we also would like to know how changes in them may affect a process end point such as yield.

Figure 2
PCA and PLS are data-driven and require a representative batch history of known good performance of the process to establish a baseline (or a reference mode) for facilitating comparison of new batches. Both techniques are also very powerful in explaining the overall variability and the correlation structure of all the variables, accounting for missing values, outlier detection, handling colinearity, and more importantly, reducing the dimensionality problem (while removing measurement noise). Because there are many variables measured (and often they are correlated to each other, hence creating colinearity) it is important to be able to reduce the number of variables to a select few important ones, which are typically expressed as principal components and latent variables, that actually drive the overall variability of the process. As shown in Figure 2, PCA uses a mathematical algorithm to fit a model plane to the data cloud and finds the direction that shows the maximum variability of the process followed by finding the next variability direction that is orthogonal to the first component and so on, until there is no significant variation left to be explained but noise. Therefore, each principal component has some contribution from the original variables in a certain weighted manner depending on the correlation structure. Multivariate statistical process monitoring in real-time uses these derived variables (principal components) to monitor the entire process performance along with multivariate statistics and charts for fault detection and diagnosis.

After the process model based on historical data is developed by either PCA or PLS, a number of multivariate statistics and charts can be constructed for monitoring new batches. Commonly used multivariate monitoring and diagnosis charts include:19

  • Squared prediction error (SPE, also known as Q-residuals or DModX) chart: SPE is used for process deviation detection. It is very useful in detecting events that are not necessarily captured by the model. In other words, when SPE limit violation is observed (and if there is no T 2 violation), it is likely that a new event is observed that is not captured by the reference model (this can be triggered by a normal event that is part of inherent process variability that is not captured or a process upset).
  • Score time series and Hotelling's T 2 charts: As mentioned earlier, these charts are also used for process deviation detection; in this case, detecting deviations that are explained by the process model and within the overall variability, but unusually high comparing to the average. Score time series allows one to monitor performance at each model dimension separately while T 2 allows monitoring all of the model dimensions over the course of a batch run by using a single statistic.
  • Contribution plots: When either or both of the detection charts identify a deviation (violation of multivariate statistical limits) from the historical behavior, we need to find out what the deviating variables are. Contribution plots are then used to delve into the original variable level to inspect which variable or variables are contributing to the inflated statistic.

SPE, T 2 , and score-time series should be used together to better understand what type of deviation (a known and an unknown) is detected.

blog comments powered by Disqus



Bristol-Myers Squibb and Five Prime Therapeutics Collaborate on Development of Immunomodulator
November 26, 2014
Merck Enters into Licensing Agreement with NewLink for Investigational Ebola Vaccine
November 25, 2014
FDA Extends Review of Novartis' Investigational Compound for Multiple Myeloma
November 25, 2014
AstraZeneca Expands Biologics Manufacturing in Maryland
November 25, 2014
GSK Leads Big Pharma in Making Its Medicines Accessible
November 24, 2014
Author Guidelines
Source: BioPharm International,
Click here