Quality by Design: Industrial Case Studies on Defining and Implementing Design Space for Pharmaceutical Processes (Part 2)

Published on: 
BioPharm International, BioPharm International-01-01-2009, Volume 22, Issue 1
Pages: 40–45

Understanding the relationship between the process and CQAs.

ABSTRACT

A key challenge in successfully implementing Quality by Design (QbD) is achieving a thorough understanding of the product and the process. This knowledge base must include understanding the variability in raw materials, the relationship between the process and the critical quality attributes (CQAs) of the product, and finally the relationship between the CQAs and the clinical safety and efficacy of the product. Part 1 of this article presented a stepwise approach to defining a design space. This Part 2 explains how to validate, file, and monitor the design space. It also discusses how to implement QbD for existing products and how to integrate QbD with process analytical technology (PAT).

Quality by Design (QbD) is receiving significant attention in both the traditional pharmaceutical and the biopharmaceutical industry subsequent to the FDA's publication of the International Conference on Harmonization (ICH) Q8 guidance, Pharmaceutical Development in May 2006.1 In the traditional approach to process validation, companies use process understanding from process characterization studies to define critical, key, and non-key process parameters. The critical process parameters are then controlled within a narrow range, and the process is validated to ensure it operates within the defined ranges for these process parameters. In the QbD paradigm, identifying critical process parameters and critical quality attributes (CQAs) will still be required in defining the design space for a process. However, companies will need to go further in developing their understanding of both the CQAs and how they relate to clinical performance, as well as in their understanding of how the process will ensure these identified properties are controlled.2 Online monitoring has been successfully demonstrated in some aspects of biopharmaceutical processes such as fermentation processes,3–5 and possibilities for future applications in raw materials,6 and downstream processing7,8 are currently under investigation. Once a design space has been defined it will be continually reassessed and changed, pending regulatory acceptability, based on process understanding.9 This is part of continuous improvement within the quality systems approach.10,11

This article is the fourteenth in the "Elements of Biopharmaceutical Production" series. In this Part 2 of the article, we present a stepwise approach to validating, filing, and monitoring a design space. Industrial case studies from industry, both biotech and traditional small-molecule pharmaceutical manufacturing, are used to illustrate the key aspects. We also discuss how to implement QbD for existing products and how to integrate QbD with process analytical technology (PAT).

Anurag S. Rathore, PhD

QbD IMPLEMENTATION FOR PHARMACEUTICALS

Validating and Filing a Design space

Since a design space "assures quality" of the drug product, the limits defined in the design space could also provide the basis of the validation acceptance criteria.9 After the design space has been created, process validation becomes an exercise to demonstrate 1) that the process will deliver a product of acceptable quality if operated within the design space; and 2) that the small- or pilot-scale systems, scale-up parameters, or models used to establish the design space accurately model the performance of the manufacturing scale process. Thus, unanticipated manufacturing excursions that remain within the design space should not jeopardize the success of the validation exercise. After the design space has been established and validated, the regulatory filing would include all critical process parameters including material attributes (i.e., those included in the design space) and could also include key process parameters along with the corresponding acceptable and operating ranges.

Figure 1

Monitoring a Design Space and Making Post-Approval Changes to Design Space

During commercial-scale manufacturing, the proactive trending of product performance on a continuous basis affords significant benefits through early detection of emerging issues and further optimization of the control strategy to ensure operation within the design space. By using tools such as design of experimentation and systematic problem solving approaches such as Six Sigma, manufacturing operations can resolve problems and continue to optimize processes. These tools allow for and encourage a systematic approach to resolving unknown sources of variability and improving manufacturing robustness.

Case Study 1: Tablet Dissolution

The case study presented in Figure 1A describes the impact of subtle changes in raw material variability on product performance and the importance of continuous monitoring throughout the product lifecycle to ensure product quality. During routine monitoring of product performance for an extended-release tablet, an incidence of high variability in dissolution results was observed. Although all of the lots produced during this period met specifications, the trend in variability raised concerns about the potential for product quality problems to arise in the future. Data analysis to evaluate process capability with respect to dissolution at 12 h was carried out and the results suggested that supplemental tier 2 or tier 3 testing would be required to ensure product quality. Since the root cause of the upward trend in dissolution was not understood, a project was initiated using Six Sigma methodology to identify the root cause, design an improvement plan, and verify the impact of the corrective action. Six Sigma is a well known, structured approach to solving technical problems that have no known solution, have a measurable defect or problem, and identifiable causes. The steps used in a Six Sigma approach are 1) define the problem, 2) evaluate the ability to measure the problem, 3) analyze the problem using the appropriate method, 4) improve the process, and 5) implement the derived controls. In this instance, the project team used production data and analytical methods to identify the root cause and develop and implement the corrective action in a few months. Further evaluation of various parameters through multivariate analysis showed the root cause of variability to be directly related to raw material properties. The raw material properties affecting the dissolution rate were identified and a new tighter specification was defined and implemented to control the quality attributes of the incoming raw material. The data and trend shown to the left of the vertical red line in Figure 1A were before the implementation of controls, while the data to the right of the vertical red line are from post-implementation. Following completion of the Six Sigma "Improve" phase, the process was found to be significantly more robust, as seen in Figure 1B. A statistical analysis showed that the process capability was 0.86 before the Six Sigma project, and 1.93 afterward. The histogram in Figure 1B of mean 12-h dissolution for before and after the Six Sigma project illustrates the improved robustness and a shifting of the dissolution mean towards the center of the allowable specification range. Further, a predictive model was developed using a JMP software-based analysis of historical production data. A multivariate dissolution model was created to predict the 12-h dissolution on an ongoing proactive basis. A Pareto analysis of the data versus the CQA ranked the variables by correlation. Five factors, all quality attributes of the formulation excipent and drug substance, were found to be statistically significant. Figure 1C shows the prediction profiler from the model. The steeper the slope, whether positive or negative, the more that factor contributes to variability in dissolution. The statistical model was validated using four different excipient lots, each converted to 10–13 lots of tablets. The data are presented in Table 1. The predicted dissolution from the model versus the average 12-h dissolution from the multiple tablets lots gave a prediction error of about 1.0%.

Advertisement

Table 1

Case Study 2: Cell Culture Scale-Up

Another case study illustrating the role of process monitoring in demonstrating process performance within a design space was recently published.5 Multivariate data analysis (MVDA) and modeling were performed using representative data from small-scale mammalian cell culture batches (2-L) and large-scale batches (2,000-L) manufactured with a cell-culture process. Several input parameters (pCO2, pO2, glucose, pH, lactate, ammonium ions) and output parameters (purity, viable cell density, viability, and osmolality) were evaluated in this analysis. An MVDA model was created using the 14 representative small-scale batches (2 L), and then used to predict performance at the 2,000-L scale. Results presented in Figure 2 show that the process at 2,000 L is indeed very representative of the 2-L process. All of the 11 large-scale batches lie within the control limits calculated from the 2-L scale data with the exception of two batches that deviated slightly from the control limits between time values 12 and 13. Further investigation indicated that one of these two lots had the highest viable cell density (VCD) and the other had the highest lactate levels compared to all of the runs. Thus, these two batches were somewhat "atypical," illustrating the usefulness of the batch control charts in fault diagnosis during manufacturing to diagnose events such as equipment failures and raw material issues.

Figure 2

As highlighted in the case studies above, if the manufacturing process creeps outside the design space, process changes may be made and may require process characterization, validation, and filing of the changes to the approved design space.

PAT and QbD

Process Analytical Technology (PAT) has been defined as "a system for designing, analyzing, and controlling manufacturing through timely measurements (i.e., during processing) of critical quality and performance attributes of raw and in-process materials and processes, with the goal of ensuring final product quality." As such, the goal of PAT, as explained in the FDA's PAT guidance document,2 is to "enhance understanding and control the manufacturing process," which is consistent with the idea that quality cannot be tested into products, but rather should be built-in by design. Thus, PAT is an enabler of the concept of Quality by Design. The design space is defined by the critical process parameters, possibly including material attributes identified from process characterization studies and their acceptable ranges. These parameters and attributes are the primary focus of online, inline, or at-line PAT applications. Situationally dependent, "real time" PAT assessments could provide the basis for continuous feedback and result in improved process robustness. Yu et al. have reviewed the PAT concept and discussed its application to a crystallization process.13 They presented a variety of in situ analytical methods that could be combined with chemometric tools to analyze multivariate process information and provide a basis for future improvements in the modeling, simulation, and control of crystallization processes. The following case studies illustrate the integration between PAT and QbD.

Case Study 3: IEX Chromatography

Besides the obvious usefulness of PAT in ensuring that the process operates within the approved design space, PAT can also be useful in broadening the design space based on enhanced knowledge and understanding. The case study involved an ion exchange (IEX) chromatography step that removes an impurity (Impurity 1) from the product to levels below the drug substance specifications.8 A typical chromatographic profile is illustrated in Figure 3 with the earliest fractions (A) showing pure product, middle fractions (F) a mix of product and impurity and latter fractions (Y) primarily impurities. In the traditional approach, pooling of such a column is performed by UV absorbance at 280 nm. The key advantage of such pooling is the simplicity of implementing UV-based pooling criteria in a manufacturing environment. This approach works well for bind-and-elute applications in which the entire peak is collected from baseline to baseline and the protein concentration is linearly correlated to the absorbance signal for the range under consideration. However, in the case of a high-resolution separation where a part of the peak is being collected to pool the product and reduce one or multiple impurities, this approach is not optimal because absorbance-based methods are not able to differentiate between product and other proteins or other species that have a similar absorbance profile. Thus, if the impurity levels in the feed material vary from lot to lot, pool purity will also vary. This could result in lot rejection if the load material had a higher level of Impurity 1 than the IEX column is capable of clearing. To prevent such an event, the process parameters for the process column would need to be maintained within tight ranges, resulting in a narrow design space for the IEX column. On the other hand, in a PAT based approach, an online high performance liquid chromatography (HPLC) system could be used to design a dynamic control scheme that would allow testing of the IEX column eluate as it elutes in real time. Thus, irrespective of the concentration of Impurity 1 in the feed material, the IEX column pooling procedure can be adjusted such that the pool quality is consistent and meets the drug substance specifications.8

Figure 3

Case Study 4: Monitoring the Dryer Bowl

A second case study involving use of PAT sensor technology to monitor a drying end point is shown in Figure 4. Traditional approaches to drying might minimally utilize time and temperature to control the drying process. At the end of a defined drying period, the operator is generally required to sample the powder bed to obtain the loss-on-drying values and compare the results against approved specifications. If the results are satisfactory, drying is complete and material is transferred to the next manufacturing step. Alternatively, if results are above the in-process specification limit, the material can be re-dried.

Figure 4

If instead, near-infrared (NIR) sensors are used, the fluid bed drying process can be monitored continuously and moisture content end point determined in real time for each drying cycle. Risks from sampling errors and from raw material and environmental variability that contribute to process variability are reduced if not eliminated. Process development studies based on time, temperature, airflow and dew point were carried out to evaluate their effect on the dried material's physical properties, including particle size and residual solvent content. The commercial process was simulated at laboratory, pilot, and commercial scale. Because of equipment design differences (e.g., the bowl, agitation, and bag design) at different scales of manufacturing, specific and some redundant trials at each step were deemed necessary. Figure 4A shows a schematic of an NIR detector attached through the wall of the dryer bowl. Figure 4B shows several examples of real time data generated throughout the drying cycle. NIR sensors monitored and enabled control of the drying cycle through a control algorithm. As the results were obtained during each run, it was possible for the process parameters to be adjusted to produce dried material with consistent quality attributes. Applying PAT in this manner led to a process with lower variability.

Figure 5

Case Study 5: Detecting Raw Material Variability

Figure 5 illustrates another case study involving a potential PAT application that is currently under study. In this application, raw materials are identified by NIR to gain the tangible benefits of speed, accuracy, and cost savings that NIR offers over traditional wet chemistries. NIR analysis allows for trending of raw material lot quality in real time and early detection of any shifts in quality. A subsequent manufacturing step involves an extrusion unit operation which can be monitored continuously inline for temperature and active ingredient concentration. An ultra-performance liquid chromatography test is performed offline to test the material for presence of a degradation product. Particle size distribution is continuously monitored during milling for process consistency and controlled by feedback or feed-forward control for manufacturability or compression performance as a function of particle size. Finally, the tablet's weight, thickness, potency, and hardness are tested at line at the tablet press for continuous quality verification and feedback control of compression. These additional data sources reduce quality risk and variability while increasing process understanding.

Case Study 6: Monitoring Multiple Unit Operations

It is possible to combine and coordinate process knowledge from multiple unit operations to achieve a holistic picture of the entire manufacturing process for a given product. Consider the process train for manufacturing a solid oral dosage form illustrated in Figure 5. Several unit operations are required to combine the right quantity of active pharmaceutical ingredient (API) and excipients under appropriate processing conditions and in a controlled environment to produce a drug product that consistently meets quality attributes. Through the use of appropriate sensor technology, the real time profile for the manufacturing process at each step or unit operation could be generated. During commercial manufacturing, material is moved through each manufacturing step only if the real-time profile is consistent with expected historical data. At the end of a manufacturing cycle, a review of the real-time profiles for all unit operations throughout the process could determine conformance and verify that the product meets quality attributes. If the reported profile is consistent with historical data, based on population analysis, real time release of product can be considered. Fundamentally, only those lots that fall outside the known population of data would require additional off-line testing or be rejected.

SUMMARY

Successful implementation of QbD in the pharmaceutical industry requires a concerted effort between the regulators and the industry. Case studies such as those presented here will serve as useful tools in gaining the common ground and defining best practices in the various functions: Research and Development, Quality, Manufacturing and Regulatory. Ultimately, the implementation of QbD is likely to result in increased efficiencies both for regulatory reviews and for pharmaceutical manufacturing.

ACKNOWLEDGMENTS

This article summarizes the presentations and discussions that occurred in the plenary session titled "How do you sell Quality by Design (QbD)?" at the PDA–FDA Joint Regulatory Conference held on September 24–28, 2007, in Washington, DC. The objective of the session was to discuss the challenges that are encountered when implementing the QbD paradigm.

Anurag S. Rathore, PhD, is a director of process development at Amgen, 805.447.4491, asrathore@yahoo.com He is also a member of BioPharm International's editorial advisory board. Stephen H. Montgomery, PhD, is a law clerk at McDonnell Boehnen Hulbert and Berghoff, LLP, Azita Saleki-Gerhardt, PhD, is division vice-president for quality at Abbott, North Chicago, IL, and Stephen M. Tyler is director of strategic quality and technical operations at Abbott.

REFERENCES

1. US Food and Drug Administration. Guidance for industry. Q8 pharmaceutical development. Rockville, MD: 2006 May.

2. US FDA. PAT Guidance for industry. A framework for innovative pharmaceutical development, manufacturing and quality assurance. Rockville, MD: 2004 Sept.

3. Gnoth S, Jenzsch M, Simutis R, Lübbert A. Process analytical technology (PAT): batch-to-batch reproducibility of fermentation processes by robust process operational design and control. J Biotechnol. 2007;132:180–6.

4. Kirdar AO, Conner JS, Baclaski J, Rathore AS. Application of multivariate analysis towards biotech processes: Case study of a cell-culture unit operation. Biotech Prog. 2007;23:61–7.

5. Johnson R, Yu O, Kirdar AO, Annamalai A, Ahuja S, Ram K, Rathore, AS. Applications of multivariate data analysis (MVDA) for biotech processing. BioPharm Int. 2007 Oct;20(10):130–44.

6. Kirdar AO, Green KD, Rathore AS. Application of multivariate analysis for identification and resolution of a root cause for a bioprocessing application. Biotech Prog. 2008;24:720–6.

7. Rathore AS, Sharma A, Chillin D. Applying process analytical technology to biotech unit operations. BioPharm Int. 2006 Aug; 19(8):48–57.

8. Rathore AS, Yu M, Yeboah S, Sharma A. Case study and application of process analytical technology (PAT) towards bioprocessing: use of on-line high performance liquid chromatography (HPLC) for making real time pooling decisions for process chromatography. Biotechnol Bioeng. 2008;100:306–16.

9. Rathore AS, Branning R, Cecchini D. Design space for biotech products. BioPharm Intl. 2007 April;20(4):36–40.

10. US FDA. Guidance for Industry: Q10 quality systems approach to pharmaceutical CGMP regulations. Rockville, MD: 2006 Sept.

11. Kozlowski S, Swann P. Current and future issues in the manufacturing and development of monoclonal antibodies. Adv Drug Deliv Rev. 2006;58:707–22.

12. Harms J, Wang X, Kim T, Yang J, Rathore AS. Defining design space for biotech products: case study of Pichia pastoris fermentation. Biotechnol Prog. 2008;24:655–62.

13. Yu LX, Lionberger RA, Raw AS, D'Costa R, Wu H, Hussain AS. Applications of process analytical technology to crystallization process. Adv Drug Deliv Rev. 2004;56:349–69.