How to Maintain Suitable Analytical Test Methods: Tools for Ensuring a Validation Continuum

Published on: 
BioPharm International, BioPharm International-10-01-2005, Volume 18, Issue 10
Pages: 40–45

Many industry professionals know that analytical testing for biopharmaceuticals for all raw materials, production in-process stages, and final containers must be validated, and they generally understand how this can be achieved. Many of us even understand the basic concepts of laboratory compliance and production process quality. However, how exactly are analytical test method performance and process robustness related and how do they depend on each other? Furthermore, how do we monitor and maintain the accuracy and reliability of analytical methods long after validation completion to ensure the suitability of these methods for measuring process quality?

Many industry professionals know that analytical testing for biopharmaceuticals for all raw materials, production in-process stages, and final containers must be validated, and they generally understand how this can be achieved. Many of us even understand the basic concepts of laboratory compliance and production process quality. However, how exactly are analytical test method performance and process robustness related and how do they depend on each other? Furthermore, how do we monitor and maintain the accuracy and reliability of analytical methods long after validation completion to ensure the suitability of these methods for measuring process quality?

Stephan O. Krause, Ph.D.

Most monitoring and maintenance activities should be controlled within a validation master plan (VMP) program. A good VMP in quality control (QC) should include not only detailed strategies and timelines for initial validation activities but also tools for maintaining compliance for years to come. The lifecycle of an analytical method should be captured from beginning to end. The beginning can be defined to lie somewhere between early-stage and finished method development, whereas the end is usually the replacement of one method with a better one. Currently, relatively little guidance exists for maintaining compliance and validity for analytical testing. This article focuses on how to run a practical program for analytical method maintenance (AMM) and how to ultimately integrate it into the production process. Detailed suggestions for analytical method development (AMD), analytical method validation (AMV), and analytical method transfer (AMT) as well as regulatory submissions are discussed elsewhere.1-7

Figure 1 illustrates a simplified ideal sequence of QC qualification and validation activities to bring an analytical method initially into compliance. If executed as written, a compendial method does not require AMD, but it will require test method verification, which demonstrates method suitability under the actual condition of use. The concept and data requirements for test method verification are described in the new draft chapter <1226> Verification of Compendial Procedures for the United States Pharmacopoeia (USP).8 The data requirements for compendial method verifications are similar to those required to demonstrate method comparability for each test method category (for example, quantitative limit test).7

VMP FOR ANALYTICAL METHODS

All compendial and non-compendial test methods should be listed in the QC VMP, together with the supporting instrumentation and equipment, computer software and hardware, critical reagents, assay standards and controls, and operator training. Although the listing of these details is time-consuming and constitutes a front-heavy planning activity, it is essential later for maintaining a compliant QC validation and testing program. All supporting test system components that are prerequisites for AMV should be identified and connected to each test method in the VMP. To be practical and effective for prioritizing, and therefore, the assignment of target completion dates, all AMV prerequisites should match the same priorities as the supported AMV. When considering timelines in preparing the overall QC VMP and all individual project plans, keep in mind that components of a validation project could fail and therefore have an associated risk to the timely completion of the overall project. Figure 1 identifies some of the prerequisites (for example, instrument qualification) that should be completed before initiating an AMV. Using Gantt charts or other similar project management tools assists in the layout of efficient sequences of validation activities and helps ensure sufficient resources and time for completion.

Figure 1. Process Map for Analytical Method Validation

Ideally, formal AMV is only a confirmation of the suitable performance of test methods that is usually known prior to validation execution. If there is a low risk of failing a particular AMV, its completion could be planned accordingly with a lower priority. Critical production process and product quality attributes must be understood and measurable before a production process can be truly optimized. The earlier that process variability can be reliably and accurately measured, understood, and ultimately controlled, the higher the probability of gaining license approvals and operating and maintain a robust production process. A brief list of high priority AMVs to support process validations and the submission of a license application follows:

  • All tests for drug safety, purity, efficacy, and stability

  • Advanced, automated, or new analytical technologies not yet commonly used in the biopharm-aceutical industry

Advertisement
  • Final container and late-stage, in-process testing

COMPLIANCE THROUGH AMM

Once regulatory approval is obtained and production is running smoothly, all quality systems must be monitored and properly maintained. Figure 2 illustrates how analytical methods can be readily monitored and maintained within the QC VMP. The gray boxes identify all new analytical methods that need validation and all compendial procedures that need verification. If a method is later replaced, a method comparability study must provide evidence that the new method is comparable to the current and licensed method.7 Successfully completing these steps brings you to initial compliance. The overall AMM for validated test methods is split into method modifications (in yellow) and method reviews (in green/red).

Figure 2. An Analytical Method Maintenance Program

METHOD MODIFICATIONS

Critical test system elements such as reference standards or analytical instruments are deliberately listed in yellow (Figure 2) under this maintenance category. Depending on the depth of studies conducted for assay robustness during AMD, some critical test system elements may have already been identified and understood. By definition, a truly robust assay will remain unaffected by small changes in operational limits and critical methods elements. In reality, however, AMD studies may lack a meaningful design of experiments to test for robustness. In addition, many test methods are inherently sensitive to changes and are therefore not very robust. This is why the suggested AMM program is important — it will help to keep test results within acceptable accuracy and precision limits. If assay robustness was not thoroughly evaluated during AMD (and AMV), the suggested validation maintenance program (VMP) should monitor all test system components, as it isn't known which one, when altered, could trigger a change in the test results. Equivalency studies that compare before-to-after method performance criteria (in general accuracy and precision) could be used to document that changing a critical test system element leaves the test method performance unchanged. Alternatively, the test-method-associated assay control chart could be monitored for drifting (after change) or undesirable spreading of control data points.

Minor method modifications such as replacing an assay control with an equivalent control are usually summarized together with the overall test method performance evaluation (assay controls charts, number of invalids, etc.) in annual reports to regulatory authorities. Major changes may not require revalidation, but they will require the demonstration of comparability of results before and after.7 For example, if a reference standard for a potency test method is changed, the equivalence of the new standard should be demonstrated by comparing accuracy and intermediate precision results using actual production process material.7 Major modifications may also require regulatory approval before their implementation.7

METHOD REVIEW

Periodic method performance and compliance reviews (Figure 2, in green) are helpful, as is the assessment of the impact from method modifications, to ensure an overall validation continuum. A recently published Food and Drug Administration (FDA) conference presentation, given by Mary Malarkey of FDA's Center for Biologics Evaluation and Research, summarized FDA's current thinking about production operations and a process validation continuum.9 Although this presentation predominately addressed only the process validation continuum, the concept of constantly monitoring and adjusting (if needed) the manufacturing process is certainly similar to the proposed maintenance tools for analytical methods herein. Acceptable method performance is continuously monitored with assay control charts, indicating that test systems are in overall control. This evidence is usually submitted in annual reports to FDA. When a licensed biopharmaceutical drug has been produced over several years and test methods have in the interim gone thorough inevitable changes, a more thorough review may be needed to ensure that the production process is properly measured and under control. Many changes that are small, and seemingly minor, could accumulate and compound into major changes in test method performance over the years. In addition, regulatory expectations are changing and are generally becoming more stringent in their demands for accuracy, reliability, and validity in test results. Test methods validated years ago may be suspect for their acceptable performance and current validation status in an upcoming inspection. A thorough periodic review is therefore a good preventative tool. Rather than being reactive to unacceptable method performance or regulatory inspection observations, periodic reviews may prevent unacceptable method performance and inspection observations. Performance reviews should be triggered through the VMP as all AMV approval dates should be listed within this plan. To ensure these reviews are done meaningfully and consistently among the various test methods and laboratories, a review checklist is an excellent tool to periodically assess the validated state for each test. This will ultimately ensure that each test method delivers accurate and reliable test results to support the quality of the production process or product. It will also ensure a test method remains compliant, efficient, and economical because the rate of invalid test results does not postpone production flow or product release, nor does it increase the demand of expensive instrumentation, reagents, or personnel. A simplified checklist is shown in Figure 3.

Figure 3. Assessment Form for Test Method Suitability and Validation Status

Emergency method performance reviews are shown in red in Figure 2 to indicate the urgency and criticality of the associated investigations. Emergency reviews are usually triggered by out-of-specification (OOS) results, above-action-level results, or frequent invalid assay runs that may postpone the release of material. A good AMM will support the generation of accurate and reliable test results, meaning here that the observed OOS results may, unfortunately, indicate a "bad" production batch or that this OOS was statistically just one of those (expected) outliers. There is one more part of an AMM program that can be used to assess process and method performance simultaneously, and with this, provides more confidence in all process results. By overlapping each particular test result for each manufacturing process test point with the assay control results, one saves resources for investigations because there is often an immediate answer for the validity of the assay result in question. Provided that test system suitability was appropriately developed and maintained in the AMM program and the analytical test system is under control, there is no need to investigate further. More detail on statistical process control and assay control combination charts is provided in Figure 4.

Figure 4. Overlay of Control Charts for Process and Analytical Performance

SUPPORTING PRODUCTION PROCESS CONTROL AND ROBUSTNESS

When laboratory test results are reported into a data management system used for process monitoring, the corresponding assay control results can be reported simultaneously to permit an overlay of production process data and test system control data. This readily facilitates the monitoring of statistical process control data and analytical test system control data. Data from these overlay charts provide several important process and analytical performance criteria. In brief, process and test method performance can be monitored individually and immediately compared to each other. Analytical inaccuracy and imprecision can be a major contributor to the observed process variability. This is particularly true for many bioassays used to test for biopharmaceutical drug efficacy, purity, and stability. It is therefore extremely useful to monitor and relate the observed process variability results to assay control results that are generated simultaneously by the same method. As stated earlier, many outlier result investigations could be shortened and simplified because outcomes often are based on the investigation results regarding whether a test result was valid or within the expected test system control range.

Figure 4 illustrates the overlay of sample results and the corresponding assay control results.10 The control and each sample were manufactured by the same process and are routinely tested using this quantitative limit test. In Figure 4, a characterized and quantitated process impurity is directly monitored against the test system assay control. One immediately observes that both process and analytical testing are not out of statistical control. However, the measured process performance is not ideal — slightly less than three standard deviations (SD) for the distance of the process mean (1.10%) to the specification limit of no more than 2.0%. Individual assay control and process data are correctly reported in this chart to 1/10th of a percentile, based on the established significant digits in the specification limit. One also readily recognizes that the observed process variability (SD=0.36%) includes a significant contribution from the day-to-day assay variability (SD=0.25%) reflected in the assay control data.

We took this a step further by using the data from the overlay chart together with our mix-ing/blending study results from the process development and validation data to estimate the actual process variability. A simplified relationship of main factors contributing to the overall observed process variability is shown here (V = Variability):

[Vobserved for process]2 = [Vassay]2 + [Vsampling/batch uniformity]2 + [Vactual for process]2

The results are illustrated in Figure 5. One immediately recognizes that the actual or true process variability (SD=0.18%) is relatively small when compared to the observed process variability. Given this, the easiest way to decrease the observed process variability is to increase both the number of samples collected and increase the number of assays performed before averaging all results. This significantly lowers the variability factors, assay variability, and sampling/batch uniformity to a more tolerable level. We could probably bring our less-than-3-SD process much closer to a desirable (but currently not measurable) 5-SD process. Although this would likely increase resource demands, mostly for the analytical testing, it would mean less OOS results, leading to a more robust process and ultimately more product to market. In addition, an improved day-to-day precision (intermediate precision) in analytical results converts to more reliable test results for measuring deliberate process modifications.

Figure 5. Contributing Factors to Process Variability

CONCLUSIONS

A good AMM program can indicate weak points within the overall process quality when control charts are combined. With AMM, one can estimate the time, money, and effort needed to correct imprecise results and achieve more accurate data. Remember that testing and production process adjustments are costly and can be risky when test results are not reliable.

Stephan O. Krause, Ph.D., is validation manager of QC assay support for Bayer HealthCare LLC, 800 Dwight Way, Berkeley, CA. 94701-1986, 510.705.4191, Fax: 510.705.5143, stephan.krause.b@bayer.com

REFERENCES:

1. International Conference on Harmonisation; Guideline on Validation of Analytical Procedures (Q2A). 60 Federal Register 11259-11262 (1995).

2. International Conference on Harmonisation; Guideline on Validation of Analytical Procedures: Methodology (Q2B). Federal Register 27463-27467 (1997).

3. Krause SO. Qualifying Release Laboratories in Europe and the United States. BioPharm International. 2004; 17 (3):28-36.

4. Krause SO. Development and validation of analytical methods for biopharmaceuticals, part I: development and optimization. BioPharm International. 2004; 17(10):52-61.

5. Krause SO. Development and validation of analytical methods for biopharmaceuticals, part II: formal validation. BioPharm International. 2004; 17(11):46-52.

6. Krause SO. Analytical Method Validation for Biopharmaceuticals, BioPharm International Guide to Validation. 2005; 24-32.

7. Krause SO. Submitting Advanced Bioanalytical Test Methods for Regulatory Approval. BioPharm International Guide to Bioanalytical Advances. 2005; 21-27.

8. United States Pharmacopoeia (USP). General Chapter in Draft, <1226> Verification of Compendial Procedures, Pharmacopeial Forum, 2005.

9. Fuege J. The Validation Continuum: An FDA Perspective, Presented by Mary Malarkey, Journal of Validation Technology. 2005; 11(3):253-6.

10. Krause SO. Analytical Method Validation for Biopharmaceuticals. Invited General Session Presentation, IBC International Conference: Quality Systems and Regulatory Compliance, Reston, VA, April 05, 2005.