Introduction to Validation of Biopharmaceuticals

Published on: 
BioPharm International, BioPharm International-03-10-2005, Volume 2005 Supplement, Issue 1
Pages: 40–45

Synthetic drugs can be well characterized by established analytical methods. Biologics on the other hand are complex, high-molecular-weight products, and analytical methods have limited abilities to completely characterize them and their impurity profiles. Regulation of biologics includes not only final product characterization but also characterization and controls on raw materials and the manufacturing process.

Synthetic drugs can be well characterized by established analytical methods. Biologics on the other hand are complex, high-molecular-weight products, and analytical methods have limited abilities to completely characterize them and their impurity profiles. Regulation of biologics includes not only final product characterization but also characterization and controls on raw materials and the manufacturing process. FDA has defined process validation as "establishing documented evidence which provides a high degree of assurance that a specific process will consistently produce a product meeting its predetermined specifications and quality attributes." This involves supporting product and manufacturing process claims with documented scientific studies. Protocols, results with statistical analysis, authorizations, and approvals must be available to regulatory inspectors. Process validation is part of current good manufacturing practices (cGMP) and is required in the US and EU for a manufacturing license.

In addition to process validation, biopharmaceutical firms must conduct analytical method validation, expression system characterization, facility and equipment validation, software validation, and cleaning validation. Final product quality is assured when these elements are combined with other elements of cGMP, including lot release testing, raw material testing, vendor quality certifications, and vendor audits.

Expression system characterization is performed before Phase I studies in humans to insure safety. Concerns include the presence of contaminating organisms, tumorigenic cells, proteins, nucleic acids, retroviruses, or other pathogens. Taking tissue culture as an example, characterization includes the source, raw materials used, selection methods, number of generations, transfection or fusion methods used, procedures for establishing working cell banks, facilities, identity, homogeneity, absence of contaminating pathogens, tumorigenicity, and stability.

Analytical methods measure product characteristics important for therapeutic safety and efficacy during preclinical and early Phase I studies. Additional tests are developed for final product release and in-process sampling of the final manufacturing process. These measure characteristics such as molecular identity, purity, potency, and safety. The number of tests should be sufficient to show manufacturing consistency and the impact of manufacturing changes. Once a test is made a formal part of the manufacturing process, it is almost impossible to remove. Test methods are evaluated for different attributes such as accuracy, precision, range, selectivity, recovery, calibration (detection and quantitation limits), assay sampling, robustness, and stability.

Test method validation is needed to conduct clinical trials. Specifications should start off wide for Phase 1 and narrow to tighter values in the license application. Relaxing established specifications is very difficult.

Advertisement

Process validation involves the identification, monitoring, and control of sources of variation that can contribute to changes in the product. It starts with process characterization studies using scale-down models for optimization, operating range specification, extractables and leachables characterization, and clearance studies. Such work depends on validated assays and representative scale-down models.

Process development normally involves identifying critical variables, defining setpoints for each unit operation, and establishing operating ranges (deviations from the setpoint). Maximum operating range (MOR) limits are typically set during Phase II or III. If they are exceeded, an investigation is necessary to determine if product quality remains acceptable.

Normal operating range (NOR) limits are determined by run-to-run reproducibility with scale-down models and trending with control charts at production scale. NOR limits lie within MOR limits, which must allow for normal variability while maintaining acceptable operation.

Facility and equipment validation is normally divided into design qualification (DQ), installation qualification (IQ), operational qualification (OQ), and performance qualification (PQ). Equipment validation begins with pilot production of clinical materials for Phase II.

DQ provides documented evidence that the proposed design of the facilities, equipment, and systems are suitable for the intended purpose. DQ must compare the design to a set of well-defined user requirements relating to product safety, identity, strength, purity, and quality.

IQ provides documented evidence that the system is assembled, installed, plumbed, and wired according to the user's design specifications, vendor recommendations, and appropriate codes and standards. Vendors typically provide much of the hardware documentation.

OQ provides documented evidence that the system performs as expected throughout its intended operating ranges, including all the system's different functions and all its components (hardware, monitoring instruments, controls, alarms, and recorders). Elements of OQ testing and documentation may be part of the factory acceptance test at the vendor's site. Integration with plant utilities and component installation must be verified at the factory. Hardware cleanliness must also be assessed after cleaning.

PQ is documented by processing actual feedstock by trained operators using buffers and utilities at the factory. Full-scale process validation includes testing the consistency of batch production.

Software validation operates under the principle that quality should not be diminished if a manual process is replaced with an automated process. Software must be developed and tested under a quality system with defined user requirements, change-control procedures, provisions for authorization of operators for data entry and data checking, data archiving, software backup, provisions for system crashing, and procedures for monitoring and correcting software problems. 21 CFR 11 defines requirements for maintaining the integrity of data and software and handling electronic signatures for traceability.

Cleaning validation demonstrates the ability of cleaning procedures to permit reuse of processing components and equipment without a concomitant deterioration of product quality. Batch-to-batch carryover is of particular concern in multi-use plants making more than one product.

Consistency of product quality is demonstrated by showing operating consistency and product quality from batch-to-batch, processing with only buffer (blank runs) with assays for contaminants, examination of cleaned surfaces and materials, and extended scale-down clearance studies on reused materials. Disposable processing components that eliminate the need for cleaning validation are increasingly used at small scale.

Herb Lutz is strategic marketing manager at Millipore Corporation, 80 Ashby Road, Bedford, MA 01730, 781.533.2366, herb_lutz@millipore.com