Quality by Design and Compliance Readiness

Published on: 
,
BioPharm International, BioPharm International-01-01-2010, Volume 23, Issue 1
Pages: 40–45

How will implementing Quality by Design strategies affect your compliance status?

Biopharmaceutical companies undertaking Quality by Design (QbD) initiatives find themselves in the quandary that accompanies any significant advance in life sciences manufacturing: If we implement a far-reaching change in the way we operate, what will it mean to our compliance status? In the case of QbD, the question is even more confounding because QbD represents a profound change in the nature of compliance itself.

Chester A. Meyers, PhD

The increased scientific understanding of products and processes that lies at the heart of QbD makes it possible for manufacturers and regulatory agencies to focus on critical control points and parameters necessary to ensure that products meet quality attributes. They can then base compliance on risk—the probability that as long as the multi-dimensional combination of input variables and process parameters remains within the validated design space, or are otherwise controlled to achieve specified product attributes, then product quality is ensured.

For manufacturers, that will mean establishing an integrated quality system designed to accommodate the far more holistic and flexible approach to compliance that QbD requires. For regulators, it will mean understanding the decisions that manufacturers make when manufacturing their products instead of relating current good manufacturing practices (cGMPs) to a set of rigid and highly complex requirements.

Debra Weigl

QUESTIONS OF COMPLIANCE

Unfortunately, there is no template that biopharmaceutical companies can follow to establish their approach to compliance under QbD. The step 2 draft of the ICH Q11 guideline, Development and Manufacture of Drug Substances, which is intended to bring together and clarify issues related to quality systems raised in previous ICH guidelines, will not be released until late 2010. Even then, it is likely to provide only a framework for compliance, not a step-by-step guide to creating the organization, policies, and processes that are most likely to take best advantage of the opportunities promised by QbD: greater assurance of quality, reduced costs through less rework and rejected batches, and a lighter regulatory burden.

Advertisement

Furthermore, QbD tools will continue to evolve. Today, documentation of a design space is the preferred approach to risk-based quality assurance (QA) because we have the tools and technology to determine it. However, process analytical technology (PAT) will continue to improve and eventually could become capable of controlling a manufacturing process in real time. When that occurs, there may no longer be the need to document a wide, multidimensional design space. As long as we know how the process relates to the product output, PAT would be able to ensure product quality at all times.

With no compliance template and the continued evolution of QbD tools, biopharmaceutical companies face a host of questions about the specifics of compliance, including:

  • How will implementing QbD and quality risk management (QRM) strategies affect my current cGMP compliance status? How will I know about a change in that status?

  • Will agency audits be looking at traditional GMPs (i.e., deviation reports) when we've incorporated risk-assessment decisions and processes? If so, for what purpose?

  • If there is movement trending within the design space, when do we implement a change? Who decides? How is this decided?

  • What are the regulators' expectations about how we "regulate" our operating parameters or design space? Are full-scale batches required for that purpose, or will qualified scale-down models be sufficient? (An FDA draft guidance document from November 2008 indicates the intent to reduce the need for re-establishing validated parameters at scale, but there is still uncertainty surrounding the specific compliance requirements for a validated design space.)

  • When we make changes in our defined design space, do we still conduct investigations and write deviation reports?

  • What trends and changes have to be part of batch records? What changes are for information only and what has to be a part of required batch documentation?

Although there are no definitive answers to these questions, newer perspectives on risk management, knowledge management, continuous verification, and real-time feedback, analytics, and control (including PAT) can be understood as extensions of general GMP principles: State what will be done, by whom, and when. Define acceptable performance and outcomes. Document the actual execution and outcome and make the whole process traceable.

Basic GMPs also provide a framework for lifecycle process management, but under QbD they need to accommodate important new scientific justifications and data. For example, the addition of design qualification to basic equipment qualification (IQ/OQ/PQ) requires the organization to justify that the equipment design is appropriate for its intended use. Similar justifications must be considered for process and product profile decisions.

With these principles in mind, a biopharmaceutical company can bring together cross-functional teams to consider three key activities central to implementing QbD: QRM, knowledge management, and change control. These activities encompass the four elements that define products and processes as validated and compliant: critical process parameters, critical quality attributes (CQA), critical material attributes, and relevant target product profile attributes.

By undertaking these considerations and approaches, the cross-functional teams can better frame and address the questions arising from these activities. After they develop and prioritize the answers, they can begin to integrate the new practices into the existing quality system and start to find the way forward.

QUALITY RISK MANAGEMENT

Traditional GMPs are a means of establishing traceability—the assurance that processes have been carried out and that the end product conforms to specifications. But they offer no mechanism for adding scientific insight and accumulated experience or knowledge, as there is in quality risk management.

ICH Q9 defines QRM as "...a systematic process for the assessment, control, communication and review of risks to the quality of the drug (medicinal) product across the product lifecycle." The advantages of the lifecycle management of risk are obvious—the continuous improvement of product and process based on continually increasing knowledge. The industry has long been comfortable with the idea of phased-in compliance during development but has not been as good at integrating development, clinical, and commercial data; and traditional GMPs provide little guidance.

Biopharmaceutical manufacturers cannot eliminate risk altogether. Risk-based quality management aims to reduce risk as much as possible through continuous science-based and data-driven evaluations. In this way, QRM provides the means to achieve the continuous improvement and integration of data that is expected from QbD implementation.

Consider, for example, the statistical evaluation of the design space or process capability. Multivariate statistical analysis tools are required for this exercise. Effective execution requires personnel with expertise in both the process and statistical analysis (but not necessarily in GMP compliance). The tools and data that are used may vary with the specific process under analysis, and also with the risks associated with that process. Further, acceptance criteria may not always be prescribed with this approach, as they are for such items as a Certificate of Analysis or validation criteria. Rather, acceptance criteria may result from the risk analysis and the evaluation exercise.

Because these analyses do not follow a traditional GMP/SOP format, compliance in a QbD environment is better suited to evaluating how the exercise was carried out, for example, by asking:

  • Were the right individuals involved in the risk analysis and statistical evaluation?

  • Were all appropriate sources of data considered?

  • Were the analyses conducted and documented appropriately?

  • Were the outcomes and mitigations consistent with the risk profile?

  • Do the decisions (changes) better ensure that the target product profile is achieved?

  • Were the decisions communicated effectively?

With these characteristics of QbD, QA scrutiny should be concentrated at the level of process, not outputs. And, statistical analyses of process and analytical data will now become justifications for further process controls and changes. They also become the platform of knowledge for other products.

Furthermore, traditional QA metrics revolve around days to review records and closing deviations and corrective and preventive actions. In many companies, the sheer volume of records has created the need to do only spot checks. Better metrics might gauge whether problems were actually solved or proposed mitigation plans actually work. In other words, compliance needs to be measured in terms of the effectiveness of quality systems. Further, because deviation within the design space is acceptable, risks must be calculated in real time for the company to decide whether to continue processing when an excursion is encountered. Therefore, those who are doing real-time assessment must have access to the knowledge and tools they need to do the job—elevating knowledge management to more than a mechanism of record-keeping for purposes of traceability.

KNOWLEDGE MANAGEMENT AND DOCUMENTATION

Knowledge management in the quality system should help produce higher value information, better decision-making, and more organized, easily retrieved documents. The type, frequency, and contents of documents and the documentation process present some of the most vexing questions for companies. Although there are no clear-cut answers, documentation is tightly tied to QRM. As ICH Q9 says, "the level of effort, formality, and documentation of the quality risk management process should be commensurate with the level of risk."

What this suggests is that documents and documentation processes should be flexible, yet provide guidance on decision-making based on risk assessments, process inputs, laboratory data, and other data. That means adapting cGMP records to capture scenarios and data, risk assessments, decisions, and justifications, rather than simply the data that are commonly recorded. For example, the practice of reviewing a single batch record as it was executed does not allow for the monitoring of trends or of overall process execution. Conventional documentation audits are restricted to uncovering documentation practices and verifying individual calculations. They do not provide an adequate picture of the process capability or the overall capability of the manufacturing team and equipment.

Thus, the addition of knowledge management and QRM to quality systems elevates compliance considerations to a higher level, moving away from document execution to document content, effectively shifting the focus of compliance from the management of paper to the management of knowledge.

As with the implementation of QRM, knowledge management should be incorporated into current systems to minimize operational and cultural disruption. Comprehensive knowledge management may require some new systems, but the goal is to bring data together to enable its conversion into knowledge, not to create yet another place to put data.

CHANGE CONTROL

The change control process should be designed to enable the staff to:

  • Establish what level of review is required at each stage of development to determine whether the process and product are within the design space.

  • Decide when trends toward the limits of design space require a change in input variables or process parameters.

  • Evaluate, justify, and record changes.

  • Distinguish changes that require regulatory approval and those that do not.

  • Capture data outputs as a result of changes.

  • Prospectively assess the effects of changes on process capability and product quality for the longterm.

An effective change control system that achieves those objectives will also work with the knowledge management system to make a significant contribution to the overarching goal of continuous improvement in the process and product.

Change is, after all, inherent in the development process and should be used to gain knowledge and improve product quality. Traditional quality oversight in development would check and audit notebook citations and technical reports, which is not very useful for gaining product and process insight. Instead, QA might put out a guidance for how change information should be captured and communicated. The rigors of this process and experimental justification should be commensurate with the phase of development.

Chester A. Meyers, PhD, is a managing consultant, and Debra Weigl is a senior consultant, both at Tunnell Consulting, King of Prussia, PA, 610.337.0820, meyersc@tunnellconsulting.com