Building Data Quality in Generates Quality Data Out

Published on: 
BioPharm International, BioPharm International-03-01-2020, Volume 33, Issue 3
Pages: 18–22, 38

Ensuring the quality of data in process monitoring and control systems starts in process development phases.

As biopharma manufacturers incorporate more data-driven monitoring and control systems in production processes, the quality of the data, as well as integrating, interpreting, and protecting it, become more important. One solution is to build quality features into these systems during process development.

As a process scales, “there is a need to transfer or integrate process development (PD) data; therefore, building data quality into these activities right from the start is important. Management of data quality always facilitates integration with other systems irrespective of the scientific or business purpose,” observes Chris Andrews, a senior solution consultant with Dassault Systèmes BIOVIA.

“In the case of monitoring data during process development, it is critical,” Andrews says, “to identify and segregate valid data intended for technical transfer and to ensure that data are clean (i.e., have few invalid results or records) as possible. Higher quality data will minimize integration time, improve the ability to quickly interpret those data, and offer assurance of data integrity and provenance.”

Establishing a common format for data, units, and terminology will also simplify integration with other systems and prevent confusion during interpretation, adds Kevin Seaver, general manager for bioprocess automation and digital at GE Healthcare Life Sciences.

In addition, working with secure databases and historians, rather than in flat files such as comma-separated values (CSV) or extensible markup language (XML), allows easier data retrieval and integration along the way, according to Edita Botonjic-Sehic, global PAT manager at Pall Biotech. “Feeding process data into a database immediately enables quicker validation and builds in an element of safety and security that does not come with flat file formats,” she observes.

Unfortunately, many analytical instruments default to providing data in flat files, and the responsibility to build and connect to databases falls to the user. “Wherever possible,” Botonjic-Sehic says, “this issue should be addressed from the beginning to avoid creating more work or risking loss of critical data on the back end.”

Data quality approaches

There are many approaches to data quality management, running the gamut from minimal to strict. The importance of the approach is relative to the criticality of the process being developed, according to Andrews. For example, he notes that “monitoring data for low-risk, well-controlled unit operations or those with historic reliability would require less stringent control and quality management. For new or less well-controlled steps or techniques or otherwise high-risk operations, however, a more robust data quality regimen is called for.”

“Quality oversight is required for laboratory/pilot-scale studies where process monitoring data will be used to develop the strategy for in-process control and/or support key components of regulatory submissions,” Seaver adds. “The design space defined through these studies provides insight into the criticality and acceptable ranges for controlled parameters and performance or quality attributes. Therefore, interpretation of the results determines the materials, run conditions, and testing strategies for scaling-up and the final commercial process, to ensure the safety, efficacy, and potency of the final product,” he continues.

As a result, Andrews asserts that for scale-up and commercial processing, data quality is very important. “While not required for licensing, the integration of upstream development data with engineering runs, scale-up, and production data has exceptional business value. There is a wealth of information to be drawn from all phases of the product development lifecycle,” he says.

It is essential, though, according to Botonjic-Sehic, to not only understand the instrument being used and the data it produces, but also the deeper context behind each data point pulled and stored (metadata). “Beyond tracking a number, a user needs to be able to understand what part of the process gave that number and how it relates to the process performance. If there is inadequate metadata available from the PD scale, then the usefulness of that process data during scale up and commercial processing will be limited. Properly contextualized data is critical for troubleshooting and success as the process is transferred from process development to commercial processing scale,” she explains.

 

 

Layers of best practice

Quality is achieved through a multi-layered approach. Encompassing all activities and at all stages of the therapeutic product lifecycle should be a culture of quality and commitment to the ultimate endpoint of that product-the patient. “Fostering such a culture of quality is the best way to ensure high-quality process monitoring and control during process development,” Andrews says.

The next layer involves the leveraging of standards to drive the way that data are stored, transferred and applied to ensure a coordinated approach. Botonjic-Sehic notes that within the pharmaceutical industry, additional guidance that governs data systems helps manufacturers meet regulatory requirements.

Advertisement

Standards are important because they help minimize the risk to risk to quality, integration, interpretation, and protection, which primarily occurs when data are being transferred and stored. “While interfaces within a piece of equipment are optimized by the supplier, when that piece of equipment must interface with databases, control systems, or other equipment, there are challenges,” Botonjic-Sehic observes.

Standards describe how best to integrate data across the systems effectively and safely. “There is guidance on communication standards, data formatting, and even coding that helps to facilitate the process from PD to commercial scales,” she notes. In addition, the foundation should start with system design based on ISPE’s Good Automated Manufacturing Practice (1) and integrate the data standards outlined by regulatory agencies such as 21 Code of Federal Regulations Part 11, Annex 11. Supplementary guidance and various industry communities of practice help build a strong solution on that foundation.

Comprehensive information technology solutions that provide the quality control necessary but also engender compelling and innovative experiences-the third layer-can enable the right quality culture. “Implementation of appropriate technical solutions should reduce friction in accomplishing tasks in addition to providing the array of business benefits, including data quality. If a solution is cumbersome or does not fully meet the business need, users will find a way to work around it, which will assuredly lead to less data integrity. The result can include data silos, dirty (i.e., invalid or incomplete) data, and thus, much more onerous data integration, interpretation, and security efforts,” Andrews observes.

The fourth layer involves validation. “Where process development data are intended to support key decisions for processing at scale, validation-of the systems used to collate information-or secondary verification-for manual collection-should be performed to ensure quality,” Seaver explains. Validation requires testing against pre-established criteria to confirm that the collection system performs as intended to ensure quality of the data. Manual adjustments made to the data set should be visible through an audit trail to protect against fraud and misinterpretation, Seaver adds.

Some tactics to avoid

Because integration of data is a key component of how any piece of manufacturing equipment works, a strategy for integration must be built in from the start. Too often companies will get excited and build a system based on features, only to think about data integration last; this, however, makes it difficult to deliver a strong data solution, says Botonjic-Sehic. “From the launch of design, data tracking, storage, security, and transferability must be engineered into the system. Creating a ground-up strategy for implementation of standards and ease of integration is undoubtedly the best way.”

It is also important to avoid manipulation or analysis of the data in a separate program/system, according to Seaver. “Doing so invalidates the quality controls provided through the original system validation. Secondary validation or manual verification would be required to ensure that all information has been transcribed correctly and is accurately represented following analysis,” he explains.

Neglecting the quality of “unregulated” data, or those that are not necessarily required per an internal standard operating procedure or an external regulatory agency, is another common problem. The result, according to Andrews, can be unexpected downstream quality failures. He also notes that, in general, not ensuring process development monitoring data quality represents a lost opportunity to learn and benefit from that data.

 

 

The promise of PAT

During a bioprocess, many critical quality attributes (CQAs) and process parameters are monitored to ensure product quality. In traditional processes, samples are taken frequently by manual means and brought into analytical laboratories for testing to ensure product quality has been met; this carries forward throughout the process.

These methods are labor-intensive, allow for operator error, and do not deliver real-time data, which inhibits their use in continuous bioprocesses, according to Botonjic-Sehic. “Real-time data is extremely important for identifying and mitigating process failures immediately and is necessary to complement the advances in continuous processing,” she asserts.

To help overcome this industry challenge, Pall Biotech has been focusing on the evaluation and development of new process analytical technologies (PATs) that can measure CQAs in real-time. “The goal is to remove any lag between identifying and mitigating a discrepancy in the process by bringing monitoring closer to the process. By automating analytics and enabling the automatic tracking and processing of data, integrity, and security are built into the process,” observes Botonjic-Sehic.

The company, she adds, is focusing on the most demanding CQAs throughout the bioprocess. “We do not want to just bring new equipment to the process suite for the sake of it; we want to ensure that we are bringing measurements closer to the process with intelligent and intuitive designs. While there is still not a ‘one-for-all’ approach, different analytical techniques are being evaluated to meet application needs with as much functionality as possible,” Botonjic-Sehic says.

The value of automation

In addition to real-time data, automation of both the bioprocess and the data collection and analysis, which can be leveraged by a computerized platform to provide real-time feedback control of the bioprocess, can provide an even deeper level of product quality, according to Botonjic-Sehic.

Pall Biotech has implemented this approach through a framework capable of controlling the process equipment and analytical instruments, automating the data collection and analysis, and feeding the data into developed chemometric models to enable continuous monitoring and control of bioprocesses.

An integrated human-machine interface (HMI) enables the operator to monitor the process and intervene as needed, ensuring optimum process control and product quality, according to Botonjic-Sehic. She notes that the automated documentation of critical process parameters and CQAs also improves the efficiency of process maintenance, while the chemometric model of each unit operation improves productivity at each step. “Our objective is to have fully automated and integrated control capabilities from end-to-end of the bioprocess, capable of immediately detecting any deviation in either the process or the product, with an HMI giving the operator the tools needed to mitigate those deviations, saving time, reducing batch failures, and improving overall product quality and consistency at every scale,” concludes Botonjic-Sehic.

Advances in digital technology are also contributing to greater quality for data monitoring and control systems. For instance, cloud-based storage environments are making it easier to bring together data generated on process and analytical equipment, according to Seaver. “This information can be linked to the purpose of the study in electronic laboratory notebooks to provide the context. It has also made it possible to compare information across studies, multiple projects/molecules, and scales to better understand risk,” he says.

The popularity of coding languages and technologies, such as the Industrial Internet of Things, has also led to amazing innovation in the areas of therapeutic development lifecycle data processing, adds Andrews. “The advance of communication and encryption technologies mean data provenance and non-repudiation are more reliable,” he explains.

Reference

1. ISPE, ISPE GAMP 5 Guide: A Risk-Based Approach to Compliant GxP Computerized Systems, February 2008.

Article Details

BioPharm International
Vol. 33, No. 3
March 2020
Pages: 18–22, 38

Citation

When referring to this article, please cite it as: C. Challener, “Building Data Quality in Generates Quality Data Out,” BioPharm International, 33 (3) 2020.