OR WAIT null SECS
Collecting and analyzing data are crucial for creating efficient, automated bioprocesses.
Digitalization of biopharmaceutical manufacturing processes is moving forward, with new technologies for digital data collection and analysis playing an important role in automating processes. Industry groups and consortia, such as the BioPhorum Operations Group (BioPhorum) and the National Institute for Innovation in Manufacturing Biopharmaceuticals (NIIMBL), are bringing together experts to develop standards and roadmaps for these new technologies.
“We are heavily invested in the future of the bioprocess automation space by being part of and actively shaping the standards of the future as part of these groups,” says Darren Verlenden, head of Bioprocessing at MilliporeSigma, the Life Science business of Merck KGaA. “Our vision is of a facility of the future that will be digital, virtual, and a resource-efficient space.”
Verlenden says that subject matter expertise was key to the development of the company’s digital ecosystem introduced in April 2020, the Bio4C Software Suite, which employs the digital “4C strategy” (control, connect, collect, and collaborate). “Our process experts are the ones who functionally design our software and automation products. [Their expertise was key] in our approach to defining our data architecture, collection, analysis, and usage strategies.”
Verlenden says that the company plans to introduce a family of products that create an integrated digital twin. The first product, introduced in April, is the Bio4C ProcessPad, which is used to collect and analyze data from disparate sources including equipment, batch records, databases, and historians. BioPharm International spoke with Verlenden about best practices for data collection and analysis, maintaining data integrity, and predictive maintenance for biopharmaceutical processes.
BioPharm: What are some best practices for collecting and analyzing process data?
Verlenden (MilliporeSigma): In the biopharmaceutical industry, if you were to look across the value chain of drug development to scale up to manufacturing, process data are generated and used throughout the process stages. Whether the stages are carried out under one umbrella or done in collaboration with partners across an ecosystem, this mechanism of capturing and using data is driven by two key objectives: to eventually run a compliant and efficient operation and to deliver high quality therapeutics for the patients. The key objectives of these two elements never change, and they are rooted in the methods of quality by design (QbD), design of experiments, and continued process verification (CPV) as you move through the scale up to the manufacturing processes.
The design principles of interoperability and modularity manifest in the way MilliporeSigma products are developed and how our teams approach standards. Industry 4.0 is big on an ecosystem approach; interoperability, data exchange, and decentralized decision making are key principles. These standards and protocols are what make these principles real. Whether we speak of ISA-95 for automated interface between enterprise and control systems (1), the ANSI/ISA-88 standards for batch process control (2), OPC UA for machine to machine communication, or ASTM standards for Raw Material eData Transfers (3), we ensure alignment on these by design.
BioPharm: To employ data analysis systems, is it better to first digitize and go paperless, or are there good ways to handle paper-based data? What are some best practices in this area?
Verlenden (MilliporeSigma): To make the best use of data analysis systems and tools, it is always recommended to have all the datasets, whether historical or present, in electronic format. However, this again goes back to the key imperatives for our industry. Companies know that in order to remain competitive and deliver on their vision of bringing high quality therapeutics to the patients faster, they need to invest in efficiency. We know there are still many organizations out there, by virtue of their past investments, that are still heavily paper-based. But that’s changing rapidly. Products like the Bio4C ProcessPad facilitate transcribing paper-based historical records to an electronic format, seamlessly integrating with other digital platforms.
The Digital Plant Maturity Model (DPMM) from the BioPhorum Operations Group (4) is one good place to start. The DPMM has defined a framework that cuts across the different functions in an organization and helps companies uncover the key gaps and focus areas they must consider as they move up in digital maturity. Once they start moving up this maturity ladder, the benefits are self-evident—faster root cause identifications, better troubleshooting process, and quicker lot releases, to name a few.
BioPharm: What are the top concerns with data integrity for biopharma processes? What are some specific concerns for processes with single-use systems?
Verlenden (MilliporeSigma): Data integrity plays an essential role for the final delivery of products in the biopharma industries, as the availability of complete, accurate, and reliable data is of paramount importance to ensure drug safety and quality. Data integrity refers to both manual (paper) and electronic data, and the availability of increasingly sophisticated digital systems is making issues relating to data integrity increasingly complex.
Product quality is determined through the right and critical process parameters and critical quality attributes. These parameters translate into a complex set of data points that have to be identified, measured, monitored, and correlated to ensure they all come together across the process to finally deliver on the stated product quality goals.
During any manufacturing process, enormous amounts of data are collected across and within production process steps. There are a large number of procedures, standard operating procedures, documentation, and reviews one must employ to maintain the integrity of the data that gets collected. All these procedures, reviews, and documentation exist to mitigate the issues around transcription errors, missing records, limited audit trail, data corruption, and mismatched data set across steps and systems, to name a few.
Thus, if you look at data integrity in the context of the digital plant maturity model, the lower one is in the maturity level of electronic records, the higher the investment required in more personnel, manual procedures, reviews, and additional paper documentation. The higher one moves in the maturity level, the costs diminish for collecting such data, increasing the efficiency, and increasing the ability to access and review data in near-real time to make faster decisions. Coming back to ensuring data integrity—one has to look at the complete process, from data acquisition, aggregation, and analysis to delivering insights.
BioPharm: What do you see as the trends in using digital technologies in maintenance of biopharma manufacturing equipment?
Verlenden (MilliporeSigma): Our approach to digital is to look at data as the key integrating unit. In Bio4C, our 3rd ‘C’, ‘Collect’, looks at the reporting and analytics across the analytics continuum from descriptive, diagnostic, predictive and, eventually, to prescriptive analytics. Bio4C ProcessPad offers, through its machine integration module ProcessPad RT, the ability to consume real-time machine data from historians for monitoring batch parameters.
On the maintenance side—namely predictive and prescriptive maintenance—we are integrating the above-mentioned process monitoring use cases to focus on the condition monitoring of the equipment. We are investing in the space of sensor integration and designing our unit operations’ automation with the capability to capture the right data across the equipment, the pumps, the valves, and the operations.
There are fundamentally two scenarios here. In the first scenario, existing facilities built over time have invested in their equipment and facilities and need to maintain them for the future. The second scenario includes expansions and new facilities. In both scenarios, the approach to equipment maintenance will have a balance of digital and non-digitally enabled strategies.
We can learn from other industries that are already ahead in this space, including automotive, aerospace, and oil and gas. These industries monitor and maintain their equipment remotely with advanced digital and virtual tools. These learnings are also coming into our industry; we only have to make sure we balance them with industry-specific compliance and regulatory requirements when adopting these technologies.
These trends are manifesting themselves in two dimensions: equipment are becoming intelligent and low-footprint sensors are becoming real, and secondly, service delivery will become advanced, proactive, and virtual. The convergence of these two dimensions will bring about the use of machine learning, augmented and virtual support, self-service capabilities, and predictive analytics and scheduling, to name a few. Thus, you can imagine a future where predictive maintenance models would predict instrument failure well in advance and thus save manufacturing batches.
BioPharm: What are some best practices for using predictive maintenance?
Verlenden (MilliporeSigma): First and foremost, the approach to predictive maintenance should be more holistic, with an integrated operations perspective—consider the business imperatives, people, processes, and technology, which will drive better asset utilization and overall lower operational costs.
A holistic approach should start with a robust strategy and plan around your production floor asset management. For example, one needs to identify the critical equipment and their impact on the overall drug development process and product quality rather than aiming for predictive maintenance of all available equipment. This information should further be mapped at the key component level and their function in the process. The strategy should ensure the key parameters regarding people and process elements are also identified and linked.
Investment in technology is key: the equipment of the future will be intelligent equipment. Advanced sensors and condition monitoring strategies are already being developed; control platforms and software will be able to collect and collate past and current data. Advanced algorithms and machine learning capabilities will assist the end user to analyze, correlate, and understand the machine performance to detect faults in the equipment in real time.
Finally, it is one thing to gather data, but quite another to analyze and use the data for its intended purpose. That’s where—based on Industry 4.0 concepts—the integration of the equipment with the facility level automation becomes important for considering predictive maintenance as a key strategy toward overall operations effectiveness.
BioPharm: Can predictive maintenance be used for continued process verification?
Verlenden (MilliporeSigma): We think it should be the other way around—once the industry initiates continued process verification (CPV) and captures all datasets in a near-real time manner, then the collected dataset can potentially be very helpful for training machine learning models and thus helping industries moves toward near real-time predictive maintenance.
To elaborate, today, manufacturers collect samples and employ traditional statistical methods in their CPV programs. At the heart of such programs is identifying potential process inconsistencies to eventually execute corrective or preventive measures. Thus, with enhanced digital capabilities of data aggregation and analysis, predictive models, and machine learning algorithms, one can start to work on the aspects of detecting inconsistences early and feed back into a CPV program in a more systematic and structured manner.
Vol. 33, No. 11
When referring to this article, please cite it as J. Markarian, “Digital Strategies for Biopharma Manufacturing,” BioPharm International 33 (11) 2020.