OR WAIT null SECS
Mechanistic models provide process understanding for developing robust manufacturing processes and for scale up and tech transfer.
A digital twin is a model of a process that allows process development experiments to be run in silico—on a computer—rather than on actual lab or manufacturing equipment. Although models can be developed using experimental data, mechanistic models, based on first principles, can be developed with only minimal experimental input, explains Nick Whitelock, EngD, sales specialist at GoSilico, which was acquired by Cytiva in June 2021. The company provides software as a service, as well as consulting and training services to help users build expertise within their organizations. BioPharm International spoke with Whitelock about how digital twins are used in biopharmaceutical process development, scale up, tech transfer, and manufacturing, particularly chromatographic polishing as an example.
BioPharm: Are digital twins typically used for modeling individual unit operations in biopharmaceutical manufacturing? Can they be used for continuous operations? What are the benefits and challenges?
Whitelock (GoSilico): Modeling a single operation is the first goal of mechanistic digital twins, as this represents the low hanging fruit in which little experimental effort provides profound process understanding. A typical first project would include calibrating and interrogating a model describing a single operation, often a chromatographic polishing step, as this is often straightforward.
A powerful aspect of a mechanistic approach is the ability to extrapolate to unseen process conditions. [These conditions] may include a changing feedstream quality with respect to product or contaminant concentrations, variations in material quality, such as adsorber capacity and buffer composition, or indeed simulating a process comprising of multiple chromatographic steps, continuous or not.
With a mechanistic approach, relatively sparse experimental data are used to describe the full dynamics of the system at hand, using deceptively simple fundamental natural laws from the fields of fluid mechanics, kinetics, and thermodynamics. With these base processes understood, the complex, emergent behavior can be simulated to high precision by linking these underlying phenomena, systems and operations, in silico as they are in vitro. Tedious, physical experimentation can, therefore, instead be performed in mere seconds within a computer.
The challenges of mechanistic modeling include the need to characterize the behavior of these systems, and the expertise required to use them effectively. Companies such as GoSilico (now part of Cytiva) provide intuitive tools, intensive training, and expert consultancy to help develop this expertise and lower the barrier of entry. Additionally, the field has reached a maturity where experimental workflows are often straightforward, and there is an ever-growing library of models and use cases applicable to a wide range of activities.
BioPharm: How are digital twins used in process development?
Whitelock (GoSilico): All mechanistic models describing preparative chromatography have the eventual aim to understand and improve manufacturing-scale processes. It is typically the development lab, however, that has access to the data needed to characterize system dynamics, considering the small operating window at manufacturing scale and prohibitive expense of testing scale-up conditions. Models are typically calibrated using lab-scale data for material and cost efficiency, with the model then scaled to predict manufacturing scale by accounting and simulating changes to scale, operation, systems, and material quality.
The fundamental principle of in silico scale-up and scale-down of chromatography is that only the fluid dynamics outside the pore system change. Once a molecule enters the pore system, its behavior follows the same mechanisms in an adsorber bead packed in a filter plate and a production column. To allow for a prediction of the complete process on a new scale, only the basic fluid dynamic properties need to be known that can be derived from, for example, column qualification runs.
BioPharm: What types of models/data would be needed for scale up/tech transfer?
Whitelock (GoSilico): A digital twin is an invaluable tool for transferring a process, whether to different scale, equipment, or facility. By explicitly describing all components within a process, from raw material to method, one can account for any changes.
To predict the impact on process performance, a profound understanding of the differences with respect to material, system, and operation is required, though fortunately obtaining this required information is often easy. The impact of the chromatographic system used, whether lab-scale or manufacturing skid, can be assessed with pulse experiments, or knowledge of the machine’s geometry. Data on column packing quality [are] often available as part of column qualification, and conductivity transition data may provide an insight in fluid mechanics often overlooked. Data on buffer compositions and adsorber quality should also be available.
A particularly powerful application of these tools to manufacturing, aside from predicting the influence of scale, is for root cause analysis; whilst a mechanistic model enables one to find optimal conditions for process robustness, if deviations do occur, such a model is also well suited to determine the root of the issue. In this case, if a model describing the process is extant and describes the process when controlled, data for the out-of-spec chromatogram is fed back into the model. Process parameters are then changed in silico to search for potential differences, such as raw material quality, operational changes, or feed stream variations, which enables one to isolate the mechanisms responsible for changes, therefore making informed decision during root cause analysis based upon data and
BioPharm: Can digital twins be used in manufacturing to control a process? If so, what are the benefits and challenges of doing this?
Whitelock (GoSilico): Mechanistic models can and have been used as part of a sophisticated control strategy, though often we see models being used to better understand the process and, therefore, find more robust conditions. In a situation where thousands of experimental conditions can be evaluated in silico, including variation in upstream feed or material quality, one can mechanistically understand the dynamics and develop a design space that will function with simple controls even in the face of process variability.
The challenges in implementing model-based control strategies are the availability and quality of data, as well as required timelines. A mechanistic model can easily predict the influence of a process change; variations within the process need to be quantified for such a prediction to be valid, and often these data are not easily available, which is a limitation in all modeling approaches. If such data are available, a mechanistic model would allow not just a prediction of process performance, but also whether this variation can be mitigated by varying operational conditions, such as chromatographic method or pooling strategy.
BioPharm: What do you predict for how digital twins will be used in biopharma in the future?
Whitelock (GoSilico): Over the past few years, we have seen great enthusiasm and rapid adoption of this technology across the industry, largely by process development groups. In the near term, we expect this trend to continue, as has occurred for statistical models; mechanistic model driven process development will continue securing itself the standard approach for improved process performance, intensified development, and quality-by-design compliance.
Additionally, mechanistic modeling has an ever-growing research and development pipeline itself; in recent times, we have seen adsorption models based upon colloidal theory, with a greater basis in physics, emerge as revolutionary tools. We expect this direction to continue, in which increasingly sophisticated mathematical descriptions of bioprocesses become available through strong collaboration with both industry and academia.
Into the longer term, we expect to see manufacturing groups further embrace this technology; continued process verification, improved control strategy, and root cause analysis are just three facets that greatly benefit from a fundamental understanding of the underlying mechanisms of the process. Accordingly, fully developed digital twins will foreseeably play an important role in handling raw material variability, batch control, and real-time release testing in the future.
One of the largest obstacles to the adoption of such a revolutionary tool is the perceived complexity; certainly, the equations are obtuse, and that is often enough to scare away non-engineers or computer scientists. However, the art and tools have matured to the point where user-friendly software are available, there’s a community of experts and an ever-growing library of papers and case studies—one doesn’t need to be a mathematical genius with advanced coding skills to use this technology effectively. Any chromatographer can pick up the practical skills rapidly; and the fundamental principles are very intuitive.
Vol. 34, No. 9
When referring to this article, please cite it as J. Markarian, “Using Digital Twins to Model Process Chromatography,” BioPharm International, 34 (9) 2021.