Discrete Event Modeling
As a project progresses towards commercialization, discrete event modeling (DEM) - or discrete event simulation - represents
a more powerful and useful modeling tool. With DEM, an entire manufacturing facility can be modeled on a computer. This allows
the user to virtually build, run, and optimize a facility before investing any capital. The heart and soul of a DEM is the
scheduler, which simulates the running of the plant as the process calls for resources. Delays become readily apparent with
DEM if adequate resources are not available to run the process optimally. This allows the user to identify and understand
not only the causes of delays but also their impact. As such, DEM is the most powerful tool available for the design (including
support systems and utilities), de-bottlenecking, scheduling, and optimization of a manufacturing facility.
When used in conjunction with a CB model, DEMs can provide a complete picture of manufacturing costs, plant throughput, and
resource utilization. In addition, DEM is the only tool that can help management understand the full impact of - and schedule
- multi-product facility changes.
As with CB models, a DEM is only as good as the process information supplied to the model. However, this model calculates
a greater number of parameters, and the implications of potential problems are more readily apparent than with CB models or
Sometimes, the most powerful tool is not the right tool for the job. This is true early in process development when the assumptions
made and the accuracy required do not warrant the time and cost of more sophisticated modeling. Nevertheless, once the basic
process is defined, DEM becomes a valuable tool for designing and optimizing a facility. For this reason, many companies begin
DEM as early as the preliminary design phase.
Past uses of DEM in the biopharmaceutical industry include optimization of a facility that was achieving only 50% of its design
capacity. By modeling the plant and process on a computer, and verifying the model's output against the actual operating facility,
we were able to identify the causes for relevant delays and explore ways to alleviate them. Through an iterative process of
testing multiple changes to the facility and process, we were able to identify the ideal approach to optimize the facility's
output. Subsequently, these changes were implemented and the plant's productivity dramatically improved - as predicted by
the model. The running facility matched the revised model, thereby validating the model.
DEM has been successful in calculating the cost advantages of disposable technologies. It can look closely at the cost benefits
of certain efficiencies in the plant support areas. CB and MC are not as precise. DEM also helped in the scheduling of resources
in a manufacturing plant, including matching labor utilization to rotating shifts. Whatever the application, if the objective
is to address scheduling and the use of limited resources (including equipment, labor, support systems, and utilities), then
DEM is the right tool for the job.
John Curling: Early Scale-Up Helps - or Hurts
Our industry is in the process of maturing. It has done a good job of responding to the new opportunities provided by recombinant
DNA and hybridoma (a hybrid cell resulting from the fusion of a lymphocite and a tumor cell) technologies. Industry is now
actively pursuing cost-effective pharmaceutical processes for defined therapeutic products, with an intense focus on the cost
Table 1: Discovery and commercial production differences
Management may focus on COGS, cost of goods manufactured (COGM), or cost of sales, but the ultimate cost depends on the expression
system used, the expression level in that system, the integration of unit operations of the process, and the process efficiency
or yield. According to Myers, costs for monoclonal antibody production are split with one-third attributable to cell culture,
one-third to purification, and one-third to support. The overriding cost driver is the bioreactor titer, even though chromatography
may account for two-thirds of downstream processing costs.9 The best time to start addressing process costs is in the R&D phase.
The dialogue between research and process development is a potential gold mine. If product development is integrated further
back into discovery and drug development, more savings may be possible - although there is a risk of stifling research and
constricting the drug pipeline. Frequently, technology choices made during development in both upstream and downstream processing
present problems when the product is in the clinic or on the market.
A real challenge is addressing the difference in characterization standards between drug substances and biologics. Because
of their origin and complexity, biopharmaceutical entities are characterized by the process used to manufacture them - at
least in the eyes of regulators. This has immense cost impact because processes are frozen at the very early stage of clinical
trial lot production. Incorporating early-stage, poorly developed methodologies into production will have major impact later
on. Process development is therefore pivotal in transforming the R&D process into a manufacturing process.
According to Karri et al.,
- There is as yet no framework to link together predictive process models as part of an overall business planning strategy.
This is probably a reflection of the historical focus on discovery, rather than manufacture, and of the very significant margins,
and hence returns, that have characterized the industry during past decades."10
This is also a statement of the hidden cost of biologics development. Most companies have learned that if process development
releases an incomplete process to manufacturing, then multiple failures and consequent delays and major costs will ensue.
Gary Pisano examines the differences between discovery and full commercial-scale production in his book, The Development Factory.11 Table 1, adapted from that book, shows the major differences between the two.
The driving forces in research are the identification of the potential drug substance and demonstration of initial therapeutic
effect. At this stage, little attention is paid to optimizing purity and cost outputs. The initial microbiology and biochemistry
lay the foundation for the process - the process development group must scale up the batch, improve the purity, and cut costs
while meeting a schedule.
At this stage, scale-up is nonlinear, and new equipment configurations will be used - sometimes with advantages, sometimes
without. It is questionable whether we yet have the models and structures to describe and help effect the major changes that
need to occur to, for example, reduce the number of steps from 25 to 7, increase the purity to >99.9% (and identify and quantify
impurities and product variants), and achieve regulatory compliance (see Table 1).
Over the past decade, highly sophisticated software tools and models have developed. These analyze and describe the cost and
operational aspects of biological processes ranging from fermentation to formulation. Programs for DEM and COGS, such as SuperProDesigner
and the BPS Simulation Process Models, provide excellent support. They force the user to identify every detail that impacts
the process. That alone is a critical task of process development.
In microbiological development, R&D may choose the cell line for speed of development rather than genetic stability, expression
level, and adaptability to animal serum or protein-free media. Purification therefore gets pushed into downstream processing.
Temporarily, we get an apparent gain by the early integration of biochemical engineering with biochemistry. However, trusted
separation media and operations are integrated before optimization and before consideration of newer and more cost-effective
technologies that could cut steps in the process.