Process Development: Maximizing Process Data from Development to Manufacturing

Published on: 
BioPharm International, BioPharm International-07-01-2007, Volume 20, Issue 7

Process development and manufacturing for biopharmaceuticals are often disjointed activities. Disconnects among groups are aggravated by a lack of common terminology and poor data management practices. A UK biotech consortium has initiated a collaborative development effort to address data management issues. The proposed outcome is a data model, based on the ISA-88 Standard for Batch Control, to capture process and facility data throughout the product lifecycle. A data framework that follows the ISA-88 model can simplify process scale up and enable early views of project costs and facility fit.


Process development and manufacturing for biopharmaceuticals are often disjointed activities. Disconnects among groups are aggravated by a lack of common terminology and poor data management practices. A UK biotech consortium has initiated a collaborative development effort to address data management issues. The proposed outcome is a data model, based on the ISA-88 Standard for Batch Control, to capture process and facility data throughout the product lifecycle. A data framework that follows the ISA-88 model can simplify process scale up and enable early views of project costs and facility fit.

The biopharmaceutical industry shows signs of maturity, and increasing competitive pressure is driving the need for faster, cheaper development and production processes. Companies are actively looking for solutions that speed progress from lab to pilot-scale manufacture and that promote cost-effective production processes. Progress toward these goals is hindered by the complexity and variability of biological product manufacturing, however, and current initiatives, such as quality by design, are a reflection of how much the industry has yet to learn about bioprocesses.

Sartorius North America Inc.

Product development lifecycles typically range from five to eight years for biopharmaceuticals;1 the long lag times between development and manufacture make it difficult for companies to learn from past experience. Furthermore, investigational process-development activities usually take second place to the main objective of producing product as quickly as possible. The organizational divides that frequently separate development and manufacturing activities only aggravate the situation, particularly when companies lack effective data management.

A survey of pharmaceutical companies conducted by Ken Morris, PhD, of Purdue University, and Sam Venugopal and Michael Eckstut of Conformia Software, Inc., revealed that few companies can track data and decision-making processes during the development lifecycle, and that most are dissatisfied with the ability of their existing IT systems to capture and manage drug development information.2 Survey results indicate that drug development specialists spend five hours a week, on average, looking for data, and about two-thirds of respondents reported they couldn't find 10–20% of the data they needed. It is virtually impossible to achieve continuous improvement in such an environment, and the cost of lost time and rework during development can be significant. The average cost to develop a new pharmaceutical is more than $800 million,3 and development costs continue to rise.

To manufacturers from other industries, the solution seems obvious: implement a new, comprehensive IT solution, or make better use of existing systems. Even though there are many sophisticated IT solutions available, ranging from enterprise resource planning (ERP) systems to electronic lab notebooks (ELNs), adoption has been slow and problematic in the biopharmaceutical industry. There are several reasons for this. First, the widespread presence of paper lab notebooks and uncertainties about electronic records and signatures have caused many organizations to be reluctant to abandon paper-based systems. Companies that do make the switch often go through an awkward transition period where both paper and electronic systems are in use. This doubles the workload of end users and can increase resistance to change. Second, software vendors tend to adopt a "one size fits all" approach, often a flawed practice because it ignores the fact that different users can have different ways of working, and there is a wide range of business models in the biopharmaceutical industry. Small companies may focus on discovery and outsource their manufacturing requirements, large companies may buy in new processes, and so on. Third, in some cases the issues go beyond data management and reflect more fundamental disconnects in communication. Adding a level of automation to ineffective business processes often makes the problems worse.



To address some of these issues, a research consortium was formed in 2006 to collaboratively develop a knowledge-management model and supporting applications. End users are from research, process development, technology transfer, and manufacturing. The consortium received a grant from the UK Department of Trade and Industry (DTI) for a three-year project. The consortium is led by BioPharm Services Ltd., a technical consultancy with experience in bioprocess simulation and design, in collaboration with Avecia and Cambridge Antibody Technology (CAT). BioPharm Services has created process simulation models for Avecia and CAT in the past, and each company offers a unique perspective on the biopharmaceutical industry. In addition, the collaborative agreement enables the model to be tested with sample process data and scenarios early in development. The following sections outline the approach taken and the progress to date.


It was apparent early in the project that consistent, standard terminology for representing bioprocesses is essential in order to create a data model with broad application. The general model for batch manufacturing defined by the Instrumentation, Systems, and Automation Society—the ISA-88 Standard for Batch Control—seemed a good starting point, given its widespread use for control systems and automation.4,5 Two aspects of ISA-88 are especially useful for defining a bioprocess model: a separation of process requirements from equipment capability, and a modular design approach. The concept of defining a process in dimensionless terms is inherent to scale up and facility fit assessments, and the practice of defining bioprocesses as a sequence of unit operations is essentially a modular design. Because ISA-88 is a standard for batch control, some simplifications were made to apply the standard to a knowledge-management model.

Figure 1

The focus of the knowledge-management model is the general recipe—the processing sequence required to produce a given product defined in dimensionless terms. There is one general recipe per product, regardless of scale. When the general recipe is carried out for a specific batch size, the resulting process sequence is the master recipe, and the equipment required to produce the product is defined in the physical model. The process of mapping a general recipe to a particular set of equipment to create a master recipe takes place through the use of transform components—a combination of equipment-sizing calculations and descriptions of process requirements and constraints. From the perspective of a biopharmaceutical company, the key activities are the creation of a general recipe during process development, and the transformation of a general recipe to a master recipe during technology transfer. This is summarized in Figure 1; Figure 2 illustrates how the ISA-88 model maps to the typical biopharmaceutical product development cycle.

Figure 2


A general recipe can be seen as the final result of a series of development studies: a checklist for ensuring that sufficient information has been specified to enable an independent organization to manufacture the product; or a process model to evaluate the manufacturing requirements for a proposed new product; or any combination of these. Ideally, the information contained within the general recipe provides sufficient guidance for the operations staff to generate manufacturing procedures and operate the process at any scale. In reality, process scheduling and performance is constrained by facility, resource, and equipment availability. Furthermore, the difference in requirements for non-GMP (good manufacturing practices) and cGMP (current good manufacturing practices) manufacture can have a significant impact on timing and resource usage. For this reason, supporting information about the criticality of parameter ranges, particularly hold time and temperature for process intermediates, is vitally important for troubleshooting during the technology transfer phase and must be incorporated into the data model. One challenge of defining bioprocesses in dimensionless terms is the wide range of unit operations and the requirements for in-process monitoring and control. The approach taken for the knowledge-management model is to define a library of standard process actions to represent basic processing steps, such as "heat," "agitate," "pressurize," etc. and the associated parameters. These defined process actions can then serve as building blocks to be assembled in different sequences to generate a general recipe, with control points clearly identified.


The initial focus of the project has been to define a data structure for the general recipe and physical model based on the ISA-88 model, and to populate a test database with data from sample processes and equipment lists. The next stage of the project will involve testing the functions required to transform a general recipe into a master recipe with different scenarios. The consortium has agreed to follow good automated manufacturing practices (GAMP) during the development process, to allow the resulting software to be validated where necessary. The benefits of having a validated knowledge-management model include the potential to extend the data structure and applications into manufacturing and to support scale-up models for regulatory submissions. The validation approach for the project is to consider each component of the knowledge-management model separately, starting with a core database, and then to extend the functionality with additional software applications.


An important outcome of the project's requirements-gathering phase was a better understanding of the potential benefits of the knowledge-management model. By allowing key process information to be systematically captured and retained throughout development, the model can enable scientists and engineers to learn more effectively from their collective experiences. A better view of historical performance also enables better predictions of future process performance and costs, and these benefits will be the focus of the next development stage for the project. The following sections describe some benefits identified during the planning phase.


A simple and intuitive approach to managing data involves separating key process information, such as data about process steps and raw materials, from routine experimental methods and general report information. Storing data and methods separately is a useful design output in itself, and it supports rapid generation of developmental reports while making it easy to search for, access, and interpret previous work.

This approach can reduce the time scientists and engineers spend creating documentation; it also simplifies and accelerates acquisition of data from past studies. For example, a simple data-entry interface can allow process-development scientists to define process parameters, to reference supporting experiments, and to select buffer recipes and operating methods from a predefined standards list. This saves them time when creating process documentation. Once a history of experiments and process definitions is built up over time, the database can also be used as a troubleshooting tool. Engineers can search for information about an unexpected outcome for a particular unit operation to see if the problem has been previously encountered and if any workaround solutions have been suggested. Similarly, operations managers can reference information about maximum allowable hold times for process intermediates during troubleshooting to evaluate whether to proceed with a batch.


The increasing popularity of platform processes indicates the value of standardization for bioprocesses. A common data center that provides access to preferred materials and operating methods can promote the development of robust process designs and enable process-development scientists to focus on refining those process sections most likely to impact product quality or process performance. In addition, ensuring that standard representations and terminology are used throughout the process can reduce confusion during technology transfer and improve communication among groups.


Ultimately, the goal of the knowledge-management model is to enable feedback from manufacturing performance to guide process development and create a model that supports continuous improvement and better process understanding. Various modeling tools are commonly used in the industry to gain a better view of the effect of process-development decisions on manufacturability, and the data model should support more effective use of these modeling tools. By enabling assumptions and relationships defined in the model to be compared with historical process data, modeling tools can be continually refined and improved. For example, cost models can assess the impact of process-development decisions on manufacturing costs, and simulation models can assess facility fit and overall manufacturing performance and resource utilization.


The planning phase for developing the knowledge-management model indicates that a central repository for process data in a structured format will offer benefits for process development and technology transfer. In addition, the model will support progress toward improved process understanding. The next development phase will focus on creating applications for defining and scaling up bioprocesses based on the general recipe concept.

(Companies interested in participating in this development effort are encouraged to contact Andrew Sinclair at BioPharm Services for further information.)

Claire Hill is a bioprocess engineer at BioPharm Services UK, Lancer House, East Street, Chesham, Bucks, HP5 1DG, United Kingdom, +44 (0) 1494.793243,

Andrew Sinclair is cofounder and managing director at BioPharm Services UK, Lancer House, East Street, Chesham, Bucks, HP5 1DG, United Kingdom, +44 (0) 1494.793243,


1. Reichert JM. Trends in development and approval times for new therapeutics in the United States. Nature Reviews Drug Discovery. 2003;2(9):695–703.

2. Morris K, Venugopal S, Eckstut M. Making the most of drug development data. PharmaManufacturing 2005 Nov 30;4(10):16 –23.

3. DiMasi JA, Hansen RW, Grabowski HG. The price of innovation: New estimates of drug development costs. J Health Econ 2003;22(2):151–85.

4. American National Standards Institute/Instrumentation, Systems, and Automation Society (ANSI/ISA). ANSI/ISA-88.01–1995 Batch Control Part 1: Models and terminology. Research Triangle Park, NC; 1995.

5. American National Standards Institute/Instrumentation, Systems, and Automation Society (ANSI/ISA). ANSI/ISA-88.00.03–2003 Batch Control Part 3: General and site recipe models and representation. Research Triangle Park, NC; 2003.