Accelerating Bioprocess Optimization - A series of advancements has changed the way bioprocesses are developed and optimized. - BioPharm International

ADVERTISEMENT

Accelerating Bioprocess Optimization
A series of advancements has changed the way bioprocesses are developed and optimized.


BioPharm International
Volume 24, Issue 4, pp. 38-44

THE PRACTICAL IMPLEMENTATION OF NGG FOR ACCELERATING BIOPROCESS QBD

One such project that the authors have been involved with is a collaborative partnership between a major pharmaceutical company and ArrayXpress, a contract genomics services company. While the specific organism and target compound are confidential, the tools, techniques and processes utilized provide a great example for demonstrating the benefits of the NGG systems biology approach. The primary objective of the project under discussion was to increase production titers of an essential target compound used in the manufacturing process of a current large revenue generating commercial product. The secondary objective was to build knowledge that will allow for faster and more efficient manufacturing of other products using the same organism/expression platform. As with all bioprocesses, the organism itself is only a single variable influencing productivity, with many environmentally tunable variables making up the remainder. We have CPPs that influence production, but due to prior technology limitations, the manufacturing engineers did not know their full impact on the metabolic processes and production efficiencies of the cells. Therefore, these parameters had previously simply been lumped together as an unknown called "process variability" or "biological variability", and as such, their manufacturing was completely at the mercy of the process itself, with limited process stability and dramatic product titer variability.


Figure 1: Fishbone diagram showing all the confirmed and putative CPPs associated with the overall target compound production process.
By bringing together the cells and the CPPs in a systems model, we can now see the entire equation. The cells are the primary production machinery; therefore our approach was to evaluate the physiological condition and the state of the cells during the various media and fermentation development stages. We first generated a working hypotheses by developing a fishbone diagram that showed all the confirmed and putative CPPs associated with the overall target compound production process (see Figure 1). This allowed for the identification of critical areas to be characterized in more detail, which was subsequently experimentally tested.

Our approach was to design highly focused and statistically sound microarray experiments with complementing standard analytical chemistry tests. We wish to emphasize the importance of having a very well thought out experimental design and analysis strategy prior to project initiation. This approach made it possible to identify key genes and their associated molecular pathways that were differentially affected due to changes of various CPPs in the overall production process. The use of DNA microarrays provides a detailed qualitative snapshot of the state of the transcriptome at the time of sampling, somewhat like a molecular fingerprint, that can reveal subtle process variations in great detail. This approach is especially useful in time course experiments like the ones we faced, to determine whole transcriptome changes associated with different CPPs, monitored across different growth phases (different time points) of the cells during the media and fermentation optimization stages.

Strong bioinformatics, both in statistical design and data analysis and mining, are the next key to success. A particularly important aspect of statistical inference in high throughput problems, such as microarray experiments, is the assessment of statistical significance exhibited by the data in the presence of a tremendous multiplicity of hypotheses. A single experiment can involve tens of thousands of hypothesis tests. This assessment requires efficient estimation of experimental error and careful control of false discovery rates. We applied two interconnected analysis–of–variance models: A normalization model that accounts for experiment–wide systematic effects that could bias inferences made from the data on individual genes, and a gene model that is fit to the normalized data from each gene, allowing inferences to be made using separate estimates of variability. Expression differences are then parameterized as factorial effects in linear mixed effects models appropriate to the experimental design. These effects can be estimated efficiently using statistical softwaresuch as JMP Genomics or SAS PROC MIXED. Resulting least square estimates are then mapped onto their associated metabolic pathways using KEGG metabolic pathway maps (www.genome.jp/kegg/pathway.html) in combination with proprietary software mapping tools (3).

The ability to map differentially expressed genes onto their associated biochemical pathways provides the opportunity to "zoom in" on each of the metabolic pathways associated with protein production. Key metabolites that are either depleted or produced are relatively easy to identify, but true process understanding comes from identifying how the compounds are used in the metabolic machinery. Amino acids, for example, could be depleted by translation, interconversion to other amino acids, or detoxification by the cell. Each of these routes has dramatically different impacts on cell health and productivity. With the application of NGG techniques you do not have to wait until the end of the project to begin seeing results. Each individual experiment contributes to the "systems" knowledge but in the short run provides specific information on variables that can be tuned for performance. Over the past three years we have completed numerous microarray experiments as part of our primary media and fermentation optimization objectives. A few examples will be highlighted here that will demonstrate the power of microarray technology to improve bioprocess stability and production yields as part of a larger NGG initiative.

In the manufacturing process of the target compound of interest, the original growth medium components were not well defined. As a result, different medium lots varied dramatically in protein yield and product titers. One of the primary objectives was to develop a chemically defined medium that would yield consistent titers. In our experiments, we evaluated whether stress response mechanisms of the production cells caused a reduction in titer during phase transition, and how media and fermentation conditions impacted these stress responses. We carefully designed time course experiments to cover transition through growth phases with trial versions of different defined media. Complimentary to this, we completed analytical chemistry tests to assign putative roles to transcription regulators that might be involved in stress response. By ultimately correlating differentially expressed genes of sigma factors with their associated biochemical pathways, we were able to optimize and change certain media components that led to improved protein production.


blog comments powered by Disqus

ADVERTISEMENT

ADVERTISEMENT

First Biosimilar Application Kicks Off Legal Battle
October 31, 2014
FDA Approves Pfizer's Trumenba for the Prevention of Meningitis B
October 30, 2014
EMA: Extrapolation Across Indications for Biosimilars a Possibility
October 30, 2014
Bristol-Myers Squibb Announces Agreement to Acquire HER2-Targeted Cancer Treatment
October 29, 2014
Amgen, Sanofi, and Ono Pharmaceuticals Partner with Universities on Transmembrane Protein Research
October 28, 2014
Author Guidelines
Source: BioPharm International,
Click here