Best Practices for Data Integrity

Published on: 
BioPharm International, BioPharm International-07-01-2017, Volume 30, Issue 7
Pages: 34–36

Optimize practices and meet requirements using electronic data integrity systems.  

Maintaining data integrity is essential for all biopharmaceutical manufacturing operations. BioPharm International interviewed Dan Nelson, Customer Experience Team member at DataNinja, which provides real-time manufacturing and quality software for the life-science industry, for insight on best practices in maintaining manufacturing data integrity. 

Importance of data integrity

BioPharm: What do you see as the most important considerations for manufacturing data integrity?

Nelson (DataNinja): There are really two main reasons why biopharmaceutical manufacturers care about data integrity: protecting the business and enabling the business. If manufacturing data are missing or inaccurate, then the firm is in violation of GMP regulations; it may mean that the quality of the product cannot be trusted. In serious cases, bad data integrity could lead to an FDA-mandated shutdown of production. I’d say just about every biopharmaceutical manufacturer understands this concern. So, out of fear, they invest heavily in business-protecting activities to ensure data integrity. These investments usually occur in the form of additional manual document reviews, more rigorous quality documentation, hiring more personnel, implementing electronic systems, and so on.

Many companies thus view data integrity as a law-mandated, paper-intensive, cost center. Stereotypes are changing, however, as innovative companies are leveraging technology to turn data integrity into a business-enabling profit center.

There are massive amounts of data collected during production, including machine usage; historian data; timestamped, in-process measurements; and materials usage. Forward thinking-manufacturers use these data for business optimization efforts. 

Using data to optimize processes

BioPharm: What are some examples of business enabling or optimization activities? 

Nelson (DataNinja): Once the business feels protected, it can progress to the next level, where the concern starts to become more about data accessibility. A thought process, for example, may be ‘Now we have production yield and quality data that are reliably collected, and we can trust their integrity. How might we leverage that data to minimize variability?’

At a large contract manufacturer in Salt Lake City, Utah, for example, one quality manager took it upon her team to try and identify the root causes of batch yield variance. They meticulously went through batch production records, one filing cabinet at a time, to aggregate datasets of the suspected yield variability drivers. When they compared yield data to all the other datasets they discovered that one of their main suppliers was under filling, and that was the most significant correlation to batch yield fluctuation. 

The practice of dataset correlation analysis is common. It’s how the hard, but often extremely valuable insights are earned. In all industries, the innovators compete with data. 

Biopharmaceutical manufacturers are leveraging data integrity to answer their toughest operational questions, such as the following:

  • Which piece of equipment is causing the most packaging deviations?

  • When is scrap occurring?

  • Where are my bottlenecks?

  • How can I get more timely and accurate financials from manufacturing?

More life-science manufacturers are stepping away from paper batch records. A paperless system lowers the cost of collecting data and increases the ability to analyze and profit from it.  It enables you to find correlations that otherwise would have been impossible or not cost effective to capture with a paper-based system. 

Ideally, what you are after is a system that meets all your business-protecting requirements while simultaneously leveraging that data integrity for business-enabling activities. For example, with such a system in place, you could connect an initial entry on an electronic batch record to historian sensors and then process a financial transaction in your ERP [enterprise resource planning system], all in real time. 



Electronic data systems

BioPharm: What are the benefits of electronic data systems for improving data integrity? 

Nelson (DataNinja): First, electronic systems improve accuracy. With an electronic system, you get real-time verification of operator data entry, instead of catching mistakes/missed entry in the final quality control review. Operators receive immediate feedback if they do anything that could cause a deviation. 

Second, electronic systems make data searchable. Instead of burying loads of useful manufacturing data in a filing cabinet, an electronic system stores it in a database ready for value-adding insights to be extracted from data trends. No need to type in critical data points. Your analysts can get straight to discovering insights. 

BioPharm: What are the data integrity risks associated with computerized systems and how can these be mitigated? 

Nelson (DataNinja): The data integrity risks associated with computerized systems fall into two main buckets: data loss and data manipulation. Loss of any manufacturing quality data is potentially catastrophic. Because of redundant database architectures and modern cloud applications, computerized systems are superior to paper in terms of total data loss. Computerized systems have their own unique weakness when it comes to data loss; there are more variables that can result in partial data loss. When it comes to paper, all you have to worry about is completion and having a grasp on the physical document. With computerized systems, you might have to deal with Internet disruptions, hardware failures, or human error (e.g., pressing log-out, instead of save). The risks are best mitigated with redundancy, audit trail robustness, and auto-saving. 

Cloud redundancy can mitigate risk. If lightning stuck your server room and everything there was instantly destroyed, how would it affect your data integrity? Modern computerized applications merge government-level security with cloud agility. So, data are securely stored with multiple redundancies and ensured with multiple failover locations.

Audit trail robustness is also crucial. With manual, paper-based systems, it’s crucial to know what was done and who did it. We have all reviewed records with an initial missing. It’s a major pain. There is a calculable probability that given the hundreds of initials required on each batch, one out of ‘x’ batch records will be missing an initial. Computerized systems combat this threat by tracking user actions based on user credentials. If a user completes documentation, the system records who did it along with a timestamp automatically.

Auto-saving is important. A computerized system should not threaten your work based on whether or not you press a save button. This risk should be mitigated with auto-saving functionality. If you don’t have auto-saving, then you’d better focus your efforts on training. 

BioPharm: How can manipulation of electronic records be detected and prevented?

Nelson (DataNinja): Detection of data manipulation has got to be built into the computerized system. Advanced computerized systems leverage MD5 hash encryption to verify that original data integrity remains intact. The MD5 hash is like a digital seal that is uniquely generated based on the contents of the electronic record. If data in the record were manipulated, the MD5 hash would reflect the change and detect the manipulation. Computerized systems for biopharmaceutical manufacturers must comply with FDA’s Code of Federal Regulations 21 Part 11 regulation for electronic signatures.

The risk of data manipulation is mitigated through validation and access. It’s hard to manipulate pen on paper; it looks obvious when something appears to be overwritten. How can you trust that computerized systems contain no data manipulation? First, limit access. Make it impossible for people entering the data to manipulate it, whether in person or through a co-worker. Put in place total separation of powers so that there is no way someone could get someone else to manipulate data post-electronic signature. In regulatory jargon, this means you want your computerized system to be ‘open’.  

In addition, validate. Auditors won’t believe the system behaves as expected when you use it unless you can prove it. You must validate the computerized system following a thorough, risk-based approach. The validation efforts for Part 11-compliant systems varies depending on whether or not the people using the system have any capability to manipulate data on the back-end of the system.

Commercial, off-the-shelf software (COTS) is the easiest to validate. All electronic record systems require in-depth validation. With COTS, you must validate for intended use. 

BioPharm: What is involved in an audit of electronic data?

(DataNinja): Excelling in GMP audits is about complying with the law and being able to prove that you did what you said you would. In an audit of electronic data, the same guidelines apply. Prior to cutting over to a paperless manufacturing solution, you need to physically send a wet signature on paper to FDA, stating that you deem electronic signatures as equivalent to pen on paper. You also need to validate the solution and make sure that your supporting documentation aligns with the new way you record information. The actual audit procedure is fairly similar to audits of paper-based data.

Article Details

BioPharm International
Volume 30, Number 7
July 2017
Pages: 34–36, 41


When referring to this article, please cite it as J. Markarian, “Best Practices for Data Integrity," BioPharm International 30 (7) 2017.