Closing Gaps in Quality Control of Electronic Data - A company must treat its entire virtual GxP-related environment as a process stream. Build a quality system around it to address the intricacies of

ADVERTISEMENT

Closing Gaps in Quality Control of Electronic Data
A company must treat its entire virtual GxP-related environment as a process stream. Build a quality system around it to address the intricacies of the law.


BioPharm International
Volume 18, Issue 12


Table 4. Electronic Quality Systems Guidance Document
We have encountered a few organizations that did not consider the virtual process stream in their daily practices. It might be helpful to talk about a few cases. Many biotechnology and pharmaceutical firms have implemented Enterprise Resource Planning (ERP) systems across their local area networks (LANs) and wide area networks (WANs) that may fully meet the expectations of a high-quality electronic system. However, the generated data may then be shared with an external partner organization. The technology (i.e., ETL tool, XML, disk) and business practices used to transfer that information, and the use of the information within the partner organization have not been fully taken into account by the originating organization. Frequently, we have found that once the information leaves the originator's environment, the originator seems to believe that their responsibility for the data has been nullified.

Another example focuses on the collection and use of data from clinical trials.4 For example, data are collected at multiple sites and sent to the analyzing or end-use organization. We have witnessed on many occasions these data being transcribed (by hand or fax) into electronic format from paper forms, saved to file, and physically or electronically mailed out to a receiving organization. The data may undergo several more conversions (paper to electronic, electronic to paper, or electronic to electronic) and cuts for the statistical analysis before being sent to the primary end-user. Data are then prepared for use in important decision-making or a submission.

Although the analysis plan was generated, it is common for the plan to not specify the handling of data in its entirety, or to specify incorrect or poor-quality procedures to handle the data outside of the statistical processing. The sponsoring agency may not have any idea of the actual detailed handling (storage and conversion) and transmissions that occur to support the generation of the final report. Many organizations feel that the storage of the paper case report forms relieves them from concerns about the quality attributes of the downstream electronic, or physical plus electronic, processes exercised.

As our last example, we have observed that a client assumes all processing occurs on an individual server that may be validated, but some sort of handling occurs that moves the data off the server to another non-validated location. Also, no procedures may exist to support functions that are occurring to the data that is now stored at this other location. As a result, a "new" copy of the dataset is created and that may end up being used in a study or a filing without thought to its origination. An inspector could immediately call the integrity of a decision, submission, or investigation into question.

THE METHODOLOGY

Companies need to implement a good quality electronic system around their virtual GxP-related process stream. Here is a succinct and well-organized approach for assessing current status and providing recommendations for closing quality and compliance gaps.

During the first phase, organize the environment into logical processes. This can be done in many ways to suit the organization. Usually the processes would be organized according to a functional area, for example, supply chain distribution. If the firm has multiple product pipelines with different technologies and business practices, both internally and at differing partner sites, the process flows can be differentiated as Product X and Product Y.

Thinking of the electronic environment as a stream of logical processes is instrumental to building and maintaining a good and cost-effective quality system. Delineation of boundaries around individual computer systems is ineffective and often leads to an organization losing sight of its larger-picture environment, which usually consists of inter-dispersed business practices (i.e., handling) that may not be fully realized.

Develop flowcharts to document and view the logical processes. Multiple charts for the same process flow should interconnect. Such flow charts should contain references to supporting documents, such as SOPs and validation packages. We recommend that the generated flowcharts be approved and managed in a document control program. Microsoft Visio has been found to serve as a good tool.


blog comments powered by Disqus

ADVERTISEMENT

ADVERTISEMENT

AbbVie/Shire Deal Officially Off
October 20, 2014
Amgen Sues Sanofi and Regeneron over Patent for mAb Targeting PCSK9
October 20, 2014
EMA Works to Speed Up Ebola Treatment
October 20, 2014
Lilly to Close Manufacturing Facility in Puerto Rico
October 17, 2014
BioReliance Introduces New Predictive Assays
October 17, 2014
Author Guidelines
Source: BioPharm International,
Click here