Qualifying Network Infrastructure - A Risk-Based Approach

Published on: 
BioPharm International, BioPharm International-02-15-2004, Volume 2004 Supplement, Issue 1
Pages: 40–45

Networks are part of the compliance picture. Recent FDA warning letters show the agency considers network monitoring and qualification a necessary part of maintaining the security and integrity of electronic records.

A risk-based approach to the validation and qualification of computerized systems used in regulated environments must focus on systems and components that have a potentially high impact (in other words, represent a high risk) on product quality and consumer safety. This is generally the case for systems and system components that form the computer network infrastructure. Hence, the role of network monitoring hardware and software for the qualification of networks, and maintaining the qualification status of the network infrastructure will be increasingly important.

Risk and Pharmaceutical cGMPs

In August 2002, FDA announced an initiative to merge scientific risk management and an integrated quality systems approach.

1

This risk-based approach will help industry, suppliers, and regulatory agencies focus resources on critical issues for public health and consumer safety, while adopting innovations made in pharmaceutical engineering.

Practical guidelines for conducting risk assessments have been published in appendix M3 of the GAMP4 guide.2 However, it is important to note that assessing risk is very different from risk management. The goal of risk assessment is the analysis of risks viewed from a specific angle, such as risk for the consumer or commercial risk to a business. Risk assessments result in a risk register and the classification of particular risks. The classification typically assigns a risk severity based on its impact and its probability.

The task of risk management is to define how identified risks can be controlled, minimized, or compensated. Risk management typically asks the following questions:

1. What risks exist, how do they affect us, and how can we manage them?

  • Risk Triggers: What is the trigger for us to change the risk severity classification? What is the trigger for us to address the risk as a real problem?
  • Risk Mitigation: What are we doing now to avoid or reduce the risk?

2. What will we do if the risk (the potential problem) becomes a real problem?

  • Risk Contingency Plan: What actions will we take if the risk is triggered?

3. The risky situation occurred! how do we deal with it?

  • What actions do (did) we take? What is the impact of the risky situation so far?

For more information, refer to the recently published guidebook on the development of risk management master plans.3

Risk and Electronic Records

In December 2002, the ISPE submitted a whitepaper to FDA on a risk-based approach to computer system validation.

4

The paper was based on the concepts emphasized by FDA's new cGMP initiative. This whitepaper appears to have contributed to the new guidance on 21 CFR Part 11.

5, 6

The paper concluded that internal system information not identified by any predicate rule was likely to be of low impact. Therefore, it is acceptable not to have additional Part 11 controls for these records provided that adequate procedures are in place and the required paper records are kept.

The authors of the whitepaper opposed the existing interpretation that software be considered GxP electronic records subject to Part 11, mainly because industry had already developed, in collaboration with FDA, "approaches for dealing with hardware and software in the GxP environment based on validation of systems, configuration management, change control, and adequate procedures and plans for maintaining the validated state. These approaches have been widely adopted and very successful in meeting GxP requirements. Considering software as GxP electronic records has little practical benefit, as well as discouraging firms from adopting innovative technological solutions."

Advertisement

The publication of the Part 11 guidance and FDA's statements about enforcement discretion for certain requirements led some to conclude that it was acceptable to revert back to paper records, (for instance, by defining the printed analysis report with the analysis chromatogram as the raw data and subsequently deleting the electronic record from the computer's hard disk). However, whether an electronic record is subject to Part 11 requirements depends on the predicate rules and whether the established business practices of the firm rely on the electronic version record to perform FDA-regulated activities.

In other words, it is not acceptable to delete the electronic record and just keep the paper record for the FDA auditor. FDA clearly states that it may take business practices into account in order to determine whether an electronic record is used instead of the paper record. It is therefore recommended to determine and document in advance whether the electronic record or the paper record will be used to perform regulated activities. Networked or chromatography data systems, laboratory information management systems (LIMS), and enterprise resource planning (ERP) systems manage critical decision-support data and continue to be a focus of GxP enforcement. The trustworthiness and reliability of the data managed by these systems is highly dependent on efficient technical controls that ensure access security, data integrity, and traceability.

Network Qualification

Client/server data systems are proliferating in regulated laboratories and manage large amounts of critical data. It is obvious that the operation and qualification of the network infrastructure must be an integral part of a company's validation strategy. By their nature, networks are heterogeneous and comprise a variety of hardware components employing diverse communication protocols. A change to a network component can affect other components and applications. Also, FDA is taking a closer look at networks and has been citing companies for violations (see

www.fdawarningletter.com

).

Figure 1: Network topology of a distributed, networked chromatography data system

It has become clear that FDA is aware of CDS issues when operating within a network. A warning letter from FDA deals with network programs with functions of a laboratory management system. The letter states:

"The network program lacked adequate validation and/or documentation controls. For example:

  • System design documentation has not been maintained or updated throughout the software dating back to 1985 despite significant changes and modification that have taken place. These include program code, functional/structural design, diagrams, specifications, and text descriptions of other programs that interfere with [this program].
  • Validation documentation failed to include complete and updated design documentation, and complete wiring/network diagrams to identify all computers and devices connected to the system.
  • The Quality Control Unit failed to ensure that adequate procedures were put in place to define and control computerized production operations, equipment qualifications, document review and laboratory operations.
  • The software validation documentation failed to adequately define, update, and control significant elements customized to configure the system for specific needs of operation."7

It is clear from letters like this that compliance issues extend not only to the CDS itself but also to the network infrastructure within which it operates. Many IT departments have applications to monitor the configuration, health, and status of the network from a network operations center (NOC). These applications may extend from local workgroups throughout the entire enterprise. Many of these applications provide graphical presentations of the network and are able to store configurations from a point in time so that changes to the network may be monitored and documented as part of an overall network validation plan.

Additionally, personnel who are not GxP trained may have access to the network as part of their normal business responsibilities. It is a paradox that the network infrastructure must be compliant, but many components (cabling, utilities, and other devices) do not have validation plans. Networks require frequent changes, additions, and repairs, but can never be taken out of service. A risk assessment, in combination with a sound risk management plan, can help address these problems.

A striking example of a computer network infrastructure failure made headlines in April 2003. A recently installed laboratory computer system in a medical center became overloaded, resulting in a severe backlog of blood-testing samples.8

In such situations, several questions have to be asked:

  • Did formal requirements include specifications for the anticipated load of the system?
  • Was the system installed according to the supplier's specifications?
  • Did it pass the test suite defined for the installation qualification (IQ) and the operational qualification (OQ)?
  • Did performance qualification tests simulate the anticipated load of the networked system in terms of number of samples and number of concurrent users in the context of the hospital's office and laboratory network?
  • What measures were in place for the prevention and early detection of severe failures and performance bottlenecks?
  • Could the bottleneck have been prevented through the use of a network monitoring system?

But where to start and, more importantly, where to stop? How much qualification, documentation, and testing is enough? The following points provide a good guide:

"It is not possible to test every possible branch point in operating, network, and application software used in a typical business system.

However, we can determine the level of quality of a subset of the software very accurately, by thoroughly testing the subset.

If rigorous and consistent development standards and methods are used, it has been observed that the quality level of the subset is representative of the quality level of the entire software system."9

However, unexpected side effects frequently occur as components of a complex environment are changed:

"Validating networked systems not only requires qualifying individual networked components (for example, applications running on each computer), but it also means qualifying authorized access to the networked system and qualifying the data transfer between related computers, as in qualifying the interfaces of components at both sites. The whole system (i.e., including the network) is validated by running typical day-by-day applications under normal and high load conditions and verifying correct functions and performance with a previously specified criteria."10

Proper network administration and operation is an area that is subject to questioning by regulatory organizations. With this in mind, it is important to capture a snapshot of the network during validation. Whenever a change is made, this snapshot can be compared to the current configuration to insure proper communication among the various nodes on the network. In addition, a retrospective document can be maintained that tracks these changes over time. Network qualification is the next frontier in computer systems validation. The Part 11 guidance helps focus qualification activities by basing them on documented risk assessment. Qualification of network infrastructure should focus on the following tasks:

  • Design qualification (DQ): evidence that the network is suitable for the applications (the design is fit for the intended purpose)
  • Installation qualification (IQ): verification and documentation of the static network topology and inventory (evidence that the implementation matches the design)
  • Operation qualification (OQ): dynamic topology verification and capacity testing (evidence that the system operates properly according to the vendor specifications)
  • Performance qualification (PQ): maintain the qualification status, ensure continuous performance through ongoing monitoring during use and measurement of performance over time, and minimize the risk of failure during operation.

Tools for Network Qualification

As clients, servers, and instruments are connected to a network, the available bandwidth may become dramatically reduced. This is especially true if thenetwork is not well segregated, leading to unnecessary broadcast network traffic (for example, between an analytical laboratory network and the office network). This decrease in bandwidth may result in decreased performance affecting real time processes (and could even result in data loss as the hospital network example cited above shows).

Modern analytical equipment and the networks within which they operate may be monitored by network analyzer software along with the clients and servers that control them. This software not only helps operators monitor the health of their networks but also aids in the qualification of the networks through which the instrument data flows. Network monitoring applications are commercially available from companies including Agilent Technologies, Computer Associates, Hewlett-Packard, IBM, and others. While software applications provide excellent monitoring capabilities, network qualification may also require powerful network measurement and test hardware to capture and document network connections, communication activities, available and consumed capacity, and control data. For regulated lab operations, the challenge is to make network measurements meaningful from a systems validation perspective. In the meantime, the first examples of metrology-based network assessment and qualification services designed specifically for laboratory networks have been developed.11

Validation and qualification activities need to consider network infrastructure. The role of network monitoring hardware and software for the qualification ofnetworks and for maintaining the qualification status of the network infrastructure will continue to increase.

The authors thank Bob Giuffre, a senior network data system consultant with Agilent Technologies based in New Jersey, for input to this manuscript and sharing analysis data derived from network monitoring measurements using the Agilent Advisor/Distributed Network Analyzer and Agilent FrameScope 350 in combination with an Agilent Cerity for Pharmaceutical QA/QC networked data system and Agilent 1100 HPLC instruments directly connected to the local area network.

References

1. FDA. Pharmaceutical cGMPs for the 21st century: a risk-based approach. Available at URL:

www.fda.gov/oc/guidance/gmp.html

.

2. ISPE. The good automated manufacturing practices (GAMP) guide for validation of automated systems in pharmaceutical manufacture, GAMP 4. Tampa: ISPE, 2001.

3. Huber L. Risk management master plan and best practices series. Available at URL: www.labcompliance.com.

4. ISPE. Risk-based approach to 21 CFR Part 11. Available from URL: www.21cfrpart11.com/pages/library/index.htm.

5. Code of Federal Regulations, Title 21, Part 11; electronic records; electronic signatures; final rule. Federal Register 1997; 62(54):13429-13466.

6. FDA. Guidance for industry: Part 11, electronic records; electronic signatures and scope and application. (draft February 2003, final version August 2003). Available at URL: www.fda.gov/cder/guidance/5667fnl.pdf.

7. FDA. Warning letter. File No. 320-01-08. Available at URL: www.fda.gov.

8. Associated Press. L.A. hospital computer system breaks down. 22 Apr 2003.

9. Fiorito T, Quinn T. Qualifying peer-served network infrastructures. Presented at the 37th Drug Information Association Annual Meeting, 2001 Jul 8-12; Denver, Co. Available at URL: www.hollisgroup.com/downloads/DIA%20-%20C3Q%2001.pdf.

10. Huber L. Validation of computerized analytical and networked systems. Boca Raton (FL): Interpharm Press; 2002.

11.Agilent Technologies. Qualification services for laboratory networks. Agilent publication 5988-9656EN. Palo Alto (CA): Agilent, 2003.