SLAS

Automated systems and regulated environments

From LabAutopedia

Jump to: navigation, search
Invited-icon.jpgA LabAutopedia invited article


Authored by: Robert D. McDowall, McDowall Consulting


Laboratories in the pharmaceutical and other healthcare industries which operate in a regulated environment need to demonstrate that their equipment, instruments and computerised systems are fit for their intended purpose. This is a stated requirement of all GMP, GLP and GCP regulations (collectively known as GXP). However regulations are very good at saying what is required but less forthcoming as to how to achieve it. Therefore there is a degree of interpretation by individuals, regulated organisations, regulatory inspectors, industry professional bodies and consultants. The purpose of this section is to provide a practical approach to the qualification of equipment and instruments and the validation of computerised systems used in a regulated environment.

We will look at in this section:
• Qualification, validation and method validation
• Terms and definitions
• Analytical Instrument Qualification (AIQ)
• Computerised System Validation (CSV)
• Integration of AIQ and CSV

Contents

Qualification, Validation and Method Validation

It is important to recognise that qualification and validation are separate activities from method validation. Qualification and validation activities need to be carried out before methods that run on a system can be developed and validated as we shall discuss now.

AIQ, CSV and AMV Interrelationships

The relationships between analytical instrument qualification, computerised system validation and analytical method validation are shown in Figure 1. The first three of the six blocks are the responsibility of the equipment and software vendors; the remaining three are the responsibility of the laboratory or end user. The best way to describe the relationships is to use an analogy of building a house.

Figure 1: Relationship Between Instrument Qualification, Computer System Validation and Analytical Method Validation
When building a house you must have a firm foundation otherwise the structure overhead (building) will collapse. Therefore the foundations in our analogy are to ensure that the analytical instrument or system has been designed, built and maintained correctly, this is where the instrument vendor comes in first. The instrument and any associated software must be designed correctly as shown in the first building block and the vendor is responsible for constructing and testing the instrument including the software elements as shown in the second block of the foundation. Maintaining the equipment and the base application software is also the responsibility of the vendor.
Implied, but not shown is the role of the user who must select the right instrument and software for the right job. However the third building part of the foundation is often missing: the laboratory must have selected the correct instrument. This means that they have a specification for the functions that the instrument will perform along with the software requirements including any issues on compliance with health authority regulations such as GMP and 21 CFR 11.
Once the foundations are successfully completed, the ground floor is now built. Here the responsibility turns over to the laboratory for instrument qualification and computerised system validation.
The instrument is qualified against the range of operating parameters defined in the URS. This can be, for example, measuring the instrument’s pipetting range works correctly over the operating range specified. This could be performed using an analytical balance calibrated to national standards to confirm that the volumes dispensed are correct.
Once the instrument is qualified and the software is validated, each analytical method that uses the system needs to be validated.

Apply Validated Methods using Qualified Instrumentation

Once the system has been qualified you have the right system able to perform as expected, it can now be released for use in a regulated environment and be used to develop, validate and apply methods. The analyst is able to operate on a firm foundation knowing that the right system performs as expected and major variables are removed from an analysis.

Distinguishing between AIQ and Method Validation

Qualification of analytical instruments cannot be achieved through method validation. Instrument qualification provides the foundation to develop, validate and run analytical methods within the operating range measured. If a method is developed using parameters outside of this range, then the analytical instrument needs to be requalified before method validation can proceed.
It is important to recognise the difference between equipment qualification and method validation. In some analytical scientists minds these are the same and therefore by validating a method, the equipment is considered qualified. This is wrong.

Figure 2: Relationship between Instrument Qualification and Analytical Method Validation


It should be realised that instrument qualification assesses the performance of modules or the system over the complete operating range of the instrument that the laboratory anticipates using. For instance, if the operating range of an HPLC pump is qualified over the range of 1.0 and 2.0 mL/min then any method working within this range can be run without any issues. This is shown in Figure 2, where methods 1 and 2 operate within the qualified range and are acceptable from a regulatory perspective.
However, if the analytical scientist now wants to use a flow rate of 2.5 mL/min that is outside of the qualified range, they will be using an unqualified instrument and the analysis is jeopardised from a regulatory perspective. This is shown by method 3 in Figure 2 where the upper end of the analysis is outside of the qualified range of the instrument.
The key to ensuring the parameters to qualify and the operating range is to specify your requirements before purchasing a system. Where a method exceeds the qualified operating range for a parameter, then the instrument needs to be requalified and, where necessary, the user requirements specification updated.

Impact of AIQ on Method Transfer

Qualification of instrumentation is important for transferring methods between laboratories. Even if you have different models or different vendors’ instruments or even different software, if they are qualified over the same operating range there will be a higher probability that a given method will transfer with less problems than if the system were not qualified.

Risk Based Approaches to Qualification and Validation

Since the FDA published their GMPs for the 21st Century in 2002 [1] risk assessment and risk management are at the core of many regulated activities in the pharmaceutical industry and qualification and validation are no exception as we shall see in this section.

Abbreviations and Terminology used in AIQ and CSV

In this section the abbreviations and terminology used will be presented and the key ones discussed in more detail. 

Abbreviations
Abbreviation Definition
AIQ Analytical Instrument Qualification
CSV Computerised System Validation
DQ Design Qualification
GAMP Good Automated Manufacturing Practice guidlines
GXP Good X Practice: where X can be Laboratory, Manufacturing or Clinical
 IQ Installation Qualification
OQ Operational Qualification
 PQ Performance Qualification

Terminology for AIQ and CSV: Same Terms – Different Meanings

Regulatory Agencies, beginning with the FDA, have adopted an approach to qualification of process equipment used in pharmaceutical manufacturing [2] which has become known as the 4Q’s model. This comprises four stages which are:
DQ: Design Qualification,
IQ: Installation Qualification,
OQ: Operational Qualification and
PQ: Performance Qualification.

The Pharmaceutical Analytical Science Group (PASG) in the UK, produced a position paper on laboratory equipment qualification in 1995 [3], here they proposed the following definitions of the 4Qs with the author’s annotations in parentheses:
Design Qualification: defining the quality parameters that the required of the equipment and manufacturer. (this is a laboratory responsibility)
Installation Qualification: assurance that the intended equipment is received as designed and specified. (this includes the installation of all items and their integration)
Operational Qualification: confirmation that the equipment functions as specified and operates correctly (instrument function versus laboratory specification)
Performance Qualification: confirmation that the equipment consistently continues to perform as required.
These definitions are consistent with those recently published by the Eurachem-UK Instrumentation Working Group and Burgess et al [4][5] but are inconsistent with the AAPS white paper on AIQ [6] and the United States Pharmacopoeia (USP) general chapter 1058 [7] that we will discuss shortly.

Figure 3: AIQ and CSV use the same Terms but with Different Meanings

There is, however, one difficulty with this nomenclature. A modified form of the 4Qs model is used for validation of computerised systems. Here the same terms are used as instrument qualification, unfortunately they have a different meaning [8][9].
• User Requirements Specification (URS) is equivalent to the design qualification
• Installation Qualification: documented verification that all key aspects of hardware installation adhere to appropriate codes and approved design intentions and recommendations of the manufacturer have been suitably considered. (In practice this means ensuring that the system is installed as specified and sufficient documented evidence exists to demonstrate this.).
• Operational Qualification: documented verification that the equipment or system operated as intended throughout require or anticipated operating ranges. (In practice this means works as specified and sufficient documented evidence exists to demonstrate this)
• Performance Qualification: documented verification that the system performs as intended throughout all anticipated operating ranges. (In practice ensuring the system in normal operating environment produces an acceptable quality product and sufficient documented evidence exists to demonstrate this.)

Thus in computerised system validation there is an additional stage before the system can be released for operational use and there is no on-going assessment of system performance such as required in equipment qualification; however this is the function of the periodic review carried out on computerised systems.

These differences in terminology can be very confusing for those scientists involved in both instrument qualification and computerised system validation. It can also lead to problems when using vendor material – which the context that a specific term used? This confusion was identified by the FDA in the General Principles of Software Validation [10] as section 3.1.3 states:
For many years, both FDA and regulated industry have attempted to understand and define software validation within the context of process validation terminology. For example, industry documents and other FDA validation guidance sometimes describe user site software validation in terms of installation qualification (IQ), operational qualification (OQ) and performance qualification (PQ). …..

While IQ/OQ/PQ terminology has served its purpose well and is one of many legitimate ways to organize software validation tasks at the user site, this terminology may not be well understood among many software professionals, and it is not used elsewhere in this document. However, both FDA personnel and device manufacturers need to be aware of these differences in terminology as they ask for and provide information regarding software validation.

It is interesting to note the use of the past tense in describing IQ / OQ / PQ terminology at the beginning of the second paragraph. Therefore throughout the guidance document there is no mention of these terms; they are replaced by terms such as user or site testing so that there is no misunderstanding of the context.

Components of a Computerised System

The key components of a computerised system are shown in Figure 4. It is important to realise early in your project that if you are validating a computerised system, you don’t just concentrate on the computer hardware and software. Validation encompasses more, as we’ll discuss now.

Figure 4: Components of a Computerised System

The elements comprising a computerised system consist of a computer system and controlled function working within its operating environment.
The computer system consists of:
• Hardware: The elements that comprise this part of a computerised system are the computer platform that the computerised system application software runs on such as workstation or server plus clients etc.
Any network components such as hubs, routers, cables, switches and bridges. The system may run on a specific segment of the whole of a network and may have peripheral devices such as printers, plotters with the associated connecting cables.
• Software: This comprises several layers such as:
Operating systems of the workstation clients and networked server
Network operating system in the switches and routers of the network.
General business applications such as Word and Excel
Computerised system application software and the associated utility software such as a database or reporting language
The controlled function comprises:
• Equipment or instrumentation linked to the computerised system e.g. the robot arm and controlled by it. Ideally the equipment connected to the data system should be qualified as part of the overall validation of the software otherwise how do you know that you are generating quality results? This is where the bulk of an automated system will be found e.g. robotic arm and associated peripherals
• Written Procedures: Trained staff should follow written SOPs as well as the system manuals to operate the equipment and the data system software correctly.
Validation is not just a matter of testing and calibrating the computerised system. There is a greater range of items to consider under the scope of validation. You could be subjected to regulatory action if you only qualify your instrument and do not validate the computerised system software.

Analytical Instrument Qualification

AIQ and Equipment Qualification (EQ), the term used previously, has been discussed since the early 1990s. We will look at the key issues that need to be considered when qualifying instruments. Note that most of the emphasis is placed on the instrument and little, if any on the computerised system. If a instrument has software installed on a read only memory chip, then the approach typically is to qualify the software as an integral part of the overall operation of the system.

The terminology used is:
• Equipment is used for apparatus that has a basic function but no measurement or calibration ability e.g. nitrogen evaporator or vortex mixer.
• Instrument is used for apparatus that can be used for measurement or calibration or has a separate data system for its operation
We will focus on instrument qualification as it is the most complex and relevant to automated systems.

Modular and Holistic Qualification

Furman et al, all FDA employees, when discussing the validation of computerised liquid chromatographic systems, presented the concept of modular and holistic qualification [11]. The modular approach is the qualification of the individual components of a system such as pump, autosampler, column heater and detector of an HPLC. The authors make the point that:

“calibration of each module may be useful for trouble shooting purposes, such tests alone cannot guarantee the accuracy and precision of analytical results”.

Therefore the concept of holistic validation was introduced where the whole chromatographic system was also qualified to evaluate its performance. The reader may question why such an approach is needed? A simple answer is that it is necessary. Burgess et al [5] cite an HPLC system where the individual modules were just within acceptable limits, however when a holistic test for the whole system was carried out, the system failed to meet the pre-defined acceptance criteria.

The concept of holistic qualification is important as some laboratories operate with a policy of modular equipment purchase. Here they select components with the best or optimum performance from any manufacturer. Furthermore, to ensure optimum throughput, some of these laboratories may swap components when they malfunction. Thus over time the composition of a system may change. Therefore to assure themselves and any regulatory bodies that the system continues to function correctly, holistic qualification of the system is vital [11].

Therefore for any automated systems, the individual modules will need to be qualified individually to show they are within acceptable limits and then the whole system qualified again to pre-defined acceptance criteria.

AAPS Guide on Instrument Qualification and USP <1058>

The American Association of Pharmaceutical Scientists (AAPS) wrote a white paper entitled “Qualification of analytical instruments for use in the pharmaceutical industry; a scientific approach” [6]. This was the outcome of a joint FDA-AAPS conference from 2003 entitled instrument validation, however the conference soon realised that the correct term to use was instrument qualification. The AAPS white paper classified instruments into three groups.
• Group A Instruments
• Group B Instruments
• Group C Instruments
Equipment comprises Group A and the risk and complexity rises as one moves up to Group C instruments.

USP <1058> Analytical Instrument Qualification

The AAPS white paper formed the basis for the new general chapter <1058> of the United States Pharmacopoeia (USP) on Analytical Instrument Qualification (AIQ) which became effective from August 2008 [7]. It is worth remembering at this point that USP general chapters between <1> and <999> are requirements and those between <1000> and <1999> are informational in nature. However <1058> will implicitly refer to USP requirements pertinent to specific techniques e.g. <21> for thermometers, <41> for weights and balances or <621> for chromatography.

Qualification Process

The AAPS and USP process for instrument qualification follows the 4 Qs model for the specification, installation check and monitoring of on-going instrument performance:
• Design Qualification (DQ)
• Installation Qualification (IQ)
• Operational Qualification (OQ)
• Performance Qualification (PQ)

One of the problems with <1058> is the reliance on the manufacturer for the specification for the instrument in the DQ phase. While this is fine for the design, build and test phases within the factory, it ignores the responsibility of the user to define their requirements and compare them with the various instruments on the market.

The message in the DQ phase is for the user to ask what they want the instrument or system to do and document it. Ignoring YOUR specification is a high business risk and if one looks outside the pharmaceutical industry to other standards ISO 17025 notes that laboratory and manufacturer’s specifications may be different. Hence the need to define your instrument and system requirements otherwise you will have problems in that the purchased system may not meet your needs.

Classification of Instruments

All instruments / systems are classified into one of three categories: Group A, B or C. Table 1 shows the criteria for classification and how each are qualified together with some typical examples in each group. There is a built in risk assessment as Group A are the lowest risk requiring the smallest amount of work and Group C represent the greatest risk and hence the most work to control.

Table 1: USP <1058> Instrument Groups and Qualification Approach
Category Classification Criteria Qualification Approach Some Examples
A Standard equipment no measurement capability or requirement for calibration
  • Specification: manufacturer
  • Conformance with requirements verified and documented by observation of operation
  • Magnetic stirrers,
  • Vortex mixers,
  • Centrifuges,
  • Sonic baths
B Standard instruments with measurement values or control physical parameters
  • User requirements typically within unit functions
  • Require calibration
  • Conformance to requirement via SOPs and IQ / OQ
  • Balances
  • pH meters,
  • Thermometers
  • Pumps,
  • Ovens,
  • Water baths
C Complex instruments and computerised systems
  • Full qualification process required • Specific function and performance tests
  • AA
  • Dissolution
  • HPLC
  • Spectrometers

USP <1058> has an Inadequate Approach to Software Validation

Virtually all instruments have software varying from firmware for the basic instrument operation to a separate data system for instrument control, data acquisition, analysis and reporting. For Group B instruments the firmware is implicitly tested as part of the IQ and OQ phases of the qualification – this is in contrast with the latest GAMP guide [12] which has removed Category 2 software which most closely equates to Group B instruments.

Where there is a separate data system this needs to be validated at the same time the instrument is qualified. However the approach taken by <1058> for CSV is too simplistic and naïve as everything is dumped on the manufacturer to write the software, validate it and provide a summary to the user for them to holistically validate in their system. Whilst this may be OK for small firmware instruments it is totally inadequate and deficient for complex systems as it omits the following:
• Configuration of security and access control
• Configuration of software to the business process, if appropriate
• Definition and testing 21 CFR 11 functions as used in your laboratory
• Custom reports
• Macros defined and written by the users
For Group 3 instruments with data systems you will leave yourself exposed if your follow the computer validation approach in USP <1058>.

Problems with USP <1058>

Consider the following issues that are not fully covered by the AAPS guide and that have now been incorporated in a formal regulatory text:
• The general chapter uses the term ‘analytical instrument qualification’ (AIQ) to describe the process of ensuring that an instrument is suitable for its intended application, but the instrument is only a part of the whole computerised system.
• The three instrument groups are described along with suggested testing approaches to be conducted for each. However, in the author’s view, there is not sufficient definition of the criteria for placing instruments in particular groups. For example, a sonic bath may be classified in Group A if it is only used for dissolving a compound in a volumetric flask (visual inspection determines if the task has been completed or not). However, if a specific level of energy is required for a task, then visual inspection is inadequate and only after measuring the energy levels throughout the bath may a laboratory locate the specific area where a flask can be located which raises the unit into Group B. Moreover the sonic bath may be part of an automated system which raises the whole system into Group C..
• The guide covers the initial qualification activities for analytical instruments but there is very little on the validation of the software that controls the instrument. There is little guidance on operational, maintenance and control activities following implementation such as access control, change control, configuration management, and data backup.
• Group C instruments cover a wide spectrum of complexity and risk, and may have very diverse requirements. However, there is no specific allowance made within the approach for custom developed applications such as macros for short cuts when using some systems or when the software is configured for laboratory use?

However, the major problem with the approach outlined in <1058> begins in the DQ section where there is the statement [7]:

Design qualification (DQ) is the documented collection of activities that define the functional and operational specifications of the instrument, based on the intended purpose. Design qualification (DQ) is most suitably performed by the instrument developer or manufacturer.

This might appear to be a good idea to push work back on the instrument vendor but is very naïve as if the laboratory does not define its own specification: how do you know if the system you buy is capable of performing the tasks required? This is poor regulatory advice as it leaves the laboratory without predefined specifications [5] to qualify against. It is poor business advice as how can you protect your investment? It is also contradicts the statement in the OQ section:

Operational qualification (OQ) is the documented collection of activities necessary to demonstrate that an instrument will function according to its operational specification in the selected environment.

If there is no advice to write an operational specification in the DQ section, how can a user test against them? In contrast, ISO 17025 has a much better and pragmatic approach to instrument qualification in section 5.5.2, where it states [13]:

Equipment and its software used for testing, calibrating and sampling shall be capable of achieving the accuracy required and shall comply with specifications relevant to the tests and/or calibrations concerned.
Before being placed into service, equipment shall be calibrated or checked to establish that it meets the laboratory's ‡ specification requirements and complies with relevant standard specifications.
‡ This is usually not the same as the manufacturer’s specification.

Note the important footnote to this section, which is not part of the standard per se, but should be remembered by everyone who specifies instrumentation in laboratories. Therein lies the major deficiency of USP <1058>: lack of any explicit guidance that the laboratory must specify their own requirements and not use a manufacturer’s specification. This appears to be in contravention with the GMP requirements in 211.160(b) – an interesting situation of a pharmacopoeia taking one approach and the regulator a different one [14].

Validation of Computerised Systems

When the original GMP and GLP regulations were written in the mid-1970s, there were relatively few computers used in laboratories. This changed in the 1980s onwards with the availability of Apple and IBM PCs and the regulators initially struggled to keep up with events. In 1983, the FDA issued a Compliance Policy Guide (CPG) to clarify the interpretation for their Inspectors; this classified computer hardware as “equipment” and the software applications as “records” [15].

To aid interpretation of the regulations, professional bodies associated with the pharmaceutical industry have developed their own guides for computerised system validation. These include the Parenteral Drug Association [8] and the GAMP Forum. The latter group have the Good Automated Manufacturing Practice (GAMP) guide currently in its 5th version published in 2008 [12]. The GAMP guide from its inception was a general framework for the validation of process systems used in manufacturing and computerised systems have been added in later versions of the guide. However, the guidance has focussed on manufacturing in general and there is no specific guidance for the laboratory. Since 2001, the GAMP Forum has published a series of Good Practice Guides for specific subjects including, in 2005, one for the Validation of Laboratory Computerised Systems [16]. The problem is that the approaches of the GAMP Guide and the laboratory GPG are not the same which results in confusion of which one to take.

GAMP Guide Practice Guide for Laboratory Computerised Systems

Published in 2005, the stated aim of the Good Practice Guide (GPG) is to develop a rational approach for computerised system validation in the laboratory and provide guidance for strategic and tactical issues in the area [17]. However this is an over complex approach to validation of all laboratory systems that does not consider qualification of equipment and will not be considered further in this section. For a further critique of this document please read reference 17.

GAMP 5 Software Categories

Appendix M4 of GAMP 5 has a classification of software categories [12] that is presented in Table 2. The majority of software used in laboratories is either category 3 or 4: therefore the support is the responsibility of vendor and the users licence the application. However, it is also important to understand that for many laboratory systems can contain more than one category of software. For an instrument controlled by a workstation there will be a minimum of two categories and possibly three present:
• Category 1: operating system software such as Windows or Unix for the workstation
• Category 2: firmware for operation of the instrument and user defined procedures (discontinued in GAMP 5 [12] )
• Category 3 or 4: software used to control the instrument, acquire and process data and report the final results.
• Category 5 software may exist on the same workstation as some software allows users to write custom macros to perform specific functions or manipulate data, these macros need to be specified, tested and controlled as part of an overall validation of the software.
This breakdown can help focus the validation effort where it is most needed.


Table 2: Comparison of Software Categories in GAMP 4 and GAMP 5 [18] [12]
GAMP 4 Software Categories GAMP 5 Software Categories
Category 1 Operating Systems
Only operating systems included
Category 1 Infrastructure Software
Expanded greatly to cover
• Established or Commercially Available Layered Software
• Infrastructure Software Tools
Category 2 Firmware
Configurable and non-configurable firmware only – custom firmware is Category 5
Category 2 Firmware
Discontinued – firmware now treated as software in one of categories 3, 4 or 5.
See text for discussion of the clash with USP <1058>
Category 3: Standard Software Packages
Commercially available standard software packages. Configuration limited to establishing the run-time environment
Category 3: Non-Configured Products
These are off the shelf products used for business purposes and includes systems that cannot be configured to conform to business processes and ones that are configurable but for which only the default configuration is used.
Category 4: Configurable Software Packages
Configurable software packages provide standard interfaces and functions that enable configuration of user specific business or manufacturing process.
Category 4: Configured Products
Configured products provide standard interfaces and functions that enable configuration of the application to meet user specific business processes.
Configuration using a vendor supplied language should be handled as custom components (Category 5).
Category 5: Custom (Bespoke) Software
These systems are developed to meet the specific needs of the user company.
Category 5: Custom Applications
These systems or subsystems are developed to meet the specific needs of the regulated company. The risk inherent with custom software is high.

 

Multiple Life Cycle Models Available

At first slight the new GAMP version [18] appears to contain a far more flexible approach to computerised system validation than in earlier versions of the guide. For example each of the software category three, four and five systems now have their own life cycle models and anticipated documentation – a triumph for reason versus the traditional GAMP V model. This latter model which has been used from the start of GAMP for production and process systems was typically used as THE life cycle model for any system validation. In many organisations, all process and computerised systems were always shoehorned into it regardless of logic or reason that said otherwise. I was always a critic of this model for computer applications where a vendor was responsible for the majority of the life cycle and the company responsible for configuration and implementation. However reading GAMP 5 the traditional V model appears to have gone missing in action which is a shame as it is very pertinent for production equipment and systems as I have said above. It has been replaced by a number of different V models dependent on the software being validated is a Category 3, 4, 4 with 5 modules or a pure category 5 system. This provides a degree of flexibility and focus that has been lacking in the earlier versions of this document.

This needs expansion further – will do later

Changes to Software Categories and the Laboratory Impact

In the laboratory there is one section of the new GAMP guide that impacts us specifically, this is Appendix M4 covering classification of software [12]. The major changes in software classification are shown in Table 1 and I’ll summarise the changes below.
• Terminology has changed to help define the categories; the guide has moved from defining software as “package” in version 4 to “product” in version 5. I believe this is to emphasise the commercial nature of these two classes which constitute the bulk of the software used in laboratories today.
• These categories are intended as a continuum rather than discrete silos, so some interpretation may be necessary as to which category a system is put will need to be documented in your system risk assessments or validation plans.
• As you can see in the old version of the guide there were five categories of software which has reduced to four in the latest version. The category that has gone missing is Category 2 (firmware). Laboratory systems containing firmware such as balances, pH meters and dispenser dilutors could be classified under GAMP 4 as equipment and qualified to demonstrate their intended purpose and not validated: a simpler and quicker process. The argument for discontinuation was that firmware can vary from simple to complex and therefore it can be dealt with under the other software categories.

However we now run into a slight problem as GAMP 5 now conflicts with USP <1058> for Analytical Instrument Qualification (AIQ). Many of the old firmware instruments listed above are Group B instruments under USP <1058> which are qualified not validated. Guess what? Which takes precedent - a USP general chapter or the GAMP guide? So in the laboratory you’ll want to ignore GAMP 5 and retain this category of software for simple laboratory equipment and instruments and be congruent with USP <1058>.

• Category 1 has undergone a radical change from operating systems to infrastructure software which is now broken down into two main areas:

Established or Commercially Available Layered Software. Software in this category still includes the old category 1 software of operating systems but has also been expanded to encompass databases, programming languages, middleware, ladder logic interpreters, statistical programming tools and spreadsheet packages. The key issue is that these are the base products and applications are developed to run under the control of this kind of software. Now before you run off thinking that Excel templates and macros do not need to be validated, think again as the Guide notes that “applications developed using these packages” are excluded from Category 1 and are Category 4 or 5 respectively.

Infrastructure Software Tools: This includes such tools as network monitoring software, anti-virus and configuration management tools and other network software. In essence all these applications are qualified on installation: i.e. what has been installed and does it work?

• Category 3 software has been renamed from Standard Software to Non-Configured Product to sharpen the difference between it and Category 4. This helps practitioners in the field to interpret software: in fact now you could have the same software in Category 3 or 4 depending if the default settings are used or the application is configured respectively.

Integrated Approach to System Qualification and Validation

However as the majority of instruments and automated systems used in any laboratory are computerised this is where the problems begin. Qualification of analytical instrumentation is seen as a separate activity from validation of the computerised system that controls the instrument, acquires data, manipulates the data and reports the results. It is the author’s contention that these two tasks are, in fact, one and an integrated approach should be taken to the combined problem of qualifying the instrument and validating the controlling computer software.

AIQ and CSV – Where are we Now?

From the laboratory perspective, there is no single source of definitive guidance for a combined approach to AIQ and CSV. There are consequently a number of problems that arise from this:
• We have no consistent terminology and definitions across the two disciplines, especially where the same terms can mean different activities.
• There is contradictory advice for some aspects of computer validation depending if the GAMP version 5 [12] or the Laboratory GPG is consulted [16]. Neither source discusses qualification of analytical instruments used in the laboratory.
• USP <1058> focuses mainly on the qualification of instruments but omits the specification by the laboratory [7]. Computer validation of the data system controlling the instrument is poorly considered.

Therefore an integrated approach is urgently required, especially when two major publications covering these areas cannot provide this, therefore the outlined presented here is based in two articles by the author.[19][20]

Integrated Qualification and Validation Terminology

The integrated approach begins with common terminology and the key word to define is “qualification”. Below is the one modified by the author from that in ICH Q7A [21] and discussed earlier in this paper.

Qualification: The action of installing and demonstrating that a laboratory instrument or system is properly installed, works correctly and actually leads to the expected results.

Implicit in this definition is that there will be the appropriate documentation for the initial installation, component integration and system acceptance during the work. Additionally if you are to check that the system produces the expected results then you need to document them in an equipment or system specification.

An Integrated Approach to AIQ and CSV

Rather than have a separate approach to AIQ and CSV, the natural thing is to combine the two because you cannot qualify the instrument without the computer and you cannot validate the computer without the instrument. Therefore an integrated approach is both logical and practical, an overall approach is shown in Figure5.

Combined System Specification for Instrument and Software

Defining the intended use of both the instrument and the software is vital in any AIQ and CSV approach, as the two elements are combined for computerised laboratory systems then we need to ensure that there is an appropriate systems specification available. If purchasing a new system, then you may need to have one specification to select the system and then after selection and familiarisation with the instrument and the software update it for the specific system purchased. It is against this system specification, that the user acceptance testing will be carried out, therefore it needs to be as specific as possible.

Defining the Qualification Activities

If qualification is the overall process, this can be further broken down into a number of phases. Depending on the nature of the system and the equipment; not all would need to be performed and some work can be undertaken by a vendor and others by the laboratory. To avoid confusion we need new terms that are easily understandable compared with the 4Qs model of USP <1058>[7].

• Specify Requirements: The action of defining the laboratory requirements for the instrument, overall system, computer application and IT support as needed.
• Component Installation and Component Integration: The action of installing the items and components of instrumentation and computer hardware and software followed by their integration into a system.
• Vendor Commissioning: The action of demonstrating that the integrated system works according to vendor specifications.
• Defining User Ways of Working: This is the action of ensuring that the instrument and system reflect the laboratories ways of working. This consists of four activities comprising:
Instrument Qualification
Instrument Calibration
Software / Firmware Configuration
Software Customisation
• User Acceptance: The action of demonstrating that the overall system works according to the user specification documents (i.e. intended purpose) and meets current regulatory requirements

Figure 5: Integrated Approach to AIQ and CSV

Applying an Integrated Approach AIQ and CSV

This section discusses the activities shown in Figure 5 in more detail.

Specify the System

The first part of the integrated AIQ / CSV journey is writing the user requirements for both instrument and the software to document the system’s intended use. Therefore define the following items and document in the specification document for the system, note I’m not using the word “instrument” quite deliberately for many of the items below:.
• Intended Use of the Application Software: Data acquisition and control, calculations performed and how results will be reported.
• 21 CFR 11 requirements such as backup and recovery of the library, sample spectra, application software and the operating system. Will electronic signatures be used by the operator and the reviewer of the results?
• Security and Access Control: who will be using the system? The needs of the various users will need appropriate access controls the main users of the system will need more basic user privileges compared with the system manager and a user who can import spectra into the library.
• Instrument Operating Parameters: This will depend on the how you will use the system.
• Interfacing Requirements: will the system be standalone or be connected to the company network? Will data and / or results be transferred between other applications?

If a laboratory has not adequately defined their initial requirements correctly money and time will be wasted. For the purposes of discussion, the vendor and system have been selected correctly and the purchased system arrives on site.

Component Installation and Integration

These are shown as two stages in Figure 5 but will be discussed as if they were one to save repetition. These are typically the responsibility of the vendor and may need to be carried out by them if your warranty and service contract is to be valid; however it also means you are not wasting your time trying to get the system to work.

The vendor service engineer or representative will unpack the instrument components and check that each one is undamaged. The components will be installed by following a written plan provided by the vendor and the engineer will confirm that each portion has been adequately completed by completing the document. In the event of errors, the engineer will resolve these before completing the installation of a component.

After the instrument is installed, the computer is installed and the application software loaded onto it, again, the installation of this must also be adequately documented by the vendor. If the workstation is supplied by the vendor, it may come preinstalled with the operating system, typically a version of Windows. Alternatively, if the system is to be connected to the company network, the company IT department may supply a workstation with the appropriately configured operating system pre-installed. Regardless of the source of the workstation, it may be connected to the network and given an Internet Protocol (IP) address and other information for it to connect and store data on a network drive. Regardless of who does the work the fact it was done needs to be recorded in a complaint manner. Then the application software will now be installed by the engineer; again this will be performed and documented in a procedure that will be completed on site together with problem resolution.

When completed, the instrument and the computer system will be connected and checked out to see that the two communicate as expected by the vendor. In outline, this completes what should be done in the first two stages of the work. Most, if not all will be performed by the vendor’s engineer and will be adequately documented. It is also the laboratory responsibility to ensure that the work is recorded in a compliant way and that work is of a suitable scientific soundness to comply with current GMP regulations. The system owner of the system will review this work and approve it.

This work equates to an installation qualification performed by a vendor.

Vendor Commissioning

This stage is can either be a separate activity after the component installation and integration or it can be combined with them in a continuous session to make it simpler for the engineer to complete. If the latter approach is taken, the process will be from unpacking the individual components to showing that the complete system is acceptable from the vendor’s perspective and works. As mentioned above, the instrument and software should work and there should be a holistic test for the overall system.

Here you should be vigilant and work smarter than harder. What is the vendor doing here? Are there tests that are carried out, that can be used instead of you doing the work in later stages of the process or are there holes in the package? In this instance, you need to do more work later in this process. Therefore review critically what the vendor proposes to do here and cross check against the system requirements in your URS. Where the vendor’s material matches your requirements then accept this and do nothing further if it meets your requirements. The problem is that you may be reviewing a standard document produced by the vendor to meet most uses of the system.

An example that the vendor could undertake here to confirm instrument operation could be to measure parameters using calibrated equipment or reference standards. Examples of some of the qualification work for an HPLC system could be:
• Pump flow rates: measurement of the maximum and minimum
• Detector: wavelength accuracy and photometric linearity
If these tests are not performed here AND reflect the user specification, then the users should undertake them during the next phase of the integrated qualification and validation. For example, if a wavelength calibration standard used by a vendor does not offer sufficient resolution or wavelength range, then alternative sources should be used in the next phase by the laboratory. Alternatively, the laboratory requests that the vendor use standards to cover their operating range as specified in the first phase of this work.

Review the proposed tests for the software that the vendor proposes to execute as well, what areas of the software do they propose to checkout and do the tests they will run match your working practices? Will the tests be performed on the unconfigured application or will the vendor configure this to meet your needs? Therefore look at these carefully, if there is extensive testing on the basic installation this might have little value if you subsequently configure or customise the software in the next phase of work. On the other hand if these tests match your requirements then you’ll have less to do later as you have already tested some of your requirements. However, in my experience, the tests performed by the vendor will be typically on the unconfigured software and you have to make a judgement on how useful these will be against your requirements.

As well as the instrument and the software, a vendor could provide a standard holistic test to show that whole system is working as expected. This should exercise both the instrument and software but should not be exhaustive, merely to demonstrate that everything works as expected.

This phase equates to an operational qualification (OQ) performed by a vendor, however the users need to review the documentation before this work

Defining User Ways of Working

In this portion of the work define and configure how you will use the whole system. This may consist of one or more of the following items:
• Qualify the instrument for the operating parameters specified in your URS. Reiterating the earlier point; if the vendor has done this work already AND it covers your operating parameters adequately – why repeat this? However if your ways of working differ from the standard approach then this is where you will plug the gap and qualify the instrument for your laboratory’s specific operating parameters. This is risk management in practice – albeit in an ad-hoc form.
• Calibrate the instrument. Again this takes the same point as above, if it has been done to an acceptable standard over the operating ranges that will be used by the system do not do anymore work as you’ll be wasting time and effort.
• Configure the software and firmware. We’ll focus here on the application software and this may include any configurable reasons for change in the audit trail to save typing, policies to enable the use of electronic signatures, reports for the results and any custom calculations. As a minimum you’ll need to define the user types and the access privileges associated to each type (this naturally will be documented as you’ll be testing this in the next phase of the work). Therefore not all users will be system administrators – unless the user base is very small e.g. two. In addition each user will need to have a user identity and password to access the system; every user needs a unique identity so that they can be identified within the system including the audit trails. Most importantly, any default user identities should be disabled and default passwords changed.
• Write user programs or macros: these are define the way some systems will be used and will be unique to the laboratory, this typically is how an automated system will function for a specific laboratory. They may need further specification, checking and testing as part of the overall qualification and validation process.

This phase work was not considered by USP <1058> [7]  but is a critical component of the set up of any major system.

User Acceptance

The focus of this phase of work is at a system level and brings together all work done at the earlier stages of the life cycle. Here you will demonstrate that the system as a whole meets intended use requirements. Testing here will be traced to the requirements defined in the URS. There will be functional tests of the system as well as non-functional tests such as security and access control.

This phase can be equated to the performance qualification (PQ) of the system

Release for Operational Use

When all of these tasks are completed and any issues have been resolved, the whole system can be formally released for operational use.

Examples of Qualification and Validation

To be added later

Case Study 1: Robotic Sample Preparation System

Case Study 2: Automated Dissolution Testing and Chromatographic Analysis

References

  1. FDA, GMPs for the 21st Century, August 2002
  2. FDA Guidance for Industry, Process Validation,1987
  3. M Freeman, M Leng, D Morrison &amp;amp;amp;amp;amp;amp;amp;amp;amp;amp; R P Munden, Pharmaceutical Technology Europe, 10 (11) 45-48 (1995)
  4. Eurachem-UK Instrumentation Working Group, ‘Guidance on Best Practice for the Equipment Qualification of Analytical Instruments’; HPLC 1999
  5. 5.0 5.1 5.2 C.Burgess, D.G.Jones and R.D.McDowall, Analyst 123 (1998) 1879-1886.
  6. 6.0 6.1 S.K.Bansal, T.Layloff, E.D.Bush, M.Hamilton, E.A.Hankinson, J.S.Landy, S.Lowes, M. M.Nasr, P.A.St.Jean, V.P.Shah Qualification of Analytical Instruments for Use in the Pharmaceutical Industry: A Scientific Approach, American Association of Pharmaceutical Scientists, 2004
  7. 7.0 7.1 7.2 7.3 7.4 7.5 United States Pharmacopoeia, General Chapter 1058, Analytical Instrument Qualification (2008), Rockville, MD
  8. 8.0 8.1 Validation of Computer-Related Systems, Parenteral Drug Association, Technical Report 18, 1995 (Journal of the PDA, 49 S1-S17).
  9. R.D.McDowall, Validation of Chromatography Data Systems: Meeting Business and Regulatory Requirements, Royal Society of Chemistry, Cambridge, 2005.
  10. FDA Guidance for Industry, General Principles of Software Validation 2002
  11. 11.0 11.1 W.Furman, R Tetzlaff and T.Layloff, JOAC International, 77 (1994) 1314-1317
  12. 12.0 12.1 12.2 12.3 12.4 12.5 Good Automated Manufacturing Practice (GAMP) Guidelines version 5, International Society for Pharmaceutical Engineering: Tampa, FL, 2008
  13. ISO/IEC 17025:2005, General Requirements for the Competence of Calibration and Testing Laboratories.
  14. FDA Current Good Manufacturing Practice for Finished Pharmaceutical Products (21 CFR 211)
  15. FDA Compliance Policy Guide 7132a.11 (1983, reaffirmed 1996)
  16. 16.0 16.1 GAMP Forum Good Practice Guide – Validation of Laboratory Computerised Systems; International Society for Pharmaceutical Engineering: Tampa, FL, 2005
  17. R.D.McDowall, Spectroscopy, 21 (4) 14-30 (2006)
  18. 18.0 18.1 Good Automated Manufacturing Practice (GAMP) Guidelines version 4, International Society for Pharmaceutical Engineering: Tampa, FL, 2001
  19. R.D.McDowall, Spectroscopy, 21 (11) xx-yy (2006)
  20. R.D.McDowall, Spectroscopy, 21 (12) xx-yy (2006)
Click [+] for other articles on 
Applications(10 C, 9 P)
The Market Place for Lab Automation & Screening  The Market Place
Click [+] for other articles on 
The Market Place for Lab Automation & Screening  The Market Place