Validation of software used in production and QMS - Part 2 Validation Master Plan
We continue this series on validation of software used in production and QMS with the Validation Master Plan (VMP).
Better than endless explanations, I added a Validation Master Plan template to my templates repository page.
The Validation Master Plan (VMP) is here: Validation Master Plan template. It contains general provisions for software validation.
It comes with other documents that we'll see in the next post:
- The Validation Protocol template, it contains the application of the VMP for a given system,
- The Validation Report template, it contains results of the validation protocol for a system,
- The Final Validation Report, it contains the conclusion of the validation of a system.
I share these templates with the conditions of CC-BY-NC-ND license.
This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivs 3.0 France License.
Content of the VMP
The Validation Master Plan contains the provisions for:
- Identifying systems that require validation,
- Defining the level of scrutiny of the validation.
Not all systems used by a company should be validated. As we already saw in the previous article, only those in the scope of requirements found in applicable regulations and standards shall be validated.
The VMP template gives hints to define the selection criteria and to present the results of the selection.
Level of concern
The VMP template introduces the concept of "level of concern", to help validation team define the steps required by validation.
The level of concern is borrowed from the FDA concept found in its guidances on medical device software. It is here adapted to the context of computerized system validation.
The validation steps are the very classical ones found in every validation protocol:
- Design Qualification (DQ),
- Installation Qualification (IQ),
- Operations Qualification (OQ),
- Performance Qualification (PQ).
These concepts don't mach well those found in software validation. But some links can be drawn between them.
Design qualification is applicable only to a subset of selected systems. DQ is applicable when software is internally developed or when its configuration is complex, with scripting and the like. See VMP template for hints on DQ applicability.
DQ is simply a software development project. The most obvious model of software development is the waterfall model but any other kind of model is possible.
The DQ should contain the classical documents and records found in a software development project:
- Development plan,
- Software Requirements Specifications,
- Design review.
- Software Test Plan,
- Software Test Report,
- Final review.
You may use the "all-in-one template" in the templates repository page to document the development projet of a software tool.
DQ is not IQ / OQ / PQ
Don't miss the point about DQ. It's a phase which is different from IQ / OQ / PQ.
To make things simple, DQ is made on a test platform or pilot platform, IQ/OQ/PQ are made on the target platform.
There may be cases where the test platform is also the target platform. But, to make things clear and catch the concepts, remember DQ equals test platform and IQ / OQ / PQ equals target platform. Using the language of software development teams, the version output of DQ is like a Release Candidate version ready to be tested by other people than the software development team.
Installation qualification is the verification of the installation of software on its target platform. The IQ can be made either during the installation or after the installation.
When it is done during the installation, the tester runs the installation and verifies at the same time that the installation is running well. The IQ is then a mix of installation tests (eg: running the installer) and of inspections (eg: checking the hardware version, the OS version, the documentation...)
When it is done after the installation, the verification is an inspection of the installation records. The tester goes through all installation records and checks that the installation was correct.
Note that the IQ happens on the target platform. It shouldn't be confused with the installation of software on a test platform during DQ. Verifying that software can be installed and run on the test platform is a part of Design Qualification or of preliminary tests before the IQ.
Using the language of software development teams, the version installed in IQ is the Release Candidate.
Operations Qualification is the verification of software functions on its target platform. The OQ is made after the IQ (I can't verify software if it wasn't properly installed before).
OQ is a set of tests verifying the functional requirements of software. The functional requirements can be either user requirements or technical requirements. These requirement are input data of the validation process.
When OQ is preceded by DQ, DQ tests and OQ tests may overlap. The most simple solution is to redo all the tests passed during DQ. OQ test can also be a reduced set of DQ tests - like typical user scenarios. OQ tests can also be completely different tests if DQ was oriented to verification of technical requirements.
When OQ is not preceded by DQ, a test protocol verifying the requirements shall be written.
Using the language of software development teams, the version output of OQ is like RC2 or RC3, where most of bugs found by users have been removed.
Performance Qualification is the verification of software in routine use. The PQ is made after the OQ (I can't verify in routine use if software functions haven't been properly tested before).
PQ can be a set of structured tests verifying user scenarios. It can also be made of free tests by end-users. The PQ should contain a predefined period of surveillance of software used in routine by the end-users.
Using the language of software development teams, the version output of PQ is the Final Release of software.
Latitude in DQ / IQ / OQ / PQ content
These four steps are heavy to implement, but we have escape plans.
The VMP gives latitude to the validation team in the exclusion of the validation steps and in their content. Provided that rationale and evidence are brought, it is possible to make the validation more simple than these four steps.
DQ is obviously not required for purchased software with minimal configuration settings. It's possible to simplify IQ, OQ and PQ steps when the context allows it. Likewise it's possible to exclude IQ or OQ with justification. It looks difficult to exclude PQ. But it may be possible to have OQ and PQ merged in a single step.
With legacy system, it's possible to do a retrospective validation. This is another kind of escape plan.
It is based on the analysis of historical data of a system already used in routine. The retrospective validation consists in assessing the conformity of the system to regulations by analyzing:
- Records output by the system,
- Non-conformities linked to the system or to processes involving the system,
- Customer complaints linked to the system or to processes involving the system,
- Any other relevant data (argh, can't be more precise ...).
Be careful with retrospective validation hence it is not "appreciated" by auditors and inspectors. They're going to search for the pitfall in this kind of validation.
The easiest pitfall to find is if you modify the system after you've validated it retrospectively. How to convince the auditor that a complete revalidation is not necessary? A tiny software change can have dire side effects.
Anyway, retrospective validation is sometimes the only way to validate a legacy system that has been used for a long time without any bugs, and without any will of the users to modify it.
Next time, we'll see the Validation Protocol and Validation Reports.