How to Overcome Validation Costs


“It is thrifty to prepare today for the wants of tomorrow” — Aesop, Greek author (620 BC – 560 BC) 

Validation costs are normally high and would be subject to more time and resources than a project without such requirements. However, there have always been ways to reduce the impact of these costs. 

Below are at least three typical causes for having overly raised costs for validation of a system, together with hints on what to do to bring the costs down. Also listed, is a more future-oriented approach to selecting pre-configured standard systems that have all interfacing systems ready-made, all cloud-based, and fail-safe. But let’s start with the classical fails and costly overdoing: 

1. Starting Computer Systems Validation at the very end of the project 

Sometimes validation is authorised on projects with systems that are nearly fully developed and yet to be tested. This causes the repetition of certain project activities which require a lot of man-hours and resources such as the rewriting of some documents and the improvement of document specifications.   

The obvious solution would be to start validation at the beginning of the project. All validation activities will have value in the project itself when integrating it with project design, development, building, and testing. Hence this will make validation more cost-effective. If an organization follows a quality management framework, project activities carried out will require less validation. 

2. Focusing validation efforts on the less important issues including over-validating others 

There are typically a larger number of test specifications being written for validating the new systems, based on larger User Requirements Specifications (URS) and Functional Specifications (FS). If everything must be tested manually without an understanding of business processes and without any risk-based approach to testing, there will be a larger number of test rounds for the testing teams.  

Risk assessment of business processes and related compliance and quality throughout the project from beginning to end helps to reduce this detailed formal testing issue. The extent and the scope of validation are therefore defined and justified as are any exclusions. Time and resources are saved when unnecessary validation test steps are removed this way. 

3. Mandatory validation according to regulations and predicate rules 

Rules are rules and title 21 CFR part 11 established by the FDA is no exception. Validation is mandatory in all processes in the pharmaceutical industry and it is costly and resource-intensive.  

However, validation is an extension to good system development practices and the documentation is a valuable resource for the duration of the system lifecycle. Working around validation can prove disastrous financially but finding problems early can save big money in the long run.  

One way of keeping the level up to standards, but still accelerate the validation process, is to use Application Lifecycle Management (ALM) systems and to run all documentation through the ALM system. This will lower the Lifecycle Costs for the system’s lifetime and speed up change management dramatically. It also provides a more manageable view of operational features.

4. The other way around: move business processes towards Best Practice Solution Suites 

What can be recommended for companies that are looking for lower Total Cost of Ownership for their systems that are replacing the systems that are going out of service? The three proposed solutions above are not really taking the companies into the new landscapes that are becoming available. 

There has been a tremendous change in the past years to move the older pipe stoves of specialist solutions into integrated suites, and these suites are now also become available on the internet as Software as a Service (SaaS). You can virtually move everything to a service provider, including the Service Operations of your previously high-cost operations of in-house validated solutions. 

But you can of course also choose to run just your secured Infrastructure as a Service (IaaS) with secured failover and standpoints on any of your operational continents or chose to have your Platforms as a Service (PaaS) operated by one or more service providers on their infrastructure.  

All in all, all levels of virtualization have been implemented in masses and the IT industry now has come close to 50% of the workload being executed on the key platforms today like Amazon, Microsoft, and so on.  

What then can be done to the validation effort for say a new Regulatory Affairs (RA) integrated suite, is that the company needing the functionality can write a thin User Requirements Specification (URS) describing just the processes that are needed to run the different processes like Electronic Document Management, Submission Management, Data Management, Regulatory Correspondence and so on. This needs to be mapped to the candidates, and adaptations of the processes to the integrated suite will streamline the business activities by avoiding any additional configuration, and especially any custom-built interfaces.  

With these high-level URS being approved, so should a high-level Validation Plan get written and approved with a higher GAMP classification, via configurable solutions, all the way up to a non-configurable Software solution. The reasoning for attempting in using the non-configurable SW solution as the target, is that the customer is using a pre-configured solution within the service providers catalogue of services. 

What the customer needs to make sure, is that you test the URS with an Acceptance Test (PQ), but you trace the remaining testing through the vendor documentation. You would also need to include all operational parts provided that you select the best candidate suite for your business process, moving your processes towards Best Practice Solution Suites will give you more speed in your processes, easily enhanced performance in any place anywhere in the world and a much shorter validation effort than with the until recently normal in-house full specifications written and testing executed approach, in the Service Level Agreement with your vendor, including how they will manage your business continuity, backup & restore, return of service KPIs that your business needs.

5. Conclusion 

Are you soon to replace any larger applications in GxP relevant business processes? If you would like to know more about moving your processes towards Best Practice Solution Suites, KVALITO has supported pharmaceutical companies in moving their systems towards the newer infrastructure, platforms. We would be happy to show you the SMART Validation Plans and Strategies used for these endeavours. 

KVALITO is a strategic partner and global quality and compliance service provider and network for regulated industries. To learn more about our service please visit us at If you would like to benefit from KVALITO’s expert services, feel free to send us an email to

Author: Lars-Eric Winqvist, Senior Life Science Consultant, KVALITO 


You May Also Like…

Megan Hoo Internship Report

Megan Hoo Internship Report

Three years ago, I made a deliberate choice to pursue science, with a future I’d envisioned myself entrenched in...