Guest Column | July 17, 2024

5 Recommendations To Maximize CSV/CSA Outcomes

By Ulrich Lein and Stefani Godoy Herrera, MAIN5

Quality control assurance satisfaction-GettyImages-2117357383

Any system relevant to GxP (good practices) in life sciences is required to undergo a formal process of computer system validation (CSV) or assurance (CSA) to confirm that it fulfills its intended purpose. The common preoccupation with the resource-intensity of this undertaking downplays its more strategic role for the organization. At the current time, this is likely to include enabling more dynamic and seamless data sharing. As these strategic goals increase in scope and complexity, it follows that ensuring that everything does as promised will be an important determinant of a project’s fuller impact.

As the validation process is correlated with other departments besides IT and quality, it encourages all stakeholders to think ahead and define what they need. This puts everyone on the same page, working toward the same agreed outcomes, while considering each department’s needs. Without this shared intention, projects could soon go off course and need to be reworked at great expense (and delay). This in turn could have a bearing on patients’ prompt access to safe, high-quality products.

Maximizing The Outcomes Of System Validation

Treating system validation as something to be streamlined, to reduce the duration and cost of project delivery, is a risky approach. If important new systems fail to deliver as intended (true of the vast majority of projects that have not followed validation standards) or if they take in or return bad data (uncovered during a health authority inspection), the reputational risk could be painful and lasting. And it will almost certainly mean time-consuming re-engineering to put problems right — which always costs more after the fact.

Here are some recommendations on how to avoid that:

1. Consider Validation Requirements At The Start

First, there needs to be a recognition that validation is not a stand-alone undertaking that takes place after the main project. Optimizing the beneficial impact of CSV or CSA starts with early action, ideally when organizations start thinking about a project or with the introduction/change of a system. Certainly, it should be an integral part of a technology or data project, not an afterthought.

By determining from the outset the factors that will allow later validation, teams are more likely to stay focused on important developments, ensuring that they are delivered on time and with high quality. Validation should be fully factored into, and budgeted for, as part of the project.

2. Collate All Needs Up Front

Validation should be directed by someone with a high-level overview and holistic interest in the new project’s success – with the aim of proactively anticipating likely issues, with input from subject matter experts across all affected departments.

Too often, though, issues don’t arise until after an inspection, at which point costly retrospective action will be needed. EMA and FDA findings are made public, too, so there could be reputational damage if systems are found not to comply with regulatory expectations. Quality management systems (QMSs) have been the focus of recent waves of inspections, evidenced by recent warning letters published by the U.S. FDA. This is increasing the urgency around ensuring a continuous validated state.

3. Set The Right Intentions, Regardless Of Your Validation Approach

The CSV vs. CSA debate is a distraction. While good automated manufacturing practice is geared to defining computer system validation guidelines systems governed by regulations, it already considers the risk-based approach, while the FDA’s CSA guidance also focuses on containing risk rather than compliance with rigid rules. That is, it places more emphasis on assurance that systems can be safely depended on for their intended use. This is considered by some to be a less burdensome approach than traditional comprehensive validation.

As liberating as the CSA approach might seem, the ideas it sets out are nothing new. The debate is really around minimizing over-engineering and the potential for inefficiency in favor of defendable compliance, which feels more “agile” in today’s dynamic environment. Certainly, validation approaches need to move with the times.

It can be more helpful to think of any guidelines as recommended best practice, focus on the essence of the provisions, and be pragmatic and flexible where necessary. Early collaboration between validation and IT teams can help with this, to preempt any issues and determine the associated level of risk and appropriate provisions.

4. Focus On The Total Cost Of System Ownership

If immediate project costings (and timelines) have served as a barrier to a more strategic, proactive approach to system validation, this may be because the immediate cost of project delivery has been the focus, rather than the total cost of system ownership. The latter focuses on a successful operational system that does what it promised once live and over time.

The GAMP community estimates that if companies approach validation preemptively and with the right intent it should account for 10 percent of the overall project budget. If it is neglected, under-resourced, or left too late, on the other hand, it is likely to cost considerably more. That’s even before the impact on any indirect co-dependencies or the cost of missed benefits. Ultimately, companies that commit themselves to doing validation well will be taking better care of patients, through more efficient delivery of better products.

5. Consider Validation Alongside The Choice Of Supplier

Finally, selecting the right supplier, as well as defining the validation criteria up front, will further boost the likely success of a project’s delivery by moving everything up a level and ensuring that nothing has been overlooked in the specification of the new system.

For many companies, regulatory requirements are the driver of new technology investment. Yet the associated process transformations will only be delivered if new systems and data processes have been optimized and validated to take full advantage of new data standardization. Even AI tools need agreed parameters to work within, e.g., in their use of data, so that they can be harnessed reliably and optimally.

The bigger and more globally dispersed the organization, the greater the need for agreed structure around how systems work and how data is defined, to prevent undesirable variances resulting from user diversity.

About The Authors:

Ulrich Lein is a management consultant at MAIN5, a European life sciences digital transformation consultancy. He specializes in managing, implementing, and driving forward change within the business, across systems, processes, procedures and documentation, in life sciences. Lein can be reached via email at ulrich.lein@main5.de.


Stefani Godoy Herrera is a computer systems validation expert at MAIN5. She is highly experienced in quality management, risk management, preparation of validation documents, and project management. Herrera can be reached via email at stefani.herrera@main5.de.