Computer Systems Validation As We Know It Is DEAD

Over the past 10 years, the software industry has experienced radical changes.  Enterprise applications deployed in the cloud, the Internet of Things (IoT), mobile applications, robotics, artificial intelligence, X-as-a-Service, agile development, cybersecurity challenges and other technology trends force us to rethink strategies for ensuring software quality.  For over 40 years, validation practices have not changed very much.  Suprisingly, many companies still conduct computer systems validation using paper-based processes.  However, the trends outlined above challenge some of the current assumptions about validation.  I sometimes hear people say “… since I am in the cloud, I don’t have to conduct an IQ…” or they will say, “… well my cloud provider is handling that…”

Issues related to responsibility and testing are changing based on deployment models and development lifecycles.  Validation is designed to confirm that a system meets its intended use.  However, how can we certify that a system meets its intended use if it is left vulnerable to cyber threats?  How can we maintain the validated state over time in production if the cloud environment is constantly changing the validated state?  How can we adequately test computer systems if users can download an “app” from the App Store to integrate with a validated system?  How can we ensure that we are following proper controls for 21 CFR Part 11 if our cloud vendor is not adhering to CSA cloud controls?  How can we test IoT devices connected to validated systems to ensure that they work safely and in accordance with regulatory standards?

You will not find the answers to any of these questions in any regulatory guidance documents.  Technology is moving at the speed of thought yet our validation processes are struggling to keep up.

These questions have led me to conclude that validation as we know it is DEAD.  The challenges imposed by the latest technological advances in agile software development, enterprise cloud applications, IoT, mobility, data integrity, privacy and cybersecurity are forcing validation engineers to rethink current processes.

Gartner group recently announced that firms using IoT grew from 29% in 2015 to 43 % in 2016.  They project that by the year 2020, over 26 billion devices will be IoT-devices.  it should be noted that Microsoft’s Azure platform includes a suite of applications for remote monitoring, predictive maintenance and connected factory monitoring for industrial devices.  Current guidance has not kept pace with ever-changing technology yet the need for quality in software applications remains a consistent imperative.

So how should validation engineers change processes to address these challenges?

First, consider how your systems are developed and deployed.  The V-model assumes a waterfall approach yet most software today is developed using Agile methodologies.  It is important to take this into consideration in your methodologies.

Secondly, I strongly recommend adding two SOPs to your quality procedures – a Cybersecurity SOP for validated computer systems and a Cloud SOP for validated systems.  You will need these two procedures to provide governance for your cloud processes.  (If you do not have a cloud or cybersecurity SOP please contact me and I will send you both SOPs.)

Third, I believe you should incorporate cybersecurity qualification (CyQ) into your testing strategy.  In addition to IQ/OQ/PQ, you should be conducting a CyQ readiness assessment for all validated systems.  A CyQ is an assessment to confirm and document your readiness to protect validated systems against a cyber attack.  It also includes testing to validate current protections for your validated systems.  It is important to note that regulators will judge you on your PROACTIVE approach to compliance.  This is an important step in that direction.

cyq-1

Forth, you should adopt lean validation methodologies.  Lean validation practices are designed to eliminate waste and inefficiency throughout the validation process while ensuring sustained compliance.

Finally, the time has come for automation.  To keep pace with the changes in current technology as discussed above, you MUST include automation for requirements management, validation testing, incident management and validation quality assurance (CAPA, NC, audit management, training, et al).  I recommend consideration of an Enterprise Validation Management system such as ValidationMaster™ to support the full lifecycle of computer systems validation.  ValidationMaster™  allows you to build a re-usable test script library and represents a “SINGLE SOURCE OF TRUTH” for all of your validation projects.  Automation of the validation process is no longer a luxury but a necessity.

Advanced technology is moving fast.  The time is now to rethink your validation strategies for the 21st century.  Validation as we know it is dead.  Lean, agile validation processes are demanded to keep pace with rapidly changing technology.  As you embrace the latest cloud, mobile and IoT technologies, you will quickly find that the old ways of validation are no longer sufficient.  Cyber criminals are not going away but you need to be ready. Step into LEAN and embrace the future!

 

Validation Testing: Understanding The Why and How

For today’s on-premise and cloud-based systems, validation testing is a required process to ensure that systems are of sufficient quality and operate according to their intended use.  Validation testing is typically done at the end of the development process after all verification has been completed.  IEEE defines validation as the process of evaluating software to determine whether it satisfies the specific defined requirements.  Therefore validation testing must be traced to pre-defined requirements.

The goals of validation are pretty clear:

  • Discover errors/anomalies in software prior to production
  • Confirm that system meet their intended use
  • Confirm that regulatory requirements in the software are met
  • Provide due diligence (documented evidence) for regulators
  • Deliver justification for use of a system

I have had the priviledge of working with many life sciences companies over the years and I have seen it all – from ad hoc testing processes to those that are well-defined and mature in their optimization and effectiveness.  Most testing processes are at level one where the processes are chaotic and not well-defined.

testing cmmi

Automated validation testing processes are essential in today’s life sciences companies where we all are being asked to do more with less.  It is essential that we establish automated processes to accelerate productivity, eliminate waste and ensure greater to ensure software quality.

The less time spent on the mechanics of test script development, the more time can be dedicated to ensuring software quality.

The software testing capability maturity model should be on your radar.  Establishing automated testing should be a goal for every validation engineer.  It is important to understand how to achieve Level 5 and what it takes from a process perspective to achieve greater testing governance and sustained compliance.

ESTABLISHING A REUSABLE TEST SCRIPT LIBRARY

When conducting validation, the most laborious part of the process is testing.  Validating today’s COTS software applications involves testing the same “out-of-the-box” features over and over again.  Many validation engineers continue to draft test scripts again and again to support this process.  What if you could establish a “reusable test script library” for your validation projects that would allow you to conduct regression testing quickly and easily without major rewrite for your applications?  What if you could centrally store this repository for all of your applications so you had a single source of truth for all of your validation projects?  What if you could ensure that your validation test library was “auditable” and could be shared with regulators during audits as part of your objective evidence requirements?  What if each test script had its own audit trail and was traced to its respective requirements for automatic traceability?

The ability to effective establish and manage a reusable test script library and a single source of truth for all of your validation projects is made possible with the ValidationMaster™ Enterprise Validation Management system.

The system allows you to create, track and manage a reusable test script library quickly and easily.  All of your validation assets are in a single location for reference and reuse.  Intelligence can be quickly gleaned from the system to drive continuous improvement and compliance.  For fully automated scripts that require no human intervention to run, the system has the ability to automate test script execution and reporting of actual results.  This helps to facilitate continuous testing in the cloud and ensure that your systems are maintained in a validated state.

Validation testing is here to stay.  AUTOMATION IS THE KEY!  It is a necessity not a luxury to automate your validation processes.  Join us for one of our Automated Testing  online web briefings to learn more.