As a validation engineer with over 30 years of experience, I am well aware of the need for rigorous testing of software applications to ensure their readiness for production use. The new software assurance methodology promoted by the US Food and Drug Administration endorses the concept of ad hoc testing.
In today’s environment where cloud technologies and sometimes mobile technologies are adopted as
validated systems, the agency is recognizing the need for a different approach to validation. In times
past, all validation testing had to be strictly documented and presented to the FDA as evidence that the
process was indeed executed and completed in a manner consistent with FDA guidelines. With the new
software assurance methodology, the agency has flipped the old validation paradigm on its head. In the
old paradigm, validation documentation was the essential deliverable. If you recall, in the old days the
FDA had a saying “if it’s not documented, it didn’t happen.” Many organizations took this precept to
heart and attempted to document everything. In today’s rapid paced environment where technology is
being adopted seemingly at the speed of thought, this approach is recognized as not only being
impractical but unnecessary.
What the agency wants the validation community to focus on is more critical thinking about the way the application impacts patient health and safety as well as product safety, quality and compliance. In order to do that, the FDA wants us to rigorously test applications. What has been the challenge with testing applications is ‘are we testing as rigorously as we should?’ There has always been one major challenge with many life sciences companies; time. Many validation engineers say how much testing is enough? The answer is usually always; ‘Whatever amount of testing that could fit within the project management guidelines.’ In many cases most software applications, especially enterprise applications, have established go-live dates and time crunches that are essential to drive the business. Validation teams typically don’t have the luxury of conducting as much testing as their hearts desire in order to ensure that we have tested literally every single feature and function within a software application. The ISPE GAMP 5 methodology strongly suggested that bespoke applications be tested differently than commercial off-the-shelf applications. ISPE GAMP 5 recommended a strategy for out-of-the-box applications that required slightly less rigorous testing of each feature and function of the application, while focusing more on the auditing of the software vendor. This is to ensure that the vendor is following good current software development practices and change control procedures in the development of their software applications. The thinking was that this would then spawn less of a need to test the application the way the vendor did. If you audit each of the vendors, the vendor should perform future function testing of each application, and thus the life sciences company would be left with testing the application in accordance with their intended use versus every single feature and function within the application. This was the thinking of the ISPE GAMP 5 methodology.
The FDA has actually changed its thinking with respect to testing by endorsing ad hoc testing. Ad hoc testing eliminates the need to document each step, as was the case with traditional validation functional testing in the past. The goal of such an endorsement is to ensure that applications are rigorously tested and take away the excuse of not having adequate time to document each individual test. In the new software assurance methodology, the FDA says that you can summarize ad hoc testing as opposed to documenting every step, generating reams and reams of paper-based documentation. This is a game changer for validation engineers that would allow them to go into regulated applications and thoroughly test them.
Now what is the practical approach of the ad hoc testing strategy? Can validation engineers simply go into a system, conduct ad hoc testing, and come out and say everything is okay? The short answer to that is no. The agency still want you to summarize your discoveries during ad hoc testing. You should know after ad hoc testing how many test scripts passed and how many test scripts failed. You should be able to determine or document how many incidents occurred during ad hoc testing. During ad hoc testing you should also be able to discover how many new issues have arisen during ad hoc testing, if this is not the first time that the application has been tested. Please keep in mind that enterprise applications are very complex. If you’re implementing an ERP system, and enterprise content management system, an enterprise document management system, or an enterprise training management system that’s integrated with other applications, these applications are complex and a simple change in one application may spawn errors in other parts of the application that were unintended. So your ad hoc testing strategy should not only allow you to conduct ad hoc testing, but it should allow you to present summary results to the agency without significant additional burden as it would be if you were developing documented test cases.
The question for each validation engineer is; “Does your current validation program support ad hoc testing?”Do you have automated tools that would allow you to quickly and efficiently conduct ad hoc testing and summarize the results? If not, you need to discover a practical way that you can conduct this sort of testing with minimal burden on your validation team. That is really what the new software assurance methodology is all about.
The ValidationMaster™ System supports ad hoc testing in accordance with the software assurance methodology. In addition to fully documented test cases, the ad hoc testing module allows you to conduct ad hoc testing and summarize the results of such testing. This is revolutionary in terms of enterprise validation management and quality systems but yet a necessary addition to ensure compliance and efficiency. Further, the ValidationMaster™ system also uses artificial intelligence to streamline and optimize all testing including the ad hoc testing. With self-healing test scripts, the system actually recognizes what systems have changed, and self-heals the test scripts accordingly.
As validation engineers, we need to look at how we can more efficiently and effectively validate computer systems. The time for ad hoc testing is here. It is actually strongly mandated by the software assurance methodology. You need to be ready to embrace this new methodology when it is formally adopted and change your strategy on how to conduct ad hoc testing as part of your validation program. You will need Standard Operating Procedures that provide governance over ad hoc testing as well as formal validation testing strategies.
Software assurance is here. The question is, how will your organization adopt this new methodology to drive greater efficiencies and compliance throughout your organization.