Automating Validation Testing: It’s Easier Than You Think

Automated validation testing has been elusive for many in the validation community.  There have been many “point solutions” on the market that addressed the creation, management and execution of validation testing.  However, what most validation engineers want is TRULY AUTOMATED validation testing that will interrogate an application in a rigorous manner and report results in a manner that not only provides objective evidence of pass/fail criteria but will highlight each point of failure.

In the 1980’s when I was conducting validation exercises for mini- and mainframe computers and drafting test scripts in a very manual way, I often envisioned a time when I would be able to conduct validation testing in a more automated way.  Most validation engineers work in an environment where they are asked to do more with less.  Thus, the need for such a tool is profound.

Cloud computing environments, mobility, cybersecurity, and data integrity imperatives make it essential that we more thoroughly test applications today.  Yet the burden of manual testing persists.  If I could share with you 5 key features of an automated testing system it would include the following:

  • Automated test script procedure capture and development
  • Automated Requirements Traceability
  • Fully Automated Validation Test Script Execution
  • Automated Incident Capture and Management
  • Ability to Support Continuous Testing in the Cloud

In most validation exercises I have participated in, validation testing was the most laborious part of the exercise.  Automated testing is easier than you think.

For example, ValidationMaster™ includes an automated test engine that captures each step of your qualification procedure and annotates the step with details of the action performed.

Test cases can be routed for review and pre-approval with the system quickly and easily through DocuSign.  Test case execution can be conducted online and a dynamic dashboard reports the status of how many test scripts have passed, how many have failed, or which ones may have passed with exception.  Once test scripts have been executed, the test scripts may be routed for post-approval and signed.

Within the ValidationMaster™ system, you can create a reusable test script library to support future regression testing efforts.  The system allows users to link requirements to test cases thus facilitating the easy generation of forward and reverse trace matrices.  Exporting documents in your UNIQUE format is a snap within the system.  Thus, you can consistently comply with your internal document procedures.

Continuous testing in a cloud environment is essential.  ValidationMaster™ supports fully automated validation testing allowing users to set a date/time for testing.  Test scripts are run AUTOMATICALLY without human intervention.  Allowing multiple runs of the same scripts if necessary.

Continuous testing in a cloud environment is ESSENTIAL.  You must have the ability to respond to rapid changes in a cloud environment that may impact the validated state of the system.  Continuous testing reduces risk and ensures sustained compliance in a cloud environment.

The system automatically raises an incident report if a bug is encountered through automated testing.  The system keeps track of each test run and results though automation.  ValidationMaster™ includes a dynamic dashboard that shows the pass/fail status of each test script as well as requirements coverage, open risks, incident trend analysis and much more.

The time is now for automated validation testing.  The good news is that there are enterprise level applications on the market that facilitate the full validation lifecycle process.  Why are you still generating manual test scripts?  Automated testing is easier than you think!

Saving Time and Money Through Lean Validation

The principles and best practices of lean manufacturing have served life sciences manufacturers well.  Lean is all about optimizing processes, while eliminating waste (“Muda”) and driving greater efficiencies.  As a 30-year validation practitioner, I have validated many computer systems, equipment and processes.  One of the key lessons learned is that there is much room for improvement across the validation process.

OnShore Technology Group is a pioneer in lean validation and has developed principles and best practices to support lean validation processes.  To power our lean processes, we leverage ValidationMaster™, an Enterprise Validation Management system exclusively designed to facilitate lean validation and automate the validation process.

So what is lean validation and how is it practically used?

Lean validation is the process of eliminating waste and inefficiencies while driving greater software quality across the validation process through automation.

Lean Validation

Lean validation cannot be achieved without automation. Lean validation processes leverage advanced technology designed to fulfill the technical, business and compliance requirements for software validation eliminating the use of manual, paper-based processes. Optimized validation processes are deployed using the latest global best practices to ensure the right amount of validation rigor based on critical quality attributes, risks, Cybersecurity and other factors.

Lean validation process begins with a lean validation project charter which defines the description of the process and key performance indicators.  KPI’s such as reduce validation costs by $X/year or reduce software errors by X%.  The charter shall define the project scope, dependencies, project metrics and resources for the project.

There are five principles of lean validation  derived from lean manufacturing principles.  Lean validation is powered through people, processes and technology.  Automation drives lean validation processes.  The LEAN VALIDATION value principles are illustated in the figure below.

Lean-Validation-Graphic-600

Principle 1 – VALUE – Lean thinking in manufacturing begins with a detailed understanding of what value the customer assigns to product and services.  Lean thinking from an independent validation and verification perspective begins with a detailed understanding of the goals and objectives of the validation process and its adherence to compliance objectives.  The principle of VALUE requires the validation team to therefore focus on the elimination of waste to deliver the value the end customer (your organization) in the most cost-effective manner.  The computer systems validation process is designed for the purpose of assuring that software applications meet their intended use.  The value derived from the validation process is greater software quality, enhanced ability to identify software defects as a result of greater focus and elimination of inefficient and wasteful processes. AUTOMATION IS THE FOUNDATION THAT FACILITATES THE ACHIEVEMENT OF THIS VALUE PRINCIPLE.

Principle 2 – VALUE STREAM – The value stream, from a lean perspective is the comprehensive product life-cycle from the raw materials through customer’s end use, and ultimate disposal the product.  To effectively eliminate waste, the ultimate goal of lean validation, there must be an accurate and complete understanding of the value stream.  Validation processes must be examined end-to-end to determine what value is added to the objective of establishing software quality and compliance.  Any process that does not add value to the validation process should be eliminated.  We recommend value stream mapping for the validation process to understand where value is added and where non-value added processes can be eliminated.  Typical “Muda” or wastes commonly revealed from validation process mapping are:

  • Wasteful Legacy Processes (“we have always done it this way”)
  • Processes That Provide No Value To Software Quality At All
  • Manual Process Bottlenecks That Stifle Processes

Principle 3 – FLOW – The lean manufacturing principle of flow is about creating a value chain with no interruption in the production process and a state where each activity is fully in step with every other.  A comprehensive assessment and understanding of flow throughout the validation process is essential to the elimination of waste.  From a validation perspective, optimal flow is created through the process when, for example, users have the ability to automatically create requirements from test scripts for automated traceability thereby eliminating the process of manually tracing each test script to a requirement.  Another example is when a user has the ability to navigate through a software application and the test script process is automatically generated.  Once generated, it is automatically published to a document portal where it is routed electronically for review and approval.  All of this requires AUTOMATION to achieve the principle of FLOW.  For process optimization and quality control throughout the validation lifecycle, information should optimally flow throughout the validation process in an efficient manner minimizing process and document bottlenecks with traceability throughout the process.

Principle 4 – PULL – A pull system is a lean manufacturing is used to reduce waste in the production process. Components used in the manufacturing process are only replaced once they have been consumed so companies only make enough products to meet customer demand.  There is much waste in the validation process.  The PULL strategy for validation may be used to reduce wastes such as duplication of effort, streamlining test case development and execution, electronic signature routing/approval and many others.  Check out our blog “The Validation Post” for more information.

Principle 5 – PERFECTION – Validation processes are in constant pursuit of continuous improvement.  Automation is KEY.  Lean validation engineers and quality professional relentlessly drive for perfection. Step by step validation engineers must identify root causes of software issues, anomalies, and quality problems that affect the suitability of a system for production use. As computing systems environments evolve and become more complex and integrated, validation engineers must seek new, innovative ways to verify software quality and compliance in today’s advanced systems. Perfection cannot easily be achieved through manual processes.

AUTOMATION IS REQUIRED TO TRULY REALIZE THE VISION OF THIS PRINCIPLE.

You can save time and money through lean.  Consider the first 2 principles of Value and Value Stream.  Many validation engineers consider validation only as a regulatory requirement or “necessary evil” – rather than a regulatory best practice designed to save time and money.  I think we all agree that in the long run quality software saves time and money through regulatory cost avoidance and more efficiency throughout quality processes.

It is important for validation engineers to not only consider quality and compliance but cost savings that may be gained throughout the validation process.  Lean validation is a strategy whose time has come.  How much will cost savings be through lean?  End users report saving about 40 – 60% for test script development and 60 – 70% on regression testing efforts.  Regulatory cost avoidance can be significant depending on the level of compliance with each company.  Cost savings may also be realized in minimizing the amount of paper generated through the validation process.

Embrace LEAN VALIDATION and experience the benefits of saving time and money.

 

 

 

Why Are You Still Generating Validation Test Scripts Manually?

Drafting validation scripts is one of the key activities in a validation exercise designed to provide document evidence that a system performs according to its intended use.  The FDA and other global agencies require objective evidence, usually in the form of screen shots that sequentially capture the target software process, to provide assurance that systems can consistently and repeatedly perform the various processes representing the intended use of the system.

Since the advent of the PC, validation engineers have been writing validation test scripts manually.  The manual process of computer systems validation test script development involves capturing screenshots and pasting them into Microsoft Word test script templates.  To generate screen captures, some companies use tools such as Microsoft Print Screen, TechSmith SnagIT, and other such tools.  A chief complaint of many validation engineers is that the test script development process is a slow, arduous one.  Some validation engineers are very reluctant to update/re-validate systems due to this manual process.  So, the question posed by this blog article is simply this: “Why are you still generating test scripts manually???”

I have been conducting validation exercises for engineering and life sciences systems since the early 1980’s.  I too have experienced first-hand the pain of writing test scripts manually.  We developed and practice “lean validation” so I sought ways to eliminate manual, wasteful validation processes.  One of the most wasteful processes in validation is the manual capture/cutting/pasting of screenshots into a Microsoft Word document.

The obvious follow up question is “how do we capture validation tests in a more automated manner to eliminate waste and create test scripts that are complete, accurate and provide the level of due diligence required for validation?”

In response to this common problem, we developed an Enterprise Validation Management system called ValidationMaster™.  This system includes TestMaster™, an automated testing system that allows validation engineers to capture and execute validation test scripts in a cost-effective manner.

TestMaster™ is designed to validate ANY enterprise or desktop application.  It is a browser-based system and allows test engineers to open any application on their desktop, launch TestMaster™, and capture their test scripts while sequentially executing the various commands in their applications.    As the validation engineer navigates through the application, TestMaster™ captures each screenshot and text entry entered in the application.

Once the test script is saved, TestMaster™ allows the script to be published in your UNIQUE test script template with the push of a button.  No more cutting/pasting screenshots from manual processes!  You can generate your test scripts in MINUTES as opposed to the hours it sometimes takes to compile documents based on a series of screenshots.  If you are one of those validation engineers that does not like screenshots in your scripts, you can easily create text-based processes both quickly and easily using TestMaster™.

So, what is the biggest benefit of using TestMaster™ versus manual processes?  There are three key benefits which are summarized as follows:

  1.  Automated Test Script Execution– for years, validation engineers have wanted a more automated approach for the execution of validation test scripts.  ValidationMaster™ supports both hands-on or hands-off validation testing.  Hands-on validation testing is the process whereby a validation engineer looks at each step of a validation test script and executes the script step-by-step by clicking through the process.  Hands off validation allows a validation engineer to execute a test script with no human intervention.  This type of regression testing (hands off) is very useful for cloud-based systems or systems that require more frequent testing.  The validation engineer simply selects a test script and defines a date/time for its execution.  At the designated time with no human intervention, the system executes the test script and reports the test results back to the system.   Automated testing is here!  Why are you still doing this manually?

  1.  Traceability– TestMaster™ allows validation engineers to link each test script to a requirement or set of requirements, thus the system delivers automatic traceability which is a regulatory requirement.  With the click of a button, TestMaster™ allows validation engineers to create a test script directly from a requirement.  This is powerful capability that allows you to see requirements coverage through our validation dashboard on demand.  This validation dashboard is viewable on a desktop or mobile device (Windows, Apple, Android).

  1.  Test Script Execution– One of the biggest problems with manual test scripts is that they must be printed and manually routed for review and approval.  Some companies who have implemented document management systems may have the ability to route the scripts around electronically for review and approval.  The worst-case scenario is the company that has no electronic document management system and generates these documents manually.  TestMaster™ allows validation engineers to execute test scripts online and capture test script results in an easy manner.  The test script results can be captured in an automated way and published into executed test script templates quickly and easily.   If incidents (bugs/anomalies) occur during testing, users have the ability to automatically capture an incident report which is tied to the exact step where the anomaly/bug occurred.  Once completed, ValidationMaster™ is tightly integrated with a 21 CFR Part 11-compliant portal (ValidationMaster Portal™). Once the test script is executed, is it automatically published to the ValidationMaster™ Portal where it is routed for review/approval in the system.  The ability to draft, route, review, approve, execute and post-approve validation test scripts is an important, time/cost saving feature that should be a part of any 21stcentury validation program.

  1.  Reuse Test Scripts For Regression Testing– manual test scripts are not ‘readily’ reusable.  What I mean by this is that the Word documents must be edited or even re-written for validation regression testing.  Validation is not a one-time process.  Regression testing is a fact of life for validation engineers.  The question is, will you rewrite all of your test scripts or use automated tools to streamline the process.  ValidationMaster™ allows validation engineers to create a reusable test script library.  This library includes all the test scripts that make up your validation test script package.  During re-validation exercises, you have the ability to reuse the same test scripts for regression testing.

Given the rapid adoption of cloud, mobile and enterprise technologies in life sciences, a new approach to validation is required.  Yes, you can still conduct validation exercises on paper but why would you?  In the early days of enterprise technology, we did not have tools available that would facilitate the rapid development of validation test scripts.  Today, that is not the case.  Systems like ValidationMaster™ are available in either a hosted or on-premise environment.  These game-changing systems are revolutionizing the way validation is conducted and offering time/cost-saving features that make this process easier.   So why are you still generating test scripts manually?

Automated Validation Best Practices

Automation is the key to lean validation practices.  Although many validation processes are still paper-based manual processes, there are best practices that support Independent Verification and Validation (IV&V) processes that drive efficiency and compliance.

BEST PRACTICE 1 – Establish Independence

The IEEE 1012 Standard For System, Software and Hardware Verification and Validation states that Independent Verification and Validation (IV&V) is defined by three parameters:

  1. Technical Independence – ensures independence from the development team.  Technical independence is intended to provide a fresh point of view in the examination of software applications to help better detect subtle errors that may be overlooked by those that are too close to the solution such as the development or system implementation team.
  2. Managerial Independence – helps to ensure that an organization separate and distinct from the development or program management team.  Managerial independence ensures that the validation team has the autonomy to independently select the validation methodology, processes, schedule, tasks, and testing strategy to independently confirm the suitability of applications for their intended use.  Managerial independence also ensures that the IV&V team can objectively report all validation test results without any restrictions or approval from the development team or system integration team.  This is a very important level of independence.
  3. Financial Independence – ensures that there are no financial ties between the IV&V team and development team to ensure objectivity.  This level of independence is designed to prevent situations where financial ties may adversely influence or pressure IV&V personnel to deliver less than an objective, authentic test results.

The IEEE 1012 standard speaks of various forms of independence but the bottom line is that the IV&V team should be as independent as possible from the development team.  It is not a good best practice for development teams to also validate their own development projects.  Objectivity is sacrificed when this is done.  Following this best practice ensures objective examination of your software projects free of bias and undue external influence from the development team.

BEST PRACTICE 2 – Continuous Testing In The Cloud

Cloud environments can be validated.  However, there are several issues and characteristics of cloud environments that challenge traditional assumptions regarding validation efforts.

  • Continuous changes in the cloud
  • Inability to conduct supplier audits for large cloud vendors (Microsoft, Oracle, et al)
  • Maintaining the Validated State

Cloud system environments continuously change.  Validation engineers are not used to uncontrolled changes in system environments.  We have been taught that all changes to a system environment once it has been validated must undergo change control.  Thus all changes are subject to a change request process.

In cloud environments, we don’t control when changes are made to systems. Cloud vendors may change disk drives, virtual servers, apply patch updates, and memory and many other system changes that may affect your validated system environment. So the question becomes how do you maintain the validated state in the cloud? There are several best practices designed to answer this question. First of all, you need a way to determine what changes are made in the cloud. Take for example Microsoft office 365 or Microsoft dynamics 365. Microsoft has established what is known as a trust center. The Microsoft trust center is an excellent resource and it provides information about how Microsoft examines its cloud environment. The first consideration you should look at when selecting cloud technology is who your provider is. All cloud providers are not created equal. There are some cloud providers that take compliance, security, data integrity and governance seriously and those who are more general or consumer oriented in nature and do not prioritize these characteristics.

Microsoft, continuing the example, has achieved several key industry certifications for their cloud environment. But most importantly, through the trust center they have provided visibility and clear communications as to how they manage the cloud. From a testing perspective Microsoft solves one of the biggest problems you have in the cloud and that is the question of how do you know what changes were made in the cloud and when the cloud provider made them. Microsoft provides a list of updates byproduct application and tells you exactly what changes were made, the date that the changes were made and if the updates or patches were successfully applied in the environment. With the Microsoft cloud it’s no longer the case that you don’t know when Microsoft has changed the environment. Thanks to their transparency, you know exactly what changes are made to the environment which brings us to the best practice of continuous testing.

Since cloud environments change so often you should employ a strategy known as continuous testing. Continuous testing is essentially validation testing at predefined (user-defined) intervals to ensure that cloud environments are maintained in a validated state. To successfully employ a continuous testing strategy, automation is essential. This is not a process that you would want to carry out manually although it can be carried out manually if you so desire. Automation adds a dimension of efficiency and consistency in the environment.

To employ continuous testing you want to establish a reusable test script Library. This is essential. Once you have validated your system using automated tools such as ValidationMaster™, you will have established a reusable test script Library. The test scripts developed can be used for subsequent regression testing and can be automated to save both time and money. For continuous testing you would conduct an impact analysis to determine the impact of changes that are made to the cloud environment. Once you conduct an impact analysis, you would want to do a risk assessment to ensure that you effectively monitor risk in accordance with ISPE GAMP 5®.  You then want to select from among your reusable test script Library regression tests suitable for continuous testing in your cloud environment. Once these test scripts are executed you can document the actual results and provide a level of due diligence for regulators that you are maintaining your cloud environment in a validated state.

BEST PRACTICE 3 – Select The Right Automation Tools

Another key best practice is selecting the right automation tools for validation. How do you know how to select the right tools? There are two types of automated tools on the market: (1) point solutions and (2) enterprise solutions. The point solution is one that addresses a single element of the validation process. For example, the management of requirements is an essential core component of any validation exercise. There are requirements management point solutions on the market that would assist you in effectively managing user, functional, and design requirements for any validation initiative. Testing is another core element of the validation process. There are many solutions out there that would allow you to capture and record test scripts and some even allow you to execute test scripts online. The problem with point solutions is that they only provide one step in the process. When validating systems it is not common for you to use up words of 17 different systems (point solutions) to prepare validation documentation and due diligence. This does not seem to make much sense and is often fraught with duplication of effort and inefficiencies that cost time and money.

To drive lean validation processes and to achieve automation best practice, you need an enterprise validation management solution to fully automate the validation process – not just one part of it. An enterprise validation management system has the capability of managing validation planning documentation such as the validation master plan, risk assessment, validation project plan, and other related documentation.

As a matter of fact an enterprise validation management system includes an enterprise content management system as a core component of the overall solution. The key deliverables from the validation process are documents. Lots of documents! It stands to reason that an enterprise content management system would be an overall core part of the solution. An enterprise validation management system should also include a requirements management system. It should have the ability to manage any type of requirements. An automated test engine should be at the core of such a solution. The automated test engine should have the ability to not only record test scripts but execute test scripts online and capture objective actual results.

The system should have a robust reporting engine that facilitates the efficient output of any type of report required as part of the due diligence for validation. Quality management is at the core of the validation process. Therefore an enterprise validation management system should include capabilities for all aspects of quality including change control, audit management, CAPA, nonconformances, training, periodic review, trend analysis, and validation key performance indicators. The system should provide real time, statistics on the overall health and performance of validation processes. The system should have at its core foundation standard technology adaptable in any systems environment.

It is best practice to select and deploy the RIGHT tools to support enterprise validation processes.  Point solutions will only get you so far.  Selecting the proper automation tools can save both time and money and deliver a single source of truth for your validation projects.

BEST PRACTICE 4 – Establish a Reusable Test Library

One of the most laborious tasks during software validation is TESTING.  The test script development, execution and documentation process takes considerable time if you do it correctly.  From the establishment of a test environment through the development of test scripts, the validation engineer must carefully document expected and actual results sufficient to prove that systems meet their intended use and have the requisite quality expected of such systems.

Developing test scripts takes time.  Traceability also takes time.  When systems are validated, you want to have the ability to retest a system as required but not have to rewrite test scripts over and over again.  A reusable test script library has been one  of the most effective practices I have implemented.

Reusable test scripts can save up to 60% of time which may be required to rewrite test scripts. Establishing a reusable test script library with FULLY AUTOMATED scripts can save even more time and money in that the fully automated scripts can be executed without human intervention.  You have the ability to set a date/time when test scripts are to be executed and the system (ValidationMaster®) will automatically execute them and report the results back in a fraction of the time it takes to manually execute them.  It is therefore best practice to establish a reusable test script library for enterprise validated systems.

BEST PRACTICE 5 – Document Clear Objective Test Evidence

Documentation of clear, objective test evidence is essential for validation. Many of the automated validation management systems do not have reporting engines that are robust enough to allow you to report documents in your unique format. It is best practice to employ an enterprise validation management system that allows you to present clear objective test evidence in your unique document formats as specified by your SOP’s. ValidationMaster™ has a comprehensive reporting engine that allows you to deliver validation reports in your unique format as is required by this best practice.

.

BEST PRACTICE 6 – Establish a Single Source of Truth For Validation Deliverables

For many organizations that conduct validation on paper, there doesn’t exist a single source of truth for validation. Some validation assets are housed within the document management system. A part of the validation package are kept within a code management systems such as SourceSafe. Part of the deliverables may be kept within a requirements management system. Some incident reports may be kept in an incident management system. Other validation deliverables may be paper-based. In many cases validation engineers attempt to keep sign copies of documentation and multiple three ring binders. This was the traditional practice in the ‘80s.

For lean validation practices that support automation, it is best practice to establish a single source of truth for all validation deliverables. This means is that there is a single point where all validation deliverables pre-and post-execution are stored.  A single source of truth facilitates better auditing and eliminates the common occurrence of loss of documentation to support an audit exercise. This best practice is essential to achieving validation excellence.

The core benefits of ValidationMaster™ is to deliver a single source of truth for all validation projects. You can store all of your validation projects in a single, easy-to-use system and reference it for internal or external audits. For lean validation this is current best practice.

Following the six key best practices above can save both time and money. Validation has not changed much over the last 40 years but the way we manage it has changed significantly. Cloud validation, mobility, cyber security and a host of other factors change the way we look at computer systems validation and manage it. I hope you find these best practices useful and effective in helping you to deliver your next validation project on time and within budget.  We use ValidationMaster™ in our practice every day to support our lean validation processes saving our clients considerable time and money.  What’s in your validation office?