Can Microsoft Azure Be Validated?

The text book definition of computer systems validation (paraphrased) is “documented evidence that a system performs according to its intended use”.  The FDA uses the definition “…Validation is a process of demonstrating, through documented evidence, that <software applications> will consistently produce the results that meets predetermined specifications and quality attributes.”

Using the FDA’s definition, one should understand that validation is a PROCESS.  It is not a one-time event.  Also, looking at this definition, one should understand that Agency expects to see “documented evidence” that a system will consistently produce results that meet predetermined specification and quality attributes.  And finally, from the FDA’s definition of validation, we learn that to properly validate a system, one MUST have “predetermined specifications and quality attributes” defined.

Let’s start at the beginning with the question posed by this blog post “Can Microsoft Azure® Be Validated?”  The short answer is YES.  The remainder of this post I will share 6 key reasons why this bold statement can be made.  But first, let’s establish some foundation as to what Microsoft Azure® (“Azure®”) is and why this question is relevant.

At its core, Azure is a cloud computing platform and infrastructure created by Microsoft for building, deploying, and managing applications and services through a global network of Microsoft-managed datacenters (Wikipedia).  In today’s global economy, life sciences companies, like many of their industry counterparts are seeking to do more with less.    These companies are embracing enterprise technologies and leveraging innovation to drive global business and regulatory processes.  If you had asked me 10 years ago, how many companies I am working with are deploying enterprise technologies in the cloud, my answer would have been very few, if any.  If you further asked me 10 years ago, how many cloud validation exercises in life sciences was I conducting, I would have said NONE.

Suffice to say, cloud computing has been a game-changer for life sciences companies.  It should be noted that due to regulatory considerations, the life sciences industry has been reluctant to change validation practices.  But the cloud is changing the process of validation as we know it.  Sure, the basic principles of risk-based validation endure.  You STILL have to provide documented evidence that your software applications will consistently produce the results that meets predetermined specifications and quality attributes.  HOW you do this is what is changing.  Validating cloud applications is a bit different than validating on-premise application mostly due to the changing roles and responsibilities of those responsible for the system and the cloud providers themselves.  I will say at the outset that all cloud providers are not created equally so “caveat emptor” (let the buyer beware).

Let’s talk about the 6 key reasons why Azure can be validated and how this is done.

SIX REASONS WHY AND HOW AZURE CAN BE VALIDATED.

REASON 1 – AZURE RISK ASSESSMENT

Risk management is not a new concept.  It has been around since the 1940’s applied to military and aerospace industries before it was broadened out.  The first step crucial step in any computer systems validation project is to conduct a validation risk assessment.  Among determining the impact on critical quality attributes, a risk assessment is used to determine the level of validation due diligence required by the system.   How much validation testing you should do for any validation exercise should be based on a risk assessment of the system.  When deploying systems such as Microsoft Dynamics AX® or BatchMaster/GP® using the Azure platform, the risk assessment should include an assessment of the Azure platform itself.  When outsourcing your platform to a vendor such as Microsoft, their risks become your risks thus, you must conduct your due diligence to ensure that these risks are acceptable in your organization.

The ISPE GAMP 5® (“GAMP 5®”) risk assessment model defines the process for hazard identification, estimation of risk, evaluation of risk, control and monitoring of risks and the process for documenting risk.  GAMP 5® speaks of a 5-step process including:

  1. Perform Initial Risk Assessment and Determine System Impact
  2. Identify Functions With Impact On Patient Safety, Product Quality and Data Integrity
  3. Perform Functional Risk Assessment and Identify Controls
  4. Implement and Verify Appropriate Controls
  5. Review Risks and Monitor Controls

GAMP 5® does not explicitly address cloud infrastructure.    One could extrapolate and interpret how this standard may be applied to cloud infrastructure but it is one of the limitations of applying a “clean” translation.  Azure could be treated as a Commercial-off-the-Shelf (COTS) platform.  You will note that the risk process identified by GAMP 5® is more application-focused than platform-focused.  To conduct a proper risk assessment of the Azure, a new framework must be established to conduct a risk assessment for the platform.  In July 2011, ISACA, previously known as the Information Systems Audit and Control Association, released a document called “IT Control Objectives for Cloud Computing: Controls and Assurance in the Cloud” which highlighted specific control objectives and risks inherent in cloud computing.  From a validation perspective, you need to address availability/reliability risks, performance/capacity risks, as well as security and cybersecurity risks.  Using the publicly available independent service organization controls (SOC 1/2/3) reports, you should be able to effectively document and assess these risks.  Remember, if it’s not documented, it didn’t happen!

REASON 2 – CHANGE CONTROL (“THE ELEPHANT IN THE ROOM”)

One of the biggest challenges with companies adoption of cloud computing is CHANGE CONTROL.   As you may be well aware, cloud environments are under constant change.  In the validation world, change control is an essential process to help ensure that validated systems remain in a validated state.  So how do you maintain the validated state when a platform such as Azure is in a constant state of change?  First, all validated systems are subject to change control.  Each time a system is changed, the change, reason for change and who made the change must be documented.  When the system is on-premise, changes are documented (in general) by the IT department.  The responsibility for the infrastructure belongs to this group in an on-premise scenario.  In a cloud environment, this responsibility belongs to the cloud provider.  SOC reports highlight and define change control processes and procedures for the cloud environment.

We recommend to our clients that they obtain a copy of these reports under non-disclosure to ensure compliance of the cloud provider.  Once in the cloud, we recommend developing a testing and regression testing strategy that tests systems in a cloud environment on a more frequent basis.  In the ‘80’s and ‘90’s, validation engineers were very reluctant to change validated systems due to the complexities and level of effort involved in testing validated systems using manual testing processes.  Today, companies are leveraging tools like OnShore’s ValidationMaster™ to facilitate automated testing that allows validation test engineers to schedule the run of automated test scripts in a cloud environment.  ValidationMaster™ takes the complexity from regression testing and frees up resources for other work.   Change control is essential in on-premise and cloud environments.   To support the cloud, you will need to establish clear policies and procedures that reference both on-premise as well as cloud computing systems.  With respect to change control, it is very important that your policies reflect procedures in a cloud environment.

Microsoft has established clear controls for Azure to manage change control in their environment.  These procedures have been documented and tested as per their SOC 1/SOC 2 control reports issued by independent auditors.  When validating systems in an Azure environment, you will have to get copies of these reports and them with your validation package.    The supplier audit becomes crucial in a hosted environment.  The good news is that Microsoft is a reputable supplier and can be relied on as a stable vendor with respect to its products.

Microsoft Azure has undergone many independent audits including a 2013 ISO/IEC 27001:2005 audit report that confirms that Microsoft has procedures in place that provide governance for change management processes.  The audit report highlights both the methodology confirms that changes to the Azure environment are appropriately tested and approved.

REASON 3 – INDEPENDENT AUDIT CONFIRMATION OF TECHNICAL/PROCEDURAL CONTROLS AND SECURITY

In cloud environments, the role of the cloud provider is paramount.  All cloud providers are not created equal.   Microsoft Azure has passed many audits and these reports, as previously noted are available upon request.  Security and cybersecurity issues are very important in the Azure environment.  Microsoft has implemented controls and procedures to ensure compliance.   The company has provided detailed information on security on their website (https://www.microsoft.com/en-us/TrustCenter/Security/default.aspx )

REASON 4 – INFRASTRUCTURE QUALIFICATION (IQ)

IQ testing changes in the Azure environment.  In the Azure environment, Microsoft bears some responsibility for software installation of hardware and software.  As previously mentioned, it is important to qualify the environment once it is setup to confirm that what is supposed to be installed is present in your environment.  We qualify the virtual software layer as well as Microsoft Lifecycle Services.

REASON 5 – OPERATIONAL/PERFORMANCE QUALIFICATION (OQ/PQ)

The process of operational qualification of the application is very similar in a cloud environment to the process in an on-premise environment.  Microsoft Dynamics AX now can easily be deployed in an Azure environment.   Validation testing of this type of implementation with some variation.  The installation of Microsoft Dynamics AX uses what is called Lifecycle Services which allows you to setup and install the system rapidly using automation.  Lifecycle Services may be subject to validation itself since it is used to establish a validated systems environment.  You will need to identify the failure of Lifecycle Services as a risk and rate the risk of using this service as part of the risk assessment.

Lifecycle Services is promoted as bringing predictability and repeatability to the process of software installation.  These are principles upon which validation has stood for many years.  However, it is important to go beyond the marketing to verify that this software layer is performing according to its intended use.  We typically draft a qualification script to qualify Lifecycle Services prior to using it in a validated systems environment.  We conduct OQ/PQ testing in the same manner as in a non-hosted environment.

REASON 6 – ABILITY TO MAINTAIN THE VALIDATED STATE

With Microsoft Azure, the need to maintain the validated state does not go away.  It is very possible to maintain your mission-critical applications in a validated state that are hosted using the Azure platform.  To ensure that systems are maintained in a validated state in the Azure environment, we recommend establishing policies and procedures and adopting the test early/test often premise to ensure that changes made to the infrastructure do not negatively impact the validated production systems environment.  It is possible to maintain the validated state using Azure.  It must be done with a combination of technology, processes and procedures to ensure compliance.

As a validation engineer, I have validated many systems on the Azure platform.  I have concluded that the Azure platform can be validated and used within the life sciences environment.   How much validation due diligence you should conduct should be based on risk.  Don’t over do it!  Use lean methodologies to drive your validation processes.

Upon close inspection of this platform, it has built-in quality and security controls to provide a level of assurance of its suitability for deployment in regulated environments.   I like the idea that the system has been independently inspected by numerous 3rd party organizations to help ensure objectivity.  This platform is ready for prime time and many of my clients are leveraging its benefits and freeing themselves from the burden of managing their infrastructure.

Cybersecurity Qualification (CyQ)

One topic that has been top of mine for many validation engineers, chief information officers, and executive management is that of Cybersecurity. You may be asking yourself the question why are we talking about Cybersecurity and validation? Recent headlines will inform you as to why this topic should be of great interest to every validation engineer. As validation engineers we spend a lot of time stressing about risk assessments, system security, and qualification of system environments. Our job is supposed to be to validate the system to ensure its readiness for production use. Let me ask a question… How can you ensure that a system is ready for production use if it is not cyber-ready?  This is why we are talking about Cybersecurity in the context of validated systems.

When it comes to computer systems in today’s highly networked environment, Cybersecurity is the elephant in the room. All networked systems may be vulnerable to cyber security threats. Businesses large and small may be subject to cyber-attacks and the exploitation of these vulnerabilities may present a risk to public health and safety if not properly addressed. Although we know these truths all too well, many validation engineers are not even discussing Cybersecurity as part of an overall validation strategy.

There is no company that can prevent all incidences of cyber-attacks but it is critically important that companies began to think seriously about how to protect themselves from persistent cyber criminals determined to inflict as much damage as possible on computer systems in either highly regulated or nonregulated environments. One thing we know about cyber criminals is they are equal opportunity offenders – everyone has a degree of vulnerability. To beat them at their game, you have to be one step ahead of them.

In the validation world, we often refer to validation testing as IQ/OQ/PQ testing.  I would like to submit for your review and consideration another type of enhanced validation testing that we should be doing which is Cybersecurity qualification or as I like to refer to it “CyQ”.  What is a CyQ?  It is confirmation of a system’s protection controls and readiness to prevent a cyber-attack.  In one of my recent blog posts, I declared that …”computer systems validation as we know it is dead!…” Now of course I mean that tongue in cheek!  What I was referring to is that it is time to rethink our validation strategy based on the fact that we need to address the vulnerabilities of today’s cloud-based and on-premise systems with respect to the Cybersecurity risk imposed. We can no longer look at systems the way we did in the 1980s. Many life sciences companies are deploying cloud-based technologies, mobile systems, the Internet of things (IoT) and many other advanced technologies in the pursuit of innovation that may drive greater risk profiles in validated systems.  Incorporating CyQ in your overall validation strategy is one way to address these challenges.

The national Institute of standards and technology (NIST) introduced as cyber security framework. The five elements of the framework are shown in the figure below.

NIST-cybersecurity-framework

As a validation engineer I have studied this framework for its applicability to validated systems.  Each element of the strategy addresses a dimension of your cybersecurity profile.  To conduct a CyQ assessment, you need to examine each element of the cybersecurity framework to determine your readiness in each respective category.  I have developed a CyQ Excel Spreadsheet which examines each element of the framework and allows you to summarize your readiness to prevent a cyber-attack. (if you would like a copy of the CyQ Excel Spreadsheet, please contact me using the contact form and I will happily send it to you).

 

Remember, for validated systems, if it is not documented, it did not happen! Cybersecurity Qualification analysis must be documented.  You must be ready to explain to regulators when it comes to data integrity and systems integrity, what controls you have in place to protect both the data and the systems under your management.

Another consideration in the management of cyber threats is EDUCATION.  The biggest cyber breach may come from the person in the cubicle next to you! You must educate (and document) cyber training and do it on a frequent basis to keep pace.

For your next validation project, address the elephant in the room explicitly.   Cyber threats are not diminishing, they are increasing.  It is important to understand their origin and seriously consider how they can and will impact validated systems.  We can no longer think that IQ/OQ/PQ is sufficient.  While it has served its purpose in times past, we need a more effective strategy to address today’s clear and present danger to validated systems – the next cyber-attack.  It could be YOUR SYSTEM.  Deal with it!

Validating Microsoft Dynamics 365: What You Should Know

Microsoft Dynamics 365 and Azure are gaining popularity within the life sciences industry. I am often asked the question about how to validate such a system given its complexity and cloud-based nature. The purpose of this blog post is to answer this question. The outline of this blog post is as follows.

  • Understanding the Changing State of Independent Validation and Verification
  • Strategies for Installation Qualification of Microsoft Azure
  • Practical Strategies for Operational and Performance Qualification Testing
  • Continuous Testing in A Cloud Environment
  • Maintaining the Validated State and Azure

To begin our discussion, it is helpful to consider what keeps regulators up at night. They are concerned primarily about four key aspects of validated computer systems:

  1. Vulnerability – How Vulnerable Our Cloud Computing System Environments
  2. Data Integrity – What Is Your Strategy to Maintain Data Integrity Within Your Validated Computer Systems
  3. System Security and Cyber Security – How Do You Keep Sensitive Information Secure and How Do You Protect a Validated Computer System Against Cyber Threats?
  4. Quality Assurance – How Do You Minimize Risk to Patient Safety and Product Quality Impacted by The Validated System?

One of the first task validation engineers must be concerned with is that of supplier auditing. When using commercial off-the-shelf software such as Microsoft Dynamics 365 and Azure a supplier audit is a mandatory part of the validation process. 20 years ago, when we prepared validation documentation for computer systems, validation engineers often conducted a paper audit or an actual physical audit of a software provider. Supplier audits conducted on-site provided a rigorous overview of what software companies were doing and the quality state of their software development lifecycle process. It was possible to examine a supplier’s processes and decide as to if the software vendor was a quality supplier.

Most software vendors today including Microsoft do not allow on-site vendor audits. Some validation engineers have reported to me that they view this as a problem. However, the Microsoft Trust Center is a direct response to the industry’s need for transparency.  Personally, I think the Microsoft trust center is the best thing that they have done for the life sciences industry. Not only do they highlight all of the Service Organization Control reports (SOC1/SOC2/SOC3 and ISO/IEC 27001:2005), but they summarize their compliance with the cloud security alliance controls as well as the NIST Cybersecurity framework. I would strongly recommend that you visit the Microsoft trust center at https://www.microsoft.com/en-us/trustcenter.  The latest information posted to their site is a section on general data protection (GDPR) and how their platform can help keep data safe and secure. I find myself as a validation engineer visiting this site often. You will see a section for specifically Microsoft 365 and Azure.

From a supplier auditing perspective, I use the information found on the Microsoft trust center to facilitate a “desk audit” of the vendor. Many of the questions that I would ask during an on-site audit are found on this website. As part of my new validation strategy I include the service organization control reports as part of my audit due diligence.  The trust center includes in-depth information about security, privacy, and compliance offerings, policies, features and practices across all of Microsoft cloud products. .

If I were conducting an on-site audit of Microsoft, I would want to know how they are establishing trust in the cloud. Many of the questions that I would ask in person I have found on this website. It should be noted that service organization control reports are created not by Microsoft but by trusted third-party organizations certified to deliver such a report. These reports include an assessment of how well Microsoft is complying with the stated controls for cloud management and security. This is extremely valuable information.

From a validation perspective I attach these reports with my validation package as part of the supplier audit due diligence. There may be instances where you conduct your own due diligence beyond the reports but the reports provide an excellent start to understanding what Microsoft is doing.

Microsoft has adopted the cloud security alliance (CSA) cloud controls matrix to establish the controls for the Microsoft Azure platform. These controls include:

  • Security Policy and Procedures
  • Physical and Environmental Security
  • Logical Security
  • System Monitoring and Maintenance
  • Data Backup, Recovery and Retention
  • Confidentiality
  • Software Development/Change Management
  • Incident Management
  • Service Level Agreements
  • Risk Assessments
  • Documentation/Asset Management
  • Training Management
  • Disaster Recovery
  • Vendor Management

the cloud control matrix includes 136 different controls that cloud vendors such as Microsoft must comply with. Microsoft has mapped out on its trust center site specifically how it addresses each of the 136 controls in the Microsoft Azure/dynamics 365 platform. This is excellent knowledge and due diligence for validation engineers and represents a good starting point for documenting the quality of the supplier.

Is Microsoft Azure/dynamics 365 secure enough for life sciences? In my personal opinion yes, it is. Companies still must conduct due diligence to ensure that Azure and dynamics 365 meet their business continuity requirements and business process requirements for enterprise resource planning. One thing is certain, the cloud changes things. You must revise your validation strategy to accommodate the cloud. Changes in how supplier audits conduct are conducted are just one of such changes.

The next challenge in validating Microsoft Dynamics 365 and Azure is conducting validation testing in the cloud environment. It should be understood that the principles of validation still endure whether or not you are in the cloud environment. You still must conduct a rigorous amount of testing including both positive and negative testing to confirm that Microsoft Azure/dynamics 365 meets its intended use. However, there are changes in the way we conduct installation qualification in the cloud environment. Some believe that installation qualification is no longer a valid testing process since cloud environments are provisioned. This is not correct. You still must ensure that cloud environments are provisioned in a consistent repeatable manner that supports quality.

It is helpful to understand that when Microsoft Dynamics 365 is provisioned it is conducted using Microsoft lifecycle services. The Microsoft lifecycle services application is designed for rapid implementation and deployment. However, it should be clearly understood that lifecycle services itself is an application which is a potential point of failure in the process.  The use of lifecycle services must be documented and the provisioning of the environment must be confirmed through the installation qualification process.

From an operational qualification perspective, validation testing remains pretty much the same. Test cases are traced to their respective user requirements and executed with the same rigor as in previous validation exercises.

Performance qualification is also conducted in the same manner as before. Since the environment is in the cloud and outside of your direct control, it is very important that network qualification as well as performance qualification be conducted to ensure that there are no performance anomalies that may occur in your environment. In the cloud environment you may have performance issues related to system resources, networks, storage arrays and many other factors. Performance tools may be used to confirm that the system is performing within an optimal range as established by the validation team. Performance qualification can be conducted either before the system goes live by placing a live load on the system or it may occur after validation. This is at the discretion of the validation engineer.

Maintaining the validated state within a cloud environment requires embrace of the principle of continuous testing. It is often been said that the cloud is perpetually changing. This is one of the reasons why many people believe that you cannot validate the system in the cloud. However, you can validate cloud-based systems such as Microsoft Dynamics 365 and Azure. Continuous testing is the key. What do I mean by continuous testing? Does that mean that we perpetually test the system for ever and ever every single day? Of course not! Continuous testing is a new strategy that should be applied to all cloud-based validated systems whereby at various predetermined intervals, regression testing should occur. Automated systems such as ValidationMaster™ can be the key to facilitating this new strategy.

Within ValidationMaster™ you can establish a reusable test script Library. This is important because in manual validation processes that are paper-based, the most laborious part of validation is the development and execution of test scripts. This is why many people cringe at the very notion of continuous testing. However automated systems make this much easier. In ValidationMaster™ each test script is automatically traced to a user requirement. Thus, during regression testing, I can select a set of test scripts to be executed based on my impact analysis and during off-peak hours in my test environment I can execute these test scripts to see if there has been any impact to my validated system. These test scripts can be run fully automated using Java-based test scripts or they can be run using an online validation testing process. Revalidation of the system can happen in a matter of hours versus a matter of days or weeks using this process.

Through continuous testing, you can review the changes that Microsoft has made to both Azure and dynamics 365 online. Yes, this information is posted online for your review. This answers the question how do I know what changes are made and when changes are made. This information is made available to you through Microsoft. You can determine how often you test a validated system. There is no regulation that codifies how often this should occur. It’s totally up to you. However, as a good validation engineer you know that it should be based on risk. The riskier the system the more often you should test. The less risky the system the less due diligence is required. Nevertheless cloud-based systems should be subject to continuous testing to ensure compliance and maintain the validated state.

There are many other aspects of supporting validation testing in the cloud but suffice to say Microsoft Dynamics and Azure can be validated and have been successfully validated for many clients. Microsoft has done a tremendous service to the life sciences industry by transparently providing information through the Microsoft trust center. As a validation engineer I find this one of the most innovative things that they’ve done! It provides a lot of the information that I need and confirmation through third-party entities that Microsoft is complying with cloud security alliance controls, and the NIST Cybersecurity framework. I would encourage each of you to review the information on Microsoft’s trust center. Of course, there will always be skeptics of anything Microsoft does but let’s give them credit where credit is due.

The Microsoft Trust Center is a good thing. The company has done an excellent job of opening up and sharing how they view system security, quality and compliance. This information was never made fail available before and it is available now. I have validated many Microsoft Dynamics AX systems as well as 365 systems. The software quality within these systems is good and with the information that Microsoft has provided you can have confidence that deploying a system in the Azure environment is not only a good business system decision if you have selected this technology for your enterprise but a sound decision regarding quality and compliance.

.

Installation Qualification (IQ) in the Cloud

Installation Qualification (IQ) is designed to ensure that systems are installed in a consistent, repeatable manner.  For on-premise systems, validation engineers typically install enterprise software from pre-defined vendor media.  Formerly, media was inserted into corporate servers and the full installation process is documented.  The purpose for installation qualification sometimes gets lost in the weeds.  Given the complexity of today’s enterprise and desktop systems in virtualized cloud environments , the need for installation qualification has never been greater.

The question often gets asked, “… how do you conduct installation qualification in the cloud?”  The answer to this question requires the need to think differently about how software is deployed in today’s validated systems environments.

Cloud deployment models include Infrastructure-as-a-Service (IaaS), Platform-as-a-Service (PaaS) and Software-as-a-Service (SaaS).  The validation strategy changes depending on the deployment model.  In all models, the virtualization, servers, storage and networking are managed by the vendor.  In the PaaS model, the runtime, middleware and O/S are also managed by the vendor.  It should be noted that cloud services are typically provisioned using automated tools thus, the installation process is not the same as with on-premise environments.

Therefore, to conduct IQ in cloud requires four essential stages:

  • Observation or Documentation of the Provisioning Process
  • Draft Installation Qualification Test Scripts For Pre-Approval
  • Execute Pre-Approved IQ Test Scripts
  • Summarize Installation Qualification Phase

We recommend for all cloud models the observation (to the extent possible) the provisioning process.  For some applications such as SaaS, this may not be possible.  You should observe the setup and configuration of all virtual servers and parameters used to load software.  The sequence of events may be key.  We have observed with some applications that the provisioning process (which itself is automated) may fail for a variety of reasons.  Thus, do not think it strange that the provisioning process does not successfully complete on occasion.  That is why, whenever possible, we observe this process.  For example, we conduct a lot of validation exercises for Microsoft Dynamics 365 using Microsoft Lifecycle Services to support provisioning.  We can observe this process to document the installation qualification process for the system.

It is important to note that the principles of validation still endure.  You STILL must ensure that your cloud environment is established using a consistent, repeatable process.  It is NOT the case that since you are in the cloud, you and eliminate IQ testing.  This is not a good idea nor best practice.

Once the provisioning process is complete, you can create your IQ test scripts and conduct installation qualification testing in the same manner as before.  It is important to note that the IQ process is still relevant for validation testing today.  In the cloud environment, you must conduct your supplier audits collecting the Service Organization Control reports discussed in one of my previous posts.  A thorough process will help ensure data integrity and quality as you deploy enterprise applications.  Are you ready for the cloud?

Automating Validation Testing: It’s Easier Than You Think

Automated validation testing has been elusive for many in the validation community.  There have been many “point solutions” on the market that addressed the creation, management and execution of validation testing.  However, what most validation engineers want is TRULY AUTOMATED validation testing that will interrogate an application in a rigorous manner and report results in a manner that not only provides objective evidence of pass/fail criteria but will highlight each point of failure.

In the 1980’s when I was conducting validation exercises for mini- and mainframe computers and drafting test scripts in a very manual way, I often envisioned a time when I would be able to conduct validation testing in a more automated way.  Most validation engineers work in an environment where they are asked to do more with less.  Thus, the need for such a tool is profound.

Cloud computing environments, mobility, cybersecurity, and data integrity imperatives make it essential that we more thoroughly test applications today.  Yet the burden of manual testing persists.  If I could share with you 5 key features of an automated testing system it would include the following:

  • Automated test script procedure capture and development
  • Automated Requirements Traceability
  • Fully Automated Validation Test Script Execution
  • Automated Incident Capture and Management
  • Ability to Support Continuous Testing in the Cloud

In most validation exercises I have participated in, validation testing was the most laborious part of the exercise.  Automated testing is easier than you think.

For example, ValidationMaster™ includes an automated test engine that captures each step of your qualification procedure and annotates the step with details of the action performed.

Test cases can be routed for review and pre-approval with the system quickly and easily through DocuSign.  Test case execution can be conducted online and a dynamic dashboard reports the status of how many test scripts have passed, how many have failed, or which ones may have passed with exception.  Once test scripts have been executed, the test scripts may be routed for post-approval and signed.

Within the ValidationMaster™ system, you can create a reusable test script library to support future regression testing efforts.  The system allows users to link requirements to test cases thus facilitating the easy generation of forward and reverse trace matrices.  Exporting documents in your UNIQUE format is a snap within the system.  Thus, you can consistently comply with your internal document procedures.

Continuous testing in a cloud environment is essential.  ValidationMaster™ supports fully automated validation testing allowing users to set a date/time for testing.  Test scripts are run AUTOMATICALLY without human intervention.  Allowing multiple runs of the same scripts if necessary.

Continuous testing in a cloud environment is ESSENTIAL.  You must have the ability to respond to rapid changes in a cloud environment that may impact the validated state of the system.  Continuous testing reduces risk and ensures sustained compliance in a cloud environment.

The system automatically raises an incident report if a bug is encountered through automated testing.  The system keeps track of each test run and results though automation.  ValidationMaster™ includes a dynamic dashboard that shows the pass/fail status of each test script as well as requirements coverage, open risks, incident trend analysis and much more.

The time is now for automated validation testing.  The good news is that there are enterprise level applications on the market that facilitate the full validation lifecycle process.  Why are you still generating manual test scripts?  Automated testing is easier than you think!

Why Are You Still Generating Validation Test Scripts Manually?

Drafting validation scripts is one of the key activities in a validation exercise designed to provide document evidence that a system performs according to its intended use.  The FDA and other global agencies require objective evidence, usually in the form of screen shots that sequentially capture the target software process, to provide assurance that systems can consistently and repeatedly perform the various processes representing the intended use of the system.

Since the advent of the PC, validation engineers have been writing validation test scripts manually.  The manual process of computer systems validation test script development involves capturing screenshots and pasting them into Microsoft Word test script templates.  To generate screen captures, some companies use tools such as Microsoft Print Screen, TechSmith SnagIT, and other such tools.  A chief complaint of many validation engineers is that the test script development process is a slow, arduous one.  Some validation engineers are very reluctant to update/re-validate systems due to this manual process.  So, the question posed by this blog article is simply this: “Why are you still generating test scripts manually???”

I have been conducting validation exercises for engineering and life sciences systems since the early 1980’s.  I too have experienced first-hand the pain of writing test scripts manually.  We developed and practice “lean validation” so I sought ways to eliminate manual, wasteful validation processes.  One of the most wasteful processes in validation is the manual capture/cutting/pasting of screenshots into a Microsoft Word document.

The obvious follow up question is “how do we capture validation tests in a more automated manner to eliminate waste and create test scripts that are complete, accurate and provide the level of due diligence required for validation?”

In response to this common problem, we developed an Enterprise Validation Management system called ValidationMaster™.  This system includes TestMaster™, an automated testing system that allows validation engineers to capture and execute validation test scripts in a cost-effective manner.

TestMaster™ is designed to validate ANY enterprise or desktop application.  It is a browser-based system and allows test engineers to open any application on their desktop, launch TestMaster™, and capture their test scripts while sequentially executing the various commands in their applications.    As the validation engineer navigates through the application, TestMaster™ captures each screenshot and text entry entered in the application.

Once the test script is saved, TestMaster™ allows the script to be published in your UNIQUE test script template with the push of a button.  No more cutting/pasting screenshots from manual processes!  You can generate your test scripts in MINUTES as opposed to the hours it sometimes takes to compile documents based on a series of screenshots.  If you are one of those validation engineers that does not like screenshots in your scripts, you can easily create text-based processes both quickly and easily using TestMaster™.

So, what is the biggest benefit of using TestMaster™ versus manual processes?  There are three key benefits which are summarized as follows:

  1.  Automated Test Script Execution– for years, validation engineers have wanted a more automated approach for the execution of validation test scripts.  ValidationMaster™ supports both hands-on or hands-off validation testing.  Hands-on validation testing is the process whereby a validation engineer looks at each step of a validation test script and executes the script step-by-step by clicking through the process.  Hands off validation allows a validation engineer to execute a test script with no human intervention.  This type of regression testing (hands off) is very useful for cloud-based systems or systems that require more frequent testing.  The validation engineer simply selects a test script and defines a date/time for its execution.  At the designated time with no human intervention, the system executes the test script and reports the test results back to the system.   Automated testing is here!  Why are you still doing this manually?

  1.  Traceability– TestMaster™ allows validation engineers to link each test script to a requirement or set of requirements, thus the system delivers automatic traceability which is a regulatory requirement.  With the click of a button, TestMaster™ allows validation engineers to create a test script directly from a requirement.  This is powerful capability that allows you to see requirements coverage through our validation dashboard on demand.  This validation dashboard is viewable on a desktop or mobile device (Windows, Apple, Android).

  1.  Test Script Execution– One of the biggest problems with manual test scripts is that they must be printed and manually routed for review and approval.  Some companies who have implemented document management systems may have the ability to route the scripts around electronically for review and approval.  The worst-case scenario is the company that has no electronic document management system and generates these documents manually.  TestMaster™ allows validation engineers to execute test scripts online and capture test script results in an easy manner.  The test script results can be captured in an automated way and published into executed test script templates quickly and easily.   If incidents (bugs/anomalies) occur during testing, users have the ability to automatically capture an incident report which is tied to the exact step where the anomaly/bug occurred.  Once completed, ValidationMaster™ is tightly integrated with a 21 CFR Part 11-compliant portal (ValidationMaster Portal™). Once the test script is executed, is it automatically published to the ValidationMaster™ Portal where it is routed for review/approval in the system.  The ability to draft, route, review, approve, execute and post-approve validation test scripts is an important, time/cost saving feature that should be a part of any 21stcentury validation program.

  1.  Reuse Test Scripts For Regression Testing– manual test scripts are not ‘readily’ reusable.  What I mean by this is that the Word documents must be edited or even re-written for validation regression testing.  Validation is not a one-time process.  Regression testing is a fact of life for validation engineers.  The question is, will you rewrite all of your test scripts or use automated tools to streamline the process.  ValidationMaster™ allows validation engineers to create a reusable test script library.  This library includes all the test scripts that make up your validation test script package.  During re-validation exercises, you have the ability to reuse the same test scripts for regression testing.

Given the rapid adoption of cloud, mobile and enterprise technologies in life sciences, a new approach to validation is required.  Yes, you can still conduct validation exercises on paper but why would you?  In the early days of enterprise technology, we did not have tools available that would facilitate the rapid development of validation test scripts.  Today, that is not the case.  Systems like ValidationMaster™ are available in either a hosted or on-premise environment.  These game-changing systems are revolutionizing the way validation is conducted and offering time/cost-saving features that make this process easier.   So why are you still generating test scripts manually?

Leveraging the NIST Cybersecurity Framework

As a validation engineer, why should you be concerned about Cybersecurity?  Good question!  Today’s headlines are filled with instances of cyber attacks and data breaches impacting some of the largest corporate systems around.  As validation engineers, our job is to confirm software quality and that systems meet their intended use.  How can you realistically do this without paying any attention to the threat of potential cyber attacks on validated system environment.

As with every system environment, you must ensure your readiness to help prevent a cyber event from occurring.  Of course, you can never fully protect your systems to the extent that a cyber attack will never be successful, but you can certainly PREPARE and reduce the probability of this risk.   That’s what this article is all about – PREPAREDNESS.

The NIST Cybersecurity Framework was created through collaboration between industry and government and consists of standards, guidelines, and practices to promote the protection of critical infrastructure.

To get a copy of the NIST Cyber Security Framework publication, click here.  If you are not familiar with the NIST Cyber Security Framework, you can view an overview video and get a copy of the Excel spreadsheet.

Remember the old addage, “…if its not documented, it did’nt happen…”?  You must document controls, processes and strategies to ensure that you are able to defend your readiness assessment for cybersecurity.  The NIST Cyber Security Framework is designed to help organizations view cybersecurity in a systematic way as part of your overall risk management strategy for validated systems.   The Framework consists of three parts:

  1. Framework Core – a set of cybersecurity activities, outcomes, and informative references that are common across your validated systems environments.   The Framework Core consists of (5) concurrent and continuous Functions which are: (1) Identify, (2) Protect, (3) Detect, (4) Respond, (5) Recover as shown in the figure below.
  2. Framework Profile – help align your cybersecurity activities with business requirements, risk tolerances, and resources
  3. Framework Implementation Tiers – a method to view, assess, document and understand the characteristics of your approach to managing cybersecurity risks in validated systems environments.  This is assessment is part of your Cybersecurity Qualification (CyQ).  Life sciences companies should characterize their level of readiness from Partial (Tier 1) to Adaptive (Tier 4).  You can use what ever scale you like in your assessment.

NIST-cybersecurity-framework

Most companies are adept at RESPONDING to cyber events rather than preventing them.  This Framework, as part of your overall integrated risk management strategy for validation.  We recommend for validation engineers that you DOCUMENT your strategy to confirm your due diligence with respect to cybersecurity.  In my previous blog post, I recommended that in addition to conducting IQ, OQ, PQ, and UAT testing that you also conduct a CyQ readiness assessment.

Cyber threats are a clear and present danger to companies of all sizes and types.  As validation engineers, we need to rethink our validation strategies and adapt to changes which can have significant impact on our validated systems environments.  Whether you are in the cloud or on-premise, cyber threats are real and may impact you.  This problem is persistent and is not going away anytime soon.  Readiness and preparedness is the key.  Some think that issues concerning cybersecurity are only the perview of the IT team – THINK AGAIN!  Cybersecurity is not only an IT problem, it is an enterprise problem that requires an interdisciplinary approach and a comprehensive governance commitment to ensure that all aspects of your validation processes and business processes are aligned to support effective cybersecurity practices.

If you are responsible for software quality and ensuring the readiness of validated you need to be concerned about this matter.  The threats are real.  The challenges are persistent.  The need for greater diligence is upon us.  Check out the NIST Cyber Security Framework.  Get your cyber house in order.

 

Computer Systems Validation As We Know It Is DEAD

Over the past 10 years, the software industry has experienced radical changes.  Enterprise applications deployed in the cloud, the Internet of Things (IoT), mobile applications, robotics, artificial intelligence, X-as-a-Service, agile development, cybersecurity challenges and other technology trends force us to rethink strategies for ensuring software quality.  For over 40 years, validation practices have not changed very much.  Suprisingly, many companies still conduct computer systems validation using paper-based processes.  However, the trends outlined above challenge some of the current assumptions about validation.  I sometimes hear people say “… since I am in the cloud, I don’t have to conduct an IQ…” or they will say, “… well my cloud provider is handling that…”

Issues related to responsibility and testing are changing based on deployment models and development lifecycles.  Validation is designed to confirm that a system meets its intended use.  However, how can we certify that a system meets its intended use if it is left vulnerable to cyber threats?  How can we maintain the validated state over time in production if the cloud environment is constantly changing the validated state?  How can we adequately test computer systems if users can download an “app” from the App Store to integrate with a validated system?  How can we ensure that we are following proper controls for 21 CFR Part 11 if our cloud vendor is not adhering to CSA cloud controls?  How can we test IoT devices connected to validated systems to ensure that they work safely and in accordance with regulatory standards?

You will not find the answers to any of these questions in any regulatory guidance documents.  Technology is moving at the speed of thought yet our validation processes are struggling to keep up.

These questions have led me to conclude that validation as we know it is DEAD.  The challenges imposed by the latest technological advances in agile software development, enterprise cloud applications, IoT, mobility, data integrity, privacy and cybersecurity are forcing validation engineers to rethink current processes.

Gartner group recently announced that firms using IoT grew from 29% in 2015 to 43 % in 2016.  They project that by the year 2020, over 26 billion devices will be IoT-devices.  it should be noted that Microsoft’s Azure platform includes a suite of applications for remote monitoring, predictive maintenance and connected factory monitoring for industrial devices.  Current guidance has not kept pace with ever-changing technology yet the need for quality in software applications remains a consistent imperative.

So how should validation engineers change processes to address these challenges?

First, consider how your systems are developed and deployed.  The V-model assumes a waterfall approach yet most software today is developed using Agile methodologies.  It is important to take this into consideration in your methodologies.

Secondly, I strongly recommend adding two SOPs to your quality procedures – a Cybersecurity SOP for validated computer systems and a Cloud SOP for validated systems.  You will need these two procedures to provide governance for your cloud processes.  (If you do not have a cloud or cybersecurity SOP please contact me and I will send you both SOPs.)

Third, I believe you should incorporate cybersecurity qualification (CyQ) into your testing strategy.  In addition to IQ/OQ/PQ, you should be conducting a CyQ readiness assessment for all validated systems.  A CyQ is an assessment to confirm and document your readiness to protect validated systems against a cyber attack.  It also includes testing to validate current protections for your validated systems.  It is important to note that regulators will judge you on your PROACTIVE approach to compliance.  This is an important step in that direction.

cyq-1

Forth, you should adopt lean validation methodologies.  Lean validation practices are designed to eliminate waste and inefficiency throughout the validation process while ensuring sustained compliance.

Finally, the time has come for automation.  To keep pace with the changes in current technology as discussed above, you MUST include automation for requirements management, validation testing, incident management and validation quality assurance (CAPA, NC, audit management, training, et al).  I recommend consideration of an Enterprise Validation Management system such as ValidationMaster™ to support the full lifecycle of computer systems validation.  ValidationMaster™  allows you to build a re-usable test script library and represents a “SINGLE SOURCE OF TRUTH” for all of your validation projects.  Automation of the validation process is no longer a luxury but a necessity.

Advanced technology is moving fast.  The time is now to rethink your validation strategies for the 21st century.  Validation as we know it is dead.  Lean, agile validation processes are demanded to keep pace with rapidly changing technology.  As you embrace the latest cloud, mobile and IoT technologies, you will quickly find that the old ways of validation are no longer sufficient.  Cyber criminals are not going away but you need to be ready. Step into LEAN and embrace the future!

 

Validation Testing: Understanding The Why and How

For today’s on-premise and cloud-based systems, validation testing is a required process to ensure that systems are of sufficient quality and operate according to their intended use.  Validation testing is typically done at the end of the development process after all verification has been completed.  IEEE defines validation as the process of evaluating software to determine whether it satisfies the specific defined requirements.  Therefore validation testing must be traced to pre-defined requirements.

The goals of validation are pretty clear:

  • Discover errors/anomalies in software prior to production
  • Confirm that system meet their intended use
  • Confirm that regulatory requirements in the software are met
  • Provide due diligence (documented evidence) for regulators
  • Deliver justification for use of a system

I have had the priviledge of working with many life sciences companies over the years and I have seen it all – from ad hoc testing processes to those that are well-defined and mature in their optimization and effectiveness.  Most testing processes are at level one where the processes are chaotic and not well-defined.

testing cmmi

Automated validation testing processes are essential in today’s life sciences companies where we all are being asked to do more with less.  It is essential that we establish automated processes to accelerate productivity, eliminate waste and ensure greater to ensure software quality.

The less time spent on the mechanics of test script development, the more time can be dedicated to ensuring software quality.

The software testing capability maturity model should be on your radar.  Establishing automated testing should be a goal for every validation engineer.  It is important to understand how to achieve Level 5 and what it takes from a process perspective to achieve greater testing governance and sustained compliance.

ESTABLISHING A REUSABLE TEST SCRIPT LIBRARY

When conducting validation, the most laborious part of the process is testing.  Validating today’s COTS software applications involves testing the same “out-of-the-box” features over and over again.  Many validation engineers continue to draft test scripts again and again to support this process.  What if you could establish a “reusable test script library” for your validation projects that would allow you to conduct regression testing quickly and easily without major rewrite for your applications?  What if you could centrally store this repository for all of your applications so you had a single source of truth for all of your validation projects?  What if you could ensure that your validation test library was “auditable” and could be shared with regulators during audits as part of your objective evidence requirements?  What if each test script had its own audit trail and was traced to its respective requirements for automatic traceability?

The ability to effective establish and manage a reusable test script library and a single source of truth for all of your validation projects is made possible with the ValidationMaster™ Enterprise Validation Management system.

The system allows you to create, track and manage a reusable test script library quickly and easily.  All of your validation assets are in a single location for reference and reuse.  Intelligence can be quickly gleaned from the system to drive continuous improvement and compliance.  For fully automated scripts that require no human intervention to run, the system has the ability to automate test script execution and reporting of actual results.  This helps to facilitate continuous testing in the cloud and ensure that your systems are maintained in a validated state.

Validation testing is here to stay.  AUTOMATION IS THE KEY!  It is a necessity not a luxury to automate your validation processes.  Join us for one of our Automated Testing  online web briefings to learn more.

Automated Validation Lifecycle Management

Lorem ipsum dolor sit amet, consectetur adipiscing elit. Nam molestie molestie nisl, eu scelerisque turpis tempus at. Nam luctus ultrices imperdiet.

Class aptent taciti sociosqu ad litora torquent per conubia nostra, per inceptos himenaeos. Suspendisse velit orci, pretium ut feugiat nec, lobortis et est. Nullam cursus ultrices tincidunt. Nam gravida sem gravida ipsum dignissim in dictum urna accumsan. Nullam nec augue magna, sed scelerisque odio. Cras adipiscing feugiat venenatis. Praesent gravida consequat purus sed lobortis. Aenean et eros nunc.

Nam ultricies aliquam imperdiet. Pellentesque massa dui, varius non sodales quis, placerat ullamcorper nisl. Donec pulvinar, arcu vel rhoncus commodo, neque lectus blandit elit, vel iaculis odio lectus sit amet metus. Curabitur sodales semper eros et vulputate. Ut et sem ipsum. Nam elementum neque sem. Fusce fringilla ante id augue sodales venenatis. Nunc eu ipsum enim.

  •  Bibendum in cursus venenatis
  • Ultricies consectetur purus
  • Integer imperdiet lectus vitae

Nunc odio odio, faucibus non porta a, venenatis non mauris. Nam non tortor est. Nullam lacinia, augue quis luctus ullamcorper, sem urna bibendum erat, sed viverra tortor velit sed quam. Sed adipiscing leo a odio condimentum in placerat ipsum bibendum.

Nam pretium, sem iaculis ullamcorper mattis, sem lacus commodo dui, vel ultrices libero nisl et massa. Sed tristique bibendum arcu, dapibus eleifend justo aliquet eu. Fusce sed blandit lorem. Phasellus blandit posuere nulla quis aliquam. In vel ante vitae neque aliquet hendrerit a non velit.

In hac habitasse platea dictumst. Integer ac ante enim, in imperdiet justo. Sed justo mi, convallis et lobortis a, venenatis at odio. Vivamus porttitor dolor eget felis pretium luctus. Sed nec dui id augue blandit accumsan vel et lorem.

Quisque eros purus, sagittis sit amet consectetur eu, scelerisque a purus. Pellentesque sollicitudin velit eu velit fringilla sollicitudin. Vestibulum ante ipsum primis in faucibus orci luctus et ultrices posuere cubilia Curae; Praesent id aliquam magna.