Can Microsoft Azure Be Validated?

The text book definition of computer systems validation (paraphrased) is “documented evidence that a system performs according to its intended use”.  The FDA uses the definition “…Validation is a process of demonstrating, through documented evidence, that <software applications> will consistently produce the results that meets predetermined specifications and quality attributes.”

Using the FDA’s definition, one should understand that validation is a PROCESS.  It is not a one-time event.  Also, looking at this definition, one should understand that Agency expects to see “documented evidence” that a system will consistently produce results that meet predetermined specification and quality attributes.  And finally, from the FDA’s definition of validation, we learn that to properly validate a system, one MUST have “predetermined specifications and quality attributes” defined.

Let’s start at the beginning with the question posed by this blog post “Can Microsoft Azure® Be Validated?”  The short answer is YES.  The remainder of this post I will share 6 key reasons why this bold statement can be made.  But first, let’s establish some foundation as to what Microsoft Azure® (“Azure®”) is and why this question is relevant.

At its core, Azure is a cloud computing platform and infrastructure created by Microsoft for building, deploying, and managing applications and services through a global network of Microsoft-managed datacenters (Wikipedia).  In today’s global economy, life sciences companies, like many of their industry counterparts are seeking to do more with less.    These companies are embracing enterprise technologies and leveraging innovation to drive global business and regulatory processes.  If you had asked me 10 years ago, how many companies I am working with are deploying enterprise technologies in the cloud, my answer would have been very few, if any.  If you further asked me 10 years ago, how many cloud validation exercises in life sciences was I conducting, I would have said NONE.

Suffice to say, cloud computing has been a game-changer for life sciences companies.  It should be noted that due to regulatory considerations, the life sciences industry has been reluctant to change validation practices.  But the cloud is changing the process of validation as we know it.  Sure, the basic principles of risk-based validation endure.  You STILL have to provide documented evidence that your software applications will consistently produce the results that meets predetermined specifications and quality attributes.  HOW you do this is what is changing.  Validating cloud applications is a bit different than validating on-premise application mostly due to the changing roles and responsibilities of those responsible for the system and the cloud providers themselves.  I will say at the outset that all cloud providers are not created equally so “caveat emptor” (let the buyer beware).

Let’s talk about the 6 key reasons why Azure can be validated and how this is done.

SIX REASONS WHY AND HOW AZURE CAN BE VALIDATED.

REASON 1 – AZURE RISK ASSESSMENT

Risk management is not a new concept.  It has been around since the 1940’s applied to military and aerospace industries before it was broadened out.  The first step crucial step in any computer systems validation project is to conduct a validation risk assessment.  Among determining the impact on critical quality attributes, a risk assessment is used to determine the level of validation due diligence required by the system.   How much validation testing you should do for any validation exercise should be based on a risk assessment of the system.  When deploying systems such as Microsoft Dynamics AX® or BatchMaster/GP® using the Azure platform, the risk assessment should include an assessment of the Azure platform itself.  When outsourcing your platform to a vendor such as Microsoft, their risks become your risks thus, you must conduct your due diligence to ensure that these risks are acceptable in your organization.

The ISPE GAMP 5® (“GAMP 5®”) risk assessment model defines the process for hazard identification, estimation of risk, evaluation of risk, control and monitoring of risks and the process for documenting risk.  GAMP 5® speaks of a 5-step process including:

  1. Perform Initial Risk Assessment and Determine System Impact
  2. Identify Functions With Impact On Patient Safety, Product Quality and Data Integrity
  3. Perform Functional Risk Assessment and Identify Controls
  4. Implement and Verify Appropriate Controls
  5. Review Risks and Monitor Controls

GAMP 5® does not explicitly address cloud infrastructure.    One could extrapolate and interpret how this standard may be applied to cloud infrastructure but it is one of the limitations of applying a “clean” translation.  Azure could be treated as a Commercial-off-the-Shelf (COTS) platform.  You will note that the risk process identified by GAMP 5® is more application-focused than platform-focused.  To conduct a proper risk assessment of the Azure, a new framework must be established to conduct a risk assessment for the platform.  In July 2011, ISACA, previously known as the Information Systems Audit and Control Association, released a document called “IT Control Objectives for Cloud Computing: Controls and Assurance in the Cloud” which highlighted specific control objectives and risks inherent in cloud computing.  From a validation perspective, you need to address availability/reliability risks, performance/capacity risks, as well as security and cybersecurity risks.  Using the publicly available independent service organization controls (SOC 1/2/3) reports, you should be able to effectively document and assess these risks.  Remember, if it’s not documented, it didn’t happen!

REASON 2 – CHANGE CONTROL (“THE ELEPHANT IN THE ROOM”)

One of the biggest challenges with companies adoption of cloud computing is CHANGE CONTROL.   As you may be well aware, cloud environments are under constant change.  In the validation world, change control is an essential process to help ensure that validated systems remain in a validated state.  So how do you maintain the validated state when a platform such as Azure is in a constant state of change?  First, all validated systems are subject to change control.  Each time a system is changed, the change, reason for change and who made the change must be documented.  When the system is on-premise, changes are documented (in general) by the IT department.  The responsibility for the infrastructure belongs to this group in an on-premise scenario.  In a cloud environment, this responsibility belongs to the cloud provider.  SOC reports highlight and define change control processes and procedures for the cloud environment.

We recommend to our clients that they obtain a copy of these reports under non-disclosure to ensure compliance of the cloud provider.  Once in the cloud, we recommend developing a testing and regression testing strategy that tests systems in a cloud environment on a more frequent basis.  In the ‘80’s and ‘90’s, validation engineers were very reluctant to change validated systems due to the complexities and level of effort involved in testing validated systems using manual testing processes.  Today, companies are leveraging tools like OnShore’s ValidationMaster™ to facilitate automated testing that allows validation test engineers to schedule the run of automated test scripts in a cloud environment.  ValidationMaster™ takes the complexity from regression testing and frees up resources for other work.   Change control is essential in on-premise and cloud environments.   To support the cloud, you will need to establish clear policies and procedures that reference both on-premise as well as cloud computing systems.  With respect to change control, it is very important that your policies reflect procedures in a cloud environment.

Microsoft has established clear controls for Azure to manage change control in their environment.  These procedures have been documented and tested as per their SOC 1/SOC 2 control reports issued by independent auditors.  When validating systems in an Azure environment, you will have to get copies of these reports and them with your validation package.    The supplier audit becomes crucial in a hosted environment.  The good news is that Microsoft is a reputable supplier and can be relied on as a stable vendor with respect to its products.

Microsoft Azure has undergone many independent audits including a 2013 ISO/IEC 27001:2005 audit report that confirms that Microsoft has procedures in place that provide governance for change management processes.  The audit report highlights both the methodology confirms that changes to the Azure environment are appropriately tested and approved.

REASON 3 – INDEPENDENT AUDIT CONFIRMATION OF TECHNICAL/PROCEDURAL CONTROLS AND SECURITY

In cloud environments, the role of the cloud provider is paramount.  All cloud providers are not created equal.   Microsoft Azure has passed many audits and these reports, as previously noted are available upon request.  Security and cybersecurity issues are very important in the Azure environment.  Microsoft has implemented controls and procedures to ensure compliance.   The company has provided detailed information on security on their website (https://www.microsoft.com/en-us/TrustCenter/Security/default.aspx )

REASON 4 – INFRASTRUCTURE QUALIFICATION (IQ)

IQ testing changes in the Azure environment.  In the Azure environment, Microsoft bears some responsibility for software installation of hardware and software.  As previously mentioned, it is important to qualify the environment once it is setup to confirm that what is supposed to be installed is present in your environment.  We qualify the virtual software layer as well as Microsoft Lifecycle Services.

REASON 5 – OPERATIONAL/PERFORMANCE QUALIFICATION (OQ/PQ)

The process of operational qualification of the application is very similar in a cloud environment to the process in an on-premise environment.  Microsoft Dynamics AX now can easily be deployed in an Azure environment.   Validation testing of this type of implementation with some variation.  The installation of Microsoft Dynamics AX uses what is called Lifecycle Services which allows you to setup and install the system rapidly using automation.  Lifecycle Services may be subject to validation itself since it is used to establish a validated systems environment.  You will need to identify the failure of Lifecycle Services as a risk and rate the risk of using this service as part of the risk assessment.

Lifecycle Services is promoted as bringing predictability and repeatability to the process of software installation.  These are principles upon which validation has stood for many years.  However, it is important to go beyond the marketing to verify that this software layer is performing according to its intended use.  We typically draft a qualification script to qualify Lifecycle Services prior to using it in a validated systems environment.  We conduct OQ/PQ testing in the same manner as in a non-hosted environment.

REASON 6 – ABILITY TO MAINTAIN THE VALIDATED STATE

With Microsoft Azure, the need to maintain the validated state does not go away.  It is very possible to maintain your mission-critical applications in a validated state that are hosted using the Azure platform.  To ensure that systems are maintained in a validated state in the Azure environment, we recommend establishing policies and procedures and adopting the test early/test often premise to ensure that changes made to the infrastructure do not negatively impact the validated production systems environment.  It is possible to maintain the validated state using Azure.  It must be done with a combination of technology, processes and procedures to ensure compliance.

As a validation engineer, I have validated many systems on the Azure platform.  I have concluded that the Azure platform can be validated and used within the life sciences environment.   How much validation due diligence you should conduct should be based on risk.  Don’t over do it!  Use lean methodologies to drive your validation processes.

Upon close inspection of this platform, it has built-in quality and security controls to provide a level of assurance of its suitability for deployment in regulated environments.   I like the idea that the system has been independently inspected by numerous 3rd party organizations to help ensure objectivity.  This platform is ready for prime time and many of my clients are leveraging its benefits and freeing themselves from the burden of managing their infrastructure.

Automating Validation Testing: It’s Easier Than You Think

Automated validation testing has been elusive for many in the validation community.  There have been many “point solutions” on the market that addressed the creation, management and execution of validation testing.  However, what most validation engineers want is TRULY AUTOMATED validation testing that will interrogate an application in a rigorous manner and report results in a manner that not only provides objective evidence of pass/fail criteria but will highlight each point of failure.

In the 1980’s when I was conducting validation exercises for mini- and mainframe computers and drafting test scripts in a very manual way, I often envisioned a time when I would be able to conduct validation testing in a more automated way.  Most validation engineers work in an environment where they are asked to do more with less.  Thus, the need for such a tool is profound.

Cloud computing environments, mobility, cybersecurity, and data integrity imperatives make it essential that we more thoroughly test applications today.  Yet the burden of manual testing persists.  If I could share with you 5 key features of an automated testing system it would include the following:

  • Automated test script procedure capture and development
  • Automated Requirements Traceability
  • Fully Automated Validation Test Script Execution
  • Automated Incident Capture and Management
  • Ability to Support Continuous Testing in the Cloud

In most validation exercises I have participated in, validation testing was the most laborious part of the exercise.  Automated testing is easier than you think.

For example, ValidationMaster™ includes an automated test engine that captures each step of your qualification procedure and annotates the step with details of the action performed.

Test cases can be routed for review and pre-approval with the system quickly and easily through DocuSign.  Test case execution can be conducted online and a dynamic dashboard reports the status of how many test scripts have passed, how many have failed, or which ones may have passed with exception.  Once test scripts have been executed, the test scripts may be routed for post-approval and signed.

Within the ValidationMaster™ system, you can create a reusable test script library to support future regression testing efforts.  The system allows users to link requirements to test cases thus facilitating the easy generation of forward and reverse trace matrices.  Exporting documents in your UNIQUE format is a snap within the system.  Thus, you can consistently comply with your internal document procedures.

Continuous testing in a cloud environment is essential.  ValidationMaster™ supports fully automated validation testing allowing users to set a date/time for testing.  Test scripts are run AUTOMATICALLY without human intervention.  Allowing multiple runs of the same scripts if necessary.

Continuous testing in a cloud environment is ESSENTIAL.  You must have the ability to respond to rapid changes in a cloud environment that may impact the validated state of the system.  Continuous testing reduces risk and ensures sustained compliance in a cloud environment.

The system automatically raises an incident report if a bug is encountered through automated testing.  The system keeps track of each test run and results though automation.  ValidationMaster™ includes a dynamic dashboard that shows the pass/fail status of each test script as well as requirements coverage, open risks, incident trend analysis and much more.

The time is now for automated validation testing.  The good news is that there are enterprise level applications on the market that facilitate the full validation lifecycle process.  Why are you still generating manual test scripts?  Automated testing is easier than you think!

Why Are You Still Generating Validation Test Scripts Manually?

Drafting validation scripts is one of the key activities in a validation exercise designed to provide document evidence that a system performs according to its intended use.  The FDA and other global agencies require objective evidence, usually in the form of screen shots that sequentially capture the target software process, to provide assurance that systems can consistently and repeatedly perform the various processes representing the intended use of the system.

Since the advent of the PC, validation engineers have been writing validation test scripts manually.  The manual process of computer systems validation test script development involves capturing screenshots and pasting them into Microsoft Word test script templates.  To generate screen captures, some companies use tools such as Microsoft Print Screen, TechSmith SnagIT, and other such tools.  A chief complaint of many validation engineers is that the test script development process is a slow, arduous one.  Some validation engineers are very reluctant to update/re-validate systems due to this manual process.  So, the question posed by this blog article is simply this: “Why are you still generating test scripts manually???”

I have been conducting validation exercises for engineering and life sciences systems since the early 1980’s.  I too have experienced first-hand the pain of writing test scripts manually.  We developed and practice “lean validation” so I sought ways to eliminate manual, wasteful validation processes.  One of the most wasteful processes in validation is the manual capture/cutting/pasting of screenshots into a Microsoft Word document.

The obvious follow up question is “how do we capture validation tests in a more automated manner to eliminate waste and create test scripts that are complete, accurate and provide the level of due diligence required for validation?”

In response to this common problem, we developed an Enterprise Validation Management system called ValidationMaster™.  This system includes TestMaster™, an automated testing system that allows validation engineers to capture and execute validation test scripts in a cost-effective manner.

TestMaster™ is designed to validate ANY enterprise or desktop application.  It is a browser-based system and allows test engineers to open any application on their desktop, launch TestMaster™, and capture their test scripts while sequentially executing the various commands in their applications.    As the validation engineer navigates through the application, TestMaster™ captures each screenshot and text entry entered in the application.

Once the test script is saved, TestMaster™ allows the script to be published in your UNIQUE test script template with the push of a button.  No more cutting/pasting screenshots from manual processes!  You can generate your test scripts in MINUTES as opposed to the hours it sometimes takes to compile documents based on a series of screenshots.  If you are one of those validation engineers that does not like screenshots in your scripts, you can easily create text-based processes both quickly and easily using TestMaster™.

So, what is the biggest benefit of using TestMaster™ versus manual processes?  There are three key benefits which are summarized as follows:

  1.  Automated Test Script Execution– for years, validation engineers have wanted a more automated approach for the execution of validation test scripts.  ValidationMaster™ supports both hands-on or hands-off validation testing.  Hands-on validation testing is the process whereby a validation engineer looks at each step of a validation test script and executes the script step-by-step by clicking through the process.  Hands off validation allows a validation engineer to execute a test script with no human intervention.  This type of regression testing (hands off) is very useful for cloud-based systems or systems that require more frequent testing.  The validation engineer simply selects a test script and defines a date/time for its execution.  At the designated time with no human intervention, the system executes the test script and reports the test results back to the system.   Automated testing is here!  Why are you still doing this manually?

  1.  Traceability– TestMaster™ allows validation engineers to link each test script to a requirement or set of requirements, thus the system delivers automatic traceability which is a regulatory requirement.  With the click of a button, TestMaster™ allows validation engineers to create a test script directly from a requirement.  This is powerful capability that allows you to see requirements coverage through our validation dashboard on demand.  This validation dashboard is viewable on a desktop or mobile device (Windows, Apple, Android).

  1.  Test Script Execution– One of the biggest problems with manual test scripts is that they must be printed and manually routed for review and approval.  Some companies who have implemented document management systems may have the ability to route the scripts around electronically for review and approval.  The worst-case scenario is the company that has no electronic document management system and generates these documents manually.  TestMaster™ allows validation engineers to execute test scripts online and capture test script results in an easy manner.  The test script results can be captured in an automated way and published into executed test script templates quickly and easily.   If incidents (bugs/anomalies) occur during testing, users have the ability to automatically capture an incident report which is tied to the exact step where the anomaly/bug occurred.  Once completed, ValidationMaster™ is tightly integrated with a 21 CFR Part 11-compliant portal (ValidationMaster Portal™). Once the test script is executed, is it automatically published to the ValidationMaster™ Portal where it is routed for review/approval in the system.  The ability to draft, route, review, approve, execute and post-approve validation test scripts is an important, time/cost saving feature that should be a part of any 21stcentury validation program.

  1.  Reuse Test Scripts For Regression Testing– manual test scripts are not ‘readily’ reusable.  What I mean by this is that the Word documents must be edited or even re-written for validation regression testing.  Validation is not a one-time process.  Regression testing is a fact of life for validation engineers.  The question is, will you rewrite all of your test scripts or use automated tools to streamline the process.  ValidationMaster™ allows validation engineers to create a reusable test script library.  This library includes all the test scripts that make up your validation test script package.  During re-validation exercises, you have the ability to reuse the same test scripts for regression testing.

Given the rapid adoption of cloud, mobile and enterprise technologies in life sciences, a new approach to validation is required.  Yes, you can still conduct validation exercises on paper but why would you?  In the early days of enterprise technology, we did not have tools available that would facilitate the rapid development of validation test scripts.  Today, that is not the case.  Systems like ValidationMaster™ are available in either a hosted or on-premise environment.  These game-changing systems are revolutionizing the way validation is conducted and offering time/cost-saving features that make this process easier.   So why are you still generating test scripts manually?

Leveraging the NIST Cybersecurity Framework

As a validation engineer, why should you be concerned about Cybersecurity?  Good question!  Today’s headlines are filled with instances of cyber attacks and data breaches impacting some of the largest corporate systems around.  As validation engineers, our job is to confirm software quality and that systems meet their intended use.  How can you realistically do this without paying any attention to the threat of potential cyber attacks on validated system environment.

As with every system environment, you must ensure your readiness to help prevent a cyber event from occurring.  Of course, you can never fully protect your systems to the extent that a cyber attack will never be successful, but you can certainly PREPARE and reduce the probability of this risk.   That’s what this article is all about – PREPAREDNESS.

The NIST Cybersecurity Framework was created through collaboration between industry and government and consists of standards, guidelines, and practices to promote the protection of critical infrastructure.

To get a copy of the NIST Cyber Security Framework publication, click here.  If you are not familiar with the NIST Cyber Security Framework, you can view an overview video and get a copy of the Excel spreadsheet.

Remember the old addage, “…if its not documented, it did’nt happen…”?  You must document controls, processes and strategies to ensure that you are able to defend your readiness assessment for cybersecurity.  The NIST Cyber Security Framework is designed to help organizations view cybersecurity in a systematic way as part of your overall risk management strategy for validated systems.   The Framework consists of three parts:

  1. Framework Core – a set of cybersecurity activities, outcomes, and informative references that are common across your validated systems environments.   The Framework Core consists of (5) concurrent and continuous Functions which are: (1) Identify, (2) Protect, (3) Detect, (4) Respond, (5) Recover as shown in the figure below.
  2. Framework Profile – help align your cybersecurity activities with business requirements, risk tolerances, and resources
  3. Framework Implementation Tiers – a method to view, assess, document and understand the characteristics of your approach to managing cybersecurity risks in validated systems environments.  This is assessment is part of your Cybersecurity Qualification (CyQ).  Life sciences companies should characterize their level of readiness from Partial (Tier 1) to Adaptive (Tier 4).  You can use what ever scale you like in your assessment.

NIST-cybersecurity-framework

Most companies are adept at RESPONDING to cyber events rather than preventing them.  This Framework, as part of your overall integrated risk management strategy for validation.  We recommend for validation engineers that you DOCUMENT your strategy to confirm your due diligence with respect to cybersecurity.  In my previous blog post, I recommended that in addition to conducting IQ, OQ, PQ, and UAT testing that you also conduct a CyQ readiness assessment.

Cyber threats are a clear and present danger to companies of all sizes and types.  As validation engineers, we need to rethink our validation strategies and adapt to changes which can have significant impact on our validated systems environments.  Whether you are in the cloud or on-premise, cyber threats are real and may impact you.  This problem is persistent and is not going away anytime soon.  Readiness and preparedness is the key.  Some think that issues concerning cybersecurity are only the perview of the IT team – THINK AGAIN!  Cybersecurity is not only an IT problem, it is an enterprise problem that requires an interdisciplinary approach and a comprehensive governance commitment to ensure that all aspects of your validation processes and business processes are aligned to support effective cybersecurity practices.

If you are responsible for software quality and ensuring the readiness of validated you need to be concerned about this matter.  The threats are real.  The challenges are persistent.  The need for greater diligence is upon us.  Check out the NIST Cyber Security Framework.  Get your cyber house in order.

 

Automated Validation Lifecycle Management

Lorem ipsum dolor sit amet, consectetur adipiscing elit. Nam molestie molestie nisl, eu scelerisque turpis tempus at. Nam luctus ultrices imperdiet.

Class aptent taciti sociosqu ad litora torquent per conubia nostra, per inceptos himenaeos. Suspendisse velit orci, pretium ut feugiat nec, lobortis et est. Nullam cursus ultrices tincidunt. Nam gravida sem gravida ipsum dignissim in dictum urna accumsan. Nullam nec augue magna, sed scelerisque odio. Cras adipiscing feugiat venenatis. Praesent gravida consequat purus sed lobortis. Aenean et eros nunc.

Nam ultricies aliquam imperdiet. Pellentesque massa dui, varius non sodales quis, placerat ullamcorper nisl. Donec pulvinar, arcu vel rhoncus commodo, neque lectus blandit elit, vel iaculis odio lectus sit amet metus. Curabitur sodales semper eros et vulputate. Ut et sem ipsum. Nam elementum neque sem. Fusce fringilla ante id augue sodales venenatis. Nunc eu ipsum enim.

  •  Bibendum in cursus venenatis
  • Ultricies consectetur purus
  • Integer imperdiet lectus vitae

Nunc odio odio, faucibus non porta a, venenatis non mauris. Nam non tortor est. Nullam lacinia, augue quis luctus ullamcorper, sem urna bibendum erat, sed viverra tortor velit sed quam. Sed adipiscing leo a odio condimentum in placerat ipsum bibendum.

Nam pretium, sem iaculis ullamcorper mattis, sem lacus commodo dui, vel ultrices libero nisl et massa. Sed tristique bibendum arcu, dapibus eleifend justo aliquet eu. Fusce sed blandit lorem. Phasellus blandit posuere nulla quis aliquam. In vel ante vitae neque aliquet hendrerit a non velit.

In hac habitasse platea dictumst. Integer ac ante enim, in imperdiet justo. Sed justo mi, convallis et lobortis a, venenatis at odio. Vivamus porttitor dolor eget felis pretium luctus. Sed nec dui id augue blandit accumsan vel et lorem.

Quisque eros purus, sagittis sit amet consectetur eu, scelerisque a purus. Pellentesque sollicitudin velit eu velit fringilla sollicitudin. Vestibulum ante ipsum primis in faucibus orci luctus et ultrices posuere cubilia Curae; Praesent id aliquam magna.