The text book definition of computer systems validation (paraphrased) is “documented evidence that a system performs according to its intended use”. The FDA uses the definition “…Validation is a process of demonstrating, through documented evidence, that <software applications> will consistently produce the results that meets predetermined specifications and quality attributes.”
Using the FDA’s definition, one should understand that validation is a PROCESS. It is not a one-time event. Also, looking at this definition, one should understand that Agency expects to see “documented evidence” that a system will consistently produce results that meet predetermined specification and quality attributes. And finally, from the FDA’s definition of validation, we learn that to properly validate a system, one MUST have “predetermined specifications and quality attributes” defined.
Let’s start at the beginning with the question posed by this blog post “Can Microsoft Azure® Be Validated?” The short answer is YES. The remainder of this post I will share 6 key reasons why this bold statement can be made. But first, let’s establish some foundation as to what Microsoft Azure® (“Azure®”) is and why this question is relevant.
At its core, Azure is a cloud computing platform and infrastructure created by Microsoft for building, deploying, and managing applications and services through a global network of Microsoft-managed datacenters (Wikipedia). In today’s global economy, life sciences companies, like many of their industry counterparts are seeking to do more with less. These companies are embracing enterprise technologies and leveraging innovation to drive global business and regulatory processes. If you had asked me 10 years ago, how many companies I am working with are deploying enterprise technologies in the cloud, my answer would have been very few, if any. If you further asked me 10 years ago, how many cloud validation exercises in life sciences was I conducting, I would have said NONE.
Suffice to say, cloud computing has been a game-changer for life sciences companies. It should be noted that due to regulatory considerations, the life sciences industry has been reluctant to change validation practices. But the cloud is changing the process of validation as we know it. Sure, the basic principles of risk-based validation endure. You STILL have to provide documented evidence that your software applications will consistently produce the results that meets predetermined specifications and quality attributes. HOW you do this is what is changing. Validating cloud applications is a bit different than validating on-premise application mostly due to the changing roles and responsibilities of those responsible for the system and the cloud providers themselves. I will say at the outset that all cloud providers are not created equally so “caveat emptor” (let the buyer beware).
Let’s talk about the 6 key reasons why Azure can be validated and how this is done.
SIX REASONS WHY AND HOW AZURE CAN BE VALIDATED.
REASON 1 – AZURE RISK ASSESSMENT
Risk management is not a new concept. It has been around since the 1940’s applied to military and aerospace industries before it was broadened out. The first step crucial step in any computer systems validation project is to conduct a validation risk assessment. Among determining the impact on critical quality attributes, a risk assessment is used to determine the level of validation due diligence required by the system. How much validation testing you should do for any validation exercise should be based on a risk assessment of the system. When deploying systems such as Microsoft Dynamics AX® or BatchMaster/GP® using the Azure platform, the risk assessment should include an assessment of the Azure platform itself. When outsourcing your platform to a vendor such as Microsoft, their risks become your risks thus, you must conduct your due diligence to ensure that these risks are acceptable in your organization.
The ISPE GAMP 5® (“GAMP 5®”) risk assessment model defines the process for hazard identification, estimation of risk, evaluation of risk, control and monitoring of risks and the process for documenting risk. GAMP 5® speaks of a 5-step process including:
- Perform Initial Risk Assessment and Determine System Impact
- Identify Functions With Impact On Patient Safety, Product Quality and Data Integrity
- Perform Functional Risk Assessment and Identify Controls
- Implement and Verify Appropriate Controls
- Review Risks and Monitor Controls
GAMP 5® does not explicitly address cloud infrastructure. One could extrapolate and interpret how this standard may be applied to cloud infrastructure but it is one of the limitations of applying a “clean” translation. Azure could be treated as a Commercial-off-the-Shelf (COTS) platform. You will note that the risk process identified by GAMP 5® is more application-focused than platform-focused. To conduct a proper risk assessment of the Azure, a new framework must be established to conduct a risk assessment for the platform. In July 2011, ISACA, previously known as the Information Systems Audit and Control Association, released a document called “IT Control Objectives for Cloud Computing: Controls and Assurance in the Cloud” which highlighted specific control objectives and risks inherent in cloud computing. From a validation perspective, you need to address availability/reliability risks, performance/capacity risks, as well as security and cybersecurity risks. Using the publicly available independent service organization controls (SOC 1/2/3) reports, you should be able to effectively document and assess these risks. Remember, if it’s not documented, it didn’t happen!
REASON 2 – CHANGE CONTROL (“THE ELEPHANT IN THE ROOM”)
One of the biggest challenges with companies adoption of cloud computing is CHANGE CONTROL. As you may be well aware, cloud environments are under constant change. In the validation world, change control is an essential process to help ensure that validated systems remain in a validated state. So how do you maintain the validated state when a platform such as Azure is in a constant state of change? First, all validated systems are subject to change control. Each time a system is changed, the change, reason for change and who made the change must be documented. When the system is on-premise, changes are documented (in general) by the IT department. The responsibility for the infrastructure belongs to this group in an on-premise scenario. In a cloud environment, this responsibility belongs to the cloud provider. SOC reports highlight and define change control processes and procedures for the cloud environment.
We recommend to our clients that they obtain a copy of these reports under non-disclosure to ensure compliance of the cloud provider. Once in the cloud, we recommend developing a testing and regression testing strategy that tests systems in a cloud environment on a more frequent basis. In the ‘80’s and ‘90’s, validation engineers were very reluctant to change validated systems due to the complexities and level of effort involved in testing validated systems using manual testing processes. Today, companies are leveraging tools like OnShore’s ValidationMaster™ to facilitate automated testing that allows validation test engineers to schedule the run of automated test scripts in a cloud environment. ValidationMaster™ takes the complexity from regression testing and frees up resources for other work. Change control is essential in on-premise and cloud environments. To support the cloud, you will need to establish clear policies and procedures that reference both on-premise as well as cloud computing systems. With respect to change control, it is very important that your policies reflect procedures in a cloud environment.
Microsoft has established clear controls for Azure to manage change control in their environment. These procedures have been documented and tested as per their SOC 1/SOC 2 control reports issued by independent auditors. When validating systems in an Azure environment, you will have to get copies of these reports and them with your validation package. The supplier audit becomes crucial in a hosted environment. The good news is that Microsoft is a reputable supplier and can be relied on as a stable vendor with respect to its products.
Microsoft Azure has undergone many independent audits including a 2013 ISO/IEC 27001:2005 audit report that confirms that Microsoft has procedures in place that provide governance for change management processes. The audit report highlights both the methodology confirms that changes to the Azure environment are appropriately tested and approved.
REASON 3 – INDEPENDENT AUDIT CONFIRMATION OF TECHNICAL/PROCEDURAL CONTROLS AND SECURITY
In cloud environments, the role of the cloud provider is paramount. All cloud providers are not created equal. Microsoft Azure has passed many audits and these reports, as previously noted are available upon request. Security and cybersecurity issues are very important in the Azure environment. Microsoft has implemented controls and procedures to ensure compliance. The company has provided detailed information on security on their website (https://www.microsoft.com/en-us/TrustCenter/Security/default.aspx )
REASON 4 – INFRASTRUCTURE QUALIFICATION (IQ)
IQ testing changes in the Azure environment. In the Azure environment, Microsoft bears some responsibility for software installation of hardware and software. As previously mentioned, it is important to qualify the environment once it is setup to confirm that what is supposed to be installed is present in your environment. We qualify the virtual software layer as well as Microsoft Lifecycle Services.
REASON 5 – OPERATIONAL/PERFORMANCE QUALIFICATION (OQ/PQ)
The process of operational qualification of the application is very similar in a cloud environment to the process in an on-premise environment. Microsoft Dynamics AX now can easily be deployed in an Azure environment. Validation testing of this type of implementation with some variation. The installation of Microsoft Dynamics AX uses what is called Lifecycle Services which allows you to setup and install the system rapidly using automation. Lifecycle Services may be subject to validation itself since it is used to establish a validated systems environment. You will need to identify the failure of Lifecycle Services as a risk and rate the risk of using this service as part of the risk assessment.
Lifecycle Services is promoted as bringing predictability and repeatability to the process of software installation. These are principles upon which validation has stood for many years. However, it is important to go beyond the marketing to verify that this software layer is performing according to its intended use. We typically draft a qualification script to qualify Lifecycle Services prior to using it in a validated systems environment. We conduct OQ/PQ testing in the same manner as in a non-hosted environment.
REASON 6 – ABILITY TO MAINTAIN THE VALIDATED STATE
With Microsoft Azure, the need to maintain the validated state does not go away. It is very possible to maintain your mission-critical applications in a validated state that are hosted using the Azure platform. To ensure that systems are maintained in a validated state in the Azure environment, we recommend establishing policies and procedures and adopting the test early/test often premise to ensure that changes made to the infrastructure do not negatively impact the validated production systems environment. It is possible to maintain the validated state using Azure. It must be done with a combination of technology, processes and procedures to ensure compliance.
As a validation engineer, I have validated many systems on the Azure platform. I have concluded that the Azure platform can be validated and used within the life sciences environment. How much validation due diligence you should conduct should be based on risk. Don’t over do it! Use lean methodologies to drive your validation processes.
Upon close inspection of this platform, it has built-in quality and security controls to provide a level of assurance of its suitability for deployment in regulated environments. I like the idea that the system has been independently inspected by numerous 3rd party organizations to help ensure objectivity. This platform is ready for prime time and many of my clients are leveraging its benefits and freeing themselves from the burden of managing their infrastructure.