Can Microsoft Azure Be Validated?

The text book definition of computer systems validation (paraphrased) is “documented evidence that a system performs according to its intended use”.  The FDA uses the definition “…Validation is a process of demonstrating, through documented evidence, that <software applications> will consistently produce the results that meets predetermined specifications and quality attributes.”

Using the FDA’s definition, one should understand that validation is a PROCESS.  It is not a one-time event.  Also, looking at this definition, one should understand that Agency expects to see “documented evidence” that a system will consistently produce results that meet predetermined specification and quality attributes.  And finally, from the FDA’s definition of validation, we learn that to properly validate a system, one MUST have “predetermined specifications and quality attributes” defined.

Let’s start at the beginning with the question posed by this blog post “Can Microsoft Azure® Be Validated?”  The short answer is YES.  The remainder of this post I will share 6 key reasons why this bold statement can be made.  But first, let’s establish some foundation as to what Microsoft Azure® (“Azure®”) is and why this question is relevant.

At its core, Azure is a cloud computing platform and infrastructure created by Microsoft for building, deploying, and managing applications and services through a global network of Microsoft-managed datacenters (Wikipedia).  In today’s global economy, life sciences companies, like many of their industry counterparts are seeking to do more with less.    These companies are embracing enterprise technologies and leveraging innovation to drive global business and regulatory processes.  If you had asked me 10 years ago, how many companies I am working with are deploying enterprise technologies in the cloud, my answer would have been very few, if any.  If you further asked me 10 years ago, how many cloud validation exercises in life sciences was I conducting, I would have said NONE.

Suffice to say, cloud computing has been a game-changer for life sciences companies.  It should be noted that due to regulatory considerations, the life sciences industry has been reluctant to change validation practices.  But the cloud is changing the process of validation as we know it.  Sure, the basic principles of risk-based validation endure.  You STILL have to provide documented evidence that your software applications will consistently produce the results that meets predetermined specifications and quality attributes.  HOW you do this is what is changing.  Validating cloud applications is a bit different than validating on-premise application mostly due to the changing roles and responsibilities of those responsible for the system and the cloud providers themselves.  I will say at the outset that all cloud providers are not created equally so “caveat emptor” (let the buyer beware).

Let’s talk about the 6 key reasons why Azure can be validated and how this is done.

SIX REASONS WHY AND HOW AZURE CAN BE VALIDATED.

REASON 1 – AZURE RISK ASSESSMENT

Risk management is not a new concept.  It has been around since the 1940’s applied to military and aerospace industries before it was broadened out.  The first step crucial step in any computer systems validation project is to conduct a validation risk assessment.  Among determining the impact on critical quality attributes, a risk assessment is used to determine the level of validation due diligence required by the system.   How much validation testing you should do for any validation exercise should be based on a risk assessment of the system.  When deploying systems such as Microsoft Dynamics AX® or BatchMaster/GP® using the Azure platform, the risk assessment should include an assessment of the Azure platform itself.  When outsourcing your platform to a vendor such as Microsoft, their risks become your risks thus, you must conduct your due diligence to ensure that these risks are acceptable in your organization.

The ISPE GAMP 5® (“GAMP 5®”) risk assessment model defines the process for hazard identification, estimation of risk, evaluation of risk, control and monitoring of risks and the process for documenting risk.  GAMP 5® speaks of a 5-step process including:

  1. Perform Initial Risk Assessment and Determine System Impact
  2. Identify Functions With Impact On Patient Safety, Product Quality and Data Integrity
  3. Perform Functional Risk Assessment and Identify Controls
  4. Implement and Verify Appropriate Controls
  5. Review Risks and Monitor Controls

GAMP 5® does not explicitly address cloud infrastructure.    One could extrapolate and interpret how this standard may be applied to cloud infrastructure but it is one of the limitations of applying a “clean” translation.  Azure could be treated as a Commercial-off-the-Shelf (COTS) platform.  You will note that the risk process identified by GAMP 5® is more application-focused than platform-focused.  To conduct a proper risk assessment of the Azure, a new framework must be established to conduct a risk assessment for the platform.  In July 2011, ISACA, previously known as the Information Systems Audit and Control Association, released a document called “IT Control Objectives for Cloud Computing: Controls and Assurance in the Cloud” which highlighted specific control objectives and risks inherent in cloud computing.  From a validation perspective, you need to address availability/reliability risks, performance/capacity risks, as well as security and cybersecurity risks.  Using the publicly available independent service organization controls (SOC 1/2/3) reports, you should be able to effectively document and assess these risks.  Remember, if it’s not documented, it didn’t happen!

REASON 2 – CHANGE CONTROL (“THE ELEPHANT IN THE ROOM”)

One of the biggest challenges with companies adoption of cloud computing is CHANGE CONTROL.   As you may be well aware, cloud environments are under constant change.  In the validation world, change control is an essential process to help ensure that validated systems remain in a validated state.  So how do you maintain the validated state when a platform such as Azure is in a constant state of change?  First, all validated systems are subject to change control.  Each time a system is changed, the change, reason for change and who made the change must be documented.  When the system is on-premise, changes are documented (in general) by the IT department.  The responsibility for the infrastructure belongs to this group in an on-premise scenario.  In a cloud environment, this responsibility belongs to the cloud provider.  SOC reports highlight and define change control processes and procedures for the cloud environment.

We recommend to our clients that they obtain a copy of these reports under non-disclosure to ensure compliance of the cloud provider.  Once in the cloud, we recommend developing a testing and regression testing strategy that tests systems in a cloud environment on a more frequent basis.  In the ‘80’s and ‘90’s, validation engineers were very reluctant to change validated systems due to the complexities and level of effort involved in testing validated systems using manual testing processes.  Today, companies are leveraging tools like OnShore’s ValidationMaster™ to facilitate automated testing that allows validation test engineers to schedule the run of automated test scripts in a cloud environment.  ValidationMaster™ takes the complexity from regression testing and frees up resources for other work.   Change control is essential in on-premise and cloud environments.   To support the cloud, you will need to establish clear policies and procedures that reference both on-premise as well as cloud computing systems.  With respect to change control, it is very important that your policies reflect procedures in a cloud environment.

Microsoft has established clear controls for Azure to manage change control in their environment.  These procedures have been documented and tested as per their SOC 1/SOC 2 control reports issued by independent auditors.  When validating systems in an Azure environment, you will have to get copies of these reports and them with your validation package.    The supplier audit becomes crucial in a hosted environment.  The good news is that Microsoft is a reputable supplier and can be relied on as a stable vendor with respect to its products.

Microsoft Azure has undergone many independent audits including a 2013 ISO/IEC 27001:2005 audit report that confirms that Microsoft has procedures in place that provide governance for change management processes.  The audit report highlights both the methodology confirms that changes to the Azure environment are appropriately tested and approved.

REASON 3 – INDEPENDENT AUDIT CONFIRMATION OF TECHNICAL/PROCEDURAL CONTROLS AND SECURITY

In cloud environments, the role of the cloud provider is paramount.  All cloud providers are not created equal.   Microsoft Azure has passed many audits and these reports, as previously noted are available upon request.  Security and cybersecurity issues are very important in the Azure environment.  Microsoft has implemented controls and procedures to ensure compliance.   The company has provided detailed information on security on their website (https://www.microsoft.com/en-us/TrustCenter/Security/default.aspx )

REASON 4 – INFRASTRUCTURE QUALIFICATION (IQ)

IQ testing changes in the Azure environment.  In the Azure environment, Microsoft bears some responsibility for software installation of hardware and software.  As previously mentioned, it is important to qualify the environment once it is setup to confirm that what is supposed to be installed is present in your environment.  We qualify the virtual software layer as well as Microsoft Lifecycle Services.

REASON 5 – OPERATIONAL/PERFORMANCE QUALIFICATION (OQ/PQ)

The process of operational qualification of the application is very similar in a cloud environment to the process in an on-premise environment.  Microsoft Dynamics AX now can easily be deployed in an Azure environment.   Validation testing of this type of implementation with some variation.  The installation of Microsoft Dynamics AX uses what is called Lifecycle Services which allows you to setup and install the system rapidly using automation.  Lifecycle Services may be subject to validation itself since it is used to establish a validated systems environment.  You will need to identify the failure of Lifecycle Services as a risk and rate the risk of using this service as part of the risk assessment.

Lifecycle Services is promoted as bringing predictability and repeatability to the process of software installation.  These are principles upon which validation has stood for many years.  However, it is important to go beyond the marketing to verify that this software layer is performing according to its intended use.  We typically draft a qualification script to qualify Lifecycle Services prior to using it in a validated systems environment.  We conduct OQ/PQ testing in the same manner as in a non-hosted environment.

REASON 6 – ABILITY TO MAINTAIN THE VALIDATED STATE

With Microsoft Azure, the need to maintain the validated state does not go away.  It is very possible to maintain your mission-critical applications in a validated state that are hosted using the Azure platform.  To ensure that systems are maintained in a validated state in the Azure environment, we recommend establishing policies and procedures and adopting the test early/test often premise to ensure that changes made to the infrastructure do not negatively impact the validated production systems environment.  It is possible to maintain the validated state using Azure.  It must be done with a combination of technology, processes and procedures to ensure compliance.

As a validation engineer, I have validated many systems on the Azure platform.  I have concluded that the Azure platform can be validated and used within the life sciences environment.   How much validation due diligence you should conduct should be based on risk.  Don’t over do it!  Use lean methodologies to drive your validation processes.

Upon close inspection of this platform, it has built-in quality and security controls to provide a level of assurance of its suitability for deployment in regulated environments.   I like the idea that the system has been independently inspected by numerous 3rd party organizations to help ensure objectivity.  This platform is ready for prime time and many of my clients are leveraging its benefits and freeing themselves from the burden of managing their infrastructure.

Cybersecurity Qualification (CyQ)

One topic that has been top of mine for many validation engineers, chief information officers, and executive management is that of Cybersecurity. You may be asking yourself the question why are we talking about Cybersecurity and validation? Recent headlines will inform you as to why this topic should be of great interest to every validation engineer. As validation engineers we spend a lot of time stressing about risk assessments, system security, and qualification of system environments. Our job is supposed to be to validate the system to ensure its readiness for production use. Let me ask a question… How can you ensure that a system is ready for production use if it is not cyber-ready?  This is why we are talking about Cybersecurity in the context of validated systems.

When it comes to computer systems in today’s highly networked environment, Cybersecurity is the elephant in the room. All networked systems may be vulnerable to cyber security threats. Businesses large and small may be subject to cyber-attacks and the exploitation of these vulnerabilities may present a risk to public health and safety if not properly addressed. Although we know these truths all too well, many validation engineers are not even discussing Cybersecurity as part of an overall validation strategy.

There is no company that can prevent all incidences of cyber-attacks but it is critically important that companies began to think seriously about how to protect themselves from persistent cyber criminals determined to inflict as much damage as possible on computer systems in either highly regulated or nonregulated environments. One thing we know about cyber criminals is they are equal opportunity offenders – everyone has a degree of vulnerability. To beat them at their game, you have to be one step ahead of them.

In the validation world, we often refer to validation testing as IQ/OQ/PQ testing.  I would like to submit for your review and consideration another type of enhanced validation testing that we should be doing which is Cybersecurity qualification or as I like to refer to it “CyQ”.  What is a CyQ?  It is confirmation of a system’s protection controls and readiness to prevent a cyber-attack.  In one of my recent blog posts, I declared that …”computer systems validation as we know it is dead!…” Now of course I mean that tongue in cheek!  What I was referring to is that it is time to rethink our validation strategy based on the fact that we need to address the vulnerabilities of today’s cloud-based and on-premise systems with respect to the Cybersecurity risk imposed. We can no longer look at systems the way we did in the 1980s. Many life sciences companies are deploying cloud-based technologies, mobile systems, the Internet of things (IoT) and many other advanced technologies in the pursuit of innovation that may drive greater risk profiles in validated systems.  Incorporating CyQ in your overall validation strategy is one way to address these challenges.

The national Institute of standards and technology (NIST) introduced as cyber security framework. The five elements of the framework are shown in the figure below.

NIST-cybersecurity-framework

As a validation engineer I have studied this framework for its applicability to validated systems.  Each element of the strategy addresses a dimension of your cybersecurity profile.  To conduct a CyQ assessment, you need to examine each element of the cybersecurity framework to determine your readiness in each respective category.  I have developed a CyQ Excel Spreadsheet which examines each element of the framework and allows you to summarize your readiness to prevent a cyber-attack. (if you would like a copy of the CyQ Excel Spreadsheet, please contact me using the contact form and I will happily send it to you).

 

Remember, for validated systems, if it is not documented, it did not happen! Cybersecurity Qualification analysis must be documented.  You must be ready to explain to regulators when it comes to data integrity and systems integrity, what controls you have in place to protect both the data and the systems under your management.

Another consideration in the management of cyber threats is EDUCATION.  The biggest cyber breach may come from the person in the cubicle next to you! You must educate (and document) cyber training and do it on a frequent basis to keep pace.

For your next validation project, address the elephant in the room explicitly.   Cyber threats are not diminishing, they are increasing.  It is important to understand their origin and seriously consider how they can and will impact validated systems.  We can no longer think that IQ/OQ/PQ is sufficient.  While it has served its purpose in times past, we need a more effective strategy to address today’s clear and present danger to validated systems – the next cyber-attack.  It could be YOUR SYSTEM.  Deal with it!

Automating Validation Testing: It’s Easier Than You Think

Automated validation testing has been elusive for many in the validation community.  There have been many “point solutions” on the market that addressed the creation, management and execution of validation testing.  However, what most validation engineers want is TRULY AUTOMATED validation testing that will interrogate an application in a rigorous manner and report results in a manner that not only provides objective evidence of pass/fail criteria but will highlight each point of failure.

In the 1980’s when I was conducting validation exercises for mini- and mainframe computers and drafting test scripts in a very manual way, I often envisioned a time when I would be able to conduct validation testing in a more automated way.  Most validation engineers work in an environment where they are asked to do more with less.  Thus, the need for such a tool is profound.

Cloud computing environments, mobility, cybersecurity, and data integrity imperatives make it essential that we more thoroughly test applications today.  Yet the burden of manual testing persists.  If I could share with you 5 key features of an automated testing system it would include the following:

  • Automated test script procedure capture and development
  • Automated Requirements Traceability
  • Fully Automated Validation Test Script Execution
  • Automated Incident Capture and Management
  • Ability to Support Continuous Testing in the Cloud

In most validation exercises I have participated in, validation testing was the most laborious part of the exercise.  Automated testing is easier than you think.

For example, ValidationMaster™ includes an automated test engine that captures each step of your qualification procedure and annotates the step with details of the action performed.

Test cases can be routed for review and pre-approval with the system quickly and easily through DocuSign.  Test case execution can be conducted online and a dynamic dashboard reports the status of how many test scripts have passed, how many have failed, or which ones may have passed with exception.  Once test scripts have been executed, the test scripts may be routed for post-approval and signed.

Within the ValidationMaster™ system, you can create a reusable test script library to support future regression testing efforts.  The system allows users to link requirements to test cases thus facilitating the easy generation of forward and reverse trace matrices.  Exporting documents in your UNIQUE format is a snap within the system.  Thus, you can consistently comply with your internal document procedures.

Continuous testing in a cloud environment is essential.  ValidationMaster™ supports fully automated validation testing allowing users to set a date/time for testing.  Test scripts are run AUTOMATICALLY without human intervention.  Allowing multiple runs of the same scripts if necessary.

Continuous testing in a cloud environment is ESSENTIAL.  You must have the ability to respond to rapid changes in a cloud environment that may impact the validated state of the system.  Continuous testing reduces risk and ensures sustained compliance in a cloud environment.

The system automatically raises an incident report if a bug is encountered through automated testing.  The system keeps track of each test run and results though automation.  ValidationMaster™ includes a dynamic dashboard that shows the pass/fail status of each test script as well as requirements coverage, open risks, incident trend analysis and much more.

The time is now for automated validation testing.  The good news is that there are enterprise level applications on the market that facilitate the full validation lifecycle process.  Why are you still generating manual test scripts?  Automated testing is easier than you think!

Why Are You Still Generating Validation Test Scripts Manually?

Drafting validation scripts is one of the key activities in a validation exercise designed to provide document evidence that a system performs according to its intended use.  The FDA and other global agencies require objective evidence, usually in the form of screen shots that sequentially capture the target software process, to provide assurance that systems can consistently and repeatedly perform the various processes representing the intended use of the system.

Since the advent of the PC, validation engineers have been writing validation test scripts manually.  The manual process of computer systems validation test script development involves capturing screenshots and pasting them into Microsoft Word test script templates.  To generate screen captures, some companies use tools such as Microsoft Print Screen, TechSmith SnagIT, and other such tools.  A chief complaint of many validation engineers is that the test script development process is a slow, arduous one.  Some validation engineers are very reluctant to update/re-validate systems due to this manual process.  So, the question posed by this blog article is simply this: “Why are you still generating test scripts manually???”

I have been conducting validation exercises for engineering and life sciences systems since the early 1980’s.  I too have experienced first-hand the pain of writing test scripts manually.  We developed and practice “lean validation” so I sought ways to eliminate manual, wasteful validation processes.  One of the most wasteful processes in validation is the manual capture/cutting/pasting of screenshots into a Microsoft Word document.

The obvious follow up question is “how do we capture validation tests in a more automated manner to eliminate waste and create test scripts that are complete, accurate and provide the level of due diligence required for validation?”

In response to this common problem, we developed an Enterprise Validation Management system called ValidationMaster™.  This system includes TestMaster™, an automated testing system that allows validation engineers to capture and execute validation test scripts in a cost-effective manner.

TestMaster™ is designed to validate ANY enterprise or desktop application.  It is a browser-based system and allows test engineers to open any application on their desktop, launch TestMaster™, and capture their test scripts while sequentially executing the various commands in their applications.    As the validation engineer navigates through the application, TestMaster™ captures each screenshot and text entry entered in the application.

Once the test script is saved, TestMaster™ allows the script to be published in your UNIQUE test script template with the push of a button.  No more cutting/pasting screenshots from manual processes!  You can generate your test scripts in MINUTES as opposed to the hours it sometimes takes to compile documents based on a series of screenshots.  If you are one of those validation engineers that does not like screenshots in your scripts, you can easily create text-based processes both quickly and easily using TestMaster™.

So, what is the biggest benefit of using TestMaster™ versus manual processes?  There are three key benefits which are summarized as follows:

  1.  Automated Test Script Execution– for years, validation engineers have wanted a more automated approach for the execution of validation test scripts.  ValidationMaster™ supports both hands-on or hands-off validation testing.  Hands-on validation testing is the process whereby a validation engineer looks at each step of a validation test script and executes the script step-by-step by clicking through the process.  Hands off validation allows a validation engineer to execute a test script with no human intervention.  This type of regression testing (hands off) is very useful for cloud-based systems or systems that require more frequent testing.  The validation engineer simply selects a test script and defines a date/time for its execution.  At the designated time with no human intervention, the system executes the test script and reports the test results back to the system.   Automated testing is here!  Why are you still doing this manually?

  1.  Traceability– TestMaster™ allows validation engineers to link each test script to a requirement or set of requirements, thus the system delivers automatic traceability which is a regulatory requirement.  With the click of a button, TestMaster™ allows validation engineers to create a test script directly from a requirement.  This is powerful capability that allows you to see requirements coverage through our validation dashboard on demand.  This validation dashboard is viewable on a desktop or mobile device (Windows, Apple, Android).

  1.  Test Script Execution– One of the biggest problems with manual test scripts is that they must be printed and manually routed for review and approval.  Some companies who have implemented document management systems may have the ability to route the scripts around electronically for review and approval.  The worst-case scenario is the company that has no electronic document management system and generates these documents manually.  TestMaster™ allows validation engineers to execute test scripts online and capture test script results in an easy manner.  The test script results can be captured in an automated way and published into executed test script templates quickly and easily.   If incidents (bugs/anomalies) occur during testing, users have the ability to automatically capture an incident report which is tied to the exact step where the anomaly/bug occurred.  Once completed, ValidationMaster™ is tightly integrated with a 21 CFR Part 11-compliant portal (ValidationMaster Portal™). Once the test script is executed, is it automatically published to the ValidationMaster™ Portal where it is routed for review/approval in the system.  The ability to draft, route, review, approve, execute and post-approve validation test scripts is an important, time/cost saving feature that should be a part of any 21stcentury validation program.

  1.  Reuse Test Scripts For Regression Testing– manual test scripts are not ‘readily’ reusable.  What I mean by this is that the Word documents must be edited or even re-written for validation regression testing.  Validation is not a one-time process.  Regression testing is a fact of life for validation engineers.  The question is, will you rewrite all of your test scripts or use automated tools to streamline the process.  ValidationMaster™ allows validation engineers to create a reusable test script library.  This library includes all the test scripts that make up your validation test script package.  During re-validation exercises, you have the ability to reuse the same test scripts for regression testing.

Given the rapid adoption of cloud, mobile and enterprise technologies in life sciences, a new approach to validation is required.  Yes, you can still conduct validation exercises on paper but why would you?  In the early days of enterprise technology, we did not have tools available that would facilitate the rapid development of validation test scripts.  Today, that is not the case.  Systems like ValidationMaster™ are available in either a hosted or on-premise environment.  These game-changing systems are revolutionizing the way validation is conducted and offering time/cost-saving features that make this process easier.   So why are you still generating test scripts manually?

Can Cloud Applications Be Validated?

Cloud applications are being deployed within life sciences Enterprises at a rapid pace. Microsoft office 365, Microsoft dynamics 365 and the cost benefit of deployment of other such applications are driving the adoption of cloud applications for regulated systems. The question that is always asked is can cloud systems be validated? The reason for the inquiry is due to the fact that many understand how clouds are deployed and maintained over time. In an effort to keep pace with system performance, security and other related controls in the cloud, cloud providers often update their environments to keep pace. The constant updating of the cloud environment makes it challenging from a systems validation perspective.  Maintaining the validated state in a cloud environment is often challenging due to this fact.

So can cloud applications be validated and what are the unique processes that must be changed to accommodate cloud validation?  The short answer is yes, cloud applications can be validated. However, there are changes required to the validation strategy to ensure that the system meets its intended use and the validated state is maintained over time.

ALL CLOUD PROVIDERS ARE NOT CREATED EQUAL

When choosing a cloud provider, it is helpful to understand that all cloud providers are not created equal. To my mind they are divided into two distinct camps: (1) those who understand regulated environments and (2) those who do not understand regulated environments. The first order of business for establishing a validated system in the cloud is to select your cloud providers carefully. I usually like to look at cloud providers who have experience in regulated environments. This goes for any application that you’re using in a highly regulated environment. It is super important that your vendor understand the regulatory requirements that you have to comply with and that in their service they build in best practices to help you comply. This will go a long way during the supplier audit process to confirm that you’ve done your proper due diligence on your cloud provider.

UNDERSTANDING RESPONSIBILITIES IN A CLOUD ENVIRONMENT

It is important to understand responsibilities in a cloud environment.  With packaged software applications, you have complete control over the environment.  With the various cloud service models, responsibilities vary depending on the model used as shown below.  For Infrastructure-as-a-Service, you manage the applications, data, runtime, middle ware and operating system.  While the vendor manages virtualization, servers, storage, and networking.  Please keep in mind that the principles of validation endure in the cloud.  You, not the vendor, are ultimately responsible for the environment and its management.  Therefore, you need to choose your cloud vendor wisely.

cloud services

You can see with Platform-as-a-Service (PaaS) and Software-as-a-Service (SaaS), you have less responsibility for the management of the system.  In the SaaS model, you do not control the application or infrastructure.  The vendor manages the application and it underlying architecture.

The strategy for validation changes based on the cloud model deployed.

The strategy for validation changes based on the cloud model deployed.

CONTINUOUS TESTING IN THE CLOUD

Previously four on premise systems, the validation engineer in consultation with the IT team determined how often validated systems environments were reviewed and updated. If the validation engineer did not want to apply a patch or an update, they simply left the system alone. Many validation engineers due to the labor involved in updating and documenting a validated system would often leave the system in a state where no changes, patches, or updates were applied. In today’s systems due to the threats from cyber security and frequent changes to browser applications, it is not possible to leave a system unpatched. To do so may leave your validated systems environment more exposed to security threats.

Therefore, when validating in the cloud you must employ a continuous testing strategy. When I say that to validation engineers they often cringe thinking of continuously having the test a validated systems environment. This is easier than you may think if you’re using automated tools for testing. Continuous testing is simply a validation testing strategy where on a designated schedule defined by the validation engineer, a system is routinely tested to ensure that patches, and routine updates do not impact the validated state. Continuous testing is facilitated best through automation. When developing a reusable test script Library, you can easily conduct an impact analysis and determine which tests need to be rerun. Regression testing is made easy through validation test script automation. Therefore, it is strongly recommended that if you deploy systems in a cloud environment that you employ an enterprise validation management system that includes automated testing such as ValidationMaster™.

Cloud applications can be validated.  The strategy changes based on the cloud model selected.  Choose your partners carefully.  They assume an important role in the management of critical system assets.

It is not a question of if a cloud application can be validated, it is a question of the deployment model you choose and the validation strategy that will govern deployment. Regulators are not averse to you using the cloud – its all about managing control and risk.

 

Accelerating Validation in the 21st Century

Each day in companies across the globe, employees are being asked to do more with less.  The mantra of the business community in the 21st century is “accelerating business”.  You see it in marketing and all types of corporate communication.  In the validation world accelerating system implementation, validation and deployment is the cry of every chief information officer and senior executive.

Accelerating validation activity means different things to different people.  I define it as driving validation processes efficiently and effectively without sacrificing quality or compliance.  Most manual processes are paper-based, inefficient, and cumbersome.  Creating paper-based test scripts requires a lot of cutting/pasting of screenshots and other objective evidence to meet regulatory demands.  Getting validation information into templates with proper headers and footers in compliance with corporate documentation guidelines is paramount.

The question for today is “HOW DO WE ACCELERATE VALIDATION WITHOUT SACRIFICING QUALITY AND COMPLIANCE?”

Earlier in my career, I developed “Validation Toolkit” for Documentum and Qumas.  A Validation Toolkit was an early version of a validation accelerator.  Many companies now have them.  These toolkits come with pre-defined document templates and test scripts designed to streamline the validation process.  They are called “tookits” for a reason – they are not intended to be completed validation projects.  They are intended to be a starting point for validation exercises.

The FDA states in the General Principles of Software Validation; Final Guidance For Industry and FDA Staff issued on January 11, 2002, “…If the vendor can provide information about their system requirements, software requirements, validation process, and the results of their validation, the medical device manufacturer (or any company) can use that information as a beginning point for their required validation documentation…”

Notice, that the FDA says “… use a beginning point” for validation – NOT USE IT EXCLUSIVELY AS THE COMPLETE VALIDATION PACKAGE.  This is because the FDA expects each company to conduct its own due diligence for validation.  Also note that the FDA allows the use of such documentation if available from a vendor.

Beware that some vendors believe their “toolkits” can substitute for rigorous review of their software applications.  It cannot.  In addition to leveraging the software vendor information, you should ALWAYS conduct your own validation due diligence to ensure that applications are thoroughly examined without bias by your team or designated validation professional.

It is an excellent idea to leverage vendor information if provided.  There is a learning curve with most enterprise applications and the use of vendor-supplied information can help streamline the process and add value.  The old validation toolkits were delivered with a set of document templates and pre-defined test scripts often resulting in hundreds of documents depending on the application.  In most cases, companies would edit the documents or put them in their pre-defined validation templates for consistency.  This resulted in A LOT OF EDITING!  In some cases, depending on the number of documents, time-savings could be eroded by just editing the existing documents.

THERE IS A BETTER WAY!  

What if YOUR unique validation templates were pre-loaded into an automated Enterprise Validation Lifecycle Management system?   What if the Commercial-off-the-Shelf (COTS) product requirements were pre-loaded into this system?  Further, what if a full set of OQ/PQ test scripts were traced to each of the pre-loaded requirements?  What if you were able to export all of these documents day one in your unique validation templates?  What if you only had to add test scripts that represented the CHANGES you were making to a COTS application versus writing all of your test scripts from scratch?  Finally, what if you could execute your test scripts on line and automatically generate a requirements trace matrix and validation summary report with PASS/FAIL status?

This is the vision of the CloudMaster 365™  Validation Accelerator.  CloudMaster 365™  is the next generation of the validation toolkit concept.    The ValidationMaster Enterprise Validation Management system is the core of the CloudMaster 365™ Validation Accelerator.  The application can be either hosted or on-premise.  This is a great way to jump start your Microsoft Dynamics 365® validation project.

CloudMaster 365

The CloudMaster 365™ Validation Accelerator includes (3) key components:

  • ValidationMaster™ Enterprise Validation Management System
  • Comprehensive set of user requirements for out-of-the-box features
  • Full set of IQ/OQ/PQ test scripts and validation document templates

The Validation Accelerator is not intended to be “validation out of the box”.  It delivers the foundation for your current and future validation projects delivering “Level 5” validation processes.  If you consider the validation capability maturity model, most validation processes at level 1 are ad hoc, chaotic, undefined and reactive to project impulses and external events.

Validation Maturity Model

Level 5 validation is an optimized, lean validation process facilitated with automation.  The test process is optimized, quality control is assured and there is a specific measure for defect prevention and management.  Level 5 cannot be achieved without automation.  The automation is powered by ValidationMaster™.   With the CloudMaster 365™ strategy, instead of having “toolkit” documents suitable for only one project, you have a system that manages your COTS or bespoke development over time and manages the full lifecycle of validation over time as is required by regulators.  This is revolutionary in terms of what you get.

CloudMaster 365™ TRULY ACCELERATES VALIDATION.  It can not only be used for Microsoft Dynamics 365®, but is also available for other applications including:

  • CloudMaster 365™ Validation Accelerator For Dynamics 365®
  • CloudMaster 365™ Validation Accelerator For Merit MAXLife®
  • CloudMaster 365™ Validation Accelerator For Yaveon ProBatch®
  • CloudMaster 365™ Validation Accelerator For Oracle e-Business® or Oracle Fusion®
  • CloudMaster 365™ Validation Accelerator For SAP®
  • CloudMaster 365™ Validation Accelerator For BatchMaster®
  • CloudMaster 365™ Validation Accelerator For Edgewater EDGE®
  • CloudMaster 365™ Validation Accelerator For VeevaVault®
  • CloudMaster 365™ Validation Accelerator For Microsoft SharePoint®

Whether you are validating an enterprise application or seeking to drive your company to higher levels of efficiency, you should consider how best to accelerate your validation processes not solution by solution but in a more holistic way that ensures sustained compliance across all applications.  If you are embarking on the validation of Microsoft Dynamics 365®, or any of the applications listed above, the CloudMaster 365™ Validation Accelerator is a no-brainer.  It is the most cost-effective way to streamline validation and ensure sustained compliance over time.

Cloud Validation Strategies

If you were to ask me 10 years ago how many of my life sciences clients were deploying systems in the cloud environment I would’ve said may be perhaps one or two. If you ask me today how many of my clients are deploying cloud’s technologies I would say most all of them in one way or another.  The adoption of cloud technologies within life sciences companies is expanding at a rapid pace.

From a validation perspective, this trend has profound consequences.  Here are some key concerns and questions to be answered for any cloud deployment.

  1. How do you validate systems in a cloud environment?
  2. What types of governance do you need to deploy applications in a cloud environment?
  3. How do you manage change in a cloud environment?
  4. How do you maintain the validated state in the cloud?
  5. How can you ensure data integrity in the cloud?
  6. How do you manage cybersecurity in a cloud environment?

The answers to these questions are obvious and routine to validation engineers managing systems in an on-premise environment where the control of the environment is managed by the internal IT team.  They have control over changes, patches, system updates, and other factors that may impact the overall validated state.  In a cloud environment, the software, platform and infrastructure is delivered as a SERVICE.  By leveraging the cloud, life sciences companies are effectively outsourcing the management and operation of a portion of their IT infrastructure to the cloud provider.  However, compliance oversight and responsibility for your validated system cannot be delegated to the cloud provider.  Therefore, these services must have a level of control sufficient to support a validated systems environment.

For years, life sciences companies have been accustomed to governing their own systems environments.  They control how often systems are updated, when patches are applied, when system resources will be updated, etc.  In a cloud environment, control is in the hands of the cloud service provider.  Therefore, who you choose as your cloud provider matters.

So what should your strategy be to manage cloud-based systems?

  • Choose Your Cloud Provider Wisely – All cloud providers are not created equally.  The Cloud Security Alliance (https://cloudsecurityalliance.org/ ) is an excellent starting point for understanding cloud controls.  The Cloud Controls Matrix (CCM) is an Excel spreadsheet that allows you to assess a vendors readiness for the cloud.  You can download it free of charge from the CSA.
  • Establish Governance For The Cloud – You must have an SOP for the management and deployment of the cloud and ensure that this process is closely followed.  You also need an SOP for cyber security to provide a process for protecting validated systems against cyber threats.
  • Leverage Cloud Supplier Audit Reports For Validation – All cloud providers must adhere to standards for their environments.  Typically, they gain 3rd party certification and submit to Service Organization Control (SOC) independent audits.  It is recommended that you capture the SOC 1/2/3 and SSAE 16 reports.  You also want to understand any certifications that your cloud provider has.  I would archive their certifications and SOC reports with the validation package as part of my due diligence for the supplier audit.
  • Embrace Lean Validation Principles and Best Practices – eliminating waste and improving efficiency is essential in any validated systems environment.  Lean validation is derived from the principles of lean manufacturing.  Automation is a MUST.  You need to embrace lean principles for greater efficiency and compliance.
  • Automate Your Validation Processes – Automation and Lean validation go hand in hand.  The testing process is the most laborious process.  We recommend using a system like ValidationMaster™ to automate requirements management, test management and execution, incident management, risk management, validation quality management, agile validation project management, and validation content management. ValidationMaster™ is designed to power lean validation processes and includes built-in best practices to support this process.
  • Use a Risk-Based Approach To Validation – all validation exercises are not created equal.  The level of validation due diligence required for your project should be based on risk – regulatory, technical and business risks.  Conduct a risk assessment for all cloud-based systems.
  • Adopt Continuous Testing Best Practices – the cloud is under continuous change which seems in and of itself counter-intuitive to the validation process.  Continuous testing can be onerous if your testing process is MANUAL.  However, if you adopt lean, automated testing processes regression testing is easy.  You can establish a routine schedule for testing and if your cloud provider delivers a dashboard that tells you when patches/updates/features have been applied and the nature of them, you can select your regression testing plan based on a risk and impact assessment.

 

Cloud environments can be validated!  A clear, practical approach that embraces lean validation and continuous testing is key.  Cloud governance to ensure data integrity and sustained compliance is key.

Cloud technologies are here to stay.  Regulators don’t object to the use of the cloud, they want to know how you are managing it and ensuring the integrity of the data.  They also want you to confirm that you are maintaining the validated state in the cloud.  The principles of validation endure in the cloud.  Just because you are in a cloud environment does not mean validation principles no longer apply.  Consider the impact of cybersecurity in your cloud environment and adopt continuous testing strategies to ensure sustained compliance.

Accelerating Validation: 10 Best Practices You Should Know in 2018

As we open the new year with resolutions and new fresh thinking, I wanted to offer 10 best practices that should be adopted by every validation team.  The major challenges facing validation engineers every day include cyber threats against validated computer systems, data integrity issues, maintaining the validated state, cloud validation techniques and a myriad of other issues that affect validated systems.

I have been championing the concept of validation maturity and lean validation throughout the validation community.  In my recent blog post, I highlighted reasons why validation as we have known it over the past 40 years is dead.   Of course I mean this with a bit of tongue in cheek.

The strategies included in the FDA guidance and those we used to validate enterprise systems in the past do not adequately address topics of IoT, cloud technologies, cybersecurity, enterprise apps, mobility and other deployment strategies such as Agile change the way we think and manage validated systems.

The following paragraphs highlight the top 10 best practices that should be embraced for validated computer systems in 2018.

Practice 1 – Embrace Lean Validation Best Practices

Lean validation is the practice of eliminating waste and inefficiencies throughout the validation process while optimizing the process and ensuring compliance. Lean validation is derived from lean manufacturing practices where non-value-added activity is eliminated. As a best practice it is good to embrace lean validation within your processes. To fully embrace lean it is necessary to automate your validation processes thus it is strongly recommended that automation be part of your consideration for updating your validation processes in 2018.  My recent blog post on lean validation discusses lean in detail and how it can help you not only optimize your validation process but achieve compliance

Practice 2 – Establish Cybersecurity Qualification and SOP

I have noted in previous lock posts that cyber threats were the elephant in the room. Although validation engineers pledge to confirm that the system meets its intended use through the validation process, many organizations do not consider cyber security is a clear and present danger to validated systems. I believe that this is a significant oversight in many validation processes. Thus, I have developed a fourth leg for validation called cyber security qualification or CyQ. Cyber security qualification is an assessment of the readiness of your processes and controls to protect against a cyber event. It includes testing to confirm the effectiveness and the strength of your controls. In addition to IQ, OQ, and PQ, I have added CyQ as the fourth leg of my validation. It is strongly recommended in 2018 that you consider this best practice.

Practice 3 – Automate Validation Testing Processes

In the year 2018, many validation processes are still paper-based. Companies are still generating test scripts using Microsoft Excel or Microsoft Word and manually tracing test scripts to requirements. This is not only time-consuming but very inefficient and causes validation engineers to focus more on formatting test cases rather than finding bugs and other software anomalies that could affect the successful operation of the system. Lean validation requires an embrace of automated testing systems. While there are many point solutions on the market that address different aspects of validation testing, for 2018 it is strongly recommended that you embrace enterprise validation management as part of your overall strategy. An enterprise validation management such as ValidationMaster manage this the full lifecycle of computer systems validation. In addition to managing requirements (user, functional, design), the system also facilitates automated testing, incident management, release management, and agile software project methodologies. In 2018, the time has come to develop a Reusable Test Script Library to support future regression testing and maintaining the validated state.

An enterprise validation management system will also offer a single source of truth for validation and validation testing assets.  This is no longer a luxury but mandatory when you consider the requirement for continuous testing in the cloud. This is a best practice whose time has come. In 2018, establish automated validation processes to help maintain the validated state and ensure software quality.

Practice 4 – Develop a Cloud Management For Validated Systems SOP

many of today’s applications are deployed in the cloud. Office 365, dynamics 365 and other such applications have gained prominence and popularity within life sciences companies. In 2018, you must have a cloud management standard operating procedure that provides governance for all cloud activities. This SOP should define how you acquire cloud technologies and what your specific requirements are. Failure to have such an SOP is failure to understand and effectively manage cloud technologies for validated systems. I have often heard from validation engineers that since you are in the cloud the principles of validation no longer apply. This is absolutely not true. In a cloud environment, the principles of validation endure. You still must qualify cloud environments and the responsibilities between yourself and the cloud provider must be clearly defined. If you don’t have a cloud SOP or don’t know where to start in writing one, write me and I will send you a baseline procedure.

Practice 5 – Establish Validation Metrics and KPIs

Peter Drucker said you can’t improve what you can’t measure. It is very important that you establish validation metrics and key performance indicators in 2018 for your validation process.  You need to define what success looks like. How will you know when you’ve achieved or objectives? Some key performance indicators may include the number of errors, the number of systems validated, the number of incidents (this should be trending downward), and other such measures. Our ValidationMaster™ portal allows you to establish KPIs for validation and track them over time. In 2018 it’s important to embrace this concept. I have found over the years that most validation activities are not looked at from a performance perspective. They are often inefficient and time-consuming and no one actually measures the value to the organization regarding validation or any key performance indicators. The time is come to start looking at this and measuring it as is appropriate.

Practice 6 – Use Validation Accelerators For Enterprise Systems

Across the globe validation engineers are being asked to do more with less. Organizations are demanding greater productivity and compliance. As you look at enterprise technologies such as Microsoft Dynamics 365®, you need to consider the importance of this environment and how it should be validated. There is a learning curve with any enterprise technology. You should recognize that although you may know a lot about validating a system you may not know specifically about this technology and the ends and outs of this technology. The FDA stipulates and their guidance that you may use vendor information as a starting point for validation. Understand that there’s no such thing as validation out of the box therefore you must do your due diligence when it comes to validated systems.

In 2018 we recommend that for systems such as Microsoft Dynamics 365, BatchMaster™, Edgewater Fullscope EDGE®, Veeva® ECM systems, Oracle e-Business®, and many others, that you use a Validation Accelerator to jumpstart your validation process. Onshore Technology Group offers Validation Accelerators for the aforementioned software applications that will help streamline the validation process. Most noteworthy, our validation accelerators come bundled with a full enterprise validation management system, ValidationMaster.  This helps to deliver a single source of truth for validation.

Practice 7 – Fill Staffing Gaps With Experts on Demand

In 2018, you may find yourself in need of validation expertise for your upcoming projects but may not have sufficient resources internally to fulfill those needs. A burgeoning trend today is the outsourcing of validation staff to achieve your objectives – validation staffing on demand. OnShore offers an exclusive service known as ValidationOffice℠ which provides validation team members on demand. We recruit and retain top talent for your next validation project. We offer a validation engineers, technical writers, validation project managers, validation test engineers, validation team leaders, and other such staff with deep domain experience and validation confidence. In 2018 you may want to forgo direct hires and use ValidationOffice℠ to fulfill your next staffing need.

Practice 8 – Establish Validation Test Script Library

One of the primary challenges with validation testing is that of regression testing. Once a system is validated and brought into production any software changes, patches, updates, or other required changes drive the need for regression testing. This often results in a full rewrite of manual, paper-based test scripts. Today’s validation engineers have tools at their disposal that eliminates the need for rewrite of test scripts and the ability to pull test scripts required for regression testing from a reusable test script Library. This is the agile, lean way to conduct validation testing. In 2018, you should consider purchasing an enterprise validation management system to manage a reusable test script Library. This is a process that will save you both time and money in the development and online execution of validation test scripts.

Your test scripts can be either fully automated or semi-automated. Fully automated test scripts can be run automatically with no human intervention. Semi-automated validation test scripts are used by the validation engineer to conduct validation testing online. It should be noted that ValidationMaster supports both. Check out demo to learn more about how this works.

Practice 9 – Develop a Strategy For Data Integrity

The buzzword for 2018 is data integrity. Europe just announced its GDPR regulations for data integrity.  The new regulations hold you accountable for the integrity of data in your system and the privacy of information housed therein.  In 2018 it is imperative that you have a data integrity strategy and that you look at the regulations surrounding data integrity to ensure that you comply. In the old days when we used to do validation, it was left up to each validation engineer when they would do system updates and how often they would validate their systems. Data integrity and cyber security as well as cloud validation demand strategies that include continuous testing. I discuss this in a previous blog post but it is very important that you consider testing your systems often to achieve a high level of integrity not only of the application but of the data.

Practice 10 – Commit to Achieve Level 5 Validation Process Maturity

Finally, your validation processes must mature in 2018. It is no longer acceptable or feasible in most organizations that are attempting to improve efficiency and compliance to simply continue with antiquated, inefficient paper-based systems. Therefore in 2018 it will be important for you to commit to Level 5 validation process maturity where your processes are automated and you have greater control over your validated systems and sustain quality across the validation lifecycle. Check out my other post on achieving Level 5 process maturity.

I wish you all the best in 2018 as you endeavor to establish and maintain the validated state for computer systems. From the development of Level 5 validation processes through automation it’s important to understand the changes that new technology brings and how well you and your organization can adapt to those changes. Your success depends on it! Good luck and keep in touch!

 

Leveraging the NIST Cybersecurity Framework

As a validation engineer, why should you be concerned about Cybersecurity?  Good question!  Today’s headlines are filled with instances of cyber attacks and data breaches impacting some of the largest corporate systems around.  As validation engineers, our job is to confirm software quality and that systems meet their intended use.  How can you realistically do this without paying any attention to the threat of potential cyber attacks on validated system environment.

As with every system environment, you must ensure your readiness to help prevent a cyber event from occurring.  Of course, you can never fully protect your systems to the extent that a cyber attack will never be successful, but you can certainly PREPARE and reduce the probability of this risk.   That’s what this article is all about – PREPAREDNESS.

The NIST Cybersecurity Framework was created through collaboration between industry and government and consists of standards, guidelines, and practices to promote the protection of critical infrastructure.

To get a copy of the NIST Cyber Security Framework publication, click here.  If you are not familiar with the NIST Cyber Security Framework, you can view an overview video and get a copy of the Excel spreadsheet.

Remember the old addage, “…if its not documented, it did’nt happen…”?  You must document controls, processes and strategies to ensure that you are able to defend your readiness assessment for cybersecurity.  The NIST Cyber Security Framework is designed to help organizations view cybersecurity in a systematic way as part of your overall risk management strategy for validated systems.   The Framework consists of three parts:

  1. Framework Core – a set of cybersecurity activities, outcomes, and informative references that are common across your validated systems environments.   The Framework Core consists of (5) concurrent and continuous Functions which are: (1) Identify, (2) Protect, (3) Detect, (4) Respond, (5) Recover as shown in the figure below.
  2. Framework Profile – help align your cybersecurity activities with business requirements, risk tolerances, and resources
  3. Framework Implementation Tiers – a method to view, assess, document and understand the characteristics of your approach to managing cybersecurity risks in validated systems environments.  This is assessment is part of your Cybersecurity Qualification (CyQ).  Life sciences companies should characterize their level of readiness from Partial (Tier 1) to Adaptive (Tier 4).  You can use what ever scale you like in your assessment.

NIST-cybersecurity-framework

Most companies are adept at RESPONDING to cyber events rather than preventing them.  This Framework, as part of your overall integrated risk management strategy for validation.  We recommend for validation engineers that you DOCUMENT your strategy to confirm your due diligence with respect to cybersecurity.  In my previous blog post, I recommended that in addition to conducting IQ, OQ, PQ, and UAT testing that you also conduct a CyQ readiness assessment.

Cyber threats are a clear and present danger to companies of all sizes and types.  As validation engineers, we need to rethink our validation strategies and adapt to changes which can have significant impact on our validated systems environments.  Whether you are in the cloud or on-premise, cyber threats are real and may impact you.  This problem is persistent and is not going away anytime soon.  Readiness and preparedness is the key.  Some think that issues concerning cybersecurity are only the perview of the IT team – THINK AGAIN!  Cybersecurity is not only an IT problem, it is an enterprise problem that requires an interdisciplinary approach and a comprehensive governance commitment to ensure that all aspects of your validation processes and business processes are aligned to support effective cybersecurity practices.

If you are responsible for software quality and ensuring the readiness of validated you need to be concerned about this matter.  The threats are real.  The challenges are persistent.  The need for greater diligence is upon us.  Check out the NIST Cyber Security Framework.  Get your cyber house in order.

 

Computer Systems Validation As We Know It Is DEAD

Over the past 10 years, the software industry has experienced radical changes.  Enterprise applications deployed in the cloud, the Internet of Things (IoT), mobile applications, robotics, artificial intelligence, X-as-a-Service, agile development, cybersecurity challenges and other technology trends force us to rethink strategies for ensuring software quality.  For over 40 years, validation practices have not changed very much.  Suprisingly, many companies still conduct computer systems validation using paper-based processes.  However, the trends outlined above challenge some of the current assumptions about validation.  I sometimes hear people say “… since I am in the cloud, I don’t have to conduct an IQ…” or they will say, “… well my cloud provider is handling that…”

Issues related to responsibility and testing are changing based on deployment models and development lifecycles.  Validation is designed to confirm that a system meets its intended use.  However, how can we certify that a system meets its intended use if it is left vulnerable to cyber threats?  How can we maintain the validated state over time in production if the cloud environment is constantly changing the validated state?  How can we adequately test computer systems if users can download an “app” from the App Store to integrate with a validated system?  How can we ensure that we are following proper controls for 21 CFR Part 11 if our cloud vendor is not adhering to CSA cloud controls?  How can we test IoT devices connected to validated systems to ensure that they work safely and in accordance with regulatory standards?

You will not find the answers to any of these questions in any regulatory guidance documents.  Technology is moving at the speed of thought yet our validation processes are struggling to keep up.

These questions have led me to conclude that validation as we know it is DEAD.  The challenges imposed by the latest technological advances in agile software development, enterprise cloud applications, IoT, mobility, data integrity, privacy and cybersecurity are forcing validation engineers to rethink current processes.

Gartner group recently announced that firms using IoT grew from 29% in 2015 to 43 % in 2016.  They project that by the year 2020, over 26 billion devices will be IoT-devices.  it should be noted that Microsoft’s Azure platform includes a suite of applications for remote monitoring, predictive maintenance and connected factory monitoring for industrial devices.  Current guidance has not kept pace with ever-changing technology yet the need for quality in software applications remains a consistent imperative.

So how should validation engineers change processes to address these challenges?

First, consider how your systems are developed and deployed.  The V-model assumes a waterfall approach yet most software today is developed using Agile methodologies.  It is important to take this into consideration in your methodologies.

Secondly, I strongly recommend adding two SOPs to your quality procedures – a Cybersecurity SOP for validated computer systems and a Cloud SOP for validated systems.  You will need these two procedures to provide governance for your cloud processes.  (If you do not have a cloud or cybersecurity SOP please contact me and I will send you both SOPs.)

Third, I believe you should incorporate cybersecurity qualification (CyQ) into your testing strategy.  In addition to IQ/OQ/PQ, you should be conducting a CyQ readiness assessment for all validated systems.  A CyQ is an assessment to confirm and document your readiness to protect validated systems against a cyber attack.  It also includes testing to validate current protections for your validated systems.  It is important to note that regulators will judge you on your PROACTIVE approach to compliance.  This is an important step in that direction.

cyq-1

Forth, you should adopt lean validation methodologies.  Lean validation practices are designed to eliminate waste and inefficiency throughout the validation process while ensuring sustained compliance.

Finally, the time has come for automation.  To keep pace with the changes in current technology as discussed above, you MUST include automation for requirements management, validation testing, incident management and validation quality assurance (CAPA, NC, audit management, training, et al).  I recommend consideration of an Enterprise Validation Management system such as ValidationMaster™ to support the full lifecycle of computer systems validation.  ValidationMaster™  allows you to build a re-usable test script library and represents a “SINGLE SOURCE OF TRUTH” for all of your validation projects.  Automation of the validation process is no longer a luxury but a necessity.

Advanced technology is moving fast.  The time is now to rethink your validation strategies for the 21st century.  Validation as we know it is dead.  Lean, agile validation processes are demanded to keep pace with rapidly changing technology.  As you embrace the latest cloud, mobile and IoT technologies, you will quickly find that the old ways of validation are no longer sufficient.  Cyber criminals are not going away but you need to be ready. Step into LEAN and embrace the future!