Can Microsoft Azure Be Validated?

The text book definition of computer systems validation (paraphrased) is “documented evidence that a system performs according to its intended use”.  The FDA uses the definition “…Validation is a process of demonstrating, through documented evidence, that <software applications> will consistently produce the results that meets predetermined specifications and quality attributes.”

Using the FDA’s definition, one should understand that validation is a PROCESS.  It is not a one-time event.  Also, looking at this definition, one should understand that Agency expects to see “documented evidence” that a system will consistently produce results that meet predetermined specification and quality attributes.  And finally, from the FDA’s definition of validation, we learn that to properly validate a system, one MUST have “predetermined specifications and quality attributes” defined.

Let’s start at the beginning with the question posed by this blog post “Can Microsoft Azure® Be Validated?”  The short answer is YES.  The remainder of this post I will share 6 key reasons why this bold statement can be made.  But first, let’s establish some foundation as to what Microsoft Azure® (“Azure®”) is and why this question is relevant.

At its core, Azure is a cloud computing platform and infrastructure created by Microsoft for building, deploying, and managing applications and services through a global network of Microsoft-managed datacenters (Wikipedia).  In today’s global economy, life sciences companies, like many of their industry counterparts are seeking to do more with less.    These companies are embracing enterprise technologies and leveraging innovation to drive global business and regulatory processes.  If you had asked me 10 years ago, how many companies I am working with are deploying enterprise technologies in the cloud, my answer would have been very few, if any.  If you further asked me 10 years ago, how many cloud validation exercises in life sciences was I conducting, I would have said NONE.

Suffice to say, cloud computing has been a game-changer for life sciences companies.  It should be noted that due to regulatory considerations, the life sciences industry has been reluctant to change validation practices.  But the cloud is changing the process of validation as we know it.  Sure, the basic principles of risk-based validation endure.  You STILL have to provide documented evidence that your software applications will consistently produce the results that meets predetermined specifications and quality attributes.  HOW you do this is what is changing.  Validating cloud applications is a bit different than validating on-premise application mostly due to the changing roles and responsibilities of those responsible for the system and the cloud providers themselves.  I will say at the outset that all cloud providers are not created equally so “caveat emptor” (let the buyer beware).

Let’s talk about the 6 key reasons why Azure can be validated and how this is done.

SIX REASONS WHY AND HOW AZURE CAN BE VALIDATED.

REASON 1 – AZURE RISK ASSESSMENT

Risk management is not a new concept.  It has been around since the 1940’s applied to military and aerospace industries before it was broadened out.  The first step crucial step in any computer systems validation project is to conduct a validation risk assessment.  Among determining the impact on critical quality attributes, a risk assessment is used to determine the level of validation due diligence required by the system.   How much validation testing you should do for any validation exercise should be based on a risk assessment of the system.  When deploying systems such as Microsoft Dynamics AX® or BatchMaster/GP® using the Azure platform, the risk assessment should include an assessment of the Azure platform itself.  When outsourcing your platform to a vendor such as Microsoft, their risks become your risks thus, you must conduct your due diligence to ensure that these risks are acceptable in your organization.

The ISPE GAMP 5® (“GAMP 5®”) risk assessment model defines the process for hazard identification, estimation of risk, evaluation of risk, control and monitoring of risks and the process for documenting risk.  GAMP 5® speaks of a 5-step process including:

  1. Perform Initial Risk Assessment and Determine System Impact
  2. Identify Functions With Impact On Patient Safety, Product Quality and Data Integrity
  3. Perform Functional Risk Assessment and Identify Controls
  4. Implement and Verify Appropriate Controls
  5. Review Risks and Monitor Controls

GAMP 5® does not explicitly address cloud infrastructure.    One could extrapolate and interpret how this standard may be applied to cloud infrastructure but it is one of the limitations of applying a “clean” translation.  Azure could be treated as a Commercial-off-the-Shelf (COTS) platform.  You will note that the risk process identified by GAMP 5® is more application-focused than platform-focused.  To conduct a proper risk assessment of the Azure, a new framework must be established to conduct a risk assessment for the platform.  In July 2011, ISACA, previously known as the Information Systems Audit and Control Association, released a document called “IT Control Objectives for Cloud Computing: Controls and Assurance in the Cloud” which highlighted specific control objectives and risks inherent in cloud computing.  From a validation perspective, you need to address availability/reliability risks, performance/capacity risks, as well as security and cybersecurity risks.  Using the publicly available independent service organization controls (SOC 1/2/3) reports, you should be able to effectively document and assess these risks.  Remember, if it’s not documented, it didn’t happen!

REASON 2 – CHANGE CONTROL (“THE ELEPHANT IN THE ROOM”)

One of the biggest challenges with companies adoption of cloud computing is CHANGE CONTROL.   As you may be well aware, cloud environments are under constant change.  In the validation world, change control is an essential process to help ensure that validated systems remain in a validated state.  So how do you maintain the validated state when a platform such as Azure is in a constant state of change?  First, all validated systems are subject to change control.  Each time a system is changed, the change, reason for change and who made the change must be documented.  When the system is on-premise, changes are documented (in general) by the IT department.  The responsibility for the infrastructure belongs to this group in an on-premise scenario.  In a cloud environment, this responsibility belongs to the cloud provider.  SOC reports highlight and define change control processes and procedures for the cloud environment.

We recommend to our clients that they obtain a copy of these reports under non-disclosure to ensure compliance of the cloud provider.  Once in the cloud, we recommend developing a testing and regression testing strategy that tests systems in a cloud environment on a more frequent basis.  In the ‘80’s and ‘90’s, validation engineers were very reluctant to change validated systems due to the complexities and level of effort involved in testing validated systems using manual testing processes.  Today, companies are leveraging tools like OnShore’s ValidationMaster™ to facilitate automated testing that allows validation test engineers to schedule the run of automated test scripts in a cloud environment.  ValidationMaster™ takes the complexity from regression testing and frees up resources for other work.   Change control is essential in on-premise and cloud environments.   To support the cloud, you will need to establish clear policies and procedures that reference both on-premise as well as cloud computing systems.  With respect to change control, it is very important that your policies reflect procedures in a cloud environment.

Microsoft has established clear controls for Azure to manage change control in their environment.  These procedures have been documented and tested as per their SOC 1/SOC 2 control reports issued by independent auditors.  When validating systems in an Azure environment, you will have to get copies of these reports and them with your validation package.    The supplier audit becomes crucial in a hosted environment.  The good news is that Microsoft is a reputable supplier and can be relied on as a stable vendor with respect to its products.

Microsoft Azure has undergone many independent audits including a 2013 ISO/IEC 27001:2005 audit report that confirms that Microsoft has procedures in place that provide governance for change management processes.  The audit report highlights both the methodology confirms that changes to the Azure environment are appropriately tested and approved.

REASON 3 – INDEPENDENT AUDIT CONFIRMATION OF TECHNICAL/PROCEDURAL CONTROLS AND SECURITY

In cloud environments, the role of the cloud provider is paramount.  All cloud providers are not created equal.   Microsoft Azure has passed many audits and these reports, as previously noted are available upon request.  Security and cybersecurity issues are very important in the Azure environment.  Microsoft has implemented controls and procedures to ensure compliance.   The company has provided detailed information on security on their website (https://www.microsoft.com/en-us/TrustCenter/Security/default.aspx )

REASON 4 – INFRASTRUCTURE QUALIFICATION (IQ)

IQ testing changes in the Azure environment.  In the Azure environment, Microsoft bears some responsibility for software installation of hardware and software.  As previously mentioned, it is important to qualify the environment once it is setup to confirm that what is supposed to be installed is present in your environment.  We qualify the virtual software layer as well as Microsoft Lifecycle Services.

REASON 5 – OPERATIONAL/PERFORMANCE QUALIFICATION (OQ/PQ)

The process of operational qualification of the application is very similar in a cloud environment to the process in an on-premise environment.  Microsoft Dynamics AX now can easily be deployed in an Azure environment.   Validation testing of this type of implementation with some variation.  The installation of Microsoft Dynamics AX uses what is called Lifecycle Services which allows you to setup and install the system rapidly using automation.  Lifecycle Services may be subject to validation itself since it is used to establish a validated systems environment.  You will need to identify the failure of Lifecycle Services as a risk and rate the risk of using this service as part of the risk assessment.

Lifecycle Services is promoted as bringing predictability and repeatability to the process of software installation.  These are principles upon which validation has stood for many years.  However, it is important to go beyond the marketing to verify that this software layer is performing according to its intended use.  We typically draft a qualification script to qualify Lifecycle Services prior to using it in a validated systems environment.  We conduct OQ/PQ testing in the same manner as in a non-hosted environment.

REASON 6 – ABILITY TO MAINTAIN THE VALIDATED STATE

With Microsoft Azure, the need to maintain the validated state does not go away.  It is very possible to maintain your mission-critical applications in a validated state that are hosted using the Azure platform.  To ensure that systems are maintained in a validated state in the Azure environment, we recommend establishing policies and procedures and adopting the test early/test often premise to ensure that changes made to the infrastructure do not negatively impact the validated production systems environment.  It is possible to maintain the validated state using Azure.  It must be done with a combination of technology, processes and procedures to ensure compliance.

As a validation engineer, I have validated many systems on the Azure platform.  I have concluded that the Azure platform can be validated and used within the life sciences environment.   How much validation due diligence you should conduct should be based on risk.  Don’t over do it!  Use lean methodologies to drive your validation processes.

Upon close inspection of this platform, it has built-in quality and security controls to provide a level of assurance of its suitability for deployment in regulated environments.   I like the idea that the system has been independently inspected by numerous 3rd party organizations to help ensure objectivity.  This platform is ready for prime time and many of my clients are leveraging its benefits and freeing themselves from the burden of managing their infrastructure.

Why Are You Still Generating Validation Test Scripts Manually?

Drafting validation scripts is one of the key activities in a validation exercise designed to provide document evidence that a system performs according to its intended use.  The FDA and other global agencies require objective evidence, usually in the form of screen shots that sequentially capture the target software process, to provide assurance that systems can consistently and repeatedly perform the various processes representing the intended use of the system.

Since the advent of the PC, validation engineers have been writing validation test scripts manually.  The manual process of computer systems validation test script development involves capturing screenshots and pasting them into Microsoft Word test script templates.  To generate screen captures, some companies use tools such as Microsoft Print Screen, TechSmith SnagIT, and other such tools.  A chief complaint of many validation engineers is that the test script development process is a slow, arduous one.  Some validation engineers are very reluctant to update/re-validate systems due to this manual process.  So, the question posed by this blog article is simply this: “Why are you still generating test scripts manually???”

I have been conducting validation exercises for engineering and life sciences systems since the early 1980’s.  I too have experienced first-hand the pain of writing test scripts manually.  We developed and practice “lean validation” so I sought ways to eliminate manual, wasteful validation processes.  One of the most wasteful processes in validation is the manual capture/cutting/pasting of screenshots into a Microsoft Word document.

The obvious follow up question is “how do we capture validation tests in a more automated manner to eliminate waste and create test scripts that are complete, accurate and provide the level of due diligence required for validation?”

In response to this common problem, we developed an Enterprise Validation Management system called ValidationMaster™.  This system includes TestMaster™, an automated testing system that allows validation engineers to capture and execute validation test scripts in a cost-effective manner.

TestMaster™ is designed to validate ANY enterprise or desktop application.  It is a browser-based system and allows test engineers to open any application on their desktop, launch TestMaster™, and capture their test scripts while sequentially executing the various commands in their applications.    As the validation engineer navigates through the application, TestMaster™ captures each screenshot and text entry entered in the application.

Once the test script is saved, TestMaster™ allows the script to be published in your UNIQUE test script template with the push of a button.  No more cutting/pasting screenshots from manual processes!  You can generate your test scripts in MINUTES as opposed to the hours it sometimes takes to compile documents based on a series of screenshots.  If you are one of those validation engineers that does not like screenshots in your scripts, you can easily create text-based processes both quickly and easily using TestMaster™.

So, what is the biggest benefit of using TestMaster™ versus manual processes?  There are three key benefits which are summarized as follows:

  1.  Automated Test Script Execution– for years, validation engineers have wanted a more automated approach for the execution of validation test scripts.  ValidationMaster™ supports both hands-on or hands-off validation testing.  Hands-on validation testing is the process whereby a validation engineer looks at each step of a validation test script and executes the script step-by-step by clicking through the process.  Hands off validation allows a validation engineer to execute a test script with no human intervention.  This type of regression testing (hands off) is very useful for cloud-based systems or systems that require more frequent testing.  The validation engineer simply selects a test script and defines a date/time for its execution.  At the designated time with no human intervention, the system executes the test script and reports the test results back to the system.   Automated testing is here!  Why are you still doing this manually?

  1.  Traceability– TestMaster™ allows validation engineers to link each test script to a requirement or set of requirements, thus the system delivers automatic traceability which is a regulatory requirement.  With the click of a button, TestMaster™ allows validation engineers to create a test script directly from a requirement.  This is powerful capability that allows you to see requirements coverage through our validation dashboard on demand.  This validation dashboard is viewable on a desktop or mobile device (Windows, Apple, Android).

  1.  Test Script Execution– One of the biggest problems with manual test scripts is that they must be printed and manually routed for review and approval.  Some companies who have implemented document management systems may have the ability to route the scripts around electronically for review and approval.  The worst-case scenario is the company that has no electronic document management system and generates these documents manually.  TestMaster™ allows validation engineers to execute test scripts online and capture test script results in an easy manner.  The test script results can be captured in an automated way and published into executed test script templates quickly and easily.   If incidents (bugs/anomalies) occur during testing, users have the ability to automatically capture an incident report which is tied to the exact step where the anomaly/bug occurred.  Once completed, ValidationMaster™ is tightly integrated with a 21 CFR Part 11-compliant portal (ValidationMaster Portal™). Once the test script is executed, is it automatically published to the ValidationMaster™ Portal where it is routed for review/approval in the system.  The ability to draft, route, review, approve, execute and post-approve validation test scripts is an important, time/cost saving feature that should be a part of any 21stcentury validation program.

  1.  Reuse Test Scripts For Regression Testing– manual test scripts are not ‘readily’ reusable.  What I mean by this is that the Word documents must be edited or even re-written for validation regression testing.  Validation is not a one-time process.  Regression testing is a fact of life for validation engineers.  The question is, will you rewrite all of your test scripts or use automated tools to streamline the process.  ValidationMaster™ allows validation engineers to create a reusable test script library.  This library includes all the test scripts that make up your validation test script package.  During re-validation exercises, you have the ability to reuse the same test scripts for regression testing.

Given the rapid adoption of cloud, mobile and enterprise technologies in life sciences, a new approach to validation is required.  Yes, you can still conduct validation exercises on paper but why would you?  In the early days of enterprise technology, we did not have tools available that would facilitate the rapid development of validation test scripts.  Today, that is not the case.  Systems like ValidationMaster™ are available in either a hosted or on-premise environment.  These game-changing systems are revolutionizing the way validation is conducted and offering time/cost-saving features that make this process easier.   So why are you still generating test scripts manually?

Accelerating Validation: 10 Best Practices You Should Know in 2018

As we open the new year with resolutions and new fresh thinking, I wanted to offer 10 best practices that should be adopted by every validation team.  The major challenges facing validation engineers every day include cyber threats against validated computer systems, data integrity issues, maintaining the validated state, cloud validation techniques and a myriad of other issues that affect validated systems.

I have been championing the concept of validation maturity and lean validation throughout the validation community.  In my recent blog post, I highlighted reasons why validation as we have known it over the past 40 years is dead.   Of course I mean this with a bit of tongue in cheek.

The strategies included in the FDA guidance and those we used to validate enterprise systems in the past do not adequately address topics of IoT, cloud technologies, cybersecurity, enterprise apps, mobility and other deployment strategies such as Agile change the way we think and manage validated systems.

The following paragraphs highlight the top 10 best practices that should be embraced for validated computer systems in 2018.

Practice 1 – Embrace Lean Validation Best Practices

Lean validation is the practice of eliminating waste and inefficiencies throughout the validation process while optimizing the process and ensuring compliance. Lean validation is derived from lean manufacturing practices where non-value-added activity is eliminated. As a best practice it is good to embrace lean validation within your processes. To fully embrace lean it is necessary to automate your validation processes thus it is strongly recommended that automation be part of your consideration for updating your validation processes in 2018.  My recent blog post on lean validation discusses lean in detail and how it can help you not only optimize your validation process but achieve compliance

Practice 2 – Establish Cybersecurity Qualification and SOP

I have noted in previous lock posts that cyber threats were the elephant in the room. Although validation engineers pledge to confirm that the system meets its intended use through the validation process, many organizations do not consider cyber security is a clear and present danger to validated systems. I believe that this is a significant oversight in many validation processes. Thus, I have developed a fourth leg for validation called cyber security qualification or CyQ. Cyber security qualification is an assessment of the readiness of your processes and controls to protect against a cyber event. It includes testing to confirm the effectiveness and the strength of your controls. In addition to IQ, OQ, and PQ, I have added CyQ as the fourth leg of my validation. It is strongly recommended in 2018 that you consider this best practice.

Practice 3 – Automate Validation Testing Processes

In the year 2018, many validation processes are still paper-based. Companies are still generating test scripts using Microsoft Excel or Microsoft Word and manually tracing test scripts to requirements. This is not only time-consuming but very inefficient and causes validation engineers to focus more on formatting test cases rather than finding bugs and other software anomalies that could affect the successful operation of the system. Lean validation requires an embrace of automated testing systems. While there are many point solutions on the market that address different aspects of validation testing, for 2018 it is strongly recommended that you embrace enterprise validation management as part of your overall strategy. An enterprise validation management such as ValidationMaster manage this the full lifecycle of computer systems validation. In addition to managing requirements (user, functional, design), the system also facilitates automated testing, incident management, release management, and agile software project methodologies. In 2018, the time has come to develop a Reusable Test Script Library to support future regression testing and maintaining the validated state.

An enterprise validation management system will also offer a single source of truth for validation and validation testing assets.  This is no longer a luxury but mandatory when you consider the requirement for continuous testing in the cloud. This is a best practice whose time has come. In 2018, establish automated validation processes to help maintain the validated state and ensure software quality.

Practice 4 – Develop a Cloud Management For Validated Systems SOP

many of today’s applications are deployed in the cloud. Office 365, dynamics 365 and other such applications have gained prominence and popularity within life sciences companies. In 2018, you must have a cloud management standard operating procedure that provides governance for all cloud activities. This SOP should define how you acquire cloud technologies and what your specific requirements are. Failure to have such an SOP is failure to understand and effectively manage cloud technologies for validated systems. I have often heard from validation engineers that since you are in the cloud the principles of validation no longer apply. This is absolutely not true. In a cloud environment, the principles of validation endure. You still must qualify cloud environments and the responsibilities between yourself and the cloud provider must be clearly defined. If you don’t have a cloud SOP or don’t know where to start in writing one, write me and I will send you a baseline procedure.

Practice 5 – Establish Validation Metrics and KPIs

Peter Drucker said you can’t improve what you can’t measure. It is very important that you establish validation metrics and key performance indicators in 2018 for your validation process.  You need to define what success looks like. How will you know when you’ve achieved or objectives? Some key performance indicators may include the number of errors, the number of systems validated, the number of incidents (this should be trending downward), and other such measures. Our ValidationMaster™ portal allows you to establish KPIs for validation and track them over time. In 2018 it’s important to embrace this concept. I have found over the years that most validation activities are not looked at from a performance perspective. They are often inefficient and time-consuming and no one actually measures the value to the organization regarding validation or any key performance indicators. The time is come to start looking at this and measuring it as is appropriate.

Practice 6 – Use Validation Accelerators For Enterprise Systems

Across the globe validation engineers are being asked to do more with less. Organizations are demanding greater productivity and compliance. As you look at enterprise technologies such as Microsoft Dynamics 365®, you need to consider the importance of this environment and how it should be validated. There is a learning curve with any enterprise technology. You should recognize that although you may know a lot about validating a system you may not know specifically about this technology and the ends and outs of this technology. The FDA stipulates and their guidance that you may use vendor information as a starting point for validation. Understand that there’s no such thing as validation out of the box therefore you must do your due diligence when it comes to validated systems.

In 2018 we recommend that for systems such as Microsoft Dynamics 365, BatchMaster™, Edgewater Fullscope EDGE®, Veeva® ECM systems, Oracle e-Business®, and many others, that you use a Validation Accelerator to jumpstart your validation process. Onshore Technology Group offers Validation Accelerators for the aforementioned software applications that will help streamline the validation process. Most noteworthy, our validation accelerators come bundled with a full enterprise validation management system, ValidationMaster.  This helps to deliver a single source of truth for validation.

Practice 7 – Fill Staffing Gaps With Experts on Demand

In 2018, you may find yourself in need of validation expertise for your upcoming projects but may not have sufficient resources internally to fulfill those needs. A burgeoning trend today is the outsourcing of validation staff to achieve your objectives – validation staffing on demand. OnShore offers an exclusive service known as ValidationOffice℠ which provides validation team members on demand. We recruit and retain top talent for your next validation project. We offer a validation engineers, technical writers, validation project managers, validation test engineers, validation team leaders, and other such staff with deep domain experience and validation confidence. In 2018 you may want to forgo direct hires and use ValidationOffice℠ to fulfill your next staffing need.

Practice 8 – Establish Validation Test Script Library

One of the primary challenges with validation testing is that of regression testing. Once a system is validated and brought into production any software changes, patches, updates, or other required changes drive the need for regression testing. This often results in a full rewrite of manual, paper-based test scripts. Today’s validation engineers have tools at their disposal that eliminates the need for rewrite of test scripts and the ability to pull test scripts required for regression testing from a reusable test script Library. This is the agile, lean way to conduct validation testing. In 2018, you should consider purchasing an enterprise validation management system to manage a reusable test script Library. This is a process that will save you both time and money in the development and online execution of validation test scripts.

Your test scripts can be either fully automated or semi-automated. Fully automated test scripts can be run automatically with no human intervention. Semi-automated validation test scripts are used by the validation engineer to conduct validation testing online. It should be noted that ValidationMaster supports both. Check out demo to learn more about how this works.

Practice 9 – Develop a Strategy For Data Integrity

The buzzword for 2018 is data integrity. Europe just announced its GDPR regulations for data integrity.  The new regulations hold you accountable for the integrity of data in your system and the privacy of information housed therein.  In 2018 it is imperative that you have a data integrity strategy and that you look at the regulations surrounding data integrity to ensure that you comply. In the old days when we used to do validation, it was left up to each validation engineer when they would do system updates and how often they would validate their systems. Data integrity and cyber security as well as cloud validation demand strategies that include continuous testing. I discuss this in a previous blog post but it is very important that you consider testing your systems often to achieve a high level of integrity not only of the application but of the data.

Practice 10 – Commit to Achieve Level 5 Validation Process Maturity

Finally, your validation processes must mature in 2018. It is no longer acceptable or feasible in most organizations that are attempting to improve efficiency and compliance to simply continue with antiquated, inefficient paper-based systems. Therefore in 2018 it will be important for you to commit to Level 5 validation process maturity where your processes are automated and you have greater control over your validated systems and sustain quality across the validation lifecycle. Check out my other post on achieving Level 5 process maturity.

I wish you all the best in 2018 as you endeavor to establish and maintain the validated state for computer systems. From the development of Level 5 validation processes through automation it’s important to understand the changes that new technology brings and how well you and your organization can adapt to those changes. Your success depends on it! Good luck and keep in touch!

 

Computer Systems Validation As We Know It Is DEAD

Over the past 10 years, the software industry has experienced radical changes.  Enterprise applications deployed in the cloud, the Internet of Things (IoT), mobile applications, robotics, artificial intelligence, X-as-a-Service, agile development, cybersecurity challenges and other technology trends force us to rethink strategies for ensuring software quality.  For over 40 years, validation practices have not changed very much.  Suprisingly, many companies still conduct computer systems validation using paper-based processes.  However, the trends outlined above challenge some of the current assumptions about validation.  I sometimes hear people say “… since I am in the cloud, I don’t have to conduct an IQ…” or they will say, “… well my cloud provider is handling that…”

Issues related to responsibility and testing are changing based on deployment models and development lifecycles.  Validation is designed to confirm that a system meets its intended use.  However, how can we certify that a system meets its intended use if it is left vulnerable to cyber threats?  How can we maintain the validated state over time in production if the cloud environment is constantly changing the validated state?  How can we adequately test computer systems if users can download an “app” from the App Store to integrate with a validated system?  How can we ensure that we are following proper controls for 21 CFR Part 11 if our cloud vendor is not adhering to CSA cloud controls?  How can we test IoT devices connected to validated systems to ensure that they work safely and in accordance with regulatory standards?

You will not find the answers to any of these questions in any regulatory guidance documents.  Technology is moving at the speed of thought yet our validation processes are struggling to keep up.

These questions have led me to conclude that validation as we know it is DEAD.  The challenges imposed by the latest technological advances in agile software development, enterprise cloud applications, IoT, mobility, data integrity, privacy and cybersecurity are forcing validation engineers to rethink current processes.

Gartner group recently announced that firms using IoT grew from 29% in 2015 to 43 % in 2016.  They project that by the year 2020, over 26 billion devices will be IoT-devices.  it should be noted that Microsoft’s Azure platform includes a suite of applications for remote monitoring, predictive maintenance and connected factory monitoring for industrial devices.  Current guidance has not kept pace with ever-changing technology yet the need for quality in software applications remains a consistent imperative.

So how should validation engineers change processes to address these challenges?

First, consider how your systems are developed and deployed.  The V-model assumes a waterfall approach yet most software today is developed using Agile methodologies.  It is important to take this into consideration in your methodologies.

Secondly, I strongly recommend adding two SOPs to your quality procedures – a Cybersecurity SOP for validated computer systems and a Cloud SOP for validated systems.  You will need these two procedures to provide governance for your cloud processes.  (If you do not have a cloud or cybersecurity SOP please contact me and I will send you both SOPs.)

Third, I believe you should incorporate cybersecurity qualification (CyQ) into your testing strategy.  In addition to IQ/OQ/PQ, you should be conducting a CyQ readiness assessment for all validated systems.  A CyQ is an assessment to confirm and document your readiness to protect validated systems against a cyber attack.  It also includes testing to validate current protections for your validated systems.  It is important to note that regulators will judge you on your PROACTIVE approach to compliance.  This is an important step in that direction.

cyq-1

Forth, you should adopt lean validation methodologies.  Lean validation practices are designed to eliminate waste and inefficiency throughout the validation process while ensuring sustained compliance.

Finally, the time has come for automation.  To keep pace with the changes in current technology as discussed above, you MUST include automation for requirements management, validation testing, incident management and validation quality assurance (CAPA, NC, audit management, training, et al).  I recommend consideration of an Enterprise Validation Management system such as ValidationMaster™ to support the full lifecycle of computer systems validation.  ValidationMaster™  allows you to build a re-usable test script library and represents a “SINGLE SOURCE OF TRUTH” for all of your validation projects.  Automation of the validation process is no longer a luxury but a necessity.

Advanced technology is moving fast.  The time is now to rethink your validation strategies for the 21st century.  Validation as we know it is dead.  Lean, agile validation processes are demanded to keep pace with rapidly changing technology.  As you embrace the latest cloud, mobile and IoT technologies, you will quickly find that the old ways of validation are no longer sufficient.  Cyber criminals are not going away but you need to be ready. Step into LEAN and embrace the future!

 

Automated Validation Best Practices

Automation is the key to lean validation practices.  Although many validation processes are still paper-based manual processes, there are best practices that support Independent Verification and Validation (IV&V) processes that drive efficiency and compliance.

BEST PRACTICE 1 – Establish Independence

The IEEE 1012 Standard For System, Software and Hardware Verification and Validation states that Independent Verification and Validation (IV&V) is defined by three parameters:

  1. Technical Independence – ensures independence from the development team.  Technical independence is intended to provide a fresh point of view in the examination of software applications to help better detect subtle errors that may be overlooked by those that are too close to the solution such as the development or system implementation team.
  2. Managerial Independence – helps to ensure that an organization separate and distinct from the development or program management team.  Managerial independence ensures that the validation team has the autonomy to independently select the validation methodology, processes, schedule, tasks, and testing strategy to independently confirm the suitability of applications for their intended use.  Managerial independence also ensures that the IV&V team can objectively report all validation test results without any restrictions or approval from the development team or system integration team.  This is a very important level of independence.
  3. Financial Independence – ensures that there are no financial ties between the IV&V team and development team to ensure objectivity.  This level of independence is designed to prevent situations where financial ties may adversely influence or pressure IV&V personnel to deliver less than an objective, authentic test results.

The IEEE 1012 standard speaks of various forms of independence but the bottom line is that the IV&V team should be as independent as possible from the development team.  It is not a good best practice for development teams to also validate their own development projects.  Objectivity is sacrificed when this is done.  Following this best practice ensures objective examination of your software projects free of bias and undue external influence from the development team.

BEST PRACTICE 2 – Continuous Testing In The Cloud

Cloud environments can be validated.  However, there are several issues and characteristics of cloud environments that challenge traditional assumptions regarding validation efforts.

  • Continuous changes in the cloud
  • Inability to conduct supplier audits for large cloud vendors (Microsoft, Oracle, et al)
  • Maintaining the Validated State

Cloud system environments continuously change.  Validation engineers are not used to uncontrolled changes in system environments.  We have been taught that all changes to a system environment once it has been validated must undergo change control.  Thus all changes are subject to a change request process.

In cloud environments, we don’t control when changes are made to systems. Cloud vendors may change disk drives, virtual servers, apply patch updates, and memory and many other system changes that may affect your validated system environment. So the question becomes how do you maintain the validated state in the cloud? There are several best practices designed to answer this question. First of all, you need a way to determine what changes are made in the cloud. Take for example Microsoft office 365 or Microsoft dynamics 365. Microsoft has established what is known as a trust center. The Microsoft trust center is an excellent resource and it provides information about how Microsoft examines its cloud environment. The first consideration you should look at when selecting cloud technology is who your provider is. All cloud providers are not created equal. There are some cloud providers that take compliance, security, data integrity and governance seriously and those who are more general or consumer oriented in nature and do not prioritize these characteristics.

Microsoft, continuing the example, has achieved several key industry certifications for their cloud environment. But most importantly, through the trust center they have provided visibility and clear communications as to how they manage the cloud. From a testing perspective Microsoft solves one of the biggest problems you have in the cloud and that is the question of how do you know what changes were made in the cloud and when the cloud provider made them. Microsoft provides a list of updates byproduct application and tells you exactly what changes were made, the date that the changes were made and if the updates or patches were successfully applied in the environment. With the Microsoft cloud it’s no longer the case that you don’t know when Microsoft has changed the environment. Thanks to their transparency, you know exactly what changes are made to the environment which brings us to the best practice of continuous testing.

Since cloud environments change so often you should employ a strategy known as continuous testing. Continuous testing is essentially validation testing at predefined (user-defined) intervals to ensure that cloud environments are maintained in a validated state. To successfully employ a continuous testing strategy, automation is essential. This is not a process that you would want to carry out manually although it can be carried out manually if you so desire. Automation adds a dimension of efficiency and consistency in the environment.

To employ continuous testing you want to establish a reusable test script Library. This is essential. Once you have validated your system using automated tools such as ValidationMaster™, you will have established a reusable test script Library. The test scripts developed can be used for subsequent regression testing and can be automated to save both time and money. For continuous testing you would conduct an impact analysis to determine the impact of changes that are made to the cloud environment. Once you conduct an impact analysis, you would want to do a risk assessment to ensure that you effectively monitor risk in accordance with ISPE GAMP 5®.  You then want to select from among your reusable test script Library regression tests suitable for continuous testing in your cloud environment. Once these test scripts are executed you can document the actual results and provide a level of due diligence for regulators that you are maintaining your cloud environment in a validated state.

BEST PRACTICE 3 – Select The Right Automation Tools

Another key best practice is selecting the right automation tools for validation. How do you know how to select the right tools? There are two types of automated tools on the market: (1) point solutions and (2) enterprise solutions. The point solution is one that addresses a single element of the validation process. For example, the management of requirements is an essential core component of any validation exercise. There are requirements management point solutions on the market that would assist you in effectively managing user, functional, and design requirements for any validation initiative. Testing is another core element of the validation process. There are many solutions out there that would allow you to capture and record test scripts and some even allow you to execute test scripts online. The problem with point solutions is that they only provide one step in the process. When validating systems it is not common for you to use up words of 17 different systems (point solutions) to prepare validation documentation and due diligence. This does not seem to make much sense and is often fraught with duplication of effort and inefficiencies that cost time and money.

To drive lean validation processes and to achieve automation best practice, you need an enterprise validation management solution to fully automate the validation process – not just one part of it. An enterprise validation management system has the capability of managing validation planning documentation such as the validation master plan, risk assessment, validation project plan, and other related documentation.

As a matter of fact an enterprise validation management system includes an enterprise content management system as a core component of the overall solution. The key deliverables from the validation process are documents. Lots of documents! It stands to reason that an enterprise content management system would be an overall core part of the solution. An enterprise validation management system should also include a requirements management system. It should have the ability to manage any type of requirements. An automated test engine should be at the core of such a solution. The automated test engine should have the ability to not only record test scripts but execute test scripts online and capture objective actual results.

The system should have a robust reporting engine that facilitates the efficient output of any type of report required as part of the due diligence for validation. Quality management is at the core of the validation process. Therefore an enterprise validation management system should include capabilities for all aspects of quality including change control, audit management, CAPA, nonconformances, training, periodic review, trend analysis, and validation key performance indicators. The system should provide real time, statistics on the overall health and performance of validation processes. The system should have at its core foundation standard technology adaptable in any systems environment.

It is best practice to select and deploy the RIGHT tools to support enterprise validation processes.  Point solutions will only get you so far.  Selecting the proper automation tools can save both time and money and deliver a single source of truth for your validation projects.

BEST PRACTICE 4 – Establish a Reusable Test Library

One of the most laborious tasks during software validation is TESTING.  The test script development, execution and documentation process takes considerable time if you do it correctly.  From the establishment of a test environment through the development of test scripts, the validation engineer must carefully document expected and actual results sufficient to prove that systems meet their intended use and have the requisite quality expected of such systems.

Developing test scripts takes time.  Traceability also takes time.  When systems are validated, you want to have the ability to retest a system as required but not have to rewrite test scripts over and over again.  A reusable test script library has been one  of the most effective practices I have implemented.

Reusable test scripts can save up to 60% of time which may be required to rewrite test scripts. Establishing a reusable test script library with FULLY AUTOMATED scripts can save even more time and money in that the fully automated scripts can be executed without human intervention.  You have the ability to set a date/time when test scripts are to be executed and the system (ValidationMaster®) will automatically execute them and report the results back in a fraction of the time it takes to manually execute them.  It is therefore best practice to establish a reusable test script library for enterprise validated systems.

BEST PRACTICE 5 – Document Clear Objective Test Evidence

Documentation of clear, objective test evidence is essential for validation. Many of the automated validation management systems do not have reporting engines that are robust enough to allow you to report documents in your unique format. It is best practice to employ an enterprise validation management system that allows you to present clear objective test evidence in your unique document formats as specified by your SOP’s. ValidationMaster™ has a comprehensive reporting engine that allows you to deliver validation reports in your unique format as is required by this best practice.

.

BEST PRACTICE 6 – Establish a Single Source of Truth For Validation Deliverables

For many organizations that conduct validation on paper, there doesn’t exist a single source of truth for validation. Some validation assets are housed within the document management system. A part of the validation package are kept within a code management systems such as SourceSafe. Part of the deliverables may be kept within a requirements management system. Some incident reports may be kept in an incident management system. Other validation deliverables may be paper-based. In many cases validation engineers attempt to keep sign copies of documentation and multiple three ring binders. This was the traditional practice in the ‘80s.

For lean validation practices that support automation, it is best practice to establish a single source of truth for all validation deliverables. This means is that there is a single point where all validation deliverables pre-and post-execution are stored.  A single source of truth facilitates better auditing and eliminates the common occurrence of loss of documentation to support an audit exercise. This best practice is essential to achieving validation excellence.

The core benefits of ValidationMaster™ is to deliver a single source of truth for all validation projects. You can store all of your validation projects in a single, easy-to-use system and reference it for internal or external audits. For lean validation this is current best practice.

Following the six key best practices above can save both time and money. Validation has not changed much over the last 40 years but the way we manage it has changed significantly. Cloud validation, mobility, cyber security and a host of other factors change the way we look at computer systems validation and manage it. I hope you find these best practices useful and effective in helping you to deliver your next validation project on time and within budget.  We use ValidationMaster™ in our practice every day to support our lean validation processes saving our clients considerable time and money.  What’s in your validation office?

The True Meaning of Software Quality

As a long-time validation engineer, I often ponder questions such as “what does it mean to achieve software quality and is it sustainable over time?”  I ask myself these questions because in today’s systems environments, there are many factors that can impact software quality assurance.

Cyber threats are the elephant in the room.  Most validation projects include IQ/OQ/PQ and UAT testing but do not address cyber threats at all.  Can you really ensure that your validated environments are safe and secure without considering cybersecurity as part of your overall validation strategy?  The International Software Testing Qualifications Board (ISTQB) defines software quality as “…The totality of functionality and features of a software product that bear on its ability to satisfy stated or implied needs…”  Another definition is “…the degree of conformance to explicit or implicit requirements and expectations…”  Finally, IEEE calls software quality “…The degree to which a system, component, or process meets specified requirements, customer, user needs or expectations…”  As shown by the definitions above, software quality is somewhat subjective.

Data integrity is also a critical concern for validated systems.  It is also a key imperative for software quality.  Data integrity is a hot topic lately and generally refers to the accuracy and consistency of information stored in corporate databases, data warehouses or other such constructs.  Data integrity ensures that information is accurate and reliable and in today’s environments, legally defensible.   The accuracy and trustworthiness of data within your systems MUST NOT be in question.

Why is data integrity so important?  Because companies make decisions routinely bases on information housed within corporate databases.

The lack of data integrity over the lifecycle of a system could cause adulterated product to get to the market, incorrect shipping of controlled materials/substances, and a wide variety of  issues affecting the quality, safety and efficacy of a company’s products.  Data integrity is not the purview of technology alone.  To manage data integrity in the broadest sense requires people, processes and technology.

The ALCOA principle as highlighted in the figure below requires that data be attributable to the individual responsible for recording the data/activity.  The “L” in ALCOA means that information must be clear and legible after it is recorded and permanent.  The “C” in ALCOA means that the data must be recorded at the time it was generated.  The “O” means data must be preserved in a unaltered state.  The final “A” in ALCOA means that data must be accurate and reflect the action or observation made.  Modifications must be explained if they are not self-explanatory.

ALCOA picture

No matter what the definition, software quality is all about providing assurance that a system is suitable for its intended use in some way.  We confirm this through testing.  However, it should be noted that testing alone cannot in and of itself ensure software quality.  Testing merely provides a level of assurance or confidence in a software application under specific controlled conditions.

You cannot discuss software quality without a discussion on data integrity.  To derive the true meaning of software quality it is important to consider the following key activities:

  • Establish SOPs That Provide Governance For Software Quality Assurance and Data Integrity
  • Document Everything (if its not documented, it didn’t happen)
  • Establish a Rigorous Software Change Management Process
  • Attain Level 5 Validation Processes Through Automation
  • Enforce Standards For Testing and Documentation
  • Identify Track and Manage Software Quality Metrics and KPIs
  • Conduct Positive and Negative Software Testing

The first step on your way to software quality and data integrity is to establish and follow procedures that provide governance over the process.  You must have procedures that cover everything from validation to data integrity, automation, and everything in between.  Secondly, you must document everything you do to ensure software quality and integrity.  Third, you must establish a rigorous software change management process that helps track and manage all changes made to a cloud-based or on-premise system and who made the changes and why.

Forth, you must drive your organization to Level 5 validation processes.  This is derived from the validation capability maturity model as illustrated in the figure below.

Validation Maturity Model

Level 5 validation means your processes are automated and optimized in a way to ensure quality and compliance.  Fifth, you must enforce all standards for testing and documentation.  This will also require Level 5 automation to achieve your objectives. Sixth, you must identify and track software quality metrics.  You cannot achieve what you don’t measure.  Peter Drucker often said “… you can’t manage what you can’t measure…”  He also said “… what gets measured gets improved…”  You must identify and track metrics to ensure you stay on track.

And finally, in all of your validation testing, conduct positive and negative testing against applications.  The FDA states in the General Principles of Software Validation; Final Guidance For Industry and FDA Staff issued on Jan 11, 2002, that “… A good test case has a high probability of exposing an error; A successful test is one that finds an error…”  This may be somewhat counter-intuitive but I am often stunned at how many validation test scripts are written so that they PASS rather than written to discover an error.  A good software test will reveal errors if written correctly.  When I interrogate applications, I often am looking to reveal problems that may arise during production.

It has been often said that software quality is no accident.  It is the deliberate result of intelligent planning, hard work and rigorous execution.

Software quality is NOT error or bug-free software.  It is about software that is of high quality and sufficiently meets the demands and expectations of the end user community.  AUTOMATION IS KEY.  Automated testing helps easily replicate tests, increases test coverage, reduces errors, improves consistency, and delivers automated traceability enabling more software defects to be discovered and addressed.

The issues surrounding software quality and data integrity are increasing across the globe.  Your organization must be ready to deal with the challenges presented by these issues.  WILL YOUR ORGANIZATION BE READY ?- Think about it.