Validating Microsoft Dynamics 365: What You Should Know

Microsoft Dynamics 365 and Azure are gaining popularity within the life sciences industry. I am often asked the question about how to validate such a system given its complexity and cloud-based nature. The purpose of this blog post is to answer this question. The outline of this blog post is as follows.

  • Understanding the Changing State of Independent Validation and Verification
  • Strategies for Installation Qualification of Microsoft Azure
  • Practical Strategies for Operational and Performance Qualification Testing
  • Continuous Testing in A Cloud Environment
  • Maintaining the Validated State and Azure

To begin our discussion, it is helpful to consider what keeps regulators up at night. They are concerned primarily about four key aspects of validated computer systems:

  1. Vulnerability – How Vulnerable Our Cloud Computing System Environments
  2. Data Integrity – What Is Your Strategy to Maintain Data Integrity Within Your Validated Computer Systems
  3. System Security and Cyber Security – How Do You Keep Sensitive Information Secure and How Do You Protect a Validated Computer System Against Cyber Threats?
  4. Quality Assurance – How Do You Minimize Risk to Patient Safety and Product Quality Impacted by The Validated System?

One of the first task validation engineers must be concerned with is that of supplier auditing. When using commercial off-the-shelf software such as Microsoft Dynamics 365 and Azure a supplier audit is a mandatory part of the validation process. 20 years ago, when we prepared validation documentation for computer systems, validation engineers often conducted a paper audit or an actual physical audit of a software provider. Supplier audits conducted on-site provided a rigorous overview of what software companies were doing and the quality state of their software development lifecycle process. It was possible to examine a supplier’s processes and decide as to if the software vendor was a quality supplier.

Most software vendors today including Microsoft do not allow on-site vendor audits. Some validation engineers have reported to me that they view this as a problem. However, the Microsoft Trust Center is a direct response to the industry’s need for transparency.  Personally, I think the Microsoft trust center is the best thing that they have done for the life sciences industry. Not only do they highlight all of the Service Organization Control reports (SOC1/SOC2/SOC3 and ISO/IEC 27001:2005), but they summarize their compliance with the cloud security alliance controls as well as the NIST Cybersecurity framework. I would strongly recommend that you visit the Microsoft trust center at https://www.microsoft.com/en-us/trustcenter.  The latest information posted to their site is a section on general data protection (GDPR) and how their platform can help keep data safe and secure. I find myself as a validation engineer visiting this site often. You will see a section for specifically Microsoft 365 and Azure.

From a supplier auditing perspective, I use the information found on the Microsoft trust center to facilitate a “desk audit” of the vendor. Many of the questions that I would ask during an on-site audit are found on this website. As part of my new validation strategy I include the service organization control reports as part of my audit due diligence.  The trust center includes in-depth information about security, privacy, and compliance offerings, policies, features and practices across all of Microsoft cloud products. .

If I were conducting an on-site audit of Microsoft, I would want to know how they are establishing trust in the cloud. Many of the questions that I would ask in person I have found on this website. It should be noted that service organization control reports are created not by Microsoft but by trusted third-party organizations certified to deliver such a report. These reports include an assessment of how well Microsoft is complying with the stated controls for cloud management and security. This is extremely valuable information.

From a validation perspective I attach these reports with my validation package as part of the supplier audit due diligence. There may be instances where you conduct your own due diligence beyond the reports but the reports provide an excellent start to understanding what Microsoft is doing.

Microsoft has adopted the cloud security alliance (CSA) cloud controls matrix to establish the controls for the Microsoft Azure platform. These controls include:

  • Security Policy and Procedures
  • Physical and Environmental Security
  • Logical Security
  • System Monitoring and Maintenance
  • Data Backup, Recovery and Retention
  • Confidentiality
  • Software Development/Change Management
  • Incident Management
  • Service Level Agreements
  • Risk Assessments
  • Documentation/Asset Management
  • Training Management
  • Disaster Recovery
  • Vendor Management

the cloud control matrix includes 136 different controls that cloud vendors such as Microsoft must comply with. Microsoft has mapped out on its trust center site specifically how it addresses each of the 136 controls in the Microsoft Azure/dynamics 365 platform. This is excellent knowledge and due diligence for validation engineers and represents a good starting point for documenting the quality of the supplier.

Is Microsoft Azure/dynamics 365 secure enough for life sciences? In my personal opinion yes, it is. Companies still must conduct due diligence to ensure that Azure and dynamics 365 meet their business continuity requirements and business process requirements for enterprise resource planning. One thing is certain, the cloud changes things. You must revise your validation strategy to accommodate the cloud. Changes in how supplier audits conduct are conducted are just one of such changes.

The next challenge in validating Microsoft Dynamics 365 and Azure is conducting validation testing in the cloud environment. It should be understood that the principles of validation still endure whether or not you are in the cloud environment. You still must conduct a rigorous amount of testing including both positive and negative testing to confirm that Microsoft Azure/dynamics 365 meets its intended use. However, there are changes in the way we conduct installation qualification in the cloud environment. Some believe that installation qualification is no longer a valid testing process since cloud environments are provisioned. This is not correct. You still must ensure that cloud environments are provisioned in a consistent repeatable manner that supports quality.

It is helpful to understand that when Microsoft Dynamics 365 is provisioned it is conducted using Microsoft lifecycle services. The Microsoft lifecycle services application is designed for rapid implementation and deployment. However, it should be clearly understood that lifecycle services itself is an application which is a potential point of failure in the process.  The use of lifecycle services must be documented and the provisioning of the environment must be confirmed through the installation qualification process.

From an operational qualification perspective, validation testing remains pretty much the same. Test cases are traced to their respective user requirements and executed with the same rigor as in previous validation exercises.

Performance qualification is also conducted in the same manner as before. Since the environment is in the cloud and outside of your direct control, it is very important that network qualification as well as performance qualification be conducted to ensure that there are no performance anomalies that may occur in your environment. In the cloud environment you may have performance issues related to system resources, networks, storage arrays and many other factors. Performance tools may be used to confirm that the system is performing within an optimal range as established by the validation team. Performance qualification can be conducted either before the system goes live by placing a live load on the system or it may occur after validation. This is at the discretion of the validation engineer.

Maintaining the validated state within a cloud environment requires embrace of the principle of continuous testing. It is often been said that the cloud is perpetually changing. This is one of the reasons why many people believe that you cannot validate the system in the cloud. However, you can validate cloud-based systems such as Microsoft Dynamics 365 and Azure. Continuous testing is the key. What do I mean by continuous testing? Does that mean that we perpetually test the system for ever and ever every single day? Of course not! Continuous testing is a new strategy that should be applied to all cloud-based validated systems whereby at various predetermined intervals, regression testing should occur. Automated systems such as ValidationMaster™ can be the key to facilitating this new strategy.

Within ValidationMaster™ you can establish a reusable test script Library. This is important because in manual validation processes that are paper-based, the most laborious part of validation is the development and execution of test scripts. This is why many people cringe at the very notion of continuous testing. However automated systems make this much easier. In ValidationMaster™ each test script is automatically traced to a user requirement. Thus, during regression testing, I can select a set of test scripts to be executed based on my impact analysis and during off-peak hours in my test environment I can execute these test scripts to see if there has been any impact to my validated system. These test scripts can be run fully automated using Java-based test scripts or they can be run using an online validation testing process. Revalidation of the system can happen in a matter of hours versus a matter of days or weeks using this process.

Through continuous testing, you can review the changes that Microsoft has made to both Azure and dynamics 365 online. Yes, this information is posted online for your review. This answers the question how do I know what changes are made and when changes are made. This information is made available to you through Microsoft. You can determine how often you test a validated system. There is no regulation that codifies how often this should occur. It’s totally up to you. However, as a good validation engineer you know that it should be based on risk. The riskier the system the more often you should test. The less risky the system the less due diligence is required. Nevertheless cloud-based systems should be subject to continuous testing to ensure compliance and maintain the validated state.

There are many other aspects of supporting validation testing in the cloud but suffice to say Microsoft Dynamics and Azure can be validated and have been successfully validated for many clients. Microsoft has done a tremendous service to the life sciences industry by transparently providing information through the Microsoft trust center. As a validation engineer I find this one of the most innovative things that they’ve done! It provides a lot of the information that I need and confirmation through third-party entities that Microsoft is complying with cloud security alliance controls, and the NIST Cybersecurity framework. I would encourage each of you to review the information on Microsoft’s trust center. Of course, there will always be skeptics of anything Microsoft does but let’s give them credit where credit is due.

The Microsoft Trust Center is a good thing. The company has done an excellent job of opening up and sharing how they view system security, quality and compliance. This information was never made fail available before and it is available now. I have validated many Microsoft Dynamics AX systems as well as 365 systems. The software quality within these systems is good and with the information that Microsoft has provided you can have confidence that deploying a system in the Azure environment is not only a good business system decision if you have selected this technology for your enterprise but a sound decision regarding quality and compliance.

.

Automating Validation Testing: It’s Easier Than You Think

Automated validation testing has been elusive for many in the validation community.  There have been many “point solutions” on the market that addressed the creation, management and execution of validation testing.  However, what most validation engineers want is TRULY AUTOMATED validation testing that will interrogate an application in a rigorous manner and report results in a manner that not only provides objective evidence of pass/fail criteria but will highlight each point of failure.

In the 1980’s when I was conducting validation exercises for mini- and mainframe computers and drafting test scripts in a very manual way, I often envisioned a time when I would be able to conduct validation testing in a more automated way.  Most validation engineers work in an environment where they are asked to do more with less.  Thus, the need for such a tool is profound.

Cloud computing environments, mobility, cybersecurity, and data integrity imperatives make it essential that we more thoroughly test applications today.  Yet the burden of manual testing persists.  If I could share with you 5 key features of an automated testing system it would include the following:

  • Automated test script procedure capture and development
  • Automated Requirements Traceability
  • Fully Automated Validation Test Script Execution
  • Automated Incident Capture and Management
  • Ability to Support Continuous Testing in the Cloud

In most validation exercises I have participated in, validation testing was the most laborious part of the exercise.  Automated testing is easier than you think.

For example, ValidationMaster™ includes an automated test engine that captures each step of your qualification procedure and annotates the step with details of the action performed.

Test cases can be routed for review and pre-approval with the system quickly and easily through DocuSign.  Test case execution can be conducted online and a dynamic dashboard reports the status of how many test scripts have passed, how many have failed, or which ones may have passed with exception.  Once test scripts have been executed, the test scripts may be routed for post-approval and signed.

Within the ValidationMaster™ system, you can create a reusable test script library to support future regression testing efforts.  The system allows users to link requirements to test cases thus facilitating the easy generation of forward and reverse trace matrices.  Exporting documents in your UNIQUE format is a snap within the system.  Thus, you can consistently comply with your internal document procedures.

Continuous testing in a cloud environment is essential.  ValidationMaster™ supports fully automated validation testing allowing users to set a date/time for testing.  Test scripts are run AUTOMATICALLY without human intervention.  Allowing multiple runs of the same scripts if necessary.

Continuous testing in a cloud environment is ESSENTIAL.  You must have the ability to respond to rapid changes in a cloud environment that may impact the validated state of the system.  Continuous testing reduces risk and ensures sustained compliance in a cloud environment.

The system automatically raises an incident report if a bug is encountered through automated testing.  The system keeps track of each test run and results though automation.  ValidationMaster™ includes a dynamic dashboard that shows the pass/fail status of each test script as well as requirements coverage, open risks, incident trend analysis and much more.

The time is now for automated validation testing.  The good news is that there are enterprise level applications on the market that facilitate the full validation lifecycle process.  Why are you still generating manual test scripts?  Automated testing is easier than you think!

Saving Time and Money Through Lean Validation

The principles and best practices of lean manufacturing have served life sciences manufacturers well.  Lean is all about optimizing processes, while eliminating waste (“Muda”) and driving greater efficiencies.  As a 30-year validation practitioner, I have validated many computer systems, equipment and processes.  One of the key lessons learned is that there is much room for improvement across the validation process.

OnShore Technology Group is a pioneer in lean validation and has developed principles and best practices to support lean validation processes.  To power our lean processes, we leverage ValidationMaster™, an Enterprise Validation Management system exclusively designed to facilitate lean validation and automate the validation process.

So what is lean validation and how is it practically used?

Lean validation is the process of eliminating waste and inefficiencies while driving greater software quality across the validation process through automation.

Lean Validation

Lean validation cannot be achieved without automation. Lean validation processes leverage advanced technology designed to fulfill the technical, business and compliance requirements for software validation eliminating the use of manual, paper-based processes. Optimized validation processes are deployed using the latest global best practices to ensure the right amount of validation rigor based on critical quality attributes, risks, Cybersecurity and other factors.

Lean validation process begins with a lean validation project charter which defines the description of the process and key performance indicators.  KPI’s such as reduce validation costs by $X/year or reduce software errors by X%.  The charter shall define the project scope, dependencies, project metrics and resources for the project.

There are five principles of lean validation  derived from lean manufacturing principles.  Lean validation is powered through people, processes and technology.  Automation drives lean validation processes.  The LEAN VALIDATION value principles are illustated in the figure below.

Lean-Validation-Graphic-600

Principle 1 – VALUE – Lean thinking in manufacturing begins with a detailed understanding of what value the customer assigns to product and services.  Lean thinking from an independent validation and verification perspective begins with a detailed understanding of the goals and objectives of the validation process and its adherence to compliance objectives.  The principle of VALUE requires the validation team to therefore focus on the elimination of waste to deliver the value the end customer (your organization) in the most cost-effective manner.  The computer systems validation process is designed for the purpose of assuring that software applications meet their intended use.  The value derived from the validation process is greater software quality, enhanced ability to identify software defects as a result of greater focus and elimination of inefficient and wasteful processes. AUTOMATION IS THE FOUNDATION THAT FACILITATES THE ACHIEVEMENT OF THIS VALUE PRINCIPLE.

Principle 2 – VALUE STREAM – The value stream, from a lean perspective is the comprehensive product life-cycle from the raw materials through customer’s end use, and ultimate disposal the product.  To effectively eliminate waste, the ultimate goal of lean validation, there must be an accurate and complete understanding of the value stream.  Validation processes must be examined end-to-end to determine what value is added to the objective of establishing software quality and compliance.  Any process that does not add value to the validation process should be eliminated.  We recommend value stream mapping for the validation process to understand where value is added and where non-value added processes can be eliminated.  Typical “Muda” or wastes commonly revealed from validation process mapping are:

  • Wasteful Legacy Processes (“we have always done it this way”)
  • Processes That Provide No Value To Software Quality At All
  • Manual Process Bottlenecks That Stifle Processes

Principle 3 – FLOW – The lean manufacturing principle of flow is about creating a value chain with no interruption in the production process and a state where each activity is fully in step with every other.  A comprehensive assessment and understanding of flow throughout the validation process is essential to the elimination of waste.  From a validation perspective, optimal flow is created through the process when, for example, users have the ability to automatically create requirements from test scripts for automated traceability thereby eliminating the process of manually tracing each test script to a requirement.  Another example is when a user has the ability to navigate through a software application and the test script process is automatically generated.  Once generated, it is automatically published to a document portal where it is routed electronically for review and approval.  All of this requires AUTOMATION to achieve the principle of FLOW.  For process optimization and quality control throughout the validation lifecycle, information should optimally flow throughout the validation process in an efficient manner minimizing process and document bottlenecks with traceability throughout the process.

Principle 4 – PULL – A pull system is a lean manufacturing is used to reduce waste in the production process. Components used in the manufacturing process are only replaced once they have been consumed so companies only make enough products to meet customer demand.  There is much waste in the validation process.  The PULL strategy for validation may be used to reduce wastes such as duplication of effort, streamlining test case development and execution, electronic signature routing/approval and many others.  Check out our blog “The Validation Post” for more information.

Principle 5 – PERFECTION – Validation processes are in constant pursuit of continuous improvement.  Automation is KEY.  Lean validation engineers and quality professional relentlessly drive for perfection. Step by step validation engineers must identify root causes of software issues, anomalies, and quality problems that affect the suitability of a system for production use. As computing systems environments evolve and become more complex and integrated, validation engineers must seek new, innovative ways to verify software quality and compliance in today’s advanced systems. Perfection cannot easily be achieved through manual processes.

AUTOMATION IS REQUIRED TO TRULY REALIZE THE VISION OF THIS PRINCIPLE.

You can save time and money through lean.  Consider the first 2 principles of Value and Value Stream.  Many validation engineers consider validation only as a regulatory requirement or “necessary evil” – rather than a regulatory best practice designed to save time and money.  I think we all agree that in the long run quality software saves time and money through regulatory cost avoidance and more efficiency throughout quality processes.

It is important for validation engineers to not only consider quality and compliance but cost savings that may be gained throughout the validation process.  Lean validation is a strategy whose time has come.  How much will cost savings be through lean?  End users report saving about 40 – 60% for test script development and 60 – 70% on regression testing efforts.  Regulatory cost avoidance can be significant depending on the level of compliance with each company.  Cost savings may also be realized in minimizing the amount of paper generated through the validation process.

Embrace LEAN VALIDATION and experience the benefits of saving time and money.

 

 

 

Why Are You Still Generating Validation Test Scripts Manually?

Drafting validation scripts is one of the key activities in a validation exercise designed to provide document evidence that a system performs according to its intended use.  The FDA and other global agencies require objective evidence, usually in the form of screen shots that sequentially capture the target software process, to provide assurance that systems can consistently and repeatedly perform the various processes representing the intended use of the system.

Since the advent of the PC, validation engineers have been writing validation test scripts manually.  The manual process of computer systems validation test script development involves capturing screenshots and pasting them into Microsoft Word test script templates.  To generate screen captures, some companies use tools such as Microsoft Print Screen, TechSmith SnagIT, and other such tools.  A chief complaint of many validation engineers is that the test script development process is a slow, arduous one.  Some validation engineers are very reluctant to update/re-validate systems due to this manual process.  So, the question posed by this blog article is simply this: “Why are you still generating test scripts manually???”

I have been conducting validation exercises for engineering and life sciences systems since the early 1980’s.  I too have experienced first-hand the pain of writing test scripts manually.  We developed and practice “lean validation” so I sought ways to eliminate manual, wasteful validation processes.  One of the most wasteful processes in validation is the manual capture/cutting/pasting of screenshots into a Microsoft Word document.

The obvious follow up question is “how do we capture validation tests in a more automated manner to eliminate waste and create test scripts that are complete, accurate and provide the level of due diligence required for validation?”

In response to this common problem, we developed an Enterprise Validation Management system called ValidationMaster™.  This system includes TestMaster™, an automated testing system that allows validation engineers to capture and execute validation test scripts in a cost-effective manner.

TestMaster™ is designed to validate ANY enterprise or desktop application.  It is a browser-based system and allows test engineers to open any application on their desktop, launch TestMaster™, and capture their test scripts while sequentially executing the various commands in their applications.    As the validation engineer navigates through the application, TestMaster™ captures each screenshot and text entry entered in the application.

Once the test script is saved, TestMaster™ allows the script to be published in your UNIQUE test script template with the push of a button.  No more cutting/pasting screenshots from manual processes!  You can generate your test scripts in MINUTES as opposed to the hours it sometimes takes to compile documents based on a series of screenshots.  If you are one of those validation engineers that does not like screenshots in your scripts, you can easily create text-based processes both quickly and easily using TestMaster™.

So, what is the biggest benefit of using TestMaster™ versus manual processes?  There are three key benefits which are summarized as follows:

  1.  Automated Test Script Execution– for years, validation engineers have wanted a more automated approach for the execution of validation test scripts.  ValidationMaster™ supports both hands-on or hands-off validation testing.  Hands-on validation testing is the process whereby a validation engineer looks at each step of a validation test script and executes the script step-by-step by clicking through the process.  Hands off validation allows a validation engineer to execute a test script with no human intervention.  This type of regression testing (hands off) is very useful for cloud-based systems or systems that require more frequent testing.  The validation engineer simply selects a test script and defines a date/time for its execution.  At the designated time with no human intervention, the system executes the test script and reports the test results back to the system.   Automated testing is here!  Why are you still doing this manually?

  1.  Traceability– TestMaster™ allows validation engineers to link each test script to a requirement or set of requirements, thus the system delivers automatic traceability which is a regulatory requirement.  With the click of a button, TestMaster™ allows validation engineers to create a test script directly from a requirement.  This is powerful capability that allows you to see requirements coverage through our validation dashboard on demand.  This validation dashboard is viewable on a desktop or mobile device (Windows, Apple, Android).

  1.  Test Script Execution– One of the biggest problems with manual test scripts is that they must be printed and manually routed for review and approval.  Some companies who have implemented document management systems may have the ability to route the scripts around electronically for review and approval.  The worst-case scenario is the company that has no electronic document management system and generates these documents manually.  TestMaster™ allows validation engineers to execute test scripts online and capture test script results in an easy manner.  The test script results can be captured in an automated way and published into executed test script templates quickly and easily.   If incidents (bugs/anomalies) occur during testing, users have the ability to automatically capture an incident report which is tied to the exact step where the anomaly/bug occurred.  Once completed, ValidationMaster™ is tightly integrated with a 21 CFR Part 11-compliant portal (ValidationMaster Portal™). Once the test script is executed, is it automatically published to the ValidationMaster™ Portal where it is routed for review/approval in the system.  The ability to draft, route, review, approve, execute and post-approve validation test scripts is an important, time/cost saving feature that should be a part of any 21stcentury validation program.

  1.  Reuse Test Scripts For Regression Testing– manual test scripts are not ‘readily’ reusable.  What I mean by this is that the Word documents must be edited or even re-written for validation regression testing.  Validation is not a one-time process.  Regression testing is a fact of life for validation engineers.  The question is, will you rewrite all of your test scripts or use automated tools to streamline the process.  ValidationMaster™ allows validation engineers to create a reusable test script library.  This library includes all the test scripts that make up your validation test script package.  During re-validation exercises, you have the ability to reuse the same test scripts for regression testing.

Given the rapid adoption of cloud, mobile and enterprise technologies in life sciences, a new approach to validation is required.  Yes, you can still conduct validation exercises on paper but why would you?  In the early days of enterprise technology, we did not have tools available that would facilitate the rapid development of validation test scripts.  Today, that is not the case.  Systems like ValidationMaster™ are available in either a hosted or on-premise environment.  These game-changing systems are revolutionizing the way validation is conducted and offering time/cost-saving features that make this process easier.   So why are you still generating test scripts manually?

Accelerating Validation in the 21st Century

Each day in companies across the globe, employees are being asked to do more with less.  The mantra of the business community in the 21st century is “accelerating business”.  You see it in marketing and all types of corporate communication.  In the validation world accelerating system implementation, validation and deployment is the cry of every chief information officer and senior executive.

Accelerating validation activity means different things to different people.  I define it as driving validation processes efficiently and effectively without sacrificing quality or compliance.  Most manual processes are paper-based, inefficient, and cumbersome.  Creating paper-based test scripts requires a lot of cutting/pasting of screenshots and other objective evidence to meet regulatory demands.  Getting validation information into templates with proper headers and footers in compliance with corporate documentation guidelines is paramount.

The question for today is “HOW DO WE ACCELERATE VALIDATION WITHOUT SACRIFICING QUALITY AND COMPLIANCE?”

Earlier in my career, I developed “Validation Toolkit” for Documentum and Qumas.  A Validation Toolkit was an early version of a validation accelerator.  Many companies now have them.  These toolkits come with pre-defined document templates and test scripts designed to streamline the validation process.  They are called “tookits” for a reason – they are not intended to be completed validation projects.  They are intended to be a starting point for validation exercises.

The FDA states in the General Principles of Software Validation; Final Guidance For Industry and FDA Staff issued on January 11, 2002, “…If the vendor can provide information about their system requirements, software requirements, validation process, and the results of their validation, the medical device manufacturer (or any company) can use that information as a beginning point for their required validation documentation…”

Notice, that the FDA says “… use a beginning point” for validation – NOT USE IT EXCLUSIVELY AS THE COMPLETE VALIDATION PACKAGE.  This is because the FDA expects each company to conduct its own due diligence for validation.  Also note that the FDA allows the use of such documentation if available from a vendor.

Beware that some vendors believe their “toolkits” can substitute for rigorous review of their software applications.  It cannot.  In addition to leveraging the software vendor information, you should ALWAYS conduct your own validation due diligence to ensure that applications are thoroughly examined without bias by your team or designated validation professional.

It is an excellent idea to leverage vendor information if provided.  There is a learning curve with most enterprise applications and the use of vendor-supplied information can help streamline the process and add value.  The old validation toolkits were delivered with a set of document templates and pre-defined test scripts often resulting in hundreds of documents depending on the application.  In most cases, companies would edit the documents or put them in their pre-defined validation templates for consistency.  This resulted in A LOT OF EDITING!  In some cases, depending on the number of documents, time-savings could be eroded by just editing the existing documents.

THERE IS A BETTER WAY!  

What if YOUR unique validation templates were pre-loaded into an automated Enterprise Validation Lifecycle Management system?   What if the Commercial-off-the-Shelf (COTS) product requirements were pre-loaded into this system?  Further, what if a full set of OQ/PQ test scripts were traced to each of the pre-loaded requirements?  What if you were able to export all of these documents day one in your unique validation templates?  What if you only had to add test scripts that represented the CHANGES you were making to a COTS application versus writing all of your test scripts from scratch?  Finally, what if you could execute your test scripts on line and automatically generate a requirements trace matrix and validation summary report with PASS/FAIL status?

This is the vision of the CloudMaster 365™  Validation Accelerator.  CloudMaster 365™  is the next generation of the validation toolkit concept.    The ValidationMaster Enterprise Validation Management system is the core of the CloudMaster 365™ Validation Accelerator.  The application can be either hosted or on-premise.  This is a great way to jump start your Microsoft Dynamics 365® validation project.

CloudMaster 365

The CloudMaster 365™ Validation Accelerator includes (3) key components:

  • ValidationMaster™ Enterprise Validation Management System
  • Comprehensive set of user requirements for out-of-the-box features
  • Full set of IQ/OQ/PQ test scripts and validation document templates

The Validation Accelerator is not intended to be “validation out of the box”.  It delivers the foundation for your current and future validation projects delivering “Level 5” validation processes.  If you consider the validation capability maturity model, most validation processes at level 1 are ad hoc, chaotic, undefined and reactive to project impulses and external events.

Validation Maturity Model

Level 5 validation is an optimized, lean validation process facilitated with automation.  The test process is optimized, quality control is assured and there is a specific measure for defect prevention and management.  Level 5 cannot be achieved without automation.  The automation is powered by ValidationMaster™.   With the CloudMaster 365™ strategy, instead of having “toolkit” documents suitable for only one project, you have a system that manages your COTS or bespoke development over time and manages the full lifecycle of validation over time as is required by regulators.  This is revolutionary in terms of what you get.

CloudMaster 365™ TRULY ACCELERATES VALIDATION.  It can not only be used for Microsoft Dynamics 365®, but is also available for other applications including:

  • CloudMaster 365™ Validation Accelerator For Dynamics 365®
  • CloudMaster 365™ Validation Accelerator For Merit MAXLife®
  • CloudMaster 365™ Validation Accelerator For Yaveon ProBatch®
  • CloudMaster 365™ Validation Accelerator For Oracle e-Business® or Oracle Fusion®
  • CloudMaster 365™ Validation Accelerator For SAP®
  • CloudMaster 365™ Validation Accelerator For BatchMaster®
  • CloudMaster 365™ Validation Accelerator For Edgewater EDGE®
  • CloudMaster 365™ Validation Accelerator For VeevaVault®
  • CloudMaster 365™ Validation Accelerator For Microsoft SharePoint®

Whether you are validating an enterprise application or seeking to drive your company to higher levels of efficiency, you should consider how best to accelerate your validation processes not solution by solution but in a more holistic way that ensures sustained compliance across all applications.  If you are embarking on the validation of Microsoft Dynamics 365®, or any of the applications listed above, the CloudMaster 365™ Validation Accelerator is a no-brainer.  It is the most cost-effective way to streamline validation and ensure sustained compliance over time.

Cloud Validation Strategies

If you were to ask me 10 years ago how many of my life sciences clients were deploying systems in the cloud environment I would’ve said may be perhaps one or two. If you ask me today how many of my clients are deploying cloud’s technologies I would say most all of them in one way or another.  The adoption of cloud technologies within life sciences companies is expanding at a rapid pace.

From a validation perspective, this trend has profound consequences.  Here are some key concerns and questions to be answered for any cloud deployment.

  1. How do you validate systems in a cloud environment?
  2. What types of governance do you need to deploy applications in a cloud environment?
  3. How do you manage change in a cloud environment?
  4. How do you maintain the validated state in the cloud?
  5. How can you ensure data integrity in the cloud?
  6. How do you manage cybersecurity in a cloud environment?

The answers to these questions are obvious and routine to validation engineers managing systems in an on-premise environment where the control of the environment is managed by the internal IT team.  They have control over changes, patches, system updates, and other factors that may impact the overall validated state.  In a cloud environment, the software, platform and infrastructure is delivered as a SERVICE.  By leveraging the cloud, life sciences companies are effectively outsourcing the management and operation of a portion of their IT infrastructure to the cloud provider.  However, compliance oversight and responsibility for your validated system cannot be delegated to the cloud provider.  Therefore, these services must have a level of control sufficient to support a validated systems environment.

For years, life sciences companies have been accustomed to governing their own systems environments.  They control how often systems are updated, when patches are applied, when system resources will be updated, etc.  In a cloud environment, control is in the hands of the cloud service provider.  Therefore, who you choose as your cloud provider matters.

So what should your strategy be to manage cloud-based systems?

  • Choose Your Cloud Provider Wisely – All cloud providers are not created equally.  The Cloud Security Alliance (https://cloudsecurityalliance.org/ ) is an excellent starting point for understanding cloud controls.  The Cloud Controls Matrix (CCM) is an Excel spreadsheet that allows you to assess a vendors readiness for the cloud.  You can download it free of charge from the CSA.
  • Establish Governance For The Cloud – You must have an SOP for the management and deployment of the cloud and ensure that this process is closely followed.  You also need an SOP for cyber security to provide a process for protecting validated systems against cyber threats.
  • Leverage Cloud Supplier Audit Reports For Validation – All cloud providers must adhere to standards for their environments.  Typically, they gain 3rd party certification and submit to Service Organization Control (SOC) independent audits.  It is recommended that you capture the SOC 1/2/3 and SSAE 16 reports.  You also want to understand any certifications that your cloud provider has.  I would archive their certifications and SOC reports with the validation package as part of my due diligence for the supplier audit.
  • Embrace Lean Validation Principles and Best Practices – eliminating waste and improving efficiency is essential in any validated systems environment.  Lean validation is derived from the principles of lean manufacturing.  Automation is a MUST.  You need to embrace lean principles for greater efficiency and compliance.
  • Automate Your Validation Processes – Automation and Lean validation go hand in hand.  The testing process is the most laborious process.  We recommend using a system like ValidationMaster™ to automate requirements management, test management and execution, incident management, risk management, validation quality management, agile validation project management, and validation content management. ValidationMaster™ is designed to power lean validation processes and includes built-in best practices to support this process.
  • Use a Risk-Based Approach To Validation – all validation exercises are not created equal.  The level of validation due diligence required for your project should be based on risk – regulatory, technical and business risks.  Conduct a risk assessment for all cloud-based systems.
  • Adopt Continuous Testing Best Practices – the cloud is under continuous change which seems in and of itself counter-intuitive to the validation process.  Continuous testing can be onerous if your testing process is MANUAL.  However, if you adopt lean, automated testing processes regression testing is easy.  You can establish a routine schedule for testing and if your cloud provider delivers a dashboard that tells you when patches/updates/features have been applied and the nature of them, you can select your regression testing plan based on a risk and impact assessment.

 

Cloud environments can be validated!  A clear, practical approach that embraces lean validation and continuous testing is key.  Cloud governance to ensure data integrity and sustained compliance is key.

Cloud technologies are here to stay.  Regulators don’t object to the use of the cloud, they want to know how you are managing it and ensuring the integrity of the data.  They also want you to confirm that you are maintaining the validated state in the cloud.  The principles of validation endure in the cloud.  Just because you are in a cloud environment does not mean validation principles no longer apply.  Consider the impact of cybersecurity in your cloud environment and adopt continuous testing strategies to ensure sustained compliance.

Automated Validation Lifecycle Management

Lorem ipsum dolor sit amet, consectetur adipiscing elit. Nam molestie molestie nisl, eu scelerisque turpis tempus at. Nam luctus ultrices imperdiet.

Class aptent taciti sociosqu ad litora torquent per conubia nostra, per inceptos himenaeos. Suspendisse velit orci, pretium ut feugiat nec, lobortis et est. Nullam cursus ultrices tincidunt. Nam gravida sem gravida ipsum dignissim in dictum urna accumsan. Nullam nec augue magna, sed scelerisque odio. Cras adipiscing feugiat venenatis. Praesent gravida consequat purus sed lobortis. Aenean et eros nunc.

Nam ultricies aliquam imperdiet. Pellentesque massa dui, varius non sodales quis, placerat ullamcorper nisl. Donec pulvinar, arcu vel rhoncus commodo, neque lectus blandit elit, vel iaculis odio lectus sit amet metus. Curabitur sodales semper eros et vulputate. Ut et sem ipsum. Nam elementum neque sem. Fusce fringilla ante id augue sodales venenatis. Nunc eu ipsum enim.

  •  Bibendum in cursus venenatis
  • Ultricies consectetur purus
  • Integer imperdiet lectus vitae

Nunc odio odio, faucibus non porta a, venenatis non mauris. Nam non tortor est. Nullam lacinia, augue quis luctus ullamcorper, sem urna bibendum erat, sed viverra tortor velit sed quam. Sed adipiscing leo a odio condimentum in placerat ipsum bibendum.

Nam pretium, sem iaculis ullamcorper mattis, sem lacus commodo dui, vel ultrices libero nisl et massa. Sed tristique bibendum arcu, dapibus eleifend justo aliquet eu. Fusce sed blandit lorem. Phasellus blandit posuere nulla quis aliquam. In vel ante vitae neque aliquet hendrerit a non velit.

In hac habitasse platea dictumst. Integer ac ante enim, in imperdiet justo. Sed justo mi, convallis et lobortis a, venenatis at odio. Vivamus porttitor dolor eget felis pretium luctus. Sed nec dui id augue blandit accumsan vel et lorem.

Quisque eros purus, sagittis sit amet consectetur eu, scelerisque a purus. Pellentesque sollicitudin velit eu velit fringilla sollicitudin. Vestibulum ante ipsum primis in faucibus orci luctus et ultrices posuere cubilia Curae; Praesent id aliquam magna.

Automated Validation Best Practices

Automation is the key to lean validation practices.  Although many validation processes are still paper-based manual processes, there are best practices that support Independent Verification and Validation (IV&V) processes that drive efficiency and compliance.

BEST PRACTICE 1 – Establish Independence

The IEEE 1012 Standard For System, Software and Hardware Verification and Validation states that Independent Verification and Validation (IV&V) is defined by three parameters:

  1. Technical Independence – ensures independence from the development team.  Technical independence is intended to provide a fresh point of view in the examination of software applications to help better detect subtle errors that may be overlooked by those that are too close to the solution such as the development or system implementation team.
  2. Managerial Independence – helps to ensure that an organization separate and distinct from the development or program management team.  Managerial independence ensures that the validation team has the autonomy to independently select the validation methodology, processes, schedule, tasks, and testing strategy to independently confirm the suitability of applications for their intended use.  Managerial independence also ensures that the IV&V team can objectively report all validation test results without any restrictions or approval from the development team or system integration team.  This is a very important level of independence.
  3. Financial Independence – ensures that there are no financial ties between the IV&V team and development team to ensure objectivity.  This level of independence is designed to prevent situations where financial ties may adversely influence or pressure IV&V personnel to deliver less than an objective, authentic test results.

The IEEE 1012 standard speaks of various forms of independence but the bottom line is that the IV&V team should be as independent as possible from the development team.  It is not a good best practice for development teams to also validate their own development projects.  Objectivity is sacrificed when this is done.  Following this best practice ensures objective examination of your software projects free of bias and undue external influence from the development team.

BEST PRACTICE 2 – Continuous Testing In The Cloud

Cloud environments can be validated.  However, there are several issues and characteristics of cloud environments that challenge traditional assumptions regarding validation efforts.

  • Continuous changes in the cloud
  • Inability to conduct supplier audits for large cloud vendors (Microsoft, Oracle, et al)
  • Maintaining the Validated State

Cloud system environments continuously change.  Validation engineers are not used to uncontrolled changes in system environments.  We have been taught that all changes to a system environment once it has been validated must undergo change control.  Thus all changes are subject to a change request process.

In cloud environments, we don’t control when changes are made to systems. Cloud vendors may change disk drives, virtual servers, apply patch updates, and memory and many other system changes that may affect your validated system environment. So the question becomes how do you maintain the validated state in the cloud? There are several best practices designed to answer this question. First of all, you need a way to determine what changes are made in the cloud. Take for example Microsoft office 365 or Microsoft dynamics 365. Microsoft has established what is known as a trust center. The Microsoft trust center is an excellent resource and it provides information about how Microsoft examines its cloud environment. The first consideration you should look at when selecting cloud technology is who your provider is. All cloud providers are not created equal. There are some cloud providers that take compliance, security, data integrity and governance seriously and those who are more general or consumer oriented in nature and do not prioritize these characteristics.

Microsoft, continuing the example, has achieved several key industry certifications for their cloud environment. But most importantly, through the trust center they have provided visibility and clear communications as to how they manage the cloud. From a testing perspective Microsoft solves one of the biggest problems you have in the cloud and that is the question of how do you know what changes were made in the cloud and when the cloud provider made them. Microsoft provides a list of updates byproduct application and tells you exactly what changes were made, the date that the changes were made and if the updates or patches were successfully applied in the environment. With the Microsoft cloud it’s no longer the case that you don’t know when Microsoft has changed the environment. Thanks to their transparency, you know exactly what changes are made to the environment which brings us to the best practice of continuous testing.

Since cloud environments change so often you should employ a strategy known as continuous testing. Continuous testing is essentially validation testing at predefined (user-defined) intervals to ensure that cloud environments are maintained in a validated state. To successfully employ a continuous testing strategy, automation is essential. This is not a process that you would want to carry out manually although it can be carried out manually if you so desire. Automation adds a dimension of efficiency and consistency in the environment.

To employ continuous testing you want to establish a reusable test script Library. This is essential. Once you have validated your system using automated tools such as ValidationMaster™, you will have established a reusable test script Library. The test scripts developed can be used for subsequent regression testing and can be automated to save both time and money. For continuous testing you would conduct an impact analysis to determine the impact of changes that are made to the cloud environment. Once you conduct an impact analysis, you would want to do a risk assessment to ensure that you effectively monitor risk in accordance with ISPE GAMP 5®.  You then want to select from among your reusable test script Library regression tests suitable for continuous testing in your cloud environment. Once these test scripts are executed you can document the actual results and provide a level of due diligence for regulators that you are maintaining your cloud environment in a validated state.

BEST PRACTICE 3 – Select The Right Automation Tools

Another key best practice is selecting the right automation tools for validation. How do you know how to select the right tools? There are two types of automated tools on the market: (1) point solutions and (2) enterprise solutions. The point solution is one that addresses a single element of the validation process. For example, the management of requirements is an essential core component of any validation exercise. There are requirements management point solutions on the market that would assist you in effectively managing user, functional, and design requirements for any validation initiative. Testing is another core element of the validation process. There are many solutions out there that would allow you to capture and record test scripts and some even allow you to execute test scripts online. The problem with point solutions is that they only provide one step in the process. When validating systems it is not common for you to use up words of 17 different systems (point solutions) to prepare validation documentation and due diligence. This does not seem to make much sense and is often fraught with duplication of effort and inefficiencies that cost time and money.

To drive lean validation processes and to achieve automation best practice, you need an enterprise validation management solution to fully automate the validation process – not just one part of it. An enterprise validation management system has the capability of managing validation planning documentation such as the validation master plan, risk assessment, validation project plan, and other related documentation.

As a matter of fact an enterprise validation management system includes an enterprise content management system as a core component of the overall solution. The key deliverables from the validation process are documents. Lots of documents! It stands to reason that an enterprise content management system would be an overall core part of the solution. An enterprise validation management system should also include a requirements management system. It should have the ability to manage any type of requirements. An automated test engine should be at the core of such a solution. The automated test engine should have the ability to not only record test scripts but execute test scripts online and capture objective actual results.

The system should have a robust reporting engine that facilitates the efficient output of any type of report required as part of the due diligence for validation. Quality management is at the core of the validation process. Therefore an enterprise validation management system should include capabilities for all aspects of quality including change control, audit management, CAPA, nonconformances, training, periodic review, trend analysis, and validation key performance indicators. The system should provide real time, statistics on the overall health and performance of validation processes. The system should have at its core foundation standard technology adaptable in any systems environment.

It is best practice to select and deploy the RIGHT tools to support enterprise validation processes.  Point solutions will only get you so far.  Selecting the proper automation tools can save both time and money and deliver a single source of truth for your validation projects.

BEST PRACTICE 4 – Establish a Reusable Test Library

One of the most laborious tasks during software validation is TESTING.  The test script development, execution and documentation process takes considerable time if you do it correctly.  From the establishment of a test environment through the development of test scripts, the validation engineer must carefully document expected and actual results sufficient to prove that systems meet their intended use and have the requisite quality expected of such systems.

Developing test scripts takes time.  Traceability also takes time.  When systems are validated, you want to have the ability to retest a system as required but not have to rewrite test scripts over and over again.  A reusable test script library has been one  of the most effective practices I have implemented.

Reusable test scripts can save up to 60% of time which may be required to rewrite test scripts. Establishing a reusable test script library with FULLY AUTOMATED scripts can save even more time and money in that the fully automated scripts can be executed without human intervention.  You have the ability to set a date/time when test scripts are to be executed and the system (ValidationMaster®) will automatically execute them and report the results back in a fraction of the time it takes to manually execute them.  It is therefore best practice to establish a reusable test script library for enterprise validated systems.

BEST PRACTICE 5 – Document Clear Objective Test Evidence

Documentation of clear, objective test evidence is essential for validation. Many of the automated validation management systems do not have reporting engines that are robust enough to allow you to report documents in your unique format. It is best practice to employ an enterprise validation management system that allows you to present clear objective test evidence in your unique document formats as specified by your SOP’s. ValidationMaster™ has a comprehensive reporting engine that allows you to deliver validation reports in your unique format as is required by this best practice.

.

BEST PRACTICE 6 – Establish a Single Source of Truth For Validation Deliverables

For many organizations that conduct validation on paper, there doesn’t exist a single source of truth for validation. Some validation assets are housed within the document management system. A part of the validation package are kept within a code management systems such as SourceSafe. Part of the deliverables may be kept within a requirements management system. Some incident reports may be kept in an incident management system. Other validation deliverables may be paper-based. In many cases validation engineers attempt to keep sign copies of documentation and multiple three ring binders. This was the traditional practice in the ‘80s.

For lean validation practices that support automation, it is best practice to establish a single source of truth for all validation deliverables. This means is that there is a single point where all validation deliverables pre-and post-execution are stored.  A single source of truth facilitates better auditing and eliminates the common occurrence of loss of documentation to support an audit exercise. This best practice is essential to achieving validation excellence.

The core benefits of ValidationMaster™ is to deliver a single source of truth for all validation projects. You can store all of your validation projects in a single, easy-to-use system and reference it for internal or external audits. For lean validation this is current best practice.

Following the six key best practices above can save both time and money. Validation has not changed much over the last 40 years but the way we manage it has changed significantly. Cloud validation, mobility, cyber security and a host of other factors change the way we look at computer systems validation and manage it. I hope you find these best practices useful and effective in helping you to deliver your next validation project on time and within budget.  We use ValidationMaster™ in our practice every day to support our lean validation processes saving our clients considerable time and money.  What’s in your validation office?