Validating Microsoft Dynamics 365: What You Should Know

Microsoft Dynamics 365 and Azure are gaining popularity within the life sciences industry. I am often asked the question about how to validate such a system given its complexity and cloud-based nature. The purpose of this blog post is to answer this question. The outline of this blog post is as follows.

  • Understanding the Changing State of Independent Validation and Verification
  • Strategies for Installation Qualification of Microsoft Azure
  • Practical Strategies for Operational and Performance Qualification Testing
  • Continuous Testing in A Cloud Environment
  • Maintaining the Validated State and Azure

To begin our discussion, it is helpful to consider what keeps regulators up at night. They are concerned primarily about four key aspects of validated computer systems:

  1. Vulnerability – How Vulnerable Our Cloud Computing System Environments
  2. Data Integrity – What Is Your Strategy to Maintain Data Integrity Within Your Validated Computer Systems
  3. System Security and Cyber Security – How Do You Keep Sensitive Information Secure and How Do You Protect a Validated Computer System Against Cyber Threats?
  4. Quality Assurance – How Do You Minimize Risk to Patient Safety and Product Quality Impacted by The Validated System?

One of the first task validation engineers must be concerned with is that of supplier auditing. When using commercial off-the-shelf software such as Microsoft Dynamics 365 and Azure a supplier audit is a mandatory part of the validation process. 20 years ago, when we prepared validation documentation for computer systems, validation engineers often conducted a paper audit or an actual physical audit of a software provider. Supplier audits conducted on-site provided a rigorous overview of what software companies were doing and the quality state of their software development lifecycle process. It was possible to examine a supplier’s processes and decide as to if the software vendor was a quality supplier.

Most software vendors today including Microsoft do not allow on-site vendor audits. Some validation engineers have reported to me that they view this as a problem. However, the Microsoft Trust Center is a direct response to the industry’s need for transparency.  Personally, I think the Microsoft trust center is the best thing that they have done for the life sciences industry. Not only do they highlight all of the Service Organization Control reports (SOC1/SOC2/SOC3 and ISO/IEC 27001:2005), but they summarize their compliance with the cloud security alliance controls as well as the NIST Cybersecurity framework. I would strongly recommend that you visit the Microsoft trust center at https://www.microsoft.com/en-us/trustcenter.  The latest information posted to their site is a section on general data protection (GDPR) and how their platform can help keep data safe and secure. I find myself as a validation engineer visiting this site often. You will see a section for specifically Microsoft 365 and Azure.

From a supplier auditing perspective, I use the information found on the Microsoft trust center to facilitate a “desk audit” of the vendor. Many of the questions that I would ask during an on-site audit are found on this website. As part of my new validation strategy I include the service organization control reports as part of my audit due diligence.  The trust center includes in-depth information about security, privacy, and compliance offerings, policies, features and practices across all of Microsoft cloud products. .

If I were conducting an on-site audit of Microsoft, I would want to know how they are establishing trust in the cloud. Many of the questions that I would ask in person I have found on this website. It should be noted that service organization control reports are created not by Microsoft but by trusted third-party organizations certified to deliver such a report. These reports include an assessment of how well Microsoft is complying with the stated controls for cloud management and security. This is extremely valuable information.

From a validation perspective I attach these reports with my validation package as part of the supplier audit due diligence. There may be instances where you conduct your own due diligence beyond the reports but the reports provide an excellent start to understanding what Microsoft is doing.

Microsoft has adopted the cloud security alliance (CSA) cloud controls matrix to establish the controls for the Microsoft Azure platform. These controls include:

  • Security Policy and Procedures
  • Physical and Environmental Security
  • Logical Security
  • System Monitoring and Maintenance
  • Data Backup, Recovery and Retention
  • Confidentiality
  • Software Development/Change Management
  • Incident Management
  • Service Level Agreements
  • Risk Assessments
  • Documentation/Asset Management
  • Training Management
  • Disaster Recovery
  • Vendor Management

the cloud control matrix includes 136 different controls that cloud vendors such as Microsoft must comply with. Microsoft has mapped out on its trust center site specifically how it addresses each of the 136 controls in the Microsoft Azure/dynamics 365 platform. This is excellent knowledge and due diligence for validation engineers and represents a good starting point for documenting the quality of the supplier.

Is Microsoft Azure/dynamics 365 secure enough for life sciences? In my personal opinion yes, it is. Companies still must conduct due diligence to ensure that Azure and dynamics 365 meet their business continuity requirements and business process requirements for enterprise resource planning. One thing is certain, the cloud changes things. You must revise your validation strategy to accommodate the cloud. Changes in how supplier audits conduct are conducted are just one of such changes.

The next challenge in validating Microsoft Dynamics 365 and Azure is conducting validation testing in the cloud environment. It should be understood that the principles of validation still endure whether or not you are in the cloud environment. You still must conduct a rigorous amount of testing including both positive and negative testing to confirm that Microsoft Azure/dynamics 365 meets its intended use. However, there are changes in the way we conduct installation qualification in the cloud environment. Some believe that installation qualification is no longer a valid testing process since cloud environments are provisioned. This is not correct. You still must ensure that cloud environments are provisioned in a consistent repeatable manner that supports quality.

It is helpful to understand that when Microsoft Dynamics 365 is provisioned it is conducted using Microsoft lifecycle services. The Microsoft lifecycle services application is designed for rapid implementation and deployment. However, it should be clearly understood that lifecycle services itself is an application which is a potential point of failure in the process.  The use of lifecycle services must be documented and the provisioning of the environment must be confirmed through the installation qualification process.

From an operational qualification perspective, validation testing remains pretty much the same. Test cases are traced to their respective user requirements and executed with the same rigor as in previous validation exercises.

Performance qualification is also conducted in the same manner as before. Since the environment is in the cloud and outside of your direct control, it is very important that network qualification as well as performance qualification be conducted to ensure that there are no performance anomalies that may occur in your environment. In the cloud environment you may have performance issues related to system resources, networks, storage arrays and many other factors. Performance tools may be used to confirm that the system is performing within an optimal range as established by the validation team. Performance qualification can be conducted either before the system goes live by placing a live load on the system or it may occur after validation. This is at the discretion of the validation engineer.

Maintaining the validated state within a cloud environment requires embrace of the principle of continuous testing. It is often been said that the cloud is perpetually changing. This is one of the reasons why many people believe that you cannot validate the system in the cloud. However, you can validate cloud-based systems such as Microsoft Dynamics 365 and Azure. Continuous testing is the key. What do I mean by continuous testing? Does that mean that we perpetually test the system for ever and ever every single day? Of course not! Continuous testing is a new strategy that should be applied to all cloud-based validated systems whereby at various predetermined intervals, regression testing should occur. Automated systems such as ValidationMaster™ can be the key to facilitating this new strategy.

Within ValidationMaster™ you can establish a reusable test script Library. This is important because in manual validation processes that are paper-based, the most laborious part of validation is the development and execution of test scripts. This is why many people cringe at the very notion of continuous testing. However automated systems make this much easier. In ValidationMaster™ each test script is automatically traced to a user requirement. Thus, during regression testing, I can select a set of test scripts to be executed based on my impact analysis and during off-peak hours in my test environment I can execute these test scripts to see if there has been any impact to my validated system. These test scripts can be run fully automated using Java-based test scripts or they can be run using an online validation testing process. Revalidation of the system can happen in a matter of hours versus a matter of days or weeks using this process.

Through continuous testing, you can review the changes that Microsoft has made to both Azure and dynamics 365 online. Yes, this information is posted online for your review. This answers the question how do I know what changes are made and when changes are made. This information is made available to you through Microsoft. You can determine how often you test a validated system. There is no regulation that codifies how often this should occur. It’s totally up to you. However, as a good validation engineer you know that it should be based on risk. The riskier the system the more often you should test. The less risky the system the less due diligence is required. Nevertheless cloud-based systems should be subject to continuous testing to ensure compliance and maintain the validated state.

There are many other aspects of supporting validation testing in the cloud but suffice to say Microsoft Dynamics and Azure can be validated and have been successfully validated for many clients. Microsoft has done a tremendous service to the life sciences industry by transparently providing information through the Microsoft trust center. As a validation engineer I find this one of the most innovative things that they’ve done! It provides a lot of the information that I need and confirmation through third-party entities that Microsoft is complying with cloud security alliance controls, and the NIST Cybersecurity framework. I would encourage each of you to review the information on Microsoft’s trust center. Of course, there will always be skeptics of anything Microsoft does but let’s give them credit where credit is due.

The Microsoft Trust Center is a good thing. The company has done an excellent job of opening up and sharing how they view system security, quality and compliance. This information was never made fail available before and it is available now. I have validated many Microsoft Dynamics AX systems as well as 365 systems. The software quality within these systems is good and with the information that Microsoft has provided you can have confidence that deploying a system in the Azure environment is not only a good business system decision if you have selected this technology for your enterprise but a sound decision regarding quality and compliance.

.

Accelerating Validation in the 21st Century

Each day in companies across the globe, employees are being asked to do more with less.  The mantra of the business community in the 21st century is “accelerating business”.  You see it in marketing and all types of corporate communication.  In the validation world accelerating system implementation, validation and deployment is the cry of every chief information officer and senior executive.

Accelerating validation activity means different things to different people.  I define it as driving validation processes efficiently and effectively without sacrificing quality or compliance.  Most manual processes are paper-based, inefficient, and cumbersome.  Creating paper-based test scripts requires a lot of cutting/pasting of screenshots and other objective evidence to meet regulatory demands.  Getting validation information into templates with proper headers and footers in compliance with corporate documentation guidelines is paramount.

The question for today is “HOW DO WE ACCELERATE VALIDATION WITHOUT SACRIFICING QUALITY AND COMPLIANCE?”

Earlier in my career, I developed “Validation Toolkit” for Documentum and Qumas.  A Validation Toolkit was an early version of a validation accelerator.  Many companies now have them.  These toolkits come with pre-defined document templates and test scripts designed to streamline the validation process.  They are called “tookits” for a reason – they are not intended to be completed validation projects.  They are intended to be a starting point for validation exercises.

The FDA states in the General Principles of Software Validation; Final Guidance For Industry and FDA Staff issued on January 11, 2002, “…If the vendor can provide information about their system requirements, software requirements, validation process, and the results of their validation, the medical device manufacturer (or any company) can use that information as a beginning point for their required validation documentation…”

Notice, that the FDA says “… use a beginning point” for validation – NOT USE IT EXCLUSIVELY AS THE COMPLETE VALIDATION PACKAGE.  This is because the FDA expects each company to conduct its own due diligence for validation.  Also note that the FDA allows the use of such documentation if available from a vendor.

Beware that some vendors believe their “toolkits” can substitute for rigorous review of their software applications.  It cannot.  In addition to leveraging the software vendor information, you should ALWAYS conduct your own validation due diligence to ensure that applications are thoroughly examined without bias by your team or designated validation professional.

It is an excellent idea to leverage vendor information if provided.  There is a learning curve with most enterprise applications and the use of vendor-supplied information can help streamline the process and add value.  The old validation toolkits were delivered with a set of document templates and pre-defined test scripts often resulting in hundreds of documents depending on the application.  In most cases, companies would edit the documents or put them in their pre-defined validation templates for consistency.  This resulted in A LOT OF EDITING!  In some cases, depending on the number of documents, time-savings could be eroded by just editing the existing documents.

THERE IS A BETTER WAY!  

What if YOUR unique validation templates were pre-loaded into an automated Enterprise Validation Lifecycle Management system?   What if the Commercial-off-the-Shelf (COTS) product requirements were pre-loaded into this system?  Further, what if a full set of OQ/PQ test scripts were traced to each of the pre-loaded requirements?  What if you were able to export all of these documents day one in your unique validation templates?  What if you only had to add test scripts that represented the CHANGES you were making to a COTS application versus writing all of your test scripts from scratch?  Finally, what if you could execute your test scripts on line and automatically generate a requirements trace matrix and validation summary report with PASS/FAIL status?

This is the vision of the CloudMaster 365™  Validation Accelerator.  CloudMaster 365™  is the next generation of the validation toolkit concept.    The ValidationMaster Enterprise Validation Management system is the core of the CloudMaster 365™ Validation Accelerator.  The application can be either hosted or on-premise.  This is a great way to jump start your Microsoft Dynamics 365® validation project.

CloudMaster 365

The CloudMaster 365™ Validation Accelerator includes (3) key components:

  • ValidationMaster™ Enterprise Validation Management System
  • Comprehensive set of user requirements for out-of-the-box features
  • Full set of IQ/OQ/PQ test scripts and validation document templates

The Validation Accelerator is not intended to be “validation out of the box”.  It delivers the foundation for your current and future validation projects delivering “Level 5” validation processes.  If you consider the validation capability maturity model, most validation processes at level 1 are ad hoc, chaotic, undefined and reactive to project impulses and external events.

Validation Maturity Model

Level 5 validation is an optimized, lean validation process facilitated with automation.  The test process is optimized, quality control is assured and there is a specific measure for defect prevention and management.  Level 5 cannot be achieved without automation.  The automation is powered by ValidationMaster™.   With the CloudMaster 365™ strategy, instead of having “toolkit” documents suitable for only one project, you have a system that manages your COTS or bespoke development over time and manages the full lifecycle of validation over time as is required by regulators.  This is revolutionary in terms of what you get.

CloudMaster 365™ TRULY ACCELERATES VALIDATION.  It can not only be used for Microsoft Dynamics 365®, but is also available for other applications including:

  • CloudMaster 365™ Validation Accelerator For Dynamics 365®
  • CloudMaster 365™ Validation Accelerator For Merit MAXLife®
  • CloudMaster 365™ Validation Accelerator For Yaveon ProBatch®
  • CloudMaster 365™ Validation Accelerator For Oracle e-Business® or Oracle Fusion®
  • CloudMaster 365™ Validation Accelerator For SAP®
  • CloudMaster 365™ Validation Accelerator For BatchMaster®
  • CloudMaster 365™ Validation Accelerator For Edgewater EDGE®
  • CloudMaster 365™ Validation Accelerator For VeevaVault®
  • CloudMaster 365™ Validation Accelerator For Microsoft SharePoint®

Whether you are validating an enterprise application or seeking to drive your company to higher levels of efficiency, you should consider how best to accelerate your validation processes not solution by solution but in a more holistic way that ensures sustained compliance across all applications.  If you are embarking on the validation of Microsoft Dynamics 365®, or any of the applications listed above, the CloudMaster 365™ Validation Accelerator is a no-brainer.  It is the most cost-effective way to streamline validation and ensure sustained compliance over time.

Accelerating Validation: 10 Best Practices You Should Know in 2018

As we open the new year with resolutions and new fresh thinking, I wanted to offer 10 best practices that should be adopted by every validation team.  The major challenges facing validation engineers every day include cyber threats against validated computer systems, data integrity issues, maintaining the validated state, cloud validation techniques and a myriad of other issues that affect validated systems.

I have been championing the concept of validation maturity and lean validation throughout the validation community.  In my recent blog post, I highlighted reasons why validation as we have known it over the past 40 years is dead.   Of course I mean this with a bit of tongue in cheek.

The strategies included in the FDA guidance and those we used to validate enterprise systems in the past do not adequately address topics of IoT, cloud technologies, cybersecurity, enterprise apps, mobility and other deployment strategies such as Agile change the way we think and manage validated systems.

The following paragraphs highlight the top 10 best practices that should be embraced for validated computer systems in 2018.

Practice 1 – Embrace Lean Validation Best Practices

Lean validation is the practice of eliminating waste and inefficiencies throughout the validation process while optimizing the process and ensuring compliance. Lean validation is derived from lean manufacturing practices where non-value-added activity is eliminated. As a best practice it is good to embrace lean validation within your processes. To fully embrace lean it is necessary to automate your validation processes thus it is strongly recommended that automation be part of your consideration for updating your validation processes in 2018.  My recent blog post on lean validation discusses lean in detail and how it can help you not only optimize your validation process but achieve compliance

Practice 2 – Establish Cybersecurity Qualification and SOP

I have noted in previous lock posts that cyber threats were the elephant in the room. Although validation engineers pledge to confirm that the system meets its intended use through the validation process, many organizations do not consider cyber security is a clear and present danger to validated systems. I believe that this is a significant oversight in many validation processes. Thus, I have developed a fourth leg for validation called cyber security qualification or CyQ. Cyber security qualification is an assessment of the readiness of your processes and controls to protect against a cyber event. It includes testing to confirm the effectiveness and the strength of your controls. In addition to IQ, OQ, and PQ, I have added CyQ as the fourth leg of my validation. It is strongly recommended in 2018 that you consider this best practice.

Practice 3 – Automate Validation Testing Processes

In the year 2018, many validation processes are still paper-based. Companies are still generating test scripts using Microsoft Excel or Microsoft Word and manually tracing test scripts to requirements. This is not only time-consuming but very inefficient and causes validation engineers to focus more on formatting test cases rather than finding bugs and other software anomalies that could affect the successful operation of the system. Lean validation requires an embrace of automated testing systems. While there are many point solutions on the market that address different aspects of validation testing, for 2018 it is strongly recommended that you embrace enterprise validation management as part of your overall strategy. An enterprise validation management such as ValidationMaster manage this the full lifecycle of computer systems validation. In addition to managing requirements (user, functional, design), the system also facilitates automated testing, incident management, release management, and agile software project methodologies. In 2018, the time has come to develop a Reusable Test Script Library to support future regression testing and maintaining the validated state.

An enterprise validation management system will also offer a single source of truth for validation and validation testing assets.  This is no longer a luxury but mandatory when you consider the requirement for continuous testing in the cloud. This is a best practice whose time has come. In 2018, establish automated validation processes to help maintain the validated state and ensure software quality.

Practice 4 – Develop a Cloud Management For Validated Systems SOP

many of today’s applications are deployed in the cloud. Office 365, dynamics 365 and other such applications have gained prominence and popularity within life sciences companies. In 2018, you must have a cloud management standard operating procedure that provides governance for all cloud activities. This SOP should define how you acquire cloud technologies and what your specific requirements are. Failure to have such an SOP is failure to understand and effectively manage cloud technologies for validated systems. I have often heard from validation engineers that since you are in the cloud the principles of validation no longer apply. This is absolutely not true. In a cloud environment, the principles of validation endure. You still must qualify cloud environments and the responsibilities between yourself and the cloud provider must be clearly defined. If you don’t have a cloud SOP or don’t know where to start in writing one, write me and I will send you a baseline procedure.

Practice 5 – Establish Validation Metrics and KPIs

Peter Drucker said you can’t improve what you can’t measure. It is very important that you establish validation metrics and key performance indicators in 2018 for your validation process.  You need to define what success looks like. How will you know when you’ve achieved or objectives? Some key performance indicators may include the number of errors, the number of systems validated, the number of incidents (this should be trending downward), and other such measures. Our ValidationMaster™ portal allows you to establish KPIs for validation and track them over time. In 2018 it’s important to embrace this concept. I have found over the years that most validation activities are not looked at from a performance perspective. They are often inefficient and time-consuming and no one actually measures the value to the organization regarding validation or any key performance indicators. The time is come to start looking at this and measuring it as is appropriate.

Practice 6 – Use Validation Accelerators For Enterprise Systems

Across the globe validation engineers are being asked to do more with less. Organizations are demanding greater productivity and compliance. As you look at enterprise technologies such as Microsoft Dynamics 365®, you need to consider the importance of this environment and how it should be validated. There is a learning curve with any enterprise technology. You should recognize that although you may know a lot about validating a system you may not know specifically about this technology and the ends and outs of this technology. The FDA stipulates and their guidance that you may use vendor information as a starting point for validation. Understand that there’s no such thing as validation out of the box therefore you must do your due diligence when it comes to validated systems.

In 2018 we recommend that for systems such as Microsoft Dynamics 365, BatchMaster™, Edgewater Fullscope EDGE®, Veeva® ECM systems, Oracle e-Business®, and many others, that you use a Validation Accelerator to jumpstart your validation process. Onshore Technology Group offers Validation Accelerators for the aforementioned software applications that will help streamline the validation process. Most noteworthy, our validation accelerators come bundled with a full enterprise validation management system, ValidationMaster.  This helps to deliver a single source of truth for validation.

Practice 7 – Fill Staffing Gaps With Experts on Demand

In 2018, you may find yourself in need of validation expertise for your upcoming projects but may not have sufficient resources internally to fulfill those needs. A burgeoning trend today is the outsourcing of validation staff to achieve your objectives – validation staffing on demand. OnShore offers an exclusive service known as ValidationOffice℠ which provides validation team members on demand. We recruit and retain top talent for your next validation project. We offer a validation engineers, technical writers, validation project managers, validation test engineers, validation team leaders, and other such staff with deep domain experience and validation confidence. In 2018 you may want to forgo direct hires and use ValidationOffice℠ to fulfill your next staffing need.

Practice 8 – Establish Validation Test Script Library

One of the primary challenges with validation testing is that of regression testing. Once a system is validated and brought into production any software changes, patches, updates, or other required changes drive the need for regression testing. This often results in a full rewrite of manual, paper-based test scripts. Today’s validation engineers have tools at their disposal that eliminates the need for rewrite of test scripts and the ability to pull test scripts required for regression testing from a reusable test script Library. This is the agile, lean way to conduct validation testing. In 2018, you should consider purchasing an enterprise validation management system to manage a reusable test script Library. This is a process that will save you both time and money in the development and online execution of validation test scripts.

Your test scripts can be either fully automated or semi-automated. Fully automated test scripts can be run automatically with no human intervention. Semi-automated validation test scripts are used by the validation engineer to conduct validation testing online. It should be noted that ValidationMaster supports both. Check out demo to learn more about how this works.

Practice 9 – Develop a Strategy For Data Integrity

The buzzword for 2018 is data integrity. Europe just announced its GDPR regulations for data integrity.  The new regulations hold you accountable for the integrity of data in your system and the privacy of information housed therein.  In 2018 it is imperative that you have a data integrity strategy and that you look at the regulations surrounding data integrity to ensure that you comply. In the old days when we used to do validation, it was left up to each validation engineer when they would do system updates and how often they would validate their systems. Data integrity and cyber security as well as cloud validation demand strategies that include continuous testing. I discuss this in a previous blog post but it is very important that you consider testing your systems often to achieve a high level of integrity not only of the application but of the data.

Practice 10 – Commit to Achieve Level 5 Validation Process Maturity

Finally, your validation processes must mature in 2018. It is no longer acceptable or feasible in most organizations that are attempting to improve efficiency and compliance to simply continue with antiquated, inefficient paper-based systems. Therefore in 2018 it will be important for you to commit to Level 5 validation process maturity where your processes are automated and you have greater control over your validated systems and sustain quality across the validation lifecycle. Check out my other post on achieving Level 5 process maturity.

I wish you all the best in 2018 as you endeavor to establish and maintain the validated state for computer systems. From the development of Level 5 validation processes through automation it’s important to understand the changes that new technology brings and how well you and your organization can adapt to those changes. Your success depends on it! Good luck and keep in touch!