Women in Validation: In Case You Missed It

One of my annual “must attend” events is the Institute of Validation Technology (IVT) Computer Systems Validation Annual Validation Week in October.  This is an excellent event to get the latest and greatest thinking on validation topics and industry guidance.  An official from the U.S. FDA attended the event and providing key information on the latest data integrity guidance and reorganization at the FDA.

The highlight of the event for me was a new session added called the “Women in Validation Empowerment Summit”.  The panel featured yours truly and along with other talented women in validation.  The goal of the panel was to discuss some of the unique challenges women face in STEM professions and how to over come them.  The panel was moderated by Roberta Goode of Goode Compliance International.  She led panelists through a lively discussion of topics ranging from career building, job challenges, work place difficulties and much more.  You can see a synopsis of what we discussed here.

I am a fierce advocate of STEM. There are women who are “hidden figures” in STEM professions such as computer systems validation that don’t get the recognition they so richly deserve or the attention of most corporate executives.  This session was about EMPOWERMENT.

There are many “hidden figures” in the field of validation but no real community surrounding them.  I left the session most impressed with my accomplished co-panelists.  Each and every one of them had tremendous insights on their career paths and stories to tell as to how they moved up the ladder.  The talk was real.  It was down to earth.  It was practical.  I didn’t want it to end!

A couple of troubling statistics highlighted during the session related to the number of undergraduate degrees earned by women in STEM.  According to The World Economic Forum, women earn “only 35 percent of the undergraduate degrees in STEM”.  The sad truth is this number remains unchanged for over a decade.  Look at the statistics for women in engineering from the National Science Foundation.

Women in Engineering I graduated in Civil and Environmental Engineering from the University of Wisconsin – Madison in 1982.  I was one of only 3 women in my class at the time.  You can see from the graph above that at the doctorate level, women drop off precipitously.

At the session, we discussed gender bias, stereotypes and other factors that affect women in our profession.  It was an excellent forum and one I hope they continue.  In case you missed it, the session was enlightening and fun for all!   See you in October!

SharePoint Validation: Quality and Compliance Portals

I am often asked the question… “can SharePoint be validated?”  The short answer is YES but it often requires customization to achieve deliver compliance objectives.  The longer response requires further examination as to why people ask the question and the nature of SharePoint as a content management system.  With the release of Office 365® reaching over 100 million active users per month and more companies moving toward the cloud, we are witnessing the maturation of SharePoint for both regulated and non-regulated content management.

SharePoint has undergone many changes over the past decade that have increased its adoption within the life sciences industry.  New features of SharePoint from Microsoft and its robust technology partner community include, but are not limited to:

  • Synchronization with OneDrive For Business®
  • New SharePoint Communication Sites With Pre-Built Layouts
  • Integration of SharePoint and Microsoft Team
  • New Integration with DocuSign® For Electronic Signatures
  • Enhanced Integration For Graphical Workflows From Nintex®
  • SharePoint-aware PowerApps and Flow
  • Updated Page Layouts and Web Part Enhancements
  • Improved SharePoint Administration
  • Enhanced Document Version Control

Within the life sciences community, the resistance to SharePoint focused on security and the lack of “out-of-the-box” features for life sciences validation.  What are some of the key application that life sciences companies require from a regulatory SharePoint enterprise content management system?  A partial list of document and records management features such as:

  • Intelligent Document Creation Tools
  • Automated Document Change Control
  • Configurable Document Types With Pre-Assigned Document Workflows (based on the type of document, workflows are automatically launched)
  • 21 CFR PART 11 support (electronic or digital signatures, audit trails, et al)
  • Ability to print a Signature Page with Each Signed Document
  • Ability to Establish Pre-defined Automated Document Lifecycle Workflows
  • Support for and designation of Controlled and Uncontrolled Content
  • Controlled Document Management Features Should include Configurable watermarks and overlays
  • Markup tools for document review
  • Ability to classify documents for records management capabilities
  • Ability to assign/tag documents with metadata
  • Content Rendering (when documents are checked in, they are automatically rendered in PDF format for document review.)
  • Custom Document Numbering (the ability to automatically assign alphanumeric document numbers to content)
  • Enforcement of the use of Standard Document Templates Codified Within SOPs
  • Version tracking with major and minor version control, version history
  • Ability to support regulatory submissions and publishing (this is a big one)
  • System MUST BE VALIDATABLE

As you can see from the partial list above, there are many features required by regulatory companies that are not standard in SharePoint out of the box.  However, SharePoint offers rich capabilities and features that have significantly enhanced the ability to deliver such as solution with the features listed above with minimal effort.

As a former Documentum and Qumas executive, I know first hand the challenges of developing such as system from scratch as my former employers did.  However, leveraging the power of SharePoint, OnShore Technology Group’s ValidationMaster™ Quality and Risk Management portal for example, is SharePoint-based and includes all of the features listed above.  The level of effort required to deliver such as solution was substantially lower due to the SharePoint application framework and development tools.

The ability to manage regulatory submissions and publishing is one of the features for which SharePoint may be more challenged.  In the Documentum world, there was such a thing as a “Virtual Document”.  A Virtual Document was a document that contained components or child documents.  A Virtual Document may represent a section of a regulatory dossier where the header represented the section of the dossier and there may be several child documents that are individual documents in that section.  Documentum was an object-oriented system and thus allowed the ability to have a single document comprised of multiple ACTUAL documents with early and late binding ability.  Since each component of a Virtual Document is its own document that can be checked in/check out and routed individually from other components, it makes them ideal for regulatory submission management which has very specific guidelines for publishing and pagination.   I have not seen a parallel yet for this in SharePoint.

Document management systems use to cost millions of dollars for acquisition, implementation and deployment.  These systems are now somewhat “commoditized” and the price points are significantly lower.  Many life sciences companies are using SharePoint for non-regulated documentation.  However, an increasing number of them are abandoning their higher cost rivals and moving to SharePoint as the foundation for controlled and uncontrolled documentation.  SharePoint can be in a hosted Office 365 environment or established in an on-premise environment.  Check out my cloud validation posts for more information on validating SharePoint and other applications in a cloud environment.  Either way, the system can and should be validated if used for regulatory content management.

It is recommended that you establish a clear set of user requirements for SharePoint.  SharePoint has powerful capabilities much beyond those articulated in this blog post.  There are many SharePoint partners that deliver effective, ready-to-use integrations with SharePoint such as Nintex® and DocuSign®.   Use these partner solutions to help minimize the validation effort.

If you have not already done so, it is worth a second look for regulated content depending on your application.  One thing is for sure, the day of the multi-million dollar content management solution is over for most companies.

Can Microsoft Azure Be Validated?

The text book definition of computer systems validation (paraphrased) is “documented evidence that a system performs according to its intended use”.  The FDA uses the definition “…Validation is a process of demonstrating, through documented evidence, that <software applications> will consistently produce the results that meets predetermined specifications and quality attributes.”

Using the FDA’s definition, one should understand that validation is a PROCESS.  It is not a one-time event.  Also, looking at this definition, one should understand that Agency expects to see “documented evidence” that a system will consistently produce results that meet predetermined specification and quality attributes.  And finally, from the FDA’s definition of validation, we learn that to properly validate a system, one MUST have “predetermined specifications and quality attributes” defined.

Let’s start at the beginning with the question posed by this blog post “Can Microsoft Azure® Be Validated?”  The short answer is YES.  The remainder of this post I will share 6 key reasons why this bold statement can be made.  But first, let’s establish some foundation as to what Microsoft Azure® (“Azure®”) is and why this question is relevant.

At its core, Azure is a cloud computing platform and infrastructure created by Microsoft for building, deploying, and managing applications and services through a global network of Microsoft-managed datacenters (Wikipedia).  In today’s global economy, life sciences companies, like many of their industry counterparts are seeking to do more with less.    These companies are embracing enterprise technologies and leveraging innovation to drive global business and regulatory processes.  If you had asked me 10 years ago, how many companies I am working with are deploying enterprise technologies in the cloud, my answer would have been very few, if any.  If you further asked me 10 years ago, how many cloud validation exercises in life sciences was I conducting, I would have said NONE.

Suffice to say, cloud computing has been a game-changer for life sciences companies.  It should be noted that due to regulatory considerations, the life sciences industry has been reluctant to change validation practices.  But the cloud is changing the process of validation as we know it.  Sure, the basic principles of risk-based validation endure.  You STILL have to provide documented evidence that your software applications will consistently produce the results that meets predetermined specifications and quality attributes.  HOW you do this is what is changing.  Validating cloud applications is a bit different than validating on-premise application mostly due to the changing roles and responsibilities of those responsible for the system and the cloud providers themselves.  I will say at the outset that all cloud providers are not created equally so “caveat emptor” (let the buyer beware).

Let’s talk about the 6 key reasons why Azure can be validated and how this is done.

SIX REASONS WHY AND HOW AZURE CAN BE VALIDATED.

REASON 1 – AZURE RISK ASSESSMENT

Risk management is not a new concept.  It has been around since the 1940’s applied to military and aerospace industries before it was broadened out.  The first step crucial step in any computer systems validation project is to conduct a validation risk assessment.  Among determining the impact on critical quality attributes, a risk assessment is used to determine the level of validation due diligence required by the system.   How much validation testing you should do for any validation exercise should be based on a risk assessment of the system.  When deploying systems such as Microsoft Dynamics AX® or BatchMaster/GP® using the Azure platform, the risk assessment should include an assessment of the Azure platform itself.  When outsourcing your platform to a vendor such as Microsoft, their risks become your risks thus, you must conduct your due diligence to ensure that these risks are acceptable in your organization.

The ISPE GAMP 5® (“GAMP 5®”) risk assessment model defines the process for hazard identification, estimation of risk, evaluation of risk, control and monitoring of risks and the process for documenting risk.  GAMP 5® speaks of a 5-step process including:

  1. Perform Initial Risk Assessment and Determine System Impact
  2. Identify Functions With Impact On Patient Safety, Product Quality and Data Integrity
  3. Perform Functional Risk Assessment and Identify Controls
  4. Implement and Verify Appropriate Controls
  5. Review Risks and Monitor Controls

GAMP 5® does not explicitly address cloud infrastructure.    One could extrapolate and interpret how this standard may be applied to cloud infrastructure but it is one of the limitations of applying a “clean” translation.  Azure could be treated as a Commercial-off-the-Shelf (COTS) platform.  You will note that the risk process identified by GAMP 5® is more application-focused than platform-focused.  To conduct a proper risk assessment of the Azure, a new framework must be established to conduct a risk assessment for the platform.  In July 2011, ISACA, previously known as the Information Systems Audit and Control Association, released a document called “IT Control Objectives for Cloud Computing: Controls and Assurance in the Cloud” which highlighted specific control objectives and risks inherent in cloud computing.  From a validation perspective, you need to address availability/reliability risks, performance/capacity risks, as well as security and cybersecurity risks.  Using the publicly available independent service organization controls (SOC 1/2/3) reports, you should be able to effectively document and assess these risks.  Remember, if it’s not documented, it didn’t happen!

REASON 2 – CHANGE CONTROL (“THE ELEPHANT IN THE ROOM”)

One of the biggest challenges with companies adoption of cloud computing is CHANGE CONTROL.   As you may be well aware, cloud environments are under constant change.  In the validation world, change control is an essential process to help ensure that validated systems remain in a validated state.  So how do you maintain the validated state when a platform such as Azure is in a constant state of change?  First, all validated systems are subject to change control.  Each time a system is changed, the change, reason for change and who made the change must be documented.  When the system is on-premise, changes are documented (in general) by the IT department.  The responsibility for the infrastructure belongs to this group in an on-premise scenario.  In a cloud environment, this responsibility belongs to the cloud provider.  SOC reports highlight and define change control processes and procedures for the cloud environment.

We recommend to our clients that they obtain a copy of these reports under non-disclosure to ensure compliance of the cloud provider.  Once in the cloud, we recommend developing a testing and regression testing strategy that tests systems in a cloud environment on a more frequent basis.  In the ‘80’s and ‘90’s, validation engineers were very reluctant to change validated systems due to the complexities and level of effort involved in testing validated systems using manual testing processes.  Today, companies are leveraging tools like OnShore’s ValidationMaster™ to facilitate automated testing that allows validation test engineers to schedule the run of automated test scripts in a cloud environment.  ValidationMaster™ takes the complexity from regression testing and frees up resources for other work.   Change control is essential in on-premise and cloud environments.   To support the cloud, you will need to establish clear policies and procedures that reference both on-premise as well as cloud computing systems.  With respect to change control, it is very important that your policies reflect procedures in a cloud environment.

Microsoft has established clear controls for Azure to manage change control in their environment.  These procedures have been documented and tested as per their SOC 1/SOC 2 control reports issued by independent auditors.  When validating systems in an Azure environment, you will have to get copies of these reports and them with your validation package.    The supplier audit becomes crucial in a hosted environment.  The good news is that Microsoft is a reputable supplier and can be relied on as a stable vendor with respect to its products.

Microsoft Azure has undergone many independent audits including a 2013 ISO/IEC 27001:2005 audit report that confirms that Microsoft has procedures in place that provide governance for change management processes.  The audit report highlights both the methodology confirms that changes to the Azure environment are appropriately tested and approved.

REASON 3 – INDEPENDENT AUDIT CONFIRMATION OF TECHNICAL/PROCEDURAL CONTROLS AND SECURITY

In cloud environments, the role of the cloud provider is paramount.  All cloud providers are not created equal.   Microsoft Azure has passed many audits and these reports, as previously noted are available upon request.  Security and cybersecurity issues are very important in the Azure environment.  Microsoft has implemented controls and procedures to ensure compliance.   The company has provided detailed information on security on their website (https://www.microsoft.com/en-us/TrustCenter/Security/default.aspx )

REASON 4 – INFRASTRUCTURE QUALIFICATION (IQ)

IQ testing changes in the Azure environment.  In the Azure environment, Microsoft bears some responsibility for software installation of hardware and software.  As previously mentioned, it is important to qualify the environment once it is setup to confirm that what is supposed to be installed is present in your environment.  We qualify the virtual software layer as well as Microsoft Lifecycle Services.

REASON 5 – OPERATIONAL/PERFORMANCE QUALIFICATION (OQ/PQ)

The process of operational qualification of the application is very similar in a cloud environment to the process in an on-premise environment.  Microsoft Dynamics AX now can easily be deployed in an Azure environment.   Validation testing of this type of implementation with some variation.  The installation of Microsoft Dynamics AX uses what is called Lifecycle Services which allows you to setup and install the system rapidly using automation.  Lifecycle Services may be subject to validation itself since it is used to establish a validated systems environment.  You will need to identify the failure of Lifecycle Services as a risk and rate the risk of using this service as part of the risk assessment.

Lifecycle Services is promoted as bringing predictability and repeatability to the process of software installation.  These are principles upon which validation has stood for many years.  However, it is important to go beyond the marketing to verify that this software layer is performing according to its intended use.  We typically draft a qualification script to qualify Lifecycle Services prior to using it in a validated systems environment.  We conduct OQ/PQ testing in the same manner as in a non-hosted environment.

REASON 6 – ABILITY TO MAINTAIN THE VALIDATED STATE

With Microsoft Azure, the need to maintain the validated state does not go away.  It is very possible to maintain your mission-critical applications in a validated state that are hosted using the Azure platform.  To ensure that systems are maintained in a validated state in the Azure environment, we recommend establishing policies and procedures and adopting the test early/test often premise to ensure that changes made to the infrastructure do not negatively impact the validated production systems environment.  It is possible to maintain the validated state using Azure.  It must be done with a combination of technology, processes and procedures to ensure compliance.

As a validation engineer, I have validated many systems on the Azure platform.  I have concluded that the Azure platform can be validated and used within the life sciences environment.   How much validation due diligence you should conduct should be based on risk.  Don’t over do it!  Use lean methodologies to drive your validation processes.

Upon close inspection of this platform, it has built-in quality and security controls to provide a level of assurance of its suitability for deployment in regulated environments.   I like the idea that the system has been independently inspected by numerous 3rd party organizations to help ensure objectivity.  This platform is ready for prime time and many of my clients are leveraging its benefits and freeing themselves from the burden of managing their infrastructure.

Is Your Validation Team Ready For GDPR?

GDPR stands for the General Data Protection Regulation.  It governs all personal data collected by companies for customers, potential customers, employees, and others.  Regulators are keen to understand how this information is managed and maintained over time.

In April 2016 the FDA issued new draft guidance for data integrity and compliance with cGMP. The guidance was issued in a question and answer style format and focused on frequently occurring data integrity lapses. When the FDA finalizes the guidance, it will represent their current thinking on data integrity and cGMP compliance for the industry.

Why did the FDA draft such guidance? It should be noted that the FDA has increasingly observed cGMP violations involving data integrity during the inspection process. Over 21 warning letters have involved data integrity lapses in drug manufacturing since January 2015. The FDA strongly believes that ensuring data integrity is an important component of the industry’s responsibility to ensure the safety, efficacy, and quality of drugs to protect public health and safety overall.

In recent years, many articles have been written that referred to data integrity using the ALCOA which means that data has to be attributable, legible, contemporaneous, original or true copy, and accurate. It should be noted that the requirements for record retention and review do not differ depending on the data format. Paper-based and electronic record-keeping systems are subject to the very same requirements.  For example section 211.68 requires that backup data be exact and complete and secure from alteration, inadvertent erasures, or loss. Section 211.180 requires true copies or other accurate reproductions of the original records.

Most life sciences companies validate business systems that have GMP impact. It is best practice to conduct installation, operational, and performance qualification testing to demonstrate that a computer system is fit for its intended use and document any incidents that may affect software quality or the reliability of the records. Data integrity and validation go hand in hand but with the latest guidance there’s really nothing new under the sun from a validation perspective. The same level of due diligence and rigor must be applied to confirm that systems are suitable for their intended use and that the data integrity within these systems is sound.

When you are examining data integrity issues it is critically important to look at all aspects of the system including system security and how it is established to ensure that records entered into the system have the highest level of integrity. The FDA recommends that you restrict the ability to alter files and settings within the system to those administrator users that require such access. A recent warning letter cited the failure to prevent unauthorized access or changes to data.

For systems design in accordance with 21 CFR part 11 it is critical to understand that audit trails should be independent. I know this doesn’t come as a surprise for many but I have seen systems where the audit trail could be turned on or off. Let me be clear. All systems designed in accordance with 21 CFR part 11 must have an independent audit trail generated by the computer such that the audit trail cannot be turned off by ordinary means. This means that someone cannot go to a common function within the system and turn off the audit trail. The FDA recommends that audit trails that capture changes to critical data be reviewed with each record and before the final approval of the record. They recommend that audit trails be subject to regular review. Recent warning letters have cited a lack of audit trail for lab instruments for example and the fact that audit trails can be turned off. If an audit trail can be turned off, fraudulent activity may occur. It is important for you to confirm within your systems that the audit trails are capturing information regarding the each record and that these audit trails are independent to ensure data integrity.

Data integrity is not a new concept but it is one that is receiving a lot of attention. Compliance with data integrity guidelines represents more of common sense for those in the compliance business. Look at data integrity not as the latest buzzword but as a reminder of how important it is to ensure the integrity and authenticity of data established and maintained within validated systems environment. This will go a long way to ensuring sustained compliance.

Cybersecurity Qualification (CyQ)

One topic that has been top of mine for many validation engineers, chief information officers, and executive management is that of Cybersecurity. You may be asking yourself the question why are we talking about Cybersecurity and validation? Recent headlines will inform you as to why this topic should be of great interest to every validation engineer. As validation engineers we spend a lot of time stressing about risk assessments, system security, and qualification of system environments. Our job is supposed to be to validate the system to ensure its readiness for production use. Let me ask a question… How can you ensure that a system is ready for production use if it is not cyber-ready?  This is why we are talking about Cybersecurity in the context of validated systems.

When it comes to computer systems in today’s highly networked environment, Cybersecurity is the elephant in the room. All networked systems may be vulnerable to cyber security threats. Businesses large and small may be subject to cyber-attacks and the exploitation of these vulnerabilities may present a risk to public health and safety if not properly addressed. Although we know these truths all too well, many validation engineers are not even discussing Cybersecurity as part of an overall validation strategy.

There is no company that can prevent all incidences of cyber-attacks but it is critically important that companies began to think seriously about how to protect themselves from persistent cyber criminals determined to inflict as much damage as possible on computer systems in either highly regulated or nonregulated environments. One thing we know about cyber criminals is they are equal opportunity offenders – everyone has a degree of vulnerability. To beat them at their game, you have to be one step ahead of them.

In the validation world, we often refer to validation testing as IQ/OQ/PQ testing.  I would like to submit for your review and consideration another type of enhanced validation testing that we should be doing which is Cybersecurity qualification or as I like to refer to it “CyQ”.  What is a CyQ?  It is confirmation of a system’s protection controls and readiness to prevent a cyber-attack.  In one of my recent blog posts, I declared that …”computer systems validation as we know it is dead!…” Now of course I mean that tongue in cheek!  What I was referring to is that it is time to rethink our validation strategy based on the fact that we need to address the vulnerabilities of today’s cloud-based and on-premise systems with respect to the Cybersecurity risk imposed. We can no longer look at systems the way we did in the 1980s. Many life sciences companies are deploying cloud-based technologies, mobile systems, the Internet of things (IoT) and many other advanced technologies in the pursuit of innovation that may drive greater risk profiles in validated systems.  Incorporating CyQ in your overall validation strategy is one way to address these challenges.

The national Institute of standards and technology (NIST) introduced as cyber security framework. The five elements of the framework are shown in the figure below.

NIST-cybersecurity-framework

As a validation engineer I have studied this framework for its applicability to validated systems.  Each element of the strategy addresses a dimension of your cybersecurity profile.  To conduct a CyQ assessment, you need to examine each element of the cybersecurity framework to determine your readiness in each respective category.  I have developed a CyQ Excel Spreadsheet which examines each element of the framework and allows you to summarize your readiness to prevent a cyber-attack. (if you would like a copy of the CyQ Excel Spreadsheet, please contact me using the contact form and I will happily send it to you).

 

Remember, for validated systems, if it is not documented, it did not happen! Cybersecurity Qualification analysis must be documented.  You must be ready to explain to regulators when it comes to data integrity and systems integrity, what controls you have in place to protect both the data and the systems under your management.

Another consideration in the management of cyber threats is EDUCATION.  The biggest cyber breach may come from the person in the cubicle next to you! You must educate (and document) cyber training and do it on a frequent basis to keep pace.

For your next validation project, address the elephant in the room explicitly.   Cyber threats are not diminishing, they are increasing.  It is important to understand their origin and seriously consider how they can and will impact validated systems.  We can no longer think that IQ/OQ/PQ is sufficient.  While it has served its purpose in times past, we need a more effective strategy to address today’s clear and present danger to validated systems – the next cyber-attack.  It could be YOUR SYSTEM.  Deal with it!

Validating Microsoft Dynamics 365: What You Should Know

Microsoft Dynamics 365 and Azure are gaining popularity within the life sciences industry. I am often asked the question about how to validate such a system given its complexity and cloud-based nature. The purpose of this blog post is to answer this question. The outline of this blog post is as follows.

  • Understanding the Changing State of Independent Validation and Verification
  • Strategies for Installation Qualification of Microsoft Azure
  • Practical Strategies for Operational and Performance Qualification Testing
  • Continuous Testing in A Cloud Environment
  • Maintaining the Validated State and Azure

To begin our discussion, it is helpful to consider what keeps regulators up at night. They are concerned primarily about four key aspects of validated computer systems:

  1. Vulnerability – How Vulnerable Our Cloud Computing System Environments
  2. Data Integrity – What Is Your Strategy to Maintain Data Integrity Within Your Validated Computer Systems
  3. System Security and Cyber Security – How Do You Keep Sensitive Information Secure and How Do You Protect a Validated Computer System Against Cyber Threats?
  4. Quality Assurance – How Do You Minimize Risk to Patient Safety and Product Quality Impacted by The Validated System?

One of the first task validation engineers must be concerned with is that of supplier auditing. When using commercial off-the-shelf software such as Microsoft Dynamics 365 and Azure a supplier audit is a mandatory part of the validation process. 20 years ago, when we prepared validation documentation for computer systems, validation engineers often conducted a paper audit or an actual physical audit of a software provider. Supplier audits conducted on-site provided a rigorous overview of what software companies were doing and the quality state of their software development lifecycle process. It was possible to examine a supplier’s processes and decide as to if the software vendor was a quality supplier.

Most software vendors today including Microsoft do not allow on-site vendor audits. Some validation engineers have reported to me that they view this as a problem. However, the Microsoft Trust Center is a direct response to the industry’s need for transparency.  Personally, I think the Microsoft trust center is the best thing that they have done for the life sciences industry. Not only do they highlight all of the Service Organization Control reports (SOC1/SOC2/SOC3 and ISO/IEC 27001:2005), but they summarize their compliance with the cloud security alliance controls as well as the NIST Cybersecurity framework. I would strongly recommend that you visit the Microsoft trust center at https://www.microsoft.com/en-us/trustcenter.  The latest information posted to their site is a section on general data protection (GDPR) and how their platform can help keep data safe and secure. I find myself as a validation engineer visiting this site often. You will see a section for specifically Microsoft 365 and Azure.

From a supplier auditing perspective, I use the information found on the Microsoft trust center to facilitate a “desk audit” of the vendor. Many of the questions that I would ask during an on-site audit are found on this website. As part of my new validation strategy I include the service organization control reports as part of my audit due diligence.  The trust center includes in-depth information about security, privacy, and compliance offerings, policies, features and practices across all of Microsoft cloud products. .

If I were conducting an on-site audit of Microsoft, I would want to know how they are establishing trust in the cloud. Many of the questions that I would ask in person I have found on this website. It should be noted that service organization control reports are created not by Microsoft but by trusted third-party organizations certified to deliver such a report. These reports include an assessment of how well Microsoft is complying with the stated controls for cloud management and security. This is extremely valuable information.

From a validation perspective I attach these reports with my validation package as part of the supplier audit due diligence. There may be instances where you conduct your own due diligence beyond the reports but the reports provide an excellent start to understanding what Microsoft is doing.

Microsoft has adopted the cloud security alliance (CSA) cloud controls matrix to establish the controls for the Microsoft Azure platform. These controls include:

  • Security Policy and Procedures
  • Physical and Environmental Security
  • Logical Security
  • System Monitoring and Maintenance
  • Data Backup, Recovery and Retention
  • Confidentiality
  • Software Development/Change Management
  • Incident Management
  • Service Level Agreements
  • Risk Assessments
  • Documentation/Asset Management
  • Training Management
  • Disaster Recovery
  • Vendor Management

the cloud control matrix includes 136 different controls that cloud vendors such as Microsoft must comply with. Microsoft has mapped out on its trust center site specifically how it addresses each of the 136 controls in the Microsoft Azure/dynamics 365 platform. This is excellent knowledge and due diligence for validation engineers and represents a good starting point for documenting the quality of the supplier.

Is Microsoft Azure/dynamics 365 secure enough for life sciences? In my personal opinion yes, it is. Companies still must conduct due diligence to ensure that Azure and dynamics 365 meet their business continuity requirements and business process requirements for enterprise resource planning. One thing is certain, the cloud changes things. You must revise your validation strategy to accommodate the cloud. Changes in how supplier audits conduct are conducted are just one of such changes.

The next challenge in validating Microsoft Dynamics 365 and Azure is conducting validation testing in the cloud environment. It should be understood that the principles of validation still endure whether or not you are in the cloud environment. You still must conduct a rigorous amount of testing including both positive and negative testing to confirm that Microsoft Azure/dynamics 365 meets its intended use. However, there are changes in the way we conduct installation qualification in the cloud environment. Some believe that installation qualification is no longer a valid testing process since cloud environments are provisioned. This is not correct. You still must ensure that cloud environments are provisioned in a consistent repeatable manner that supports quality.

It is helpful to understand that when Microsoft Dynamics 365 is provisioned it is conducted using Microsoft lifecycle services. The Microsoft lifecycle services application is designed for rapid implementation and deployment. However, it should be clearly understood that lifecycle services itself is an application which is a potential point of failure in the process.  The use of lifecycle services must be documented and the provisioning of the environment must be confirmed through the installation qualification process.

From an operational qualification perspective, validation testing remains pretty much the same. Test cases are traced to their respective user requirements and executed with the same rigor as in previous validation exercises.

Performance qualification is also conducted in the same manner as before. Since the environment is in the cloud and outside of your direct control, it is very important that network qualification as well as performance qualification be conducted to ensure that there are no performance anomalies that may occur in your environment. In the cloud environment you may have performance issues related to system resources, networks, storage arrays and many other factors. Performance tools may be used to confirm that the system is performing within an optimal range as established by the validation team. Performance qualification can be conducted either before the system goes live by placing a live load on the system or it may occur after validation. This is at the discretion of the validation engineer.

Maintaining the validated state within a cloud environment requires embrace of the principle of continuous testing. It is often been said that the cloud is perpetually changing. This is one of the reasons why many people believe that you cannot validate the system in the cloud. However, you can validate cloud-based systems such as Microsoft Dynamics 365 and Azure. Continuous testing is the key. What do I mean by continuous testing? Does that mean that we perpetually test the system for ever and ever every single day? Of course not! Continuous testing is a new strategy that should be applied to all cloud-based validated systems whereby at various predetermined intervals, regression testing should occur. Automated systems such as ValidationMaster™ can be the key to facilitating this new strategy.

Within ValidationMaster™ you can establish a reusable test script Library. This is important because in manual validation processes that are paper-based, the most laborious part of validation is the development and execution of test scripts. This is why many people cringe at the very notion of continuous testing. However automated systems make this much easier. In ValidationMaster™ each test script is automatically traced to a user requirement. Thus, during regression testing, I can select a set of test scripts to be executed based on my impact analysis and during off-peak hours in my test environment I can execute these test scripts to see if there has been any impact to my validated system. These test scripts can be run fully automated using Java-based test scripts or they can be run using an online validation testing process. Revalidation of the system can happen in a matter of hours versus a matter of days or weeks using this process.

Through continuous testing, you can review the changes that Microsoft has made to both Azure and dynamics 365 online. Yes, this information is posted online for your review. This answers the question how do I know what changes are made and when changes are made. This information is made available to you through Microsoft. You can determine how often you test a validated system. There is no regulation that codifies how often this should occur. It’s totally up to you. However, as a good validation engineer you know that it should be based on risk. The riskier the system the more often you should test. The less risky the system the less due diligence is required. Nevertheless cloud-based systems should be subject to continuous testing to ensure compliance and maintain the validated state.

There are many other aspects of supporting validation testing in the cloud but suffice to say Microsoft Dynamics and Azure can be validated and have been successfully validated for many clients. Microsoft has done a tremendous service to the life sciences industry by transparently providing information through the Microsoft trust center. As a validation engineer I find this one of the most innovative things that they’ve done! It provides a lot of the information that I need and confirmation through third-party entities that Microsoft is complying with cloud security alliance controls, and the NIST Cybersecurity framework. I would encourage each of you to review the information on Microsoft’s trust center. Of course, there will always be skeptics of anything Microsoft does but let’s give them credit where credit is due.

The Microsoft Trust Center is a good thing. The company has done an excellent job of opening up and sharing how they view system security, quality and compliance. This information was never made fail available before and it is available now. I have validated many Microsoft Dynamics AX systems as well as 365 systems. The software quality within these systems is good and with the information that Microsoft has provided you can have confidence that deploying a system in the Azure environment is not only a good business system decision if you have selected this technology for your enterprise but a sound decision regarding quality and compliance.

.

In Case You Missed It: CSV Validation Master Class

In case you missed it, in December 11- 13 KenX conducted the Computer System Validation and Data Integrity Congress.  Recently, the FDA has issued warning letters regarding non-compliance of validated computer systems.  Findings have included issues such as inadequate risk analysis, non-independent audit trails (audit trails that could be manipulated or turned on/off), failure to established SOPs that provide governance for the validation process and other such issues.  The event featured a guest speaker from the FDA who highlighted challenges associated with data integrity and the approach taken by the Agency.

Recently, I have been speaking to the regulatory community about the many challenges we face in validating today’s computer systems.  Cybersecurity, mobility, cloud applications at the enterprise level, the Internet of Things (IoT) and many other changes affecting the installation, qualification, deployment and function of computer systems have compelled me to rethink strategies around computer systems validation.  As a 30-year practitioner in the field, I have developed key best practices to support cloud validation, address cybersecurity and the other challenges associated with today’s technology.

In case you missed it, I conducted one of the first Computer Systems Validation Master Classes.  The Master Class presented a broad range of topics related to the Independent Validation and Verification of today’s technologies.  We addressed topics such as:

  • Lean Validation Principles, Best Practices and Lessons Learned
  • Computer Systems Validation Automation and Best Practicess
  • Cybersecurity & Computer Systems Validation: What You Should Know
  • Cybersecurity Qualification: The Next Frontier in Validation Testing
  • Cloud Validation Best Practices
  • Continuous Testing In The Cloud
  • Leveraging Service Organization Control (SOC) Reports For Supplier Audits
  • … and much more

The 90-minute session was a lively discussion of many topics for validation contemporaries that will help them master validation of the latest technologies and ensure sustained quality and compliance.

Our Master Class format encouraged knowledge-exchange, where each topic was not only debated from the practitioners’ perspective, but participants delivered insights from their experiences presenting the latest best practices, regulatory guidance and practical CSV scenarios resulting in a comprehensive discussion of each topic as well as practical tips, tools and techniques to ensure software quality and a more relevant validation process which takes into account today’s technologies and their profound impact on the validation process writ large.

For participation in the Validation Master Class workshop, I offered participants a copy of my lean validation process templates, a cybersecurity qualification (CyQ) template, a cloud validation SOP, cybersecurity validation SOP, a system risk assessment template and sample SOC 1/SOC2/SOC3 data center reports for cloud providers.  (if you would like to obtain a copy of these materials please contact me using the contact form provided)

In case you missed it, I can report that the event was a huge success as measured by the feedback from the session and the response of all participants.  Check out our events and join us at one of our weekly webinars or industry events!

What Data Integrity Means For Validation

After much fanfare, the general data protection regulations (GDPR) was approved by the EU parliament on April 14, 2016. The enforcement date for this regulation is May 25, 2018. If companies are not in compliance at that time the potential for heavy fines is inevitable. The EU general data protection regulation replaces the data protection directive 95 /46/EC and was designed primarily to harmonize data privacy laws across Europe and to protect and empower all Eve’s citizens data privacy and reshape the way organizations across the region approach data privacy. The regulation has a profound impact on businesses that operate in the EU. Maximum penalties may be as high as 4% of annual global turnover or €20 million (whichever is higher).

In recent years, we have seen massive data breaches at companies which has exposed private information and other sensitive information without consent. Many of these breaches have been due to cyber-attacks against companies of different sizes. The newspapers are full of such data breaches. The new GDPR regulation requires that breaches be reported to the relevant regulator without undue delay and where feasible, within 72 hours of becoming aware of it unless the breach is unlikely to result in a risk to the right and freedom of individuals. Data subjects must be informed without undue delay with the breaches likely to result in a high risk to the data subjects’ rights and freedoms unless the data has been rendered unintelligible to any third party (for example by encryption). Data processors are required to inform data controllers of any breach without undue delay.

What does all this mean for validated systems?

If you operate in the EU and your validated systems include sensitive data or data which may be of a personal nature, such as patient information, you are subject to the guidelines included within the GDPR regulation. You also need to look at data integrity and security practices around the validated system. We recommend strongly the Cybersecurity Qualification (CyQ ) discussed in a previous post. The CyQ assesses a firm’s readiness to protect itself against the cyber-attack. This could go a long way to meeting the requirements of GDPR since the cyber security qualification requires documentation of your security controls.

I recommend reading GDPR and getting used to it before May 2018. Assess your controls within your validated systems environment to determine how vulnerable your systems really are and your readiness to comply with this regulation. I assure you more will be forthcoming about this topic in the months to come.  WATCH THIS SPACE.

Installation Qualification (IQ) in the Cloud

Installation Qualification (IQ) is designed to ensure that systems are installed in a consistent, repeatable manner.  For on-premise systems, validation engineers typically install enterprise software from pre-defined vendor media.  Formerly, media was inserted into corporate servers and the full installation process is documented.  The purpose for installation qualification sometimes gets lost in the weeds.  Given the complexity of today’s enterprise and desktop systems in virtualized cloud environments , the need for installation qualification has never been greater.

The question often gets asked, “… how do you conduct installation qualification in the cloud?”  The answer to this question requires the need to think differently about how software is deployed in today’s validated systems environments.

Cloud deployment models include Infrastructure-as-a-Service (IaaS), Platform-as-a-Service (PaaS) and Software-as-a-Service (SaaS).  The validation strategy changes depending on the deployment model.  In all models, the virtualization, servers, storage and networking are managed by the vendor.  In the PaaS model, the runtime, middleware and O/S are also managed by the vendor.  It should be noted that cloud services are typically provisioned using automated tools thus, the installation process is not the same as with on-premise environments.

Therefore, to conduct IQ in cloud requires four essential stages:

  • Observation or Documentation of the Provisioning Process
  • Draft Installation Qualification Test Scripts For Pre-Approval
  • Execute Pre-Approved IQ Test Scripts
  • Summarize Installation Qualification Phase

We recommend for all cloud models the observation (to the extent possible) the provisioning process.  For some applications such as SaaS, this may not be possible.  You should observe the setup and configuration of all virtual servers and parameters used to load software.  The sequence of events may be key.  We have observed with some applications that the provisioning process (which itself is automated) may fail for a variety of reasons.  Thus, do not think it strange that the provisioning process does not successfully complete on occasion.  That is why, whenever possible, we observe this process.  For example, we conduct a lot of validation exercises for Microsoft Dynamics 365 using Microsoft Lifecycle Services to support provisioning.  We can observe this process to document the installation qualification process for the system.

It is important to note that the principles of validation still endure.  You STILL must ensure that your cloud environment is established using a consistent, repeatable process.  It is NOT the case that since you are in the cloud, you and eliminate IQ testing.  This is not a good idea nor best practice.

Once the provisioning process is complete, you can create your IQ test scripts and conduct installation qualification testing in the same manner as before.  It is important to note that the IQ process is still relevant for validation testing today.  In the cloud environment, you must conduct your supplier audits collecting the Service Organization Control reports discussed in one of my previous posts.  A thorough process will help ensure data integrity and quality as you deploy enterprise applications.  Are you ready for the cloud?

Automating Validation Testing: It’s Easier Than You Think

Automated validation testing has been elusive for many in the validation community.  There have been many “point solutions” on the market that addressed the creation, management and execution of validation testing.  However, what most validation engineers want is TRULY AUTOMATED validation testing that will interrogate an application in a rigorous manner and report results in a manner that not only provides objective evidence of pass/fail criteria but will highlight each point of failure.

In the 1980’s when I was conducting validation exercises for mini- and mainframe computers and drafting test scripts in a very manual way, I often envisioned a time when I would be able to conduct validation testing in a more automated way.  Most validation engineers work in an environment where they are asked to do more with less.  Thus, the need for such a tool is profound.

Cloud computing environments, mobility, cybersecurity, and data integrity imperatives make it essential that we more thoroughly test applications today.  Yet the burden of manual testing persists.  If I could share with you 5 key features of an automated testing system it would include the following:

  • Automated test script procedure capture and development
  • Automated Requirements Traceability
  • Fully Automated Validation Test Script Execution
  • Automated Incident Capture and Management
  • Ability to Support Continuous Testing in the Cloud

In most validation exercises I have participated in, validation testing was the most laborious part of the exercise.  Automated testing is easier than you think.

For example, ValidationMaster™ includes an automated test engine that captures each step of your qualification procedure and annotates the step with details of the action performed.

Test cases can be routed for review and pre-approval with the system quickly and easily through DocuSign.  Test case execution can be conducted online and a dynamic dashboard reports the status of how many test scripts have passed, how many have failed, or which ones may have passed with exception.  Once test scripts have been executed, the test scripts may be routed for post-approval and signed.

Within the ValidationMaster™ system, you can create a reusable test script library to support future regression testing efforts.  The system allows users to link requirements to test cases thus facilitating the easy generation of forward and reverse trace matrices.  Exporting documents in your UNIQUE format is a snap within the system.  Thus, you can consistently comply with your internal document procedures.

Continuous testing in a cloud environment is essential.  ValidationMaster™ supports fully automated validation testing allowing users to set a date/time for testing.  Test scripts are run AUTOMATICALLY without human intervention.  Allowing multiple runs of the same scripts if necessary.

Continuous testing in a cloud environment is ESSENTIAL.  You must have the ability to respond to rapid changes in a cloud environment that may impact the validated state of the system.  Continuous testing reduces risk and ensures sustained compliance in a cloud environment.

The system automatically raises an incident report if a bug is encountered through automated testing.  The system keeps track of each test run and results though automation.  ValidationMaster™ includes a dynamic dashboard that shows the pass/fail status of each test script as well as requirements coverage, open risks, incident trend analysis and much more.

The time is now for automated validation testing.  The good news is that there are enterprise level applications on the market that facilitate the full validation lifecycle process.  Why are you still generating manual test scripts?  Automated testing is easier than you think!