SharePoint Validation: Quality and Compliance Portals

I am often asked the question… “can SharePoint be validated?”  The short answer is YES but it often requires customization to achieve deliver compliance objectives.  The longer response requires further examination as to why people ask the question and the nature of SharePoint as a content management system.  With the release of Office 365® reaching over 100 million active users per month and more companies moving toward the cloud, we are witnessing the maturation of SharePoint for both regulated and non-regulated content management.

SharePoint has undergone many changes over the past decade that have increased its adoption within the life sciences industry.  New features of SharePoint from Microsoft and its robust technology partner community include, but are not limited to:

  • Synchronization with OneDrive For Business®
  • New SharePoint Communication Sites With Pre-Built Layouts
  • Integration of SharePoint and Microsoft Team
  • New Integration with DocuSign® For Electronic Signatures
  • Enhanced Integration For Graphical Workflows From Nintex®
  • SharePoint-aware PowerApps and Flow
  • Updated Page Layouts and Web Part Enhancements
  • Improved SharePoint Administration
  • Enhanced Document Version Control

Within the life sciences community, the resistance to SharePoint focused on security and the lack of “out-of-the-box” features for life sciences validation.  What are some of the key application that life sciences companies require from a regulatory SharePoint enterprise content management system?  A partial list of document and records management features such as:

  • Intelligent Document Creation Tools
  • Automated Document Change Control
  • Configurable Document Types With Pre-Assigned Document Workflows (based on the type of document, workflows are automatically launched)
  • 21 CFR PART 11 support (electronic or digital signatures, audit trails, et al)
  • Ability to print a Signature Page with Each Signed Document
  • Ability to Establish Pre-defined Automated Document Lifecycle Workflows
  • Support for and designation of Controlled and Uncontrolled Content
  • Controlled Document Management Features Should include Configurable watermarks and overlays
  • Markup tools for document review
  • Ability to classify documents for records management capabilities
  • Ability to assign/tag documents with metadata
  • Content Rendering (when documents are checked in, they are automatically rendered in PDF format for document review.)
  • Custom Document Numbering (the ability to automatically assign alphanumeric document numbers to content)
  • Enforcement of the use of Standard Document Templates Codified Within SOPs
  • Version tracking with major and minor version control, version history
  • Ability to support regulatory submissions and publishing (this is a big one)
  • System MUST BE VALIDATABLE

As you can see from the partial list above, there are many features required by regulatory companies that are not standard in SharePoint out of the box.  However, SharePoint offers rich capabilities and features that have significantly enhanced the ability to deliver such as solution with the features listed above with minimal effort.

As a former Documentum and Qumas executive, I know first hand the challenges of developing such as system from scratch as my former employers did.  However, leveraging the power of SharePoint, OnShore Technology Group’s ValidationMaster™ Quality and Risk Management portal for example, is SharePoint-based and includes all of the features listed above.  The level of effort required to deliver such as solution was substantially lower due to the SharePoint application framework and development tools.

The ability to manage regulatory submissions and publishing is one of the features for which SharePoint may be more challenged.  In the Documentum world, there was such a thing as a “Virtual Document”.  A Virtual Document was a document that contained components or child documents.  A Virtual Document may represent a section of a regulatory dossier where the header represented the section of the dossier and there may be several child documents that are individual documents in that section.  Documentum was an object-oriented system and thus allowed the ability to have a single document comprised of multiple ACTUAL documents with early and late binding ability.  Since each component of a Virtual Document is its own document that can be checked in/check out and routed individually from other components, it makes them ideal for regulatory submission management which has very specific guidelines for publishing and pagination.   I have not seen a parallel yet for this in SharePoint.

Document management systems use to cost millions of dollars for acquisition, implementation and deployment.  These systems are now somewhat “commoditized” and the price points are significantly lower.  Many life sciences companies are using SharePoint for non-regulated documentation.  However, an increasing number of them are abandoning their higher cost rivals and moving to SharePoint as the foundation for controlled and uncontrolled documentation.  SharePoint can be in a hosted Office 365 environment or established in an on-premise environment.  Check out my cloud validation posts for more information on validating SharePoint and other applications in a cloud environment.  Either way, the system can and should be validated if used for regulatory content management.

It is recommended that you establish a clear set of user requirements for SharePoint.  SharePoint has powerful capabilities much beyond those articulated in this blog post.  There are many SharePoint partners that deliver effective, ready-to-use integrations with SharePoint such as Nintex® and DocuSign®.   Use these partner solutions to help minimize the validation effort.

If you have not already done so, it is worth a second look for regulated content depending on your application.  One thing is for sure, the day of the multi-million dollar content management solution is over for most companies.

Validating Microsoft Dynamics 365: What You Should Know

Microsoft Dynamics 365 and Azure are gaining popularity within the life sciences industry. I am often asked the question about how to validate such a system given its complexity and cloud-based nature. The purpose of this blog post is to answer this question. The outline of this blog post is as follows.

  • Understanding the Changing State of Independent Validation and Verification
  • Strategies for Installation Qualification of Microsoft Azure
  • Practical Strategies for Operational and Performance Qualification Testing
  • Continuous Testing in A Cloud Environment
  • Maintaining the Validated State and Azure

To begin our discussion, it is helpful to consider what keeps regulators up at night. They are concerned primarily about four key aspects of validated computer systems:

  1. Vulnerability – How Vulnerable Our Cloud Computing System Environments
  2. Data Integrity – What Is Your Strategy to Maintain Data Integrity Within Your Validated Computer Systems
  3. System Security and Cyber Security – How Do You Keep Sensitive Information Secure and How Do You Protect a Validated Computer System Against Cyber Threats?
  4. Quality Assurance – How Do You Minimize Risk to Patient Safety and Product Quality Impacted by The Validated System?

One of the first task validation engineers must be concerned with is that of supplier auditing. When using commercial off-the-shelf software such as Microsoft Dynamics 365 and Azure a supplier audit is a mandatory part of the validation process. 20 years ago, when we prepared validation documentation for computer systems, validation engineers often conducted a paper audit or an actual physical audit of a software provider. Supplier audits conducted on-site provided a rigorous overview of what software companies were doing and the quality state of their software development lifecycle process. It was possible to examine a supplier’s processes and decide as to if the software vendor was a quality supplier.

Most software vendors today including Microsoft do not allow on-site vendor audits. Some validation engineers have reported to me that they view this as a problem. However, the Microsoft Trust Center is a direct response to the industry’s need for transparency.  Personally, I think the Microsoft trust center is the best thing that they have done for the life sciences industry. Not only do they highlight all of the Service Organization Control reports (SOC1/SOC2/SOC3 and ISO/IEC 27001:2005), but they summarize their compliance with the cloud security alliance controls as well as the NIST Cybersecurity framework. I would strongly recommend that you visit the Microsoft trust center at https://www.microsoft.com/en-us/trustcenter.  The latest information posted to their site is a section on general data protection (GDPR) and how their platform can help keep data safe and secure. I find myself as a validation engineer visiting this site often. You will see a section for specifically Microsoft 365 and Azure.

From a supplier auditing perspective, I use the information found on the Microsoft trust center to facilitate a “desk audit” of the vendor. Many of the questions that I would ask during an on-site audit are found on this website. As part of my new validation strategy I include the service organization control reports as part of my audit due diligence.  The trust center includes in-depth information about security, privacy, and compliance offerings, policies, features and practices across all of Microsoft cloud products. .

If I were conducting an on-site audit of Microsoft, I would want to know how they are establishing trust in the cloud. Many of the questions that I would ask in person I have found on this website. It should be noted that service organization control reports are created not by Microsoft but by trusted third-party organizations certified to deliver such a report. These reports include an assessment of how well Microsoft is complying with the stated controls for cloud management and security. This is extremely valuable information.

From a validation perspective I attach these reports with my validation package as part of the supplier audit due diligence. There may be instances where you conduct your own due diligence beyond the reports but the reports provide an excellent start to understanding what Microsoft is doing.

Microsoft has adopted the cloud security alliance (CSA) cloud controls matrix to establish the controls for the Microsoft Azure platform. These controls include:

  • Security Policy and Procedures
  • Physical and Environmental Security
  • Logical Security
  • System Monitoring and Maintenance
  • Data Backup, Recovery and Retention
  • Confidentiality
  • Software Development/Change Management
  • Incident Management
  • Service Level Agreements
  • Risk Assessments
  • Documentation/Asset Management
  • Training Management
  • Disaster Recovery
  • Vendor Management

the cloud control matrix includes 136 different controls that cloud vendors such as Microsoft must comply with. Microsoft has mapped out on its trust center site specifically how it addresses each of the 136 controls in the Microsoft Azure/dynamics 365 platform. This is excellent knowledge and due diligence for validation engineers and represents a good starting point for documenting the quality of the supplier.

Is Microsoft Azure/dynamics 365 secure enough for life sciences? In my personal opinion yes, it is. Companies still must conduct due diligence to ensure that Azure and dynamics 365 meet their business continuity requirements and business process requirements for enterprise resource planning. One thing is certain, the cloud changes things. You must revise your validation strategy to accommodate the cloud. Changes in how supplier audits conduct are conducted are just one of such changes.

The next challenge in validating Microsoft Dynamics 365 and Azure is conducting validation testing in the cloud environment. It should be understood that the principles of validation still endure whether or not you are in the cloud environment. You still must conduct a rigorous amount of testing including both positive and negative testing to confirm that Microsoft Azure/dynamics 365 meets its intended use. However, there are changes in the way we conduct installation qualification in the cloud environment. Some believe that installation qualification is no longer a valid testing process since cloud environments are provisioned. This is not correct. You still must ensure that cloud environments are provisioned in a consistent repeatable manner that supports quality.

It is helpful to understand that when Microsoft Dynamics 365 is provisioned it is conducted using Microsoft lifecycle services. The Microsoft lifecycle services application is designed for rapid implementation and deployment. However, it should be clearly understood that lifecycle services itself is an application which is a potential point of failure in the process.  The use of lifecycle services must be documented and the provisioning of the environment must be confirmed through the installation qualification process.

From an operational qualification perspective, validation testing remains pretty much the same. Test cases are traced to their respective user requirements and executed with the same rigor as in previous validation exercises.

Performance qualification is also conducted in the same manner as before. Since the environment is in the cloud and outside of your direct control, it is very important that network qualification as well as performance qualification be conducted to ensure that there are no performance anomalies that may occur in your environment. In the cloud environment you may have performance issues related to system resources, networks, storage arrays and many other factors. Performance tools may be used to confirm that the system is performing within an optimal range as established by the validation team. Performance qualification can be conducted either before the system goes live by placing a live load on the system or it may occur after validation. This is at the discretion of the validation engineer.

Maintaining the validated state within a cloud environment requires embrace of the principle of continuous testing. It is often been said that the cloud is perpetually changing. This is one of the reasons why many people believe that you cannot validate the system in the cloud. However, you can validate cloud-based systems such as Microsoft Dynamics 365 and Azure. Continuous testing is the key. What do I mean by continuous testing? Does that mean that we perpetually test the system for ever and ever every single day? Of course not! Continuous testing is a new strategy that should be applied to all cloud-based validated systems whereby at various predetermined intervals, regression testing should occur. Automated systems such as ValidationMaster™ can be the key to facilitating this new strategy.

Within ValidationMaster™ you can establish a reusable test script Library. This is important because in manual validation processes that are paper-based, the most laborious part of validation is the development and execution of test scripts. This is why many people cringe at the very notion of continuous testing. However automated systems make this much easier. In ValidationMaster™ each test script is automatically traced to a user requirement. Thus, during regression testing, I can select a set of test scripts to be executed based on my impact analysis and during off-peak hours in my test environment I can execute these test scripts to see if there has been any impact to my validated system. These test scripts can be run fully automated using Java-based test scripts or they can be run using an online validation testing process. Revalidation of the system can happen in a matter of hours versus a matter of days or weeks using this process.

Through continuous testing, you can review the changes that Microsoft has made to both Azure and dynamics 365 online. Yes, this information is posted online for your review. This answers the question how do I know what changes are made and when changes are made. This information is made available to you through Microsoft. You can determine how often you test a validated system. There is no regulation that codifies how often this should occur. It’s totally up to you. However, as a good validation engineer you know that it should be based on risk. The riskier the system the more often you should test. The less risky the system the less due diligence is required. Nevertheless cloud-based systems should be subject to continuous testing to ensure compliance and maintain the validated state.

There are many other aspects of supporting validation testing in the cloud but suffice to say Microsoft Dynamics and Azure can be validated and have been successfully validated for many clients. Microsoft has done a tremendous service to the life sciences industry by transparently providing information through the Microsoft trust center. As a validation engineer I find this one of the most innovative things that they’ve done! It provides a lot of the information that I need and confirmation through third-party entities that Microsoft is complying with cloud security alliance controls, and the NIST Cybersecurity framework. I would encourage each of you to review the information on Microsoft’s trust center. Of course, there will always be skeptics of anything Microsoft does but let’s give them credit where credit is due.

The Microsoft Trust Center is a good thing. The company has done an excellent job of opening up and sharing how they view system security, quality and compliance. This information was never made fail available before and it is available now. I have validated many Microsoft Dynamics AX systems as well as 365 systems. The software quality within these systems is good and with the information that Microsoft has provided you can have confidence that deploying a system in the Azure environment is not only a good business system decision if you have selected this technology for your enterprise but a sound decision regarding quality and compliance.

.

Installation Qualification (IQ) in the Cloud

Installation Qualification (IQ) is designed to ensure that systems are installed in a consistent, repeatable manner.  For on-premise systems, validation engineers typically install enterprise software from pre-defined vendor media.  Formerly, media was inserted into corporate servers and the full installation process is documented.  The purpose for installation qualification sometimes gets lost in the weeds.  Given the complexity of today’s enterprise and desktop systems in virtualized cloud environments , the need for installation qualification has never been greater.

The question often gets asked, “… how do you conduct installation qualification in the cloud?”  The answer to this question requires the need to think differently about how software is deployed in today’s validated systems environments.

Cloud deployment models include Infrastructure-as-a-Service (IaaS), Platform-as-a-Service (PaaS) and Software-as-a-Service (SaaS).  The validation strategy changes depending on the deployment model.  In all models, the virtualization, servers, storage and networking are managed by the vendor.  In the PaaS model, the runtime, middleware and O/S are also managed by the vendor.  It should be noted that cloud services are typically provisioned using automated tools thus, the installation process is not the same as with on-premise environments.

Therefore, to conduct IQ in cloud requires four essential stages:

  • Observation or Documentation of the Provisioning Process
  • Draft Installation Qualification Test Scripts For Pre-Approval
  • Execute Pre-Approved IQ Test Scripts
  • Summarize Installation Qualification Phase

We recommend for all cloud models the observation (to the extent possible) the provisioning process.  For some applications such as SaaS, this may not be possible.  You should observe the setup and configuration of all virtual servers and parameters used to load software.  The sequence of events may be key.  We have observed with some applications that the provisioning process (which itself is automated) may fail for a variety of reasons.  Thus, do not think it strange that the provisioning process does not successfully complete on occasion.  That is why, whenever possible, we observe this process.  For example, we conduct a lot of validation exercises for Microsoft Dynamics 365 using Microsoft Lifecycle Services to support provisioning.  We can observe this process to document the installation qualification process for the system.

It is important to note that the principles of validation still endure.  You STILL must ensure that your cloud environment is established using a consistent, repeatable process.  It is NOT the case that since you are in the cloud, you and eliminate IQ testing.  This is not a good idea nor best practice.

Once the provisioning process is complete, you can create your IQ test scripts and conduct installation qualification testing in the same manner as before.  It is important to note that the IQ process is still relevant for validation testing today.  In the cloud environment, you must conduct your supplier audits collecting the Service Organization Control reports discussed in one of my previous posts.  A thorough process will help ensure data integrity and quality as you deploy enterprise applications.  Are you ready for the cloud?

Can Cloud Applications Be Validated?

Cloud applications are being deployed within life sciences Enterprises at a rapid pace. Microsoft office 365, Microsoft dynamics 365 and the cost benefit of deployment of other such applications are driving the adoption of cloud applications for regulated systems. The question that is always asked is can cloud systems be validated? The reason for the inquiry is due to the fact that many understand how clouds are deployed and maintained over time. In an effort to keep pace with system performance, security and other related controls in the cloud, cloud providers often update their environments to keep pace. The constant updating of the cloud environment makes it challenging from a systems validation perspective.  Maintaining the validated state in a cloud environment is often challenging due to this fact.

So can cloud applications be validated and what are the unique processes that must be changed to accommodate cloud validation?  The short answer is yes, cloud applications can be validated. However, there are changes required to the validation strategy to ensure that the system meets its intended use and the validated state is maintained over time.

ALL CLOUD PROVIDERS ARE NOT CREATED EQUAL

When choosing a cloud provider, it is helpful to understand that all cloud providers are not created equal. To my mind they are divided into two distinct camps: (1) those who understand regulated environments and (2) those who do not understand regulated environments. The first order of business for establishing a validated system in the cloud is to select your cloud providers carefully. I usually like to look at cloud providers who have experience in regulated environments. This goes for any application that you’re using in a highly regulated environment. It is super important that your vendor understand the regulatory requirements that you have to comply with and that in their service they build in best practices to help you comply. This will go a long way during the supplier audit process to confirm that you’ve done your proper due diligence on your cloud provider.

UNDERSTANDING RESPONSIBILITIES IN A CLOUD ENVIRONMENT

It is important to understand responsibilities in a cloud environment.  With packaged software applications, you have complete control over the environment.  With the various cloud service models, responsibilities vary depending on the model used as shown below.  For Infrastructure-as-a-Service, you manage the applications, data, runtime, middle ware and operating system.  While the vendor manages virtualization, servers, storage, and networking.  Please keep in mind that the principles of validation endure in the cloud.  You, not the vendor, are ultimately responsible for the environment and its management.  Therefore, you need to choose your cloud vendor wisely.

cloud services

You can see with Platform-as-a-Service (PaaS) and Software-as-a-Service (SaaS), you have less responsibility for the management of the system.  In the SaaS model, you do not control the application or infrastructure.  The vendor manages the application and it underlying architecture.

The strategy for validation changes based on the cloud model deployed.

The strategy for validation changes based on the cloud model deployed.

CONTINUOUS TESTING IN THE CLOUD

Previously four on premise systems, the validation engineer in consultation with the IT team determined how often validated systems environments were reviewed and updated. If the validation engineer did not want to apply a patch or an update, they simply left the system alone. Many validation engineers due to the labor involved in updating and documenting a validated system would often leave the system in a state where no changes, patches, or updates were applied. In today’s systems due to the threats from cyber security and frequent changes to browser applications, it is not possible to leave a system unpatched. To do so may leave your validated systems environment more exposed to security threats.

Therefore, when validating in the cloud you must employ a continuous testing strategy. When I say that to validation engineers they often cringe thinking of continuously having the test a validated systems environment. This is easier than you may think if you’re using automated tools for testing. Continuous testing is simply a validation testing strategy where on a designated schedule defined by the validation engineer, a system is routinely tested to ensure that patches, and routine updates do not impact the validated state. Continuous testing is facilitated best through automation. When developing a reusable test script Library, you can easily conduct an impact analysis and determine which tests need to be rerun. Regression testing is made easy through validation test script automation. Therefore, it is strongly recommended that if you deploy systems in a cloud environment that you employ an enterprise validation management system that includes automated testing such as ValidationMaster™.

Cloud applications can be validated.  The strategy changes based on the cloud model selected.  Choose your partners carefully.  They assume an important role in the management of critical system assets.

It is not a question of if a cloud application can be validated, it is a question of the deployment model you choose and the validation strategy that will govern deployment. Regulators are not averse to you using the cloud – its all about managing control and risk.

 

5 Ways to Revive Your CSV Validation Process in 2018

If you are like most veteran validation engineers, you have been conducting validation exercises the same old way.  You have been gathering user requirements, conducting risk assessments, developing test scripts (most often on paper) and delivering validation documentation and due diligence as required by current global regulations.

Validation processes have not changed very much over the last 40 years (unless you include ISPE GAMP® as revolutionary).  However, the systems we validate and the state of play with respect to cybersecurity, data integrity, mobility and compliance has changed in a profound way.  Cloud-based applications are seeing increasing adoptions.  Platforms such as Microsoft Azure are changing the way companies do business.  Mobility is in the hands of everyone who can obtain an affordable device.  These game-changing trends call for changes in the way we wake up our sleepy validation processes and use them as opportunities for process improvement.

To help you address these new challenges for the coming year, I offer 5 ways to revive sleepy, manual validation processes for your consideration.

  1. Think LEAN
  2. Automate
  3. Leverage Electronic Document Review & Approval Processes
  4. Adopt Agile Validation Processes
  5. Conduct Cybersecurity Qualification

STEP 1.  THINK LEAN

Lean manufacturing is a proven process that powered Toyota to success. Lean manufacturing processes were designed to eliminate waste and improve operational efficiency. Toyota adopted lean manufacturing processes and rose to prominence with quality vehicles that outsold most of their competitors in each automotive class. To improve your validation processes in today’s systems environment, it is important to think lean. As we are being asked to do more with less as validation engineers it is important and dare I say critical that you eliminate any wasteful processes and focus on validation processes that in the end at value. This is the basic premise of lean validation.

Throughout my practice, we deliver lean validation services to all of our clients. Lean validation is powered through automation. You can’t have a lean validation process without automation since automation powers lean. Through automation and lean validation processes, we are able to automate the development and management of requirements, incidents, and validation testing in a manner not possible on paper. As you look to wake up your validation processes in 2018, it is important to think lean. I wrote a blog article on the principles and best practices of lean as a good starting point to educate you as to how to maintain and establish lean validation processes.

STEP 2.  AUTOMATE

As I mentioned above you cannot adopt lean validation processes unless you automate your validation processes. Automation has often been looked at by validation engineers as a luxury rather than the necessity. In 2018, automation of the validation process is a necessity not a luxury. Electronic workflow used to be the purview of expensive enterprise systems often costing in the millions of dollars. Today, there is no excuse for not automating your process other than simply not wanting to change.

Electronic systems to manage automated validation processes have come a long way in the last 20 years. We offer a system call ValidationMaster™  which was designed by validation engineers for validation engineers. ValidationMaster™  is an enterprise validation management system designed to manage the full lifecycle of validation processes. From requirements to test script development and execution through incident reporting and validation document control, validation master manages the full lifecycle of validation processes and their associated deliverables. The system comes with a fully integrated validation master portal based on SharePoint designed to manage document deliverables in a controlled manner. The system is integrated with electronic signatures and graphical workflows to eliminate document bottlenecks and improve the process of development of your document deliverables.

The development and execution of test cases represents the most laborious part of validation. This is where most validation exercises are either successful or fail. Most often, given the manual cryptic nature that most validation engineers use to develop test scripts, a lot of focus is spent on the actual development of the test script and not the quality of the test case itself. This is due to wasteful processes such as cutting and pasting screenshots or manually typing in information that the validation engineer sees on the screen during the manual test script development process. Both the process of capturing screenshots or typing in information to describe a screen are considered in my view wasteful and don’t add to the value of the test case.

The FDA says a good test case is one that finds an error. As validation engineers we should be focused more on finding errors and software quality than cutting and pasting screenshots into word documents as most of us do. There are things that computers can do much more efficiently and effectively than humans can do and one of them is automated test generation.

There are tools on the market such as ValidationMaster™ that allow you to fully automate this process and eliminate the waste in this process. ValidationMaster™ allows you to navigate through the application that is subject for your validation exercise while the system automatically captures the screenshot and the text and places this text directly into your formatted documents. You simply push a button at the end of the test script development and your test script is written for you. This is standard out-of-the-box functionality. Managing requirements for enterprise applications can also be laborious and challenging.

If you are validating enterprise resource planning systems which consist of hundreds of requirements, keeping track of the version of each requirement may be difficult. Who changes a requirement and why was a requirement changed is often the subject of hours and hours of meetings that waste time and money. ValidationMaster™ has a full-featured requirements management engine which not only tracks the requirement but it tracks the full audit history of a requirement so that you can understand who change the requirement, when was the requirement changed and why. This gives you full visibility to requirements. These requirements can be directly traced to test scripts thereby fully automating requirements traceability in your validation process.

Validation testing can be conducted online as either a fully automated validation test script which requires no human intervention or test cases that are semi-automatic where the validation engineer navigates through each software application and the actual results are captured by the system.

Manually tracking tests runs can be a headache and sometimes error-prone. An automated system can do this for you with ease. ValidationMaster™ allows you to conduct multiple tests runs and it tracks each of the tests runs, the person who ran the test, how long it took to actually execute the test, and the actual results of the test. If an incident or software anomaly occurred during testing the automated system would actually allow you to capture that information and keeps the incident report with each validation test. The time for automation has calm.

If you want to wake up your sleepy validation processes and deliver greater value for your company, you cannot afford to overlook automation of your validation process. Check out a demonstration of ValidationMaster™ to see how automated validation lifecycle management can work for you.

STEP 3 – LEVERAGE ELECTRONIC DOCUMENT REVIEW & APPROVAL PROCESSES

One of the many challenges that validation engineers experience every day is the route review and approval of validation documentation. In many paper-based systems this involves routing around printed documents for physical handwritten signatures. This is the way we did it in the 1980s. 21st-century validation requires a new more efficient approach. As mentioned in the section above on automation, there really is no excuse for using manual paper-based processes anymore. There are affordable technologies on the market that would facilitate electronic document review and approval processes and tremendously speed up the efficiency and quality of your document management processes.

In one previous company that I worked for our document cycle times were up to 120 days per document. With electronic route and review using 21 CFR part 11 electronic signatures we managed to get the process down to three weeks which saved a significant amount of time and money. Electronic document review and approval is a necessity and no longer a luxury there are tools on the market that costs less than a happy meal per month that would allow you to efficiently route and review your validation documentation and apply electronic signatures in compliance with 21 CFR part 11. To wake up your document management processes electronic document review and approval is a MUST.

STEP 4 – ADOPT AGILE VALIDATION PROCESSES

For as long as I can remember, whenever validation was spoken up we talked about the V model. This is the model as shown in the figure below where validation planning happens on the left-hand side of the V and validation testing happens on the right-hand side of the V.

V-MODEL

This model often assumed a waterfall approach to software development where all requirements were developed in advance and were tested subsequent to the development/configuration process. In today’s systems environment this is actually not how software is developed or implemented in most cases. Most development teams use an agile development process. Agile software development methodologies have really revolutionized the way organizations plan, develop, test and release software applications today. It is fair to say that agile development methods have now been established as the most accepted way to develop software applications. Having said this, many validation engineers insist on the waterfall methodology to support validation.

Agile development is basically a lightweight software development methodology that focuses on having small time boxed sprints of new functionality that are incorporated into an integrated product baseline. In other words, all requirements are not developed upfront. Agile recognizes that you may not know all requirements up front. The scrum places emphasis on customer interaction, feedback and adjustments rather than documentation and prediction. From a validation perspective this means iterations of validation testing as software iterations are developed in sprints. It’s a new way of looking at validation from the old waterfall approach. This may add more time to the validation process overall but the gradual introduction of functionality within complex software applications has value in and of itself. The Big Bang approach of delivering ERP systems sometimes has been fraught with issues and has sometimes been error-prone. The gradual introduction of features and functionality within software application has its merits. You should adopt agile validation processes to wake up your current processes.

STEP 5 – CONDUCT CYBERSECURITY QUALIFICATION (CYQ)

Last but not least is cyber security. In my previous blog posts, I discussed extensively the impact of cyber security on validation. I cannot emphasize enough how you need to pay attention to this phenomenon and be ready to address it. Cyber threats are a clear and present danger not only to validated systems environments but to all systems environments. As validation engineers our job is to ensure that system are established and meet their intended use. How can you confirm that the system meets its intended use when it’s vulnerable to cyber threats? The reality is you can’t.

Many validation engineers believe that cyber security is the purview of the IT department. This is something that the IT group is responsible for handling and many believe that it has nothing to do with validation. Nothing could be further from the truth. If you remember the old adage in validation “if it’s not documented it didn’t happen” you will quickly realize that you must document your readiness to deal with a cyber security event in the event that one occurs in your environment.

I call this “cyber security qualification” or “CyQ”. A cyber security qualification is an assessment of your readiness to protect validated systems environments against the threat of a cyber event. The CyQ is intended to evaluate your readiness to ensure compliance. One of the ways that you can wake up your validation processes in 2018 is to understand and educate yourself as to the threats that cyber events can have on your current validated systems environment and conduct a CyQ in addition to your IQ/OQ/PQ testing to ensure that you have the processes and procedures in place to deal with any threats that may come your way.  If you would like to see an example of a CyQ just write me and I’ll send you one.

2018 is going to bring many challenges your way. It is time that you bring your validation processes to level V validation maturity as shown in the figure below. You need to develop more mature processes that deal with the threats of cyber security and embrace lean validation and agile methodologies. If you’re like most you will be asked to do more with less. Updating old antiquated validation processes is essential to you being able to have the agility to do more with less.  It’s about time that you revive your old antiquated validation processes. Are you ready to go lean?

Cloud Validation Strategies

If you were to ask me 10 years ago how many of my life sciences clients were deploying systems in the cloud environment I would’ve said may be perhaps one or two. If you ask me today how many of my clients are deploying cloud’s technologies I would say most all of them in one way or another.  The adoption of cloud technologies within life sciences companies is expanding at a rapid pace.

From a validation perspective, this trend has profound consequences.  Here are some key concerns and questions to be answered for any cloud deployment.

  1. How do you validate systems in a cloud environment?
  2. What types of governance do you need to deploy applications in a cloud environment?
  3. How do you manage change in a cloud environment?
  4. How do you maintain the validated state in the cloud?
  5. How can you ensure data integrity in the cloud?
  6. How do you manage cybersecurity in a cloud environment?

The answers to these questions are obvious and routine to validation engineers managing systems in an on-premise environment where the control of the environment is managed by the internal IT team.  They have control over changes, patches, system updates, and other factors that may impact the overall validated state.  In a cloud environment, the software, platform and infrastructure is delivered as a SERVICE.  By leveraging the cloud, life sciences companies are effectively outsourcing the management and operation of a portion of their IT infrastructure to the cloud provider.  However, compliance oversight and responsibility for your validated system cannot be delegated to the cloud provider.  Therefore, these services must have a level of control sufficient to support a validated systems environment.

For years, life sciences companies have been accustomed to governing their own systems environments.  They control how often systems are updated, when patches are applied, when system resources will be updated, etc.  In a cloud environment, control is in the hands of the cloud service provider.  Therefore, who you choose as your cloud provider matters.

So what should your strategy be to manage cloud-based systems?

  • Choose Your Cloud Provider Wisely – All cloud providers are not created equally.  The Cloud Security Alliance (https://cloudsecurityalliance.org/ ) is an excellent starting point for understanding cloud controls.  The Cloud Controls Matrix (CCM) is an Excel spreadsheet that allows you to assess a vendors readiness for the cloud.  You can download it free of charge from the CSA.
  • Establish Governance For The Cloud – You must have an SOP for the management and deployment of the cloud and ensure that this process is closely followed.  You also need an SOP for cyber security to provide a process for protecting validated systems against cyber threats.
  • Leverage Cloud Supplier Audit Reports For Validation – All cloud providers must adhere to standards for their environments.  Typically, they gain 3rd party certification and submit to Service Organization Control (SOC) independent audits.  It is recommended that you capture the SOC 1/2/3 and SSAE 16 reports.  You also want to understand any certifications that your cloud provider has.  I would archive their certifications and SOC reports with the validation package as part of my due diligence for the supplier audit.
  • Embrace Lean Validation Principles and Best Practices – eliminating waste and improving efficiency is essential in any validated systems environment.  Lean validation is derived from the principles of lean manufacturing.  Automation is a MUST.  You need to embrace lean principles for greater efficiency and compliance.
  • Automate Your Validation Processes – Automation and Lean validation go hand in hand.  The testing process is the most laborious process.  We recommend using a system like ValidationMaster™ to automate requirements management, test management and execution, incident management, risk management, validation quality management, agile validation project management, and validation content management. ValidationMaster™ is designed to power lean validation processes and includes built-in best practices to support this process.
  • Use a Risk-Based Approach To Validation – all validation exercises are not created equal.  The level of validation due diligence required for your project should be based on risk – regulatory, technical and business risks.  Conduct a risk assessment for all cloud-based systems.
  • Adopt Continuous Testing Best Practices – the cloud is under continuous change which seems in and of itself counter-intuitive to the validation process.  Continuous testing can be onerous if your testing process is MANUAL.  However, if you adopt lean, automated testing processes regression testing is easy.  You can establish a routine schedule for testing and if your cloud provider delivers a dashboard that tells you when patches/updates/features have been applied and the nature of them, you can select your regression testing plan based on a risk and impact assessment.

 

Cloud environments can be validated!  A clear, practical approach that embraces lean validation and continuous testing is key.  Cloud governance to ensure data integrity and sustained compliance is key.

Cloud technologies are here to stay.  Regulators don’t object to the use of the cloud, they want to know how you are managing it and ensuring the integrity of the data.  They also want you to confirm that you are maintaining the validated state in the cloud.  The principles of validation endure in the cloud.  Just because you are in a cloud environment does not mean validation principles no longer apply.  Consider the impact of cybersecurity in your cloud environment and adopt continuous testing strategies to ensure sustained compliance.