SharePoint Validation: Quality and Compliance Portals

I am often asked the question… “can SharePoint be validated?”  The short answer is YES but it often requires customization to achieve deliver compliance objectives.  The longer response requires further examination as to why people ask the question and the nature of SharePoint as a content management system.  With the release of Office 365® reaching over 100 million active users per month and more companies moving toward the cloud, we are witnessing the maturation of SharePoint for both regulated and non-regulated content management.

SharePoint has undergone many changes over the past decade that have increased its adoption within the life sciences industry.  New features of SharePoint from Microsoft and its robust technology partner community include, but are not limited to:

  • Synchronization with OneDrive For Business®
  • New SharePoint Communication Sites With Pre-Built Layouts
  • Integration of SharePoint and Microsoft Team
  • New Integration with DocuSign® For Electronic Signatures
  • Enhanced Integration For Graphical Workflows From Nintex®
  • SharePoint-aware PowerApps and Flow
  • Updated Page Layouts and Web Part Enhancements
  • Improved SharePoint Administration
  • Enhanced Document Version Control

Within the life sciences community, the resistance to SharePoint focused on security and the lack of “out-of-the-box” features for life sciences validation.  What are some of the key application that life sciences companies require from a regulatory SharePoint enterprise content management system?  A partial list of document and records management features such as:

  • Intelligent Document Creation Tools
  • Automated Document Change Control
  • Configurable Document Types With Pre-Assigned Document Workflows (based on the type of document, workflows are automatically launched)
  • 21 CFR PART 11 support (electronic or digital signatures, audit trails, et al)
  • Ability to print a Signature Page with Each Signed Document
  • Ability to Establish Pre-defined Automated Document Lifecycle Workflows
  • Support for and designation of Controlled and Uncontrolled Content
  • Controlled Document Management Features Should include Configurable watermarks and overlays
  • Markup tools for document review
  • Ability to classify documents for records management capabilities
  • Ability to assign/tag documents with metadata
  • Content Rendering (when documents are checked in, they are automatically rendered in PDF format for document review.)
  • Custom Document Numbering (the ability to automatically assign alphanumeric document numbers to content)
  • Enforcement of the use of Standard Document Templates Codified Within SOPs
  • Version tracking with major and minor version control, version history
  • Ability to support regulatory submissions and publishing (this is a big one)
  • System MUST BE VALIDATABLE

As you can see from the partial list above, there are many features required by regulatory companies that are not standard in SharePoint out of the box.  However, SharePoint offers rich capabilities and features that have significantly enhanced the ability to deliver such as solution with the features listed above with minimal effort.

As a former Documentum and Qumas executive, I know first hand the challenges of developing such as system from scratch as my former employers did.  However, leveraging the power of SharePoint, OnShore Technology Group’s ValidationMaster™ Quality and Risk Management portal for example, is SharePoint-based and includes all of the features listed above.  The level of effort required to deliver such as solution was substantially lower due to the SharePoint application framework and development tools.

The ability to manage regulatory submissions and publishing is one of the features for which SharePoint may be more challenged.  In the Documentum world, there was such a thing as a “Virtual Document”.  A Virtual Document was a document that contained components or child documents.  A Virtual Document may represent a section of a regulatory dossier where the header represented the section of the dossier and there may be several child documents that are individual documents in that section.  Documentum was an object-oriented system and thus allowed the ability to have a single document comprised of multiple ACTUAL documents with early and late binding ability.  Since each component of a Virtual Document is its own document that can be checked in/check out and routed individually from other components, it makes them ideal for regulatory submission management which has very specific guidelines for publishing and pagination.   I have not seen a parallel yet for this in SharePoint.

Document management systems use to cost millions of dollars for acquisition, implementation and deployment.  These systems are now somewhat “commoditized” and the price points are significantly lower.  Many life sciences companies are using SharePoint for non-regulated documentation.  However, an increasing number of them are abandoning their higher cost rivals and moving to SharePoint as the foundation for controlled and uncontrolled documentation.  SharePoint can be in a hosted Office 365 environment or established in an on-premise environment.  Check out my cloud validation posts for more information on validating SharePoint and other applications in a cloud environment.  Either way, the system can and should be validated if used for regulatory content management.

It is recommended that you establish a clear set of user requirements for SharePoint.  SharePoint has powerful capabilities much beyond those articulated in this blog post.  There are many SharePoint partners that deliver effective, ready-to-use integrations with SharePoint such as Nintex® and DocuSign®.   Use these partner solutions to help minimize the validation effort.

If you have not already done so, it is worth a second look for regulated content depending on your application.  One thing is for sure, the day of the multi-million dollar content management solution is over for most companies.

Validating Microsoft Dynamics 365: What You Should Know

Microsoft Dynamics 365 and Azure are gaining popularity within the life sciences industry. I am often asked the question about how to validate such a system given its complexity and cloud-based nature. The purpose of this blog post is to answer this question. The outline of this blog post is as follows.

  • Understanding the Changing State of Independent Validation and Verification
  • Strategies for Installation Qualification of Microsoft Azure
  • Practical Strategies for Operational and Performance Qualification Testing
  • Continuous Testing in A Cloud Environment
  • Maintaining the Validated State and Azure

To begin our discussion, it is helpful to consider what keeps regulators up at night. They are concerned primarily about four key aspects of validated computer systems:

  1. Vulnerability – How Vulnerable Our Cloud Computing System Environments
  2. Data Integrity – What Is Your Strategy to Maintain Data Integrity Within Your Validated Computer Systems
  3. System Security and Cyber Security – How Do You Keep Sensitive Information Secure and How Do You Protect a Validated Computer System Against Cyber Threats?
  4. Quality Assurance – How Do You Minimize Risk to Patient Safety and Product Quality Impacted by The Validated System?

One of the first task validation engineers must be concerned with is that of supplier auditing. When using commercial off-the-shelf software such as Microsoft Dynamics 365 and Azure a supplier audit is a mandatory part of the validation process. 20 years ago, when we prepared validation documentation for computer systems, validation engineers often conducted a paper audit or an actual physical audit of a software provider. Supplier audits conducted on-site provided a rigorous overview of what software companies were doing and the quality state of their software development lifecycle process. It was possible to examine a supplier’s processes and decide as to if the software vendor was a quality supplier.

Most software vendors today including Microsoft do not allow on-site vendor audits. Some validation engineers have reported to me that they view this as a problem. However, the Microsoft Trust Center is a direct response to the industry’s need for transparency.  Personally, I think the Microsoft trust center is the best thing that they have done for the life sciences industry. Not only do they highlight all of the Service Organization Control reports (SOC1/SOC2/SOC3 and ISO/IEC 27001:2005), but they summarize their compliance with the cloud security alliance controls as well as the NIST Cybersecurity framework. I would strongly recommend that you visit the Microsoft trust center at https://www.microsoft.com/en-us/trustcenter.  The latest information posted to their site is a section on general data protection (GDPR) and how their platform can help keep data safe and secure. I find myself as a validation engineer visiting this site often. You will see a section for specifically Microsoft 365 and Azure.

From a supplier auditing perspective, I use the information found on the Microsoft trust center to facilitate a “desk audit” of the vendor. Many of the questions that I would ask during an on-site audit are found on this website. As part of my new validation strategy I include the service organization control reports as part of my audit due diligence.  The trust center includes in-depth information about security, privacy, and compliance offerings, policies, features and practices across all of Microsoft cloud products. .

If I were conducting an on-site audit of Microsoft, I would want to know how they are establishing trust in the cloud. Many of the questions that I would ask in person I have found on this website. It should be noted that service organization control reports are created not by Microsoft but by trusted third-party organizations certified to deliver such a report. These reports include an assessment of how well Microsoft is complying with the stated controls for cloud management and security. This is extremely valuable information.

From a validation perspective I attach these reports with my validation package as part of the supplier audit due diligence. There may be instances where you conduct your own due diligence beyond the reports but the reports provide an excellent start to understanding what Microsoft is doing.

Microsoft has adopted the cloud security alliance (CSA) cloud controls matrix to establish the controls for the Microsoft Azure platform. These controls include:

  • Security Policy and Procedures
  • Physical and Environmental Security
  • Logical Security
  • System Monitoring and Maintenance
  • Data Backup, Recovery and Retention
  • Confidentiality
  • Software Development/Change Management
  • Incident Management
  • Service Level Agreements
  • Risk Assessments
  • Documentation/Asset Management
  • Training Management
  • Disaster Recovery
  • Vendor Management

the cloud control matrix includes 136 different controls that cloud vendors such as Microsoft must comply with. Microsoft has mapped out on its trust center site specifically how it addresses each of the 136 controls in the Microsoft Azure/dynamics 365 platform. This is excellent knowledge and due diligence for validation engineers and represents a good starting point for documenting the quality of the supplier.

Is Microsoft Azure/dynamics 365 secure enough for life sciences? In my personal opinion yes, it is. Companies still must conduct due diligence to ensure that Azure and dynamics 365 meet their business continuity requirements and business process requirements for enterprise resource planning. One thing is certain, the cloud changes things. You must revise your validation strategy to accommodate the cloud. Changes in how supplier audits conduct are conducted are just one of such changes.

The next challenge in validating Microsoft Dynamics 365 and Azure is conducting validation testing in the cloud environment. It should be understood that the principles of validation still endure whether or not you are in the cloud environment. You still must conduct a rigorous amount of testing including both positive and negative testing to confirm that Microsoft Azure/dynamics 365 meets its intended use. However, there are changes in the way we conduct installation qualification in the cloud environment. Some believe that installation qualification is no longer a valid testing process since cloud environments are provisioned. This is not correct. You still must ensure that cloud environments are provisioned in a consistent repeatable manner that supports quality.

It is helpful to understand that when Microsoft Dynamics 365 is provisioned it is conducted using Microsoft lifecycle services. The Microsoft lifecycle services application is designed for rapid implementation and deployment. However, it should be clearly understood that lifecycle services itself is an application which is a potential point of failure in the process.  The use of lifecycle services must be documented and the provisioning of the environment must be confirmed through the installation qualification process.

From an operational qualification perspective, validation testing remains pretty much the same. Test cases are traced to their respective user requirements and executed with the same rigor as in previous validation exercises.

Performance qualification is also conducted in the same manner as before. Since the environment is in the cloud and outside of your direct control, it is very important that network qualification as well as performance qualification be conducted to ensure that there are no performance anomalies that may occur in your environment. In the cloud environment you may have performance issues related to system resources, networks, storage arrays and many other factors. Performance tools may be used to confirm that the system is performing within an optimal range as established by the validation team. Performance qualification can be conducted either before the system goes live by placing a live load on the system or it may occur after validation. This is at the discretion of the validation engineer.

Maintaining the validated state within a cloud environment requires embrace of the principle of continuous testing. It is often been said that the cloud is perpetually changing. This is one of the reasons why many people believe that you cannot validate the system in the cloud. However, you can validate cloud-based systems such as Microsoft Dynamics 365 and Azure. Continuous testing is the key. What do I mean by continuous testing? Does that mean that we perpetually test the system for ever and ever every single day? Of course not! Continuous testing is a new strategy that should be applied to all cloud-based validated systems whereby at various predetermined intervals, regression testing should occur. Automated systems such as ValidationMaster™ can be the key to facilitating this new strategy.

Within ValidationMaster™ you can establish a reusable test script Library. This is important because in manual validation processes that are paper-based, the most laborious part of validation is the development and execution of test scripts. This is why many people cringe at the very notion of continuous testing. However automated systems make this much easier. In ValidationMaster™ each test script is automatically traced to a user requirement. Thus, during regression testing, I can select a set of test scripts to be executed based on my impact analysis and during off-peak hours in my test environment I can execute these test scripts to see if there has been any impact to my validated system. These test scripts can be run fully automated using Java-based test scripts or they can be run using an online validation testing process. Revalidation of the system can happen in a matter of hours versus a matter of days or weeks using this process.

Through continuous testing, you can review the changes that Microsoft has made to both Azure and dynamics 365 online. Yes, this information is posted online for your review. This answers the question how do I know what changes are made and when changes are made. This information is made available to you through Microsoft. You can determine how often you test a validated system. There is no regulation that codifies how often this should occur. It’s totally up to you. However, as a good validation engineer you know that it should be based on risk. The riskier the system the more often you should test. The less risky the system the less due diligence is required. Nevertheless cloud-based systems should be subject to continuous testing to ensure compliance and maintain the validated state.

There are many other aspects of supporting validation testing in the cloud but suffice to say Microsoft Dynamics and Azure can be validated and have been successfully validated for many clients. Microsoft has done a tremendous service to the life sciences industry by transparently providing information through the Microsoft trust center. As a validation engineer I find this one of the most innovative things that they’ve done! It provides a lot of the information that I need and confirmation through third-party entities that Microsoft is complying with cloud security alliance controls, and the NIST Cybersecurity framework. I would encourage each of you to review the information on Microsoft’s trust center. Of course, there will always be skeptics of anything Microsoft does but let’s give them credit where credit is due.

The Microsoft Trust Center is a good thing. The company has done an excellent job of opening up and sharing how they view system security, quality and compliance. This information was never made fail available before and it is available now. I have validated many Microsoft Dynamics AX systems as well as 365 systems. The software quality within these systems is good and with the information that Microsoft has provided you can have confidence that deploying a system in the Azure environment is not only a good business system decision if you have selected this technology for your enterprise but a sound decision regarding quality and compliance.

.

Saving Time and Money Through Lean Validation

The principles and best practices of lean manufacturing have served life sciences manufacturers well.  Lean is all about optimizing processes, while eliminating waste (“Muda”) and driving greater efficiencies.  As a 30-year validation practitioner, I have validated many computer systems, equipment and processes.  One of the key lessons learned is that there is much room for improvement across the validation process.

OnShore Technology Group is a pioneer in lean validation and has developed principles and best practices to support lean validation processes.  To power our lean processes, we leverage ValidationMaster™, an Enterprise Validation Management system exclusively designed to facilitate lean validation and automate the validation process.

So what is lean validation and how is it practically used?

Lean validation is the process of eliminating waste and inefficiencies while driving greater software quality across the validation process through automation.

Lean Validation

Lean validation cannot be achieved without automation. Lean validation processes leverage advanced technology designed to fulfill the technical, business and compliance requirements for software validation eliminating the use of manual, paper-based processes. Optimized validation processes are deployed using the latest global best practices to ensure the right amount of validation rigor based on critical quality attributes, risks, Cybersecurity and other factors.

Lean validation process begins with a lean validation project charter which defines the description of the process and key performance indicators.  KPI’s such as reduce validation costs by $X/year or reduce software errors by X%.  The charter shall define the project scope, dependencies, project metrics and resources for the project.

There are five principles of lean validation  derived from lean manufacturing principles.  Lean validation is powered through people, processes and technology.  Automation drives lean validation processes.  The LEAN VALIDATION value principles are illustated in the figure below.

Lean-Validation-Graphic-600

Principle 1 – VALUE – Lean thinking in manufacturing begins with a detailed understanding of what value the customer assigns to product and services.  Lean thinking from an independent validation and verification perspective begins with a detailed understanding of the goals and objectives of the validation process and its adherence to compliance objectives.  The principle of VALUE requires the validation team to therefore focus on the elimination of waste to deliver the value the end customer (your organization) in the most cost-effective manner.  The computer systems validation process is designed for the purpose of assuring that software applications meet their intended use.  The value derived from the validation process is greater software quality, enhanced ability to identify software defects as a result of greater focus and elimination of inefficient and wasteful processes. AUTOMATION IS THE FOUNDATION THAT FACILITATES THE ACHIEVEMENT OF THIS VALUE PRINCIPLE.

Principle 2 – VALUE STREAM – The value stream, from a lean perspective is the comprehensive product life-cycle from the raw materials through customer’s end use, and ultimate disposal the product.  To effectively eliminate waste, the ultimate goal of lean validation, there must be an accurate and complete understanding of the value stream.  Validation processes must be examined end-to-end to determine what value is added to the objective of establishing software quality and compliance.  Any process that does not add value to the validation process should be eliminated.  We recommend value stream mapping for the validation process to understand where value is added and where non-value added processes can be eliminated.  Typical “Muda” or wastes commonly revealed from validation process mapping are:

  • Wasteful Legacy Processes (“we have always done it this way”)
  • Processes That Provide No Value To Software Quality At All
  • Manual Process Bottlenecks That Stifle Processes

Principle 3 – FLOW – The lean manufacturing principle of flow is about creating a value chain with no interruption in the production process and a state where each activity is fully in step with every other.  A comprehensive assessment and understanding of flow throughout the validation process is essential to the elimination of waste.  From a validation perspective, optimal flow is created through the process when, for example, users have the ability to automatically create requirements from test scripts for automated traceability thereby eliminating the process of manually tracing each test script to a requirement.  Another example is when a user has the ability to navigate through a software application and the test script process is automatically generated.  Once generated, it is automatically published to a document portal where it is routed electronically for review and approval.  All of this requires AUTOMATION to achieve the principle of FLOW.  For process optimization and quality control throughout the validation lifecycle, information should optimally flow throughout the validation process in an efficient manner minimizing process and document bottlenecks with traceability throughout the process.

Principle 4 – PULL – A pull system is a lean manufacturing is used to reduce waste in the production process. Components used in the manufacturing process are only replaced once they have been consumed so companies only make enough products to meet customer demand.  There is much waste in the validation process.  The PULL strategy for validation may be used to reduce wastes such as duplication of effort, streamlining test case development and execution, electronic signature routing/approval and many others.  Check out our blog “The Validation Post” for more information.

Principle 5 – PERFECTION – Validation processes are in constant pursuit of continuous improvement.  Automation is KEY.  Lean validation engineers and quality professional relentlessly drive for perfection. Step by step validation engineers must identify root causes of software issues, anomalies, and quality problems that affect the suitability of a system for production use. As computing systems environments evolve and become more complex and integrated, validation engineers must seek new, innovative ways to verify software quality and compliance in today’s advanced systems. Perfection cannot easily be achieved through manual processes.

AUTOMATION IS REQUIRED TO TRULY REALIZE THE VISION OF THIS PRINCIPLE.

You can save time and money through lean.  Consider the first 2 principles of Value and Value Stream.  Many validation engineers consider validation only as a regulatory requirement or “necessary evil” – rather than a regulatory best practice designed to save time and money.  I think we all agree that in the long run quality software saves time and money through regulatory cost avoidance and more efficiency throughout quality processes.

It is important for validation engineers to not only consider quality and compliance but cost savings that may be gained throughout the validation process.  Lean validation is a strategy whose time has come.  How much will cost savings be through lean?  End users report saving about 40 – 60% for test script development and 60 – 70% on regression testing efforts.  Regulatory cost avoidance can be significant depending on the level of compliance with each company.  Cost savings may also be realized in minimizing the amount of paper generated through the validation process.

Embrace LEAN VALIDATION and experience the benefits of saving time and money.

 

 

 

5 Ways to Revive Your CSV Validation Process in 2018

If you are like most veteran validation engineers, you have been conducting validation exercises the same old way.  You have been gathering user requirements, conducting risk assessments, developing test scripts (most often on paper) and delivering validation documentation and due diligence as required by current global regulations.

Validation processes have not changed very much over the last 40 years (unless you include ISPE GAMP® as revolutionary).  However, the systems we validate and the state of play with respect to cybersecurity, data integrity, mobility and compliance has changed in a profound way.  Cloud-based applications are seeing increasing adoptions.  Platforms such as Microsoft Azure are changing the way companies do business.  Mobility is in the hands of everyone who can obtain an affordable device.  These game-changing trends call for changes in the way we wake up our sleepy validation processes and use them as opportunities for process improvement.

To help you address these new challenges for the coming year, I offer 5 ways to revive sleepy, manual validation processes for your consideration.

  1. Think LEAN
  2. Automate
  3. Leverage Electronic Document Review & Approval Processes
  4. Adopt Agile Validation Processes
  5. Conduct Cybersecurity Qualification

STEP 1.  THINK LEAN

Lean manufacturing is a proven process that powered Toyota to success. Lean manufacturing processes were designed to eliminate waste and improve operational efficiency. Toyota adopted lean manufacturing processes and rose to prominence with quality vehicles that outsold most of their competitors in each automotive class. To improve your validation processes in today’s systems environment, it is important to think lean. As we are being asked to do more with less as validation engineers it is important and dare I say critical that you eliminate any wasteful processes and focus on validation processes that in the end at value. This is the basic premise of lean validation.

Throughout my practice, we deliver lean validation services to all of our clients. Lean validation is powered through automation. You can’t have a lean validation process without automation since automation powers lean. Through automation and lean validation processes, we are able to automate the development and management of requirements, incidents, and validation testing in a manner not possible on paper. As you look to wake up your validation processes in 2018, it is important to think lean. I wrote a blog article on the principles and best practices of lean as a good starting point to educate you as to how to maintain and establish lean validation processes.

STEP 2.  AUTOMATE

As I mentioned above you cannot adopt lean validation processes unless you automate your validation processes. Automation has often been looked at by validation engineers as a luxury rather than the necessity. In 2018, automation of the validation process is a necessity not a luxury. Electronic workflow used to be the purview of expensive enterprise systems often costing in the millions of dollars. Today, there is no excuse for not automating your process other than simply not wanting to change.

Electronic systems to manage automated validation processes have come a long way in the last 20 years. We offer a system call ValidationMaster™  which was designed by validation engineers for validation engineers. ValidationMaster™  is an enterprise validation management system designed to manage the full lifecycle of validation processes. From requirements to test script development and execution through incident reporting and validation document control, validation master manages the full lifecycle of validation processes and their associated deliverables. The system comes with a fully integrated validation master portal based on SharePoint designed to manage document deliverables in a controlled manner. The system is integrated with electronic signatures and graphical workflows to eliminate document bottlenecks and improve the process of development of your document deliverables.

The development and execution of test cases represents the most laborious part of validation. This is where most validation exercises are either successful or fail. Most often, given the manual cryptic nature that most validation engineers use to develop test scripts, a lot of focus is spent on the actual development of the test script and not the quality of the test case itself. This is due to wasteful processes such as cutting and pasting screenshots or manually typing in information that the validation engineer sees on the screen during the manual test script development process. Both the process of capturing screenshots or typing in information to describe a screen are considered in my view wasteful and don’t add to the value of the test case.

The FDA says a good test case is one that finds an error. As validation engineers we should be focused more on finding errors and software quality than cutting and pasting screenshots into word documents as most of us do. There are things that computers can do much more efficiently and effectively than humans can do and one of them is automated test generation.

There are tools on the market such as ValidationMaster™ that allow you to fully automate this process and eliminate the waste in this process. ValidationMaster™ allows you to navigate through the application that is subject for your validation exercise while the system automatically captures the screenshot and the text and places this text directly into your formatted documents. You simply push a button at the end of the test script development and your test script is written for you. This is standard out-of-the-box functionality. Managing requirements for enterprise applications can also be laborious and challenging.

If you are validating enterprise resource planning systems which consist of hundreds of requirements, keeping track of the version of each requirement may be difficult. Who changes a requirement and why was a requirement changed is often the subject of hours and hours of meetings that waste time and money. ValidationMaster™ has a full-featured requirements management engine which not only tracks the requirement but it tracks the full audit history of a requirement so that you can understand who change the requirement, when was the requirement changed and why. This gives you full visibility to requirements. These requirements can be directly traced to test scripts thereby fully automating requirements traceability in your validation process.

Validation testing can be conducted online as either a fully automated validation test script which requires no human intervention or test cases that are semi-automatic where the validation engineer navigates through each software application and the actual results are captured by the system.

Manually tracking tests runs can be a headache and sometimes error-prone. An automated system can do this for you with ease. ValidationMaster™ allows you to conduct multiple tests runs and it tracks each of the tests runs, the person who ran the test, how long it took to actually execute the test, and the actual results of the test. If an incident or software anomaly occurred during testing the automated system would actually allow you to capture that information and keeps the incident report with each validation test. The time for automation has calm.

If you want to wake up your sleepy validation processes and deliver greater value for your company, you cannot afford to overlook automation of your validation process. Check out a demonstration of ValidationMaster™ to see how automated validation lifecycle management can work for you.

STEP 3 – LEVERAGE ELECTRONIC DOCUMENT REVIEW & APPROVAL PROCESSES

One of the many challenges that validation engineers experience every day is the route review and approval of validation documentation. In many paper-based systems this involves routing around printed documents for physical handwritten signatures. This is the way we did it in the 1980s. 21st-century validation requires a new more efficient approach. As mentioned in the section above on automation, there really is no excuse for using manual paper-based processes anymore. There are affordable technologies on the market that would facilitate electronic document review and approval processes and tremendously speed up the efficiency and quality of your document management processes.

In one previous company that I worked for our document cycle times were up to 120 days per document. With electronic route and review using 21 CFR part 11 electronic signatures we managed to get the process down to three weeks which saved a significant amount of time and money. Electronic document review and approval is a necessity and no longer a luxury there are tools on the market that costs less than a happy meal per month that would allow you to efficiently route and review your validation documentation and apply electronic signatures in compliance with 21 CFR part 11. To wake up your document management processes electronic document review and approval is a MUST.

STEP 4 – ADOPT AGILE VALIDATION PROCESSES

For as long as I can remember, whenever validation was spoken up we talked about the V model. This is the model as shown in the figure below where validation planning happens on the left-hand side of the V and validation testing happens on the right-hand side of the V.

V-MODEL

This model often assumed a waterfall approach to software development where all requirements were developed in advance and were tested subsequent to the development/configuration process. In today’s systems environment this is actually not how software is developed or implemented in most cases. Most development teams use an agile development process. Agile software development methodologies have really revolutionized the way organizations plan, develop, test and release software applications today. It is fair to say that agile development methods have now been established as the most accepted way to develop software applications. Having said this, many validation engineers insist on the waterfall methodology to support validation.

Agile development is basically a lightweight software development methodology that focuses on having small time boxed sprints of new functionality that are incorporated into an integrated product baseline. In other words, all requirements are not developed upfront. Agile recognizes that you may not know all requirements up front. The scrum places emphasis on customer interaction, feedback and adjustments rather than documentation and prediction. From a validation perspective this means iterations of validation testing as software iterations are developed in sprints. It’s a new way of looking at validation from the old waterfall approach. This may add more time to the validation process overall but the gradual introduction of functionality within complex software applications has value in and of itself. The Big Bang approach of delivering ERP systems sometimes has been fraught with issues and has sometimes been error-prone. The gradual introduction of features and functionality within software application has its merits. You should adopt agile validation processes to wake up your current processes.

STEP 5 – CONDUCT CYBERSECURITY QUALIFICATION (CYQ)

Last but not least is cyber security. In my previous blog posts, I discussed extensively the impact of cyber security on validation. I cannot emphasize enough how you need to pay attention to this phenomenon and be ready to address it. Cyber threats are a clear and present danger not only to validated systems environments but to all systems environments. As validation engineers our job is to ensure that system are established and meet their intended use. How can you confirm that the system meets its intended use when it’s vulnerable to cyber threats? The reality is you can’t.

Many validation engineers believe that cyber security is the purview of the IT department. This is something that the IT group is responsible for handling and many believe that it has nothing to do with validation. Nothing could be further from the truth. If you remember the old adage in validation “if it’s not documented it didn’t happen” you will quickly realize that you must document your readiness to deal with a cyber security event in the event that one occurs in your environment.

I call this “cyber security qualification” or “CyQ”. A cyber security qualification is an assessment of your readiness to protect validated systems environments against the threat of a cyber event. The CyQ is intended to evaluate your readiness to ensure compliance. One of the ways that you can wake up your validation processes in 2018 is to understand and educate yourself as to the threats that cyber events can have on your current validated systems environment and conduct a CyQ in addition to your IQ/OQ/PQ testing to ensure that you have the processes and procedures in place to deal with any threats that may come your way.  If you would like to see an example of a CyQ just write me and I’ll send you one.

2018 is going to bring many challenges your way. It is time that you bring your validation processes to level V validation maturity as shown in the figure below. You need to develop more mature processes that deal with the threats of cyber security and embrace lean validation and agile methodologies. If you’re like most you will be asked to do more with less. Updating old antiquated validation processes is essential to you being able to have the agility to do more with less.  It’s about time that you revive your old antiquated validation processes. Are you ready to go lean?

Cloud Validation Strategies

If you were to ask me 10 years ago how many of my life sciences clients were deploying systems in the cloud environment I would’ve said may be perhaps one or two. If you ask me today how many of my clients are deploying cloud’s technologies I would say most all of them in one way or another.  The adoption of cloud technologies within life sciences companies is expanding at a rapid pace.

From a validation perspective, this trend has profound consequences.  Here are some key concerns and questions to be answered for any cloud deployment.

  1. How do you validate systems in a cloud environment?
  2. What types of governance do you need to deploy applications in a cloud environment?
  3. How do you manage change in a cloud environment?
  4. How do you maintain the validated state in the cloud?
  5. How can you ensure data integrity in the cloud?
  6. How do you manage cybersecurity in a cloud environment?

The answers to these questions are obvious and routine to validation engineers managing systems in an on-premise environment where the control of the environment is managed by the internal IT team.  They have control over changes, patches, system updates, and other factors that may impact the overall validated state.  In a cloud environment, the software, platform and infrastructure is delivered as a SERVICE.  By leveraging the cloud, life sciences companies are effectively outsourcing the management and operation of a portion of their IT infrastructure to the cloud provider.  However, compliance oversight and responsibility for your validated system cannot be delegated to the cloud provider.  Therefore, these services must have a level of control sufficient to support a validated systems environment.

For years, life sciences companies have been accustomed to governing their own systems environments.  They control how often systems are updated, when patches are applied, when system resources will be updated, etc.  In a cloud environment, control is in the hands of the cloud service provider.  Therefore, who you choose as your cloud provider matters.

So what should your strategy be to manage cloud-based systems?

  • Choose Your Cloud Provider Wisely – All cloud providers are not created equally.  The Cloud Security Alliance (https://cloudsecurityalliance.org/ ) is an excellent starting point for understanding cloud controls.  The Cloud Controls Matrix (CCM) is an Excel spreadsheet that allows you to assess a vendors readiness for the cloud.  You can download it free of charge from the CSA.
  • Establish Governance For The Cloud – You must have an SOP for the management and deployment of the cloud and ensure that this process is closely followed.  You also need an SOP for cyber security to provide a process for protecting validated systems against cyber threats.
  • Leverage Cloud Supplier Audit Reports For Validation – All cloud providers must adhere to standards for their environments.  Typically, they gain 3rd party certification and submit to Service Organization Control (SOC) independent audits.  It is recommended that you capture the SOC 1/2/3 and SSAE 16 reports.  You also want to understand any certifications that your cloud provider has.  I would archive their certifications and SOC reports with the validation package as part of my due diligence for the supplier audit.
  • Embrace Lean Validation Principles and Best Practices – eliminating waste and improving efficiency is essential in any validated systems environment.  Lean validation is derived from the principles of lean manufacturing.  Automation is a MUST.  You need to embrace lean principles for greater efficiency and compliance.
  • Automate Your Validation Processes – Automation and Lean validation go hand in hand.  The testing process is the most laborious process.  We recommend using a system like ValidationMaster™ to automate requirements management, test management and execution, incident management, risk management, validation quality management, agile validation project management, and validation content management. ValidationMaster™ is designed to power lean validation processes and includes built-in best practices to support this process.
  • Use a Risk-Based Approach To Validation – all validation exercises are not created equal.  The level of validation due diligence required for your project should be based on risk – regulatory, technical and business risks.  Conduct a risk assessment for all cloud-based systems.
  • Adopt Continuous Testing Best Practices – the cloud is under continuous change which seems in and of itself counter-intuitive to the validation process.  Continuous testing can be onerous if your testing process is MANUAL.  However, if you adopt lean, automated testing processes regression testing is easy.  You can establish a routine schedule for testing and if your cloud provider delivers a dashboard that tells you when patches/updates/features have been applied and the nature of them, you can select your regression testing plan based on a risk and impact assessment.

 

Cloud environments can be validated!  A clear, practical approach that embraces lean validation and continuous testing is key.  Cloud governance to ensure data integrity and sustained compliance is key.

Cloud technologies are here to stay.  Regulators don’t object to the use of the cloud, they want to know how you are managing it and ensuring the integrity of the data.  They also want you to confirm that you are maintaining the validated state in the cloud.  The principles of validation endure in the cloud.  Just because you are in a cloud environment does not mean validation principles no longer apply.  Consider the impact of cybersecurity in your cloud environment and adopt continuous testing strategies to ensure sustained compliance.

Understanding Lean Validation: A Practical Approach

Lean manufacturing has been with us since 1988.   Lean principles were derived from the Japanese manufacturing industry.  The “Lean” process was originally created and adopted by Toyota designed to eliminate waste and inefficiency in its manufacturing operations.  Lean processes led to the Toyota Production System (TPS) which is arguably one of the greatest manufacturing success stories of all time.  The focus of lean was the elimination of waste and inefficiencies throughout the manufacturing process.

To identify and eliminate waste from the production process, Toyota believed it was important to understand exactly the nature of waste and where it existed. While Toyota’s products significantly differed between factories, the typical wastes found in manufacturing environments were similar in nature. For each waste, Toyota developed an effective strategy to reduce or eliminate its effects, thereby improving overall performance and quality.  The process became so successful that it has been embraced in manufacturing sectors around the world. Today’s manufacturers have embraced the concepts and philosophies of lean.  Being lean is considered critical competitive advantage and strategic imperative.  It has made Toyota an automotive success story.

I have been a validation practitioner for over 30 years.  In the process of validating large engineering systems and life sciences quality and compliance management technologies, I have experienced first hand the waste involved in the validation processes.  From manually cutting and pasting screenshots into test cases to manually tracing requirements to test scripts through manual document route/review processes and rewriting scripts for regression testing, waste abounds in the validation process.  As I pondered my work over the past three decades, I began to think of a better way to validate computer systems.

In my initial research, I started to think of all of the wastes throughout validation processes and categorized them.

validation waste

Wastes in the validation process include planning process wastes, testing wastes, documentation wastes, defect wastes, wasteful requirement processes, quality and incident management wastes and wastes associated with the re-validation and re-testing of software applications.  As I began to ponder solutions to eliminate these wastes throughout the validation process, I embarked on the development of a Lean Validation strategy.  We endeavor to save our clients both time and money throughout the validation process.  OnShore Technology Group is the only company whose practice focuses on the the delivery of Lean Validation services.  Our products and services are uniquely designed to power lean validation processes.

Lean Validation

Using lean validation principles can result in the elimination of wasteful manual validation processes and significant improvements in validation efficiency, document cycle times, increased testing productivity and greater ability to identify and correct software defects leading to enhanced software quality, lower costs and improved regulatory compliance.

OnShore Technology Group has pioneered the principles and best practices of “Lean Validation” – the process of eliminating waste and inefficiencies while driving greater software quality throughout the validation process.  In addition to the delivery of expert lean validation services, OnShore’s flagship software application is known as ValidationMaster™ – the FIRST Enterprise Validation Management and Quality system designed to automate lean validation processes.  ValidationMaster™ delivers a single source of truth for any type of validation including software, equipment, process, cold chain, facility, and other types of validation projects.

ValidationMaster™ is also the first validation management system accessible via any Window, Apple or Android mobile device.  The system includes key features such as a validation dashboard, fully automated test script development and execution, automatic requirements traceability, custom report development and generation and a full range of quality management capabilities (training, audit management, change management, controlled document management, ISO, validation KPI’s, CAPA, nonconformance management and much more.

In 2017, OnShore Technology Group was recognized as the “Best IV&V Automation Solutions Provider” and the “Software Validation Testing Experts of the Year”.  In 2016, OnShore made the coveted annual list of CIOReview Magazine as one of the 20 Most Promising Pharma & Life Sciences Tech Solution Providers”.

Contact us today to learn more and see a live demonstration of ValidationMaster™.

Computer Systems Validation As We Know It Is DEAD

Over the past 10 years, the software industry has experienced radical changes.  Enterprise applications deployed in the cloud, the Internet of Things (IoT), mobile applications, robotics, artificial intelligence, X-as-a-Service, agile development, cybersecurity challenges and other technology trends force us to rethink strategies for ensuring software quality.  For over 40 years, validation practices have not changed very much.  Suprisingly, many companies still conduct computer systems validation using paper-based processes.  However, the trends outlined above challenge some of the current assumptions about validation.  I sometimes hear people say “… since I am in the cloud, I don’t have to conduct an IQ…” or they will say, “… well my cloud provider is handling that…”

Issues related to responsibility and testing are changing based on deployment models and development lifecycles.  Validation is designed to confirm that a system meets its intended use.  However, how can we certify that a system meets its intended use if it is left vulnerable to cyber threats?  How can we maintain the validated state over time in production if the cloud environment is constantly changing the validated state?  How can we adequately test computer systems if users can download an “app” from the App Store to integrate with a validated system?  How can we ensure that we are following proper controls for 21 CFR Part 11 if our cloud vendor is not adhering to CSA cloud controls?  How can we test IoT devices connected to validated systems to ensure that they work safely and in accordance with regulatory standards?

You will not find the answers to any of these questions in any regulatory guidance documents.  Technology is moving at the speed of thought yet our validation processes are struggling to keep up.

These questions have led me to conclude that validation as we know it is DEAD.  The challenges imposed by the latest technological advances in agile software development, enterprise cloud applications, IoT, mobility, data integrity, privacy and cybersecurity are forcing validation engineers to rethink current processes.

Gartner group recently announced that firms using IoT grew from 29% in 2015 to 43 % in 2016.  They project that by the year 2020, over 26 billion devices will be IoT-devices.  it should be noted that Microsoft’s Azure platform includes a suite of applications for remote monitoring, predictive maintenance and connected factory monitoring for industrial devices.  Current guidance has not kept pace with ever-changing technology yet the need for quality in software applications remains a consistent imperative.

So how should validation engineers change processes to address these challenges?

First, consider how your systems are developed and deployed.  The V-model assumes a waterfall approach yet most software today is developed using Agile methodologies.  It is important to take this into consideration in your methodologies.

Secondly, I strongly recommend adding two SOPs to your quality procedures – a Cybersecurity SOP for validated computer systems and a Cloud SOP for validated systems.  You will need these two procedures to provide governance for your cloud processes.  (If you do not have a cloud or cybersecurity SOP please contact me and I will send you both SOPs.)

Third, I believe you should incorporate cybersecurity qualification (CyQ) into your testing strategy.  In addition to IQ/OQ/PQ, you should be conducting a CyQ readiness assessment for all validated systems.  A CyQ is an assessment to confirm and document your readiness to protect validated systems against a cyber attack.  It also includes testing to validate current protections for your validated systems.  It is important to note that regulators will judge you on your PROACTIVE approach to compliance.  This is an important step in that direction.

cyq-1

Forth, you should adopt lean validation methodologies.  Lean validation practices are designed to eliminate waste and inefficiency throughout the validation process while ensuring sustained compliance.

Finally, the time has come for automation.  To keep pace with the changes in current technology as discussed above, you MUST include automation for requirements management, validation testing, incident management and validation quality assurance (CAPA, NC, audit management, training, et al).  I recommend consideration of an Enterprise Validation Management system such as ValidationMaster™ to support the full lifecycle of computer systems validation.  ValidationMaster™  allows you to build a re-usable test script library and represents a “SINGLE SOURCE OF TRUTH” for all of your validation projects.  Automation of the validation process is no longer a luxury but a necessity.

Advanced technology is moving fast.  The time is now to rethink your validation strategies for the 21st century.  Validation as we know it is dead.  Lean, agile validation processes are demanded to keep pace with rapidly changing technology.  As you embrace the latest cloud, mobile and IoT technologies, you will quickly find that the old ways of validation are no longer sufficient.  Cyber criminals are not going away but you need to be ready. Step into LEAN and embrace the future!

 

How Vulnerable Is Your Validated System Environment?

If you are a validation engineer reading this post, I have a simple question to ask you.. “How vulnerable is your validated systems environment?”  Has anyone ever asked you this question?  How often do you think about the impact of a cyber threat against your validated computer system?  I know… you may be thinking that this is the IT departments responsibility and you don’t have to worry about it.  THINK AGAIN!

If you have picked up in a newspaper lately you will see cyber threats coming from all directions. The White House has been hacked, all companies big and small have been hacked into and from the hacker’s perspective there seems to be no distinction between small medium or large enterprises. In summary, everyone is vulnerable.  The definition of cyber security is the possibility of a malicious attempt to damage or disrupt a computer network or system.

Network applications continue to create new business opportunities and power the enterprise. A recent report suggested that the impact and scale of cyber-attacks is increasing dramatically. The recent leak of government developed malware has given cyber criminals greater capabilities than they had before. IT is struggling to keep pace with the flow of important software security patches and updates and the continuous adoption of new technologies like the Internet of things (IOT) that are creating new vulnerabilities to contend with.

A recent study in 2017 by CSO highlighted the fact that 61% of corporate boards still seasick security as an IT issue rather than a corporate governance issue. This is only one part of the problem. It is not even on the radar of most validation engineers.  You cannot begin to confirm that a system meets its intended use without dealing with the concept of cyber security and its impact on validated systems.

Validated computer systems house GMP and quality information as well as the data required by regulators thus particular scrutiny must be paid to these systems.  Compromise of a validated system may lead to adulterated product and issues which may affect the products quality efficacy and other critical quality attributes of marketed product. Therefore, attention must be paid to validated systems with respect to their unique vulnerability.

As part of the validation lifecycle process we conduct IQ, OQ, and PQ testing as well as user acceptance testing to confirm a system’s readiness for intended use.  I am suggesting that another type of testing be added to the domain of validation testing which is called cyber security qualification or CyQ.

Cyber security qualification is confirmation of a system’s readiness to protect against a cyber attack.

cyq2

You should incorporate CyQ in your validation testing strategy.  It is imperative that your validated systems are protected against a cyber event.  You must document this as well to prove that you have conducted your due diligence.  Given all of the attention to cyber events in the news, you need a strategy to ensure sustained security and compliance.  Are you protecting your validated systems?  If not, you should.