SharePoint Validation: Quality and Compliance Portals

I am often asked the question… “can SharePoint be validated?”  The short answer is YES but it often requires customization to achieve deliver compliance objectives.  The longer response requires further examination as to why people ask the question and the nature of SharePoint as a content management system.  With the release of Office 365® reaching over 100 million active users per month and more companies moving toward the cloud, we are witnessing the maturation of SharePoint for both regulated and non-regulated content management.

SharePoint has undergone many changes over the past decade that have increased its adoption within the life sciences industry.  New features of SharePoint from Microsoft and its robust technology partner community include, but are not limited to:

  • Synchronization with OneDrive For Business®
  • New SharePoint Communication Sites With Pre-Built Layouts
  • Integration of SharePoint and Microsoft Team
  • New Integration with DocuSign® For Electronic Signatures
  • Enhanced Integration For Graphical Workflows From Nintex®
  • SharePoint-aware PowerApps and Flow
  • Updated Page Layouts and Web Part Enhancements
  • Improved SharePoint Administration
  • Enhanced Document Version Control

Within the life sciences community, the resistance to SharePoint focused on security and the lack of “out-of-the-box” features for life sciences validation.  What are some of the key application that life sciences companies require from a regulatory SharePoint enterprise content management system?  A partial list of document and records management features such as:

  • Intelligent Document Creation Tools
  • Automated Document Change Control
  • Configurable Document Types With Pre-Assigned Document Workflows (based on the type of document, workflows are automatically launched)
  • 21 CFR PART 11 support (electronic or digital signatures, audit trails, et al)
  • Ability to print a Signature Page with Each Signed Document
  • Ability to Establish Pre-defined Automated Document Lifecycle Workflows
  • Support for and designation of Controlled and Uncontrolled Content
  • Controlled Document Management Features Should include Configurable watermarks and overlays
  • Markup tools for document review
  • Ability to classify documents for records management capabilities
  • Ability to assign/tag documents with metadata
  • Content Rendering (when documents are checked in, they are automatically rendered in PDF format for document review.)
  • Custom Document Numbering (the ability to automatically assign alphanumeric document numbers to content)
  • Enforcement of the use of Standard Document Templates Codified Within SOPs
  • Version tracking with major and minor version control, version history
  • Ability to support regulatory submissions and publishing (this is a big one)
  • System MUST BE VALIDATABLE

As you can see from the partial list above, there are many features required by regulatory companies that are not standard in SharePoint out of the box.  However, SharePoint offers rich capabilities and features that have significantly enhanced the ability to deliver such as solution with the features listed above with minimal effort.

As a former Documentum and Qumas executive, I know first hand the challenges of developing such as system from scratch as my former employers did.  However, leveraging the power of SharePoint, OnShore Technology Group’s ValidationMaster™ Quality and Risk Management portal for example, is SharePoint-based and includes all of the features listed above.  The level of effort required to deliver such as solution was substantially lower due to the SharePoint application framework and development tools.

The ability to manage regulatory submissions and publishing is one of the features for which SharePoint may be more challenged.  In the Documentum world, there was such a thing as a “Virtual Document”.  A Virtual Document was a document that contained components or child documents.  A Virtual Document may represent a section of a regulatory dossier where the header represented the section of the dossier and there may be several child documents that are individual documents in that section.  Documentum was an object-oriented system and thus allowed the ability to have a single document comprised of multiple ACTUAL documents with early and late binding ability.  Since each component of a Virtual Document is its own document that can be checked in/check out and routed individually from other components, it makes them ideal for regulatory submission management which has very specific guidelines for publishing and pagination.   I have not seen a parallel yet for this in SharePoint.

Document management systems use to cost millions of dollars for acquisition, implementation and deployment.  These systems are now somewhat “commoditized” and the price points are significantly lower.  Many life sciences companies are using SharePoint for non-regulated documentation.  However, an increasing number of them are abandoning their higher cost rivals and moving to SharePoint as the foundation for controlled and uncontrolled documentation.  SharePoint can be in a hosted Office 365 environment or established in an on-premise environment.  Check out my cloud validation posts for more information on validating SharePoint and other applications in a cloud environment.  Either way, the system can and should be validated if used for regulatory content management.

It is recommended that you establish a clear set of user requirements for SharePoint.  SharePoint has powerful capabilities much beyond those articulated in this blog post.  There are many SharePoint partners that deliver effective, ready-to-use integrations with SharePoint such as Nintex® and DocuSign®.   Use these partner solutions to help minimize the validation effort.

If you have not already done so, it is worth a second look for regulated content depending on your application.  One thing is for sure, the day of the multi-million dollar content management solution is over for most companies.

Is Your Validation Team Ready For GDPR?

GDPR stands for the General Data Protection Regulation.  It governs all personal data collected by companies for customers, potential customers, employees, and others.  Regulators are keen to understand how this information is managed and maintained over time.

In April 2016 the FDA issued new draft guidance for data integrity and compliance with cGMP. The guidance was issued in a question and answer style format and focused on frequently occurring data integrity lapses. When the FDA finalizes the guidance, it will represent their current thinking on data integrity and cGMP compliance for the industry.

Why did the FDA draft such guidance? It should be noted that the FDA has increasingly observed cGMP violations involving data integrity during the inspection process. Over 21 warning letters have involved data integrity lapses in drug manufacturing since January 2015. The FDA strongly believes that ensuring data integrity is an important component of the industry’s responsibility to ensure the safety, efficacy, and quality of drugs to protect public health and safety overall.

In recent years, many articles have been written that referred to data integrity using the ALCOA which means that data has to be attributable, legible, contemporaneous, original or true copy, and accurate. It should be noted that the requirements for record retention and review do not differ depending on the data format. Paper-based and electronic record-keeping systems are subject to the very same requirements.  For example section 211.68 requires that backup data be exact and complete and secure from alteration, inadvertent erasures, or loss. Section 211.180 requires true copies or other accurate reproductions of the original records.

Most life sciences companies validate business systems that have GMP impact. It is best practice to conduct installation, operational, and performance qualification testing to demonstrate that a computer system is fit for its intended use and document any incidents that may affect software quality or the reliability of the records. Data integrity and validation go hand in hand but with the latest guidance there’s really nothing new under the sun from a validation perspective. The same level of due diligence and rigor must be applied to confirm that systems are suitable for their intended use and that the data integrity within these systems is sound.

When you are examining data integrity issues it is critically important to look at all aspects of the system including system security and how it is established to ensure that records entered into the system have the highest level of integrity. The FDA recommends that you restrict the ability to alter files and settings within the system to those administrator users that require such access. A recent warning letter cited the failure to prevent unauthorized access or changes to data.

For systems design in accordance with 21 CFR part 11 it is critical to understand that audit trails should be independent. I know this doesn’t come as a surprise for many but I have seen systems where the audit trail could be turned on or off. Let me be clear. All systems designed in accordance with 21 CFR part 11 must have an independent audit trail generated by the computer such that the audit trail cannot be turned off by ordinary means. This means that someone cannot go to a common function within the system and turn off the audit trail. The FDA recommends that audit trails that capture changes to critical data be reviewed with each record and before the final approval of the record. They recommend that audit trails be subject to regular review. Recent warning letters have cited a lack of audit trail for lab instruments for example and the fact that audit trails can be turned off. If an audit trail can be turned off, fraudulent activity may occur. It is important for you to confirm within your systems that the audit trails are capturing information regarding the each record and that these audit trails are independent to ensure data integrity.

Data integrity is not a new concept but it is one that is receiving a lot of attention. Compliance with data integrity guidelines represents more of common sense for those in the compliance business. Look at data integrity not as the latest buzzword but as a reminder of how important it is to ensure the integrity and authenticity of data established and maintained within validated systems environment. This will go a long way to ensuring sustained compliance.

Risk Management & Computer Systems Validation: The Power Twins

For as long as systems have been validated, risk is an inherent part of the process.  Although validation engineers have been drafting risk assessments since the beginning of computer systems validation, many do not understand how crucial this process is to the overall validation process.

Risk management and validation go hand-in-hand.  The ISPE GAMP 5® (“GAMP 5”) methodology is a risk-based approach to validation.  GAMP 5 recommends that you scale all validation life cycle activities and associated documentation according to risk, complexity and novelty.  As shown in the figure below, the key drivers for GAMP 5 is science-based quality management of risks.

Drivers for GAMP 5

From a systems perspective, quality risk management, according to GAMP 5 is “…a systematic approach for the assessment, control, communication, and review of risks to patient safety, product quality, and data integrity.  It is an iterative process applied throughout the entire system life cycle…”  The guidance recommends qualitative and quantitative techniques to identify, document and manage risks over the lifecycle of a computer system.  Most importantly, they must be continually monitored to ensure the on-going effectiveness of each implemented control.

Risk assessments may be used to drive the validation testing process.  In our practice, we focus on critical quality attributes and four types of risks:

  • Business Risks
  • Regulatory Risks
  • Technical Risks
  • Cybersecurity Risks

The last type of risk, cybersecurity risks, is one that has not gotten a lot of attention from validation engineers and is not explicitly mentioned in GAMP 5.  However, this type of risk represents a clear and present danger to ALL systems, not just validated ones.  Cyber threats are real.  Large-scale cybersecurity attacks continue to proliferate across many enterprises and cyber criminals are broadening their approaches to strengthen their impact.  You need to look at a holistic approach to risk that not only includes the traditional risk assessment from GAMP as highlighted in the figure below but one that includes and incorporates cybersecurity as a real threat and risk to validated systems environments.

Risk Assessment - gamp 5

Cyber threats most certainly may impact product quality, patient safety and even data integrity.  We incorporate these risks into our risk management profile to provide a more comprehensive risk assessment for computer systems.

ZDNet reports that “…As new security risks continue to emerge, cloud security spending will grow to $3.5 billion by 2021…”

ZDNet June 15, 2017

As life sciences companies increase their adoption of the cloud, new challenges for validate systems environment are emerging and the risks that go along with them.  I wrote a blog post recently called “Validation as we know it is DEAD”.  In this post, I addressed the challenges and opportunities that cloud and cybersecurity bring.  Although cloud security “solutions” will driving spending by 2021, the solution is not necessarily a technology issue.  You can attack cyber risks effectively with STRATEGY.

For validated systems environments, I am recommending a cybersecurity plan to combat hackers and document your level of due diligence for validated systems.  Remember in validated systems, “…if its not documented, it didn’t happen…”.  You must document you plans and develop an effective strategy for all of your computer systems.

Validation and risk are the power twins of compliance.  Risk management cannot only facilitate the identification of controls to protect your systems long term, it can help ensure compliance and data integrity as well as drive efficiencies in your lean validation processes.  In the conduct of your risk assessments, do not ignore cybersecurity risks – It is the elephant in the room.

How Vulnerable Is Your Validated System Environment?

If you are a validation engineer reading this post, I have a simple question to ask you.. “How vulnerable is your validated systems environment?”  Has anyone ever asked you this question?  How often do you think about the impact of a cyber threat against your validated computer system?  I know… you may be thinking that this is the IT departments responsibility and you don’t have to worry about it.  THINK AGAIN!

If you have picked up in a newspaper lately you will see cyber threats coming from all directions. The White House has been hacked, all companies big and small have been hacked into and from the hacker’s perspective there seems to be no distinction between small medium or large enterprises. In summary, everyone is vulnerable.  The definition of cyber security is the possibility of a malicious attempt to damage or disrupt a computer network or system.

Network applications continue to create new business opportunities and power the enterprise. A recent report suggested that the impact and scale of cyber-attacks is increasing dramatically. The recent leak of government developed malware has given cyber criminals greater capabilities than they had before. IT is struggling to keep pace with the flow of important software security patches and updates and the continuous adoption of new technologies like the Internet of things (IOT) that are creating new vulnerabilities to contend with.

A recent study in 2017 by CSO highlighted the fact that 61% of corporate boards still seasick security as an IT issue rather than a corporate governance issue. This is only one part of the problem. It is not even on the radar of most validation engineers.  You cannot begin to confirm that a system meets its intended use without dealing with the concept of cyber security and its impact on validated systems.

Validated computer systems house GMP and quality information as well as the data required by regulators thus particular scrutiny must be paid to these systems.  Compromise of a validated system may lead to adulterated product and issues which may affect the products quality efficacy and other critical quality attributes of marketed product. Therefore, attention must be paid to validated systems with respect to their unique vulnerability.

As part of the validation lifecycle process we conduct IQ, OQ, and PQ testing as well as user acceptance testing to confirm a system’s readiness for intended use.  I am suggesting that another type of testing be added to the domain of validation testing which is called cyber security qualification or CyQ.

Cyber security qualification is confirmation of a system’s readiness to protect against a cyber attack.

cyq2

You should incorporate CyQ in your validation testing strategy.  It is imperative that your validated systems are protected against a cyber event.  You must document this as well to prove that you have conducted your due diligence.  Given all of the attention to cyber events in the news, you need a strategy to ensure sustained security and compliance.  Are you protecting your validated systems?  If not, you should.