Women in Validation: In Case You Missed It

One of my annual “must attend” events is the Institute of Validation Technology (IVT) Computer Systems Validation Annual Validation Week in October.  This is an excellent event to get the latest and greatest thinking on validation topics and industry guidance.  An official from the U.S. FDA attended the event and providing key information on the latest data integrity guidance and reorganization at the FDA.

The highlight of the event for me was a new session added called the “Women in Validation Empowerment Summit”.  The panel featured yours truly and along with other talented women in validation.  The goal of the panel was to discuss some of the unique challenges women face in STEM professions and how to over come them.  The panel was moderated by Roberta Goode of Goode Compliance International.  She led panelists through a lively discussion of topics ranging from career building, job challenges, work place difficulties and much more.  You can see a synopsis of what we discussed here.

I am a fierce advocate of STEM. There are women who are “hidden figures” in STEM professions such as computer systems validation that don’t get the recognition they so richly deserve or the attention of most corporate executives.  This session was about EMPOWERMENT.

There are many “hidden figures” in the field of validation but no real community surrounding them.  I left the session most impressed with my accomplished co-panelists.  Each and every one of them had tremendous insights on their career paths and stories to tell as to how they moved up the ladder.  The talk was real.  It was down to earth.  It was practical.  I didn’t want it to end!

A couple of troubling statistics highlighted during the session related to the number of undergraduate degrees earned by women in STEM.  According to The World Economic Forum, women earn “only 35 percent of the undergraduate degrees in STEM”.  The sad truth is this number remains unchanged for over a decade.  Look at the statistics for women in engineering from the National Science Foundation.

Women in Engineering I graduated in Civil and Environmental Engineering from the University of Wisconsin – Madison in 1982.  I was one of only 3 women in my class at the time.  You can see from the graph above that at the doctorate level, women drop off precipitously.

At the session, we discussed gender bias, stereotypes and other factors that affect women in our profession.  It was an excellent forum and one I hope they continue.  In case you missed it, the session was enlightening and fun for all!   See you in October!

Is Your Validation Team Ready For GDPR?

GDPR stands for the General Data Protection Regulation.  It governs all personal data collected by companies for customers, potential customers, employees, and others.  Regulators are keen to understand how this information is managed and maintained over time.

In April 2016 the FDA issued new draft guidance for data integrity and compliance with cGMP. The guidance was issued in a question and answer style format and focused on frequently occurring data integrity lapses. When the FDA finalizes the guidance, it will represent their current thinking on data integrity and cGMP compliance for the industry.

Why did the FDA draft such guidance? It should be noted that the FDA has increasingly observed cGMP violations involving data integrity during the inspection process. Over 21 warning letters have involved data integrity lapses in drug manufacturing since January 2015. The FDA strongly believes that ensuring data integrity is an important component of the industry’s responsibility to ensure the safety, efficacy, and quality of drugs to protect public health and safety overall.

In recent years, many articles have been written that referred to data integrity using the ALCOA which means that data has to be attributable, legible, contemporaneous, original or true copy, and accurate. It should be noted that the requirements for record retention and review do not differ depending on the data format. Paper-based and electronic record-keeping systems are subject to the very same requirements.  For example section 211.68 requires that backup data be exact and complete and secure from alteration, inadvertent erasures, or loss. Section 211.180 requires true copies or other accurate reproductions of the original records.

Most life sciences companies validate business systems that have GMP impact. It is best practice to conduct installation, operational, and performance qualification testing to demonstrate that a computer system is fit for its intended use and document any incidents that may affect software quality or the reliability of the records. Data integrity and validation go hand in hand but with the latest guidance there’s really nothing new under the sun from a validation perspective. The same level of due diligence and rigor must be applied to confirm that systems are suitable for their intended use and that the data integrity within these systems is sound.

When you are examining data integrity issues it is critically important to look at all aspects of the system including system security and how it is established to ensure that records entered into the system have the highest level of integrity. The FDA recommends that you restrict the ability to alter files and settings within the system to those administrator users that require such access. A recent warning letter cited the failure to prevent unauthorized access or changes to data.

For systems design in accordance with 21 CFR part 11 it is critical to understand that audit trails should be independent. I know this doesn’t come as a surprise for many but I have seen systems where the audit trail could be turned on or off. Let me be clear. All systems designed in accordance with 21 CFR part 11 must have an independent audit trail generated by the computer such that the audit trail cannot be turned off by ordinary means. This means that someone cannot go to a common function within the system and turn off the audit trail. The FDA recommends that audit trails that capture changes to critical data be reviewed with each record and before the final approval of the record. They recommend that audit trails be subject to regular review. Recent warning letters have cited a lack of audit trail for lab instruments for example and the fact that audit trails can be turned off. If an audit trail can be turned off, fraudulent activity may occur. It is important for you to confirm within your systems that the audit trails are capturing information regarding the each record and that these audit trails are independent to ensure data integrity.

Data integrity is not a new concept but it is one that is receiving a lot of attention. Compliance with data integrity guidelines represents more of common sense for those in the compliance business. Look at data integrity not as the latest buzzword but as a reminder of how important it is to ensure the integrity and authenticity of data established and maintained within validated systems environment. This will go a long way to ensuring sustained compliance.

What Data Integrity Means For Validation

After much fanfare, the general data protection regulations (GDPR) was approved by the EU parliament on April 14, 2016. The enforcement date for this regulation is May 25, 2018. If companies are not in compliance at that time the potential for heavy fines is inevitable. The EU general data protection regulation replaces the data protection directive 95 /46/EC and was designed primarily to harmonize data privacy laws across Europe and to protect and empower all Eve’s citizens data privacy and reshape the way organizations across the region approach data privacy. The regulation has a profound impact on businesses that operate in the EU. Maximum penalties may be as high as 4% of annual global turnover or €20 million (whichever is higher).

In recent years, we have seen massive data breaches at companies which has exposed private information and other sensitive information without consent. Many of these breaches have been due to cyber-attacks against companies of different sizes. The newspapers are full of such data breaches. The new GDPR regulation requires that breaches be reported to the relevant regulator without undue delay and where feasible, within 72 hours of becoming aware of it unless the breach is unlikely to result in a risk to the right and freedom of individuals. Data subjects must be informed without undue delay with the breaches likely to result in a high risk to the data subjects’ rights and freedoms unless the data has been rendered unintelligible to any third party (for example by encryption). Data processors are required to inform data controllers of any breach without undue delay.

What does all this mean for validated systems?

If you operate in the EU and your validated systems include sensitive data or data which may be of a personal nature, such as patient information, you are subject to the guidelines included within the GDPR regulation. You also need to look at data integrity and security practices around the validated system. We recommend strongly the Cybersecurity Qualification (CyQ ) discussed in a previous post. The CyQ assesses a firm’s readiness to protect itself against the cyber-attack. This could go a long way to meeting the requirements of GDPR since the cyber security qualification requires documentation of your security controls.

I recommend reading GDPR and getting used to it before May 2018. Assess your controls within your validated systems environment to determine how vulnerable your systems really are and your readiness to comply with this regulation. I assure you more will be forthcoming about this topic in the months to come.  WATCH THIS SPACE.

Automating Validation Testing: It’s Easier Than You Think

Automated validation testing has been elusive for many in the validation community.  There have been many “point solutions” on the market that addressed the creation, management and execution of validation testing.  However, what most validation engineers want is TRULY AUTOMATED validation testing that will interrogate an application in a rigorous manner and report results in a manner that not only provides objective evidence of pass/fail criteria but will highlight each point of failure.

In the 1980’s when I was conducting validation exercises for mini- and mainframe computers and drafting test scripts in a very manual way, I often envisioned a time when I would be able to conduct validation testing in a more automated way.  Most validation engineers work in an environment where they are asked to do more with less.  Thus, the need for such a tool is profound.

Cloud computing environments, mobility, cybersecurity, and data integrity imperatives make it essential that we more thoroughly test applications today.  Yet the burden of manual testing persists.  If I could share with you 5 key features of an automated testing system it would include the following:

  • Automated test script procedure capture and development
  • Automated Requirements Traceability
  • Fully Automated Validation Test Script Execution
  • Automated Incident Capture and Management
  • Ability to Support Continuous Testing in the Cloud

In most validation exercises I have participated in, validation testing was the most laborious part of the exercise.  Automated testing is easier than you think.

For example, ValidationMaster™ includes an automated test engine that captures each step of your qualification procedure and annotates the step with details of the action performed.

Test cases can be routed for review and pre-approval with the system quickly and easily through DocuSign.  Test case execution can be conducted online and a dynamic dashboard reports the status of how many test scripts have passed, how many have failed, or which ones may have passed with exception.  Once test scripts have been executed, the test scripts may be routed for post-approval and signed.

Within the ValidationMaster™ system, you can create a reusable test script library to support future regression testing efforts.  The system allows users to link requirements to test cases thus facilitating the easy generation of forward and reverse trace matrices.  Exporting documents in your UNIQUE format is a snap within the system.  Thus, you can consistently comply with your internal document procedures.

Continuous testing in a cloud environment is essential.  ValidationMaster™ supports fully automated validation testing allowing users to set a date/time for testing.  Test scripts are run AUTOMATICALLY without human intervention.  Allowing multiple runs of the same scripts if necessary.

Continuous testing in a cloud environment is ESSENTIAL.  You must have the ability to respond to rapid changes in a cloud environment that may impact the validated state of the system.  Continuous testing reduces risk and ensures sustained compliance in a cloud environment.

The system automatically raises an incident report if a bug is encountered through automated testing.  The system keeps track of each test run and results though automation.  ValidationMaster™ includes a dynamic dashboard that shows the pass/fail status of each test script as well as requirements coverage, open risks, incident trend analysis and much more.

The time is now for automated validation testing.  The good news is that there are enterprise level applications on the market that facilitate the full validation lifecycle process.  Why are you still generating manual test scripts?  Automated testing is easier than you think!

Saving Time and Money Through Lean Validation

The principles and best practices of lean manufacturing have served life sciences manufacturers well.  Lean is all about optimizing processes, while eliminating waste (“Muda”) and driving greater efficiencies.  As a 30-year validation practitioner, I have validated many computer systems, equipment and processes.  One of the key lessons learned is that there is much room for improvement across the validation process.

OnShore Technology Group is a pioneer in lean validation and has developed principles and best practices to support lean validation processes.  To power our lean processes, we leverage ValidationMaster™, an Enterprise Validation Management system exclusively designed to facilitate lean validation and automate the validation process.

So what is lean validation and how is it practically used?

Lean validation is the process of eliminating waste and inefficiencies while driving greater software quality across the validation process through automation.

Lean Validation

Lean validation cannot be achieved without automation. Lean validation processes leverage advanced technology designed to fulfill the technical, business and compliance requirements for software validation eliminating the use of manual, paper-based processes. Optimized validation processes are deployed using the latest global best practices to ensure the right amount of validation rigor based on critical quality attributes, risks, Cybersecurity and other factors.

Lean validation process begins with a lean validation project charter which defines the description of the process and key performance indicators.  KPI’s such as reduce validation costs by $X/year or reduce software errors by X%.  The charter shall define the project scope, dependencies, project metrics and resources for the project.

There are five principles of lean validation  derived from lean manufacturing principles.  Lean validation is powered through people, processes and technology.  Automation drives lean validation processes.  The LEAN VALIDATION value principles are illustated in the figure below.

Lean-Validation-Graphic-600

Principle 1 – VALUE – Lean thinking in manufacturing begins with a detailed understanding of what value the customer assigns to product and services.  Lean thinking from an independent validation and verification perspective begins with a detailed understanding of the goals and objectives of the validation process and its adherence to compliance objectives.  The principle of VALUE requires the validation team to therefore focus on the elimination of waste to deliver the value the end customer (your organization) in the most cost-effective manner.  The computer systems validation process is designed for the purpose of assuring that software applications meet their intended use.  The value derived from the validation process is greater software quality, enhanced ability to identify software defects as a result of greater focus and elimination of inefficient and wasteful processes. AUTOMATION IS THE FOUNDATION THAT FACILITATES THE ACHIEVEMENT OF THIS VALUE PRINCIPLE.

Principle 2 – VALUE STREAM – The value stream, from a lean perspective is the comprehensive product life-cycle from the raw materials through customer’s end use, and ultimate disposal the product.  To effectively eliminate waste, the ultimate goal of lean validation, there must be an accurate and complete understanding of the value stream.  Validation processes must be examined end-to-end to determine what value is added to the objective of establishing software quality and compliance.  Any process that does not add value to the validation process should be eliminated.  We recommend value stream mapping for the validation process to understand where value is added and where non-value added processes can be eliminated.  Typical “Muda” or wastes commonly revealed from validation process mapping are:

  • Wasteful Legacy Processes (“we have always done it this way”)
  • Processes That Provide No Value To Software Quality At All
  • Manual Process Bottlenecks That Stifle Processes

Principle 3 – FLOW – The lean manufacturing principle of flow is about creating a value chain with no interruption in the production process and a state where each activity is fully in step with every other.  A comprehensive assessment and understanding of flow throughout the validation process is essential to the elimination of waste.  From a validation perspective, optimal flow is created through the process when, for example, users have the ability to automatically create requirements from test scripts for automated traceability thereby eliminating the process of manually tracing each test script to a requirement.  Another example is when a user has the ability to navigate through a software application and the test script process is automatically generated.  Once generated, it is automatically published to a document portal where it is routed electronically for review and approval.  All of this requires AUTOMATION to achieve the principle of FLOW.  For process optimization and quality control throughout the validation lifecycle, information should optimally flow throughout the validation process in an efficient manner minimizing process and document bottlenecks with traceability throughout the process.

Principle 4 – PULL – A pull system is a lean manufacturing is used to reduce waste in the production process. Components used in the manufacturing process are only replaced once they have been consumed so companies only make enough products to meet customer demand.  There is much waste in the validation process.  The PULL strategy for validation may be used to reduce wastes such as duplication of effort, streamlining test case development and execution, electronic signature routing/approval and many others.  Check out our blog “The Validation Post” for more information.

Principle 5 – PERFECTION – Validation processes are in constant pursuit of continuous improvement.  Automation is KEY.  Lean validation engineers and quality professional relentlessly drive for perfection. Step by step validation engineers must identify root causes of software issues, anomalies, and quality problems that affect the suitability of a system for production use. As computing systems environments evolve and become more complex and integrated, validation engineers must seek new, innovative ways to verify software quality and compliance in today’s advanced systems. Perfection cannot easily be achieved through manual processes.

AUTOMATION IS REQUIRED TO TRULY REALIZE THE VISION OF THIS PRINCIPLE.

You can save time and money through lean.  Consider the first 2 principles of Value and Value Stream.  Many validation engineers consider validation only as a regulatory requirement or “necessary evil” – rather than a regulatory best practice designed to save time and money.  I think we all agree that in the long run quality software saves time and money through regulatory cost avoidance and more efficiency throughout quality processes.

It is important for validation engineers to not only consider quality and compliance but cost savings that may be gained throughout the validation process.  Lean validation is a strategy whose time has come.  How much will cost savings be through lean?  End users report saving about 40 – 60% for test script development and 60 – 70% on regression testing efforts.  Regulatory cost avoidance can be significant depending on the level of compliance with each company.  Cost savings may also be realized in minimizing the amount of paper generated through the validation process.

Embrace LEAN VALIDATION and experience the benefits of saving time and money.

 

 

 

Staffing Your Next Validation Project: What You Should Know

Finding good talent is always a challenge.  Good people are hard to find.  In the validation world, the unique skill sets required for success are sometimes difficult to find and require diligence to fulfill your objectives.  In today’s competitive environment, a lot of good validation talent is gainfully employed but you can always find good talent if you know where to look and what you are looking for.

When seeking validation talent for your next project, there are several points to consider.  Specifically, you should consider outsourcing as a means to acquire qualified experienced talent on demand.  Outsourcing talent acquisition and services offers your firm several key benefits:

  • Provides deep industry domain experience on demand
  • Frees up your internal team to focus on what they do best
  • Save 30 – 40% on outsourced talent
  • Minimize recruitment delays/costs
  • No overhead costs (unemployment insurance, taxes, healthcare, labor law costs)
  • Scalability – add or terminate resources as required

OnShore Technology Group offers a unique staffing service called ValidationOffice™.  This unique service offering delivers personnel with deep industry domain knowledge in life sciences and the field of validation.  Our validation engineers are adept at navigating the ever-changing regulatory landscape and understand well the principles and best practices for validation in a highly regulated environment.  Each candidate has over a decade of experience in enterprise validation projects and can work on any size project.

When looking to outsource part or all of your validation team, it is important that you have a team that is responsive, qualified and flexible enough to meet your demands.

Validation is a highly regulated process requiring rigorous training and experience to help ensure success.  ValidationOffice™ delivers staffing on demand for validation project resources.

WHAT YOU SHOULD KNOW

If you have ever worked with an outsourced validation firm, below are 5 things you should know.

FIRST THING YOU SHOULD KNOW

ValidationOffice™ can be used to provide temporary or permanent experienced validation staff.  According to the American Staffing Association (ASA), staffing companies hire millions of temporary and contact employees each year.  Around 33% of contract employees were eventually offered permanent jobs.   If you are seeking validation talent, you should know that there are options to help you reduce your risk of hiring mistakes and provide the ability for you to evaluate a potential candidate on the job before making a permanent hiring decision.

SECOND THING YOU SHOULD KNOW

We specialize in validation talent.  We can provide validation test engineers, validation project managers, validation program managers, validation technical engineers, technical writers, and other relevant skill sets.   Our candidates come from the ranks of leading companies and have the skills and know how to hit the ground running.  A good staffing partner is essential to oversee staffing needs as they arise.  You should know that you have a partner acting on your behalf.

THIRD THING YOU SHOULD KNOW

How we do our job in recruitment and retaining talent is critical for you to understand.  We offer a simplified and streamlined staffing process.  Hiring the right person means that we have to be intimately familiar with your hiring requirements and business needs.  We listen.  Hiring the wrong person can be an expensive proposition.  We provide just-in-time access to qualified candidate ready to work in your office environment.

FOURTH THING YOU SHOULD KNOW

You should know that your temporary staff members are taken care of administratively.  We use an applicant tracking system to track candidate activity and a customer relationship management system for business development.  Technology drives our business processes to help us deliver the best candidates for you.  You should know that our systems provide efficiencies for you and the job candidates.

FIFTH THING YOU SHOULD KNOW

Finally, you should know that all staffing firms are not created equal.  We are very selective and dedicated to bringing you the very best in validation talent.  Our services can be beneficial for both companies and prospective employees.  We focus on your objectives to deliver the best and brightest to serve your needs.

For your next project, you may want to consider getting validation talent on demand.  Contact us for more details.

Why Are You Still Generating Validation Test Scripts Manually?

Drafting validation scripts is one of the key activities in a validation exercise designed to provide document evidence that a system performs according to its intended use.  The FDA and other global agencies require objective evidence, usually in the form of screen shots that sequentially capture the target software process, to provide assurance that systems can consistently and repeatedly perform the various processes representing the intended use of the system.

Since the advent of the PC, validation engineers have been writing validation test scripts manually.  The manual process of computer systems validation test script development involves capturing screenshots and pasting them into Microsoft Word test script templates.  To generate screen captures, some companies use tools such as Microsoft Print Screen, TechSmith SnagIT, and other such tools.  A chief complaint of many validation engineers is that the test script development process is a slow, arduous one.  Some validation engineers are very reluctant to update/re-validate systems due to this manual process.  So, the question posed by this blog article is simply this: “Why are you still generating test scripts manually???”

I have been conducting validation exercises for engineering and life sciences systems since the early 1980’s.  I too have experienced first-hand the pain of writing test scripts manually.  We developed and practice “lean validation” so I sought ways to eliminate manual, wasteful validation processes.  One of the most wasteful processes in validation is the manual capture/cutting/pasting of screenshots into a Microsoft Word document.

The obvious follow up question is “how do we capture validation tests in a more automated manner to eliminate waste and create test scripts that are complete, accurate and provide the level of due diligence required for validation?”

In response to this common problem, we developed an Enterprise Validation Management system called ValidationMaster™.  This system includes TestMaster™, an automated testing system that allows validation engineers to capture and execute validation test scripts in a cost-effective manner.

TestMaster™ is designed to validate ANY enterprise or desktop application.  It is a browser-based system and allows test engineers to open any application on their desktop, launch TestMaster™, and capture their test scripts while sequentially executing the various commands in their applications.    As the validation engineer navigates through the application, TestMaster™ captures each screenshot and text entry entered in the application.

Once the test script is saved, TestMaster™ allows the script to be published in your UNIQUE test script template with the push of a button.  No more cutting/pasting screenshots from manual processes!  You can generate your test scripts in MINUTES as opposed to the hours it sometimes takes to compile documents based on a series of screenshots.  If you are one of those validation engineers that does not like screenshots in your scripts, you can easily create text-based processes both quickly and easily using TestMaster™.

So, what is the biggest benefit of using TestMaster™ versus manual processes?  There are three key benefits which are summarized as follows:

  1.  Automated Test Script Execution– for years, validation engineers have wanted a more automated approach for the execution of validation test scripts.  ValidationMaster™ supports both hands-on or hands-off validation testing.  Hands-on validation testing is the process whereby a validation engineer looks at each step of a validation test script and executes the script step-by-step by clicking through the process.  Hands off validation allows a validation engineer to execute a test script with no human intervention.  This type of regression testing (hands off) is very useful for cloud-based systems or systems that require more frequent testing.  The validation engineer simply selects a test script and defines a date/time for its execution.  At the designated time with no human intervention, the system executes the test script and reports the test results back to the system.   Automated testing is here!  Why are you still doing this manually?

  1.  Traceability– TestMaster™ allows validation engineers to link each test script to a requirement or set of requirements, thus the system delivers automatic traceability which is a regulatory requirement.  With the click of a button, TestMaster™ allows validation engineers to create a test script directly from a requirement.  This is powerful capability that allows you to see requirements coverage through our validation dashboard on demand.  This validation dashboard is viewable on a desktop or mobile device (Windows, Apple, Android).

  1.  Test Script Execution– One of the biggest problems with manual test scripts is that they must be printed and manually routed for review and approval.  Some companies who have implemented document management systems may have the ability to route the scripts around electronically for review and approval.  The worst-case scenario is the company that has no electronic document management system and generates these documents manually.  TestMaster™ allows validation engineers to execute test scripts online and capture test script results in an easy manner.  The test script results can be captured in an automated way and published into executed test script templates quickly and easily.   If incidents (bugs/anomalies) occur during testing, users have the ability to automatically capture an incident report which is tied to the exact step where the anomaly/bug occurred.  Once completed, ValidationMaster™ is tightly integrated with a 21 CFR Part 11-compliant portal (ValidationMaster Portal™). Once the test script is executed, is it automatically published to the ValidationMaster™ Portal where it is routed for review/approval in the system.  The ability to draft, route, review, approve, execute and post-approve validation test scripts is an important, time/cost saving feature that should be a part of any 21stcentury validation program.

  1.  Reuse Test Scripts For Regression Testing– manual test scripts are not ‘readily’ reusable.  What I mean by this is that the Word documents must be edited or even re-written for validation regression testing.  Validation is not a one-time process.  Regression testing is a fact of life for validation engineers.  The question is, will you rewrite all of your test scripts or use automated tools to streamline the process.  ValidationMaster™ allows validation engineers to create a reusable test script library.  This library includes all the test scripts that make up your validation test script package.  During re-validation exercises, you have the ability to reuse the same test scripts for regression testing.

Given the rapid adoption of cloud, mobile and enterprise technologies in life sciences, a new approach to validation is required.  Yes, you can still conduct validation exercises on paper but why would you?  In the early days of enterprise technology, we did not have tools available that would facilitate the rapid development of validation test scripts.  Today, that is not the case.  Systems like ValidationMaster™ are available in either a hosted or on-premise environment.  These game-changing systems are revolutionizing the way validation is conducted and offering time/cost-saving features that make this process easier.   So why are you still generating test scripts manually?

Accelerating Validation in the 21st Century

Each day in companies across the globe, employees are being asked to do more with less.  The mantra of the business community in the 21st century is “accelerating business”.  You see it in marketing and all types of corporate communication.  In the validation world accelerating system implementation, validation and deployment is the cry of every chief information officer and senior executive.

Accelerating validation activity means different things to different people.  I define it as driving validation processes efficiently and effectively without sacrificing quality or compliance.  Most manual processes are paper-based, inefficient, and cumbersome.  Creating paper-based test scripts requires a lot of cutting/pasting of screenshots and other objective evidence to meet regulatory demands.  Getting validation information into templates with proper headers and footers in compliance with corporate documentation guidelines is paramount.

The question for today is “HOW DO WE ACCELERATE VALIDATION WITHOUT SACRIFICING QUALITY AND COMPLIANCE?”

Earlier in my career, I developed “Validation Toolkit” for Documentum and Qumas.  A Validation Toolkit was an early version of a validation accelerator.  Many companies now have them.  These toolkits come with pre-defined document templates and test scripts designed to streamline the validation process.  They are called “tookits” for a reason – they are not intended to be completed validation projects.  They are intended to be a starting point for validation exercises.

The FDA states in the General Principles of Software Validation; Final Guidance For Industry and FDA Staff issued on January 11, 2002, “…If the vendor can provide information about their system requirements, software requirements, validation process, and the results of their validation, the medical device manufacturer (or any company) can use that information as a beginning point for their required validation documentation…”

Notice, that the FDA says “… use a beginning point” for validation – NOT USE IT EXCLUSIVELY AS THE COMPLETE VALIDATION PACKAGE.  This is because the FDA expects each company to conduct its own due diligence for validation.  Also note that the FDA allows the use of such documentation if available from a vendor.

Beware that some vendors believe their “toolkits” can substitute for rigorous review of their software applications.  It cannot.  In addition to leveraging the software vendor information, you should ALWAYS conduct your own validation due diligence to ensure that applications are thoroughly examined without bias by your team or designated validation professional.

It is an excellent idea to leverage vendor information if provided.  There is a learning curve with most enterprise applications and the use of vendor-supplied information can help streamline the process and add value.  The old validation toolkits were delivered with a set of document templates and pre-defined test scripts often resulting in hundreds of documents depending on the application.  In most cases, companies would edit the documents or put them in their pre-defined validation templates for consistency.  This resulted in A LOT OF EDITING!  In some cases, depending on the number of documents, time-savings could be eroded by just editing the existing documents.

THERE IS A BETTER WAY!  

What if YOUR unique validation templates were pre-loaded into an automated Enterprise Validation Lifecycle Management system?   What if the Commercial-off-the-Shelf (COTS) product requirements were pre-loaded into this system?  Further, what if a full set of OQ/PQ test scripts were traced to each of the pre-loaded requirements?  What if you were able to export all of these documents day one in your unique validation templates?  What if you only had to add test scripts that represented the CHANGES you were making to a COTS application versus writing all of your test scripts from scratch?  Finally, what if you could execute your test scripts on line and automatically generate a requirements trace matrix and validation summary report with PASS/FAIL status?

This is the vision of the CloudMaster 365™  Validation Accelerator.  CloudMaster 365™  is the next generation of the validation toolkit concept.    The ValidationMaster Enterprise Validation Management system is the core of the CloudMaster 365™ Validation Accelerator.  The application can be either hosted or on-premise.  This is a great way to jump start your Microsoft Dynamics 365® validation project.

CloudMaster 365

The CloudMaster 365™ Validation Accelerator includes (3) key components:

  • ValidationMaster™ Enterprise Validation Management System
  • Comprehensive set of user requirements for out-of-the-box features
  • Full set of IQ/OQ/PQ test scripts and validation document templates

The Validation Accelerator is not intended to be “validation out of the box”.  It delivers the foundation for your current and future validation projects delivering “Level 5” validation processes.  If you consider the validation capability maturity model, most validation processes at level 1 are ad hoc, chaotic, undefined and reactive to project impulses and external events.

Validation Maturity Model

Level 5 validation is an optimized, lean validation process facilitated with automation.  The test process is optimized, quality control is assured and there is a specific measure for defect prevention and management.  Level 5 cannot be achieved without automation.  The automation is powered by ValidationMaster™.   With the CloudMaster 365™ strategy, instead of having “toolkit” documents suitable for only one project, you have a system that manages your COTS or bespoke development over time and manages the full lifecycle of validation over time as is required by regulators.  This is revolutionary in terms of what you get.

CloudMaster 365™ TRULY ACCELERATES VALIDATION.  It can not only be used for Microsoft Dynamics 365®, but is also available for other applications including:

  • CloudMaster 365™ Validation Accelerator For Dynamics 365®
  • CloudMaster 365™ Validation Accelerator For Merit MAXLife®
  • CloudMaster 365™ Validation Accelerator For Yaveon ProBatch®
  • CloudMaster 365™ Validation Accelerator For Oracle e-Business® or Oracle Fusion®
  • CloudMaster 365™ Validation Accelerator For SAP®
  • CloudMaster 365™ Validation Accelerator For BatchMaster®
  • CloudMaster 365™ Validation Accelerator For Edgewater EDGE®
  • CloudMaster 365™ Validation Accelerator For VeevaVault®
  • CloudMaster 365™ Validation Accelerator For Microsoft SharePoint®

Whether you are validating an enterprise application or seeking to drive your company to higher levels of efficiency, you should consider how best to accelerate your validation processes not solution by solution but in a more holistic way that ensures sustained compliance across all applications.  If you are embarking on the validation of Microsoft Dynamics 365®, or any of the applications listed above, the CloudMaster 365™ Validation Accelerator is a no-brainer.  It is the most cost-effective way to streamline validation and ensure sustained compliance over time.

5 Ways to Revive Your CSV Validation Process in 2018

If you are like most veteran validation engineers, you have been conducting validation exercises the same old way.  You have been gathering user requirements, conducting risk assessments, developing test scripts (most often on paper) and delivering validation documentation and due diligence as required by current global regulations.

Validation processes have not changed very much over the last 40 years (unless you include ISPE GAMP® as revolutionary).  However, the systems we validate and the state of play with respect to cybersecurity, data integrity, mobility and compliance has changed in a profound way.  Cloud-based applications are seeing increasing adoptions.  Platforms such as Microsoft Azure are changing the way companies do business.  Mobility is in the hands of everyone who can obtain an affordable device.  These game-changing trends call for changes in the way we wake up our sleepy validation processes and use them as opportunities for process improvement.

To help you address these new challenges for the coming year, I offer 5 ways to revive sleepy, manual validation processes for your consideration.

  1. Think LEAN
  2. Automate
  3. Leverage Electronic Document Review & Approval Processes
  4. Adopt Agile Validation Processes
  5. Conduct Cybersecurity Qualification

STEP 1.  THINK LEAN

Lean manufacturing is a proven process that powered Toyota to success. Lean manufacturing processes were designed to eliminate waste and improve operational efficiency. Toyota adopted lean manufacturing processes and rose to prominence with quality vehicles that outsold most of their competitors in each automotive class. To improve your validation processes in today’s systems environment, it is important to think lean. As we are being asked to do more with less as validation engineers it is important and dare I say critical that you eliminate any wasteful processes and focus on validation processes that in the end at value. This is the basic premise of lean validation.

Throughout my practice, we deliver lean validation services to all of our clients. Lean validation is powered through automation. You can’t have a lean validation process without automation since automation powers lean. Through automation and lean validation processes, we are able to automate the development and management of requirements, incidents, and validation testing in a manner not possible on paper. As you look to wake up your validation processes in 2018, it is important to think lean. I wrote a blog article on the principles and best practices of lean as a good starting point to educate you as to how to maintain and establish lean validation processes.

STEP 2.  AUTOMATE

As I mentioned above you cannot adopt lean validation processes unless you automate your validation processes. Automation has often been looked at by validation engineers as a luxury rather than the necessity. In 2018, automation of the validation process is a necessity not a luxury. Electronic workflow used to be the purview of expensive enterprise systems often costing in the millions of dollars. Today, there is no excuse for not automating your process other than simply not wanting to change.

Electronic systems to manage automated validation processes have come a long way in the last 20 years. We offer a system call ValidationMaster™  which was designed by validation engineers for validation engineers. ValidationMaster™  is an enterprise validation management system designed to manage the full lifecycle of validation processes. From requirements to test script development and execution through incident reporting and validation document control, validation master manages the full lifecycle of validation processes and their associated deliverables. The system comes with a fully integrated validation master portal based on SharePoint designed to manage document deliverables in a controlled manner. The system is integrated with electronic signatures and graphical workflows to eliminate document bottlenecks and improve the process of development of your document deliverables.

The development and execution of test cases represents the most laborious part of validation. This is where most validation exercises are either successful or fail. Most often, given the manual cryptic nature that most validation engineers use to develop test scripts, a lot of focus is spent on the actual development of the test script and not the quality of the test case itself. This is due to wasteful processes such as cutting and pasting screenshots or manually typing in information that the validation engineer sees on the screen during the manual test script development process. Both the process of capturing screenshots or typing in information to describe a screen are considered in my view wasteful and don’t add to the value of the test case.

The FDA says a good test case is one that finds an error. As validation engineers we should be focused more on finding errors and software quality than cutting and pasting screenshots into word documents as most of us do. There are things that computers can do much more efficiently and effectively than humans can do and one of them is automated test generation.

There are tools on the market such as ValidationMaster™ that allow you to fully automate this process and eliminate the waste in this process. ValidationMaster™ allows you to navigate through the application that is subject for your validation exercise while the system automatically captures the screenshot and the text and places this text directly into your formatted documents. You simply push a button at the end of the test script development and your test script is written for you. This is standard out-of-the-box functionality. Managing requirements for enterprise applications can also be laborious and challenging.

If you are validating enterprise resource planning systems which consist of hundreds of requirements, keeping track of the version of each requirement may be difficult. Who changes a requirement and why was a requirement changed is often the subject of hours and hours of meetings that waste time and money. ValidationMaster™ has a full-featured requirements management engine which not only tracks the requirement but it tracks the full audit history of a requirement so that you can understand who change the requirement, when was the requirement changed and why. This gives you full visibility to requirements. These requirements can be directly traced to test scripts thereby fully automating requirements traceability in your validation process.

Validation testing can be conducted online as either a fully automated validation test script which requires no human intervention or test cases that are semi-automatic where the validation engineer navigates through each software application and the actual results are captured by the system.

Manually tracking tests runs can be a headache and sometimes error-prone. An automated system can do this for you with ease. ValidationMaster™ allows you to conduct multiple tests runs and it tracks each of the tests runs, the person who ran the test, how long it took to actually execute the test, and the actual results of the test. If an incident or software anomaly occurred during testing the automated system would actually allow you to capture that information and keeps the incident report with each validation test. The time for automation has calm.

If you want to wake up your sleepy validation processes and deliver greater value for your company, you cannot afford to overlook automation of your validation process. Check out a demonstration of ValidationMaster™ to see how automated validation lifecycle management can work for you.

STEP 3 – LEVERAGE ELECTRONIC DOCUMENT REVIEW & APPROVAL PROCESSES

One of the many challenges that validation engineers experience every day is the route review and approval of validation documentation. In many paper-based systems this involves routing around printed documents for physical handwritten signatures. This is the way we did it in the 1980s. 21st-century validation requires a new more efficient approach. As mentioned in the section above on automation, there really is no excuse for using manual paper-based processes anymore. There are affordable technologies on the market that would facilitate electronic document review and approval processes and tremendously speed up the efficiency and quality of your document management processes.

In one previous company that I worked for our document cycle times were up to 120 days per document. With electronic route and review using 21 CFR part 11 electronic signatures we managed to get the process down to three weeks which saved a significant amount of time and money. Electronic document review and approval is a necessity and no longer a luxury there are tools on the market that costs less than a happy meal per month that would allow you to efficiently route and review your validation documentation and apply electronic signatures in compliance with 21 CFR part 11. To wake up your document management processes electronic document review and approval is a MUST.

STEP 4 – ADOPT AGILE VALIDATION PROCESSES

For as long as I can remember, whenever validation was spoken up we talked about the V model. This is the model as shown in the figure below where validation planning happens on the left-hand side of the V and validation testing happens on the right-hand side of the V.

V-MODEL

This model often assumed a waterfall approach to software development where all requirements were developed in advance and were tested subsequent to the development/configuration process. In today’s systems environment this is actually not how software is developed or implemented in most cases. Most development teams use an agile development process. Agile software development methodologies have really revolutionized the way organizations plan, develop, test and release software applications today. It is fair to say that agile development methods have now been established as the most accepted way to develop software applications. Having said this, many validation engineers insist on the waterfall methodology to support validation.

Agile development is basically a lightweight software development methodology that focuses on having small time boxed sprints of new functionality that are incorporated into an integrated product baseline. In other words, all requirements are not developed upfront. Agile recognizes that you may not know all requirements up front. The scrum places emphasis on customer interaction, feedback and adjustments rather than documentation and prediction. From a validation perspective this means iterations of validation testing as software iterations are developed in sprints. It’s a new way of looking at validation from the old waterfall approach. This may add more time to the validation process overall but the gradual introduction of functionality within complex software applications has value in and of itself. The Big Bang approach of delivering ERP systems sometimes has been fraught with issues and has sometimes been error-prone. The gradual introduction of features and functionality within software application has its merits. You should adopt agile validation processes to wake up your current processes.

STEP 5 – CONDUCT CYBERSECURITY QUALIFICATION (CYQ)

Last but not least is cyber security. In my previous blog posts, I discussed extensively the impact of cyber security on validation. I cannot emphasize enough how you need to pay attention to this phenomenon and be ready to address it. Cyber threats are a clear and present danger not only to validated systems environments but to all systems environments. As validation engineers our job is to ensure that system are established and meet their intended use. How can you confirm that the system meets its intended use when it’s vulnerable to cyber threats? The reality is you can’t.

Many validation engineers believe that cyber security is the purview of the IT department. This is something that the IT group is responsible for handling and many believe that it has nothing to do with validation. Nothing could be further from the truth. If you remember the old adage in validation “if it’s not documented it didn’t happen” you will quickly realize that you must document your readiness to deal with a cyber security event in the event that one occurs in your environment.

I call this “cyber security qualification” or “CyQ”. A cyber security qualification is an assessment of your readiness to protect validated systems environments against the threat of a cyber event. The CyQ is intended to evaluate your readiness to ensure compliance. One of the ways that you can wake up your validation processes in 2018 is to understand and educate yourself as to the threats that cyber events can have on your current validated systems environment and conduct a CyQ in addition to your IQ/OQ/PQ testing to ensure that you have the processes and procedures in place to deal with any threats that may come your way.  If you would like to see an example of a CyQ just write me and I’ll send you one.

2018 is going to bring many challenges your way. It is time that you bring your validation processes to level V validation maturity as shown in the figure below. You need to develop more mature processes that deal with the threats of cyber security and embrace lean validation and agile methodologies. If you’re like most you will be asked to do more with less. Updating old antiquated validation processes is essential to you being able to have the agility to do more with less.  It’s about time that you revive your old antiquated validation processes. Are you ready to go lean?

Risk Management & Computer Systems Validation: The Power Twins

For as long as systems have been validated, risk is an inherent part of the process.  Although validation engineers have been drafting risk assessments since the beginning of computer systems validation, many do not understand how crucial this process is to the overall validation process.

Risk management and validation go hand-in-hand.  The ISPE GAMP 5® (“GAMP 5”) methodology is a risk-based approach to validation.  GAMP 5 recommends that you scale all validation life cycle activities and associated documentation according to risk, complexity and novelty.  As shown in the figure below, the key drivers for GAMP 5 is science-based quality management of risks.

Drivers for GAMP 5

From a systems perspective, quality risk management, according to GAMP 5 is “…a systematic approach for the assessment, control, communication, and review of risks to patient safety, product quality, and data integrity.  It is an iterative process applied throughout the entire system life cycle…”  The guidance recommends qualitative and quantitative techniques to identify, document and manage risks over the lifecycle of a computer system.  Most importantly, they must be continually monitored to ensure the on-going effectiveness of each implemented control.

Risk assessments may be used to drive the validation testing process.  In our practice, we focus on critical quality attributes and four types of risks:

  • Business Risks
  • Regulatory Risks
  • Technical Risks
  • Cybersecurity Risks

The last type of risk, cybersecurity risks, is one that has not gotten a lot of attention from validation engineers and is not explicitly mentioned in GAMP 5.  However, this type of risk represents a clear and present danger to ALL systems, not just validated ones.  Cyber threats are real.  Large-scale cybersecurity attacks continue to proliferate across many enterprises and cyber criminals are broadening their approaches to strengthen their impact.  You need to look at a holistic approach to risk that not only includes the traditional risk assessment from GAMP as highlighted in the figure below but one that includes and incorporates cybersecurity as a real threat and risk to validated systems environments.

Risk Assessment - gamp 5

Cyber threats most certainly may impact product quality, patient safety and even data integrity.  We incorporate these risks into our risk management profile to provide a more comprehensive risk assessment for computer systems.

ZDNet reports that “…As new security risks continue to emerge, cloud security spending will grow to $3.5 billion by 2021…”

ZDNet June 15, 2017

As life sciences companies increase their adoption of the cloud, new challenges for validate systems environment are emerging and the risks that go along with them.  I wrote a blog post recently called “Validation as we know it is DEAD”.  In this post, I addressed the challenges and opportunities that cloud and cybersecurity bring.  Although cloud security “solutions” will driving spending by 2021, the solution is not necessarily a technology issue.  You can attack cyber risks effectively with STRATEGY.

For validated systems environments, I am recommending a cybersecurity plan to combat hackers and document your level of due diligence for validated systems.  Remember in validated systems, “…if its not documented, it didn’t happen…”.  You must document you plans and develop an effective strategy for all of your computer systems.

Validation and risk are the power twins of compliance.  Risk management cannot only facilitate the identification of controls to protect your systems long term, it can help ensure compliance and data integrity as well as drive efficiencies in your lean validation processes.  In the conduct of your risk assessments, do not ignore cybersecurity risks – It is the elephant in the room.