Saving Time and Money Through Lean Validation

The principles and best practices of lean manufacturing have served life sciences manufacturers well.  Lean is all about optimizing processes, while eliminating waste (“Muda”) and driving greater efficiencies.  As a 30-year validation practitioner, I have validated many computer systems, equipment and processes.  One of the key lessons learned is that there is much room for improvement across the validation process.

OnShore Technology Group is a pioneer in lean validation and has developed principles and best practices to support lean validation processes.  To power our lean processes, we leverage ValidationMaster™, an Enterprise Validation Management system exclusively designed to facilitate lean validation and automate the validation process.

So what is lean validation and how is it practically used?

Lean validation is the process of eliminating waste and inefficiencies while driving greater software quality across the validation process through automation.

Lean Validation

Lean validation cannot be achieved without automation. Lean validation processes leverage advanced technology designed to fulfill the technical, business and compliance requirements for software validation eliminating the use of manual, paper-based processes. Optimized validation processes are deployed using the latest global best practices to ensure the right amount of validation rigor based on critical quality attributes, risks, Cybersecurity and other factors.

Lean validation process begins with a lean validation project charter which defines the description of the process and key performance indicators.  KPI’s such as reduce validation costs by $X/year or reduce software errors by X%.  The charter shall define the project scope, dependencies, project metrics and resources for the project.

There are five principles of lean validation  derived from lean manufacturing principles.  Lean validation is powered through people, processes and technology.  Automation drives lean validation processes.  The LEAN VALIDATION value principles are illustated in the figure below.

Lean-Validation-Graphic-600

Principle 1 – VALUE – Lean thinking in manufacturing begins with a detailed understanding of what value the customer assigns to product and services.  Lean thinking from an independent validation and verification perspective begins with a detailed understanding of the goals and objectives of the validation process and its adherence to compliance objectives.  The principle of VALUE requires the validation team to therefore focus on the elimination of waste to deliver the value the end customer (your organization) in the most cost-effective manner.  The computer systems validation process is designed for the purpose of assuring that software applications meet their intended use.  The value derived from the validation process is greater software quality, enhanced ability to identify software defects as a result of greater focus and elimination of inefficient and wasteful processes. AUTOMATION IS THE FOUNDATION THAT FACILITATES THE ACHIEVEMENT OF THIS VALUE PRINCIPLE.

Principle 2 – VALUE STREAM – The value stream, from a lean perspective is the comprehensive product life-cycle from the raw materials through customer’s end use, and ultimate disposal the product.  To effectively eliminate waste, the ultimate goal of lean validation, there must be an accurate and complete understanding of the value stream.  Validation processes must be examined end-to-end to determine what value is added to the objective of establishing software quality and compliance.  Any process that does not add value to the validation process should be eliminated.  We recommend value stream mapping for the validation process to understand where value is added and where non-value added processes can be eliminated.  Typical “Muda” or wastes commonly revealed from validation process mapping are:

  • Wasteful Legacy Processes (“we have always done it this way”)
  • Processes That Provide No Value To Software Quality At All
  • Manual Process Bottlenecks That Stifle Processes

Principle 3 – FLOW – The lean manufacturing principle of flow is about creating a value chain with no interruption in the production process and a state where each activity is fully in step with every other.  A comprehensive assessment and understanding of flow throughout the validation process is essential to the elimination of waste.  From a validation perspective, optimal flow is created through the process when, for example, users have the ability to automatically create requirements from test scripts for automated traceability thereby eliminating the process of manually tracing each test script to a requirement.  Another example is when a user has the ability to navigate through a software application and the test script process is automatically generated.  Once generated, it is automatically published to a document portal where it is routed electronically for review and approval.  All of this requires AUTOMATION to achieve the principle of FLOW.  For process optimization and quality control throughout the validation lifecycle, information should optimally flow throughout the validation process in an efficient manner minimizing process and document bottlenecks with traceability throughout the process.

Principle 4 – PULL – A pull system is a lean manufacturing is used to reduce waste in the production process. Components used in the manufacturing process are only replaced once they have been consumed so companies only make enough products to meet customer demand.  There is much waste in the validation process.  The PULL strategy for validation may be used to reduce wastes such as duplication of effort, streamlining test case development and execution, electronic signature routing/approval and many others.  Check out our blog “The Validation Post” for more information.

Principle 5 – PERFECTION – Validation processes are in constant pursuit of continuous improvement.  Automation is KEY.  Lean validation engineers and quality professional relentlessly drive for perfection. Step by step validation engineers must identify root causes of software issues, anomalies, and quality problems that affect the suitability of a system for production use. As computing systems environments evolve and become more complex and integrated, validation engineers must seek new, innovative ways to verify software quality and compliance in today’s advanced systems. Perfection cannot easily be achieved through manual processes.

AUTOMATION IS REQUIRED TO TRULY REALIZE THE VISION OF THIS PRINCIPLE.

You can save time and money through lean.  Consider the first 2 principles of Value and Value Stream.  Many validation engineers consider validation only as a regulatory requirement or “necessary evil” – rather than a regulatory best practice designed to save time and money.  I think we all agree that in the long run quality software saves time and money through regulatory cost avoidance and more efficiency throughout quality processes.

It is important for validation engineers to not only consider quality and compliance but cost savings that may be gained throughout the validation process.  Lean validation is a strategy whose time has come.  How much will cost savings be through lean?  End users report saving about 40 – 60% for test script development and 60 – 70% on regression testing efforts.  Regulatory cost avoidance can be significant depending on the level of compliance with each company.  Cost savings may also be realized in minimizing the amount of paper generated through the validation process.

Embrace LEAN VALIDATION and experience the benefits of saving time and money.

 

 

 

Staffing Your Next Validation Project: What You Should Know

Finding good talent is always a challenge.  Good people are hard to find.  In the validation world, the unique skill sets required for success are sometimes difficult to find and require diligence to fulfill your objectives.  In today’s competitive environment, a lot of good validation talent is gainfully employed but you can always find good talent if you know where to look and what you are looking for.

When seeking validation talent for your next project, there are several points to consider.  Specifically, you should consider outsourcing as a means to acquire qualified experienced talent on demand.  Outsourcing talent acquisition and services offers your firm several key benefits:

  • Provides deep industry domain experience on demand
  • Frees up your internal team to focus on what they do best
  • Save 30 – 40% on outsourced talent
  • Minimize recruitment delays/costs
  • No overhead costs (unemployment insurance, taxes, healthcare, labor law costs)
  • Scalability – add or terminate resources as required

OnShore Technology Group offers a unique staffing service called ValidationOffice™.  This unique service offering delivers personnel with deep industry domain knowledge in life sciences and the field of validation.  Our validation engineers are adept at navigating the ever-changing regulatory landscape and understand well the principles and best practices for validation in a highly regulated environment.  Each candidate has over a decade of experience in enterprise validation projects and can work on any size project.

When looking to outsource part or all of your validation team, it is important that you have a team that is responsive, qualified and flexible enough to meet your demands.

Validation is a highly regulated process requiring rigorous training and experience to help ensure success.  ValidationOffice™ delivers staffing on demand for validation project resources.

WHAT YOU SHOULD KNOW

If you have ever worked with an outsourced validation firm, below are 5 things you should know.

FIRST THING YOU SHOULD KNOW

ValidationOffice™ can be used to provide temporary or permanent experienced validation staff.  According to the American Staffing Association (ASA), staffing companies hire millions of temporary and contact employees each year.  Around 33% of contract employees were eventually offered permanent jobs.   If you are seeking validation talent, you should know that there are options to help you reduce your risk of hiring mistakes and provide the ability for you to evaluate a potential candidate on the job before making a permanent hiring decision.

SECOND THING YOU SHOULD KNOW

We specialize in validation talent.  We can provide validation test engineers, validation project managers, validation program managers, validation technical engineers, technical writers, and other relevant skill sets.   Our candidates come from the ranks of leading companies and have the skills and know how to hit the ground running.  A good staffing partner is essential to oversee staffing needs as they arise.  You should know that you have a partner acting on your behalf.

THIRD THING YOU SHOULD KNOW

How we do our job in recruitment and retaining talent is critical for you to understand.  We offer a simplified and streamlined staffing process.  Hiring the right person means that we have to be intimately familiar with your hiring requirements and business needs.  We listen.  Hiring the wrong person can be an expensive proposition.  We provide just-in-time access to qualified candidate ready to work in your office environment.

FOURTH THING YOU SHOULD KNOW

You should know that your temporary staff members are taken care of administratively.  We use an applicant tracking system to track candidate activity and a customer relationship management system for business development.  Technology drives our business processes to help us deliver the best candidates for you.  You should know that our systems provide efficiencies for you and the job candidates.

FIFTH THING YOU SHOULD KNOW

Finally, you should know that all staffing firms are not created equal.  We are very selective and dedicated to bringing you the very best in validation talent.  Our services can be beneficial for both companies and prospective employees.  We focus on your objectives to deliver the best and brightest to serve your needs.

For your next project, you may want to consider getting validation talent on demand.  Contact us for more details.

Why Are You Still Generating Validation Test Scripts Manually?

Drafting validation scripts is one of the key activities in a validation exercise designed to provide document evidence that a system performs according to its intended use.  The FDA and other global agencies require objective evidence, usually in the form of screen shots that sequentially capture the target software process, to provide assurance that systems can consistently and repeatedly perform the various processes representing the intended use of the system.

Since the advent of the PC, validation engineers have been writing validation test scripts manually.  The manual process of computer systems validation test script development involves capturing screenshots and pasting them into Microsoft Word test script templates.  To generate screen captures, some companies use tools such as Microsoft Print Screen, TechSmith SnagIT, and other such tools.  A chief complaint of many validation engineers is that the test script development process is a slow, arduous one.  Some validation engineers are very reluctant to update/re-validate systems due to this manual process.  So, the question posed by this blog article is simply this: “Why are you still generating test scripts manually???”

I have been conducting validation exercises for engineering and life sciences systems since the early 1980’s.  I too have experienced first-hand the pain of writing test scripts manually.  We developed and practice “lean validation” so I sought ways to eliminate manual, wasteful validation processes.  One of the most wasteful processes in validation is the manual capture/cutting/pasting of screenshots into a Microsoft Word document.

The obvious follow up question is “how do we capture validation tests in a more automated manner to eliminate waste and create test scripts that are complete, accurate and provide the level of due diligence required for validation?”

In response to this common problem, we developed an Enterprise Validation Management system called ValidationMaster™.  This system includes TestMaster™, an automated testing system that allows validation engineers to capture and execute validation test scripts in a cost-effective manner.

TestMaster™ is designed to validate ANY enterprise or desktop application.  It is a browser-based system and allows test engineers to open any application on their desktop, launch TestMaster™, and capture their test scripts while sequentially executing the various commands in their applications.    As the validation engineer navigates through the application, TestMaster™ captures each screenshot and text entry entered in the application.

Once the test script is saved, TestMaster™ allows the script to be published in your UNIQUE test script template with the push of a button.  No more cutting/pasting screenshots from manual processes!  You can generate your test scripts in MINUTES as opposed to the hours it sometimes takes to compile documents based on a series of screenshots.  If you are one of those validation engineers that does not like screenshots in your scripts, you can easily create text-based processes both quickly and easily using TestMaster™.

So, what is the biggest benefit of using TestMaster™ versus manual processes?  There are three key benefits which are summarized as follows:

  1.  Automated Test Script Execution– for years, validation engineers have wanted a more automated approach for the execution of validation test scripts.  ValidationMaster™ supports both hands-on or hands-off validation testing.  Hands-on validation testing is the process whereby a validation engineer looks at each step of a validation test script and executes the script step-by-step by clicking through the process.  Hands off validation allows a validation engineer to execute a test script with no human intervention.  This type of regression testing (hands off) is very useful for cloud-based systems or systems that require more frequent testing.  The validation engineer simply selects a test script and defines a date/time for its execution.  At the designated time with no human intervention, the system executes the test script and reports the test results back to the system.   Automated testing is here!  Why are you still doing this manually?

  1.  Traceability– TestMaster™ allows validation engineers to link each test script to a requirement or set of requirements, thus the system delivers automatic traceability which is a regulatory requirement.  With the click of a button, TestMaster™ allows validation engineers to create a test script directly from a requirement.  This is powerful capability that allows you to see requirements coverage through our validation dashboard on demand.  This validation dashboard is viewable on a desktop or mobile device (Windows, Apple, Android).

  1.  Test Script Execution– One of the biggest problems with manual test scripts is that they must be printed and manually routed for review and approval.  Some companies who have implemented document management systems may have the ability to route the scripts around electronically for review and approval.  The worst-case scenario is the company that has no electronic document management system and generates these documents manually.  TestMaster™ allows validation engineers to execute test scripts online and capture test script results in an easy manner.  The test script results can be captured in an automated way and published into executed test script templates quickly and easily.   If incidents (bugs/anomalies) occur during testing, users have the ability to automatically capture an incident report which is tied to the exact step where the anomaly/bug occurred.  Once completed, ValidationMaster™ is tightly integrated with a 21 CFR Part 11-compliant portal (ValidationMaster Portal™). Once the test script is executed, is it automatically published to the ValidationMaster™ Portal where it is routed for review/approval in the system.  The ability to draft, route, review, approve, execute and post-approve validation test scripts is an important, time/cost saving feature that should be a part of any 21stcentury validation program.

  1.  Reuse Test Scripts For Regression Testing– manual test scripts are not ‘readily’ reusable.  What I mean by this is that the Word documents must be edited or even re-written for validation regression testing.  Validation is not a one-time process.  Regression testing is a fact of life for validation engineers.  The question is, will you rewrite all of your test scripts or use automated tools to streamline the process.  ValidationMaster™ allows validation engineers to create a reusable test script library.  This library includes all the test scripts that make up your validation test script package.  During re-validation exercises, you have the ability to reuse the same test scripts for regression testing.

Given the rapid adoption of cloud, mobile and enterprise technologies in life sciences, a new approach to validation is required.  Yes, you can still conduct validation exercises on paper but why would you?  In the early days of enterprise technology, we did not have tools available that would facilitate the rapid development of validation test scripts.  Today, that is not the case.  Systems like ValidationMaster™ are available in either a hosted or on-premise environment.  These game-changing systems are revolutionizing the way validation is conducted and offering time/cost-saving features that make this process easier.   So why are you still generating test scripts manually?

Can Cloud Applications Be Validated?

Cloud applications are being deployed within life sciences Enterprises at a rapid pace. Microsoft office 365, Microsoft dynamics 365 and the cost benefit of deployment of other such applications are driving the adoption of cloud applications for regulated systems. The question that is always asked is can cloud systems be validated? The reason for the inquiry is due to the fact that many understand how clouds are deployed and maintained over time. In an effort to keep pace with system performance, security and other related controls in the cloud, cloud providers often update their environments to keep pace. The constant updating of the cloud environment makes it challenging from a systems validation perspective.  Maintaining the validated state in a cloud environment is often challenging due to this fact.

So can cloud applications be validated and what are the unique processes that must be changed to accommodate cloud validation?  The short answer is yes, cloud applications can be validated. However, there are changes required to the validation strategy to ensure that the system meets its intended use and the validated state is maintained over time.

ALL CLOUD PROVIDERS ARE NOT CREATED EQUAL

When choosing a cloud provider, it is helpful to understand that all cloud providers are not created equal. To my mind they are divided into two distinct camps: (1) those who understand regulated environments and (2) those who do not understand regulated environments. The first order of business for establishing a validated system in the cloud is to select your cloud providers carefully. I usually like to look at cloud providers who have experience in regulated environments. This goes for any application that you’re using in a highly regulated environment. It is super important that your vendor understand the regulatory requirements that you have to comply with and that in their service they build in best practices to help you comply. This will go a long way during the supplier audit process to confirm that you’ve done your proper due diligence on your cloud provider.

UNDERSTANDING RESPONSIBILITIES IN A CLOUD ENVIRONMENT

It is important to understand responsibilities in a cloud environment.  With packaged software applications, you have complete control over the environment.  With the various cloud service models, responsibilities vary depending on the model used as shown below.  For Infrastructure-as-a-Service, you manage the applications, data, runtime, middle ware and operating system.  While the vendor manages virtualization, servers, storage, and networking.  Please keep in mind that the principles of validation endure in the cloud.  You, not the vendor, are ultimately responsible for the environment and its management.  Therefore, you need to choose your cloud vendor wisely.

cloud services

You can see with Platform-as-a-Service (PaaS) and Software-as-a-Service (SaaS), you have less responsibility for the management of the system.  In the SaaS model, you do not control the application or infrastructure.  The vendor manages the application and it underlying architecture.

The strategy for validation changes based on the cloud model deployed.

The strategy for validation changes based on the cloud model deployed.

CONTINUOUS TESTING IN THE CLOUD

Previously four on premise systems, the validation engineer in consultation with the IT team determined how often validated systems environments were reviewed and updated. If the validation engineer did not want to apply a patch or an update, they simply left the system alone. Many validation engineers due to the labor involved in updating and documenting a validated system would often leave the system in a state where no changes, patches, or updates were applied. In today’s systems due to the threats from cyber security and frequent changes to browser applications, it is not possible to leave a system unpatched. To do so may leave your validated systems environment more exposed to security threats.

Therefore, when validating in the cloud you must employ a continuous testing strategy. When I say that to validation engineers they often cringe thinking of continuously having the test a validated systems environment. This is easier than you may think if you’re using automated tools for testing. Continuous testing is simply a validation testing strategy where on a designated schedule defined by the validation engineer, a system is routinely tested to ensure that patches, and routine updates do not impact the validated state. Continuous testing is facilitated best through automation. When developing a reusable test script Library, you can easily conduct an impact analysis and determine which tests need to be rerun. Regression testing is made easy through validation test script automation. Therefore, it is strongly recommended that if you deploy systems in a cloud environment that you employ an enterprise validation management system that includes automated testing such as ValidationMaster™.

Cloud applications can be validated.  The strategy changes based on the cloud model selected.  Choose your partners carefully.  They assume an important role in the management of critical system assets.

It is not a question of if a cloud application can be validated, it is a question of the deployment model you choose and the validation strategy that will govern deployment. Regulators are not averse to you using the cloud – its all about managing control and risk.

 

Accelerating Validation in the 21st Century

Each day in companies across the globe, employees are being asked to do more with less.  The mantra of the business community in the 21st century is “accelerating business”.  You see it in marketing and all types of corporate communication.  In the validation world accelerating system implementation, validation and deployment is the cry of every chief information officer and senior executive.

Accelerating validation activity means different things to different people.  I define it as driving validation processes efficiently and effectively without sacrificing quality or compliance.  Most manual processes are paper-based, inefficient, and cumbersome.  Creating paper-based test scripts requires a lot of cutting/pasting of screenshots and other objective evidence to meet regulatory demands.  Getting validation information into templates with proper headers and footers in compliance with corporate documentation guidelines is paramount.

The question for today is “HOW DO WE ACCELERATE VALIDATION WITHOUT SACRIFICING QUALITY AND COMPLIANCE?”

Earlier in my career, I developed “Validation Toolkit” for Documentum and Qumas.  A Validation Toolkit was an early version of a validation accelerator.  Many companies now have them.  These toolkits come with pre-defined document templates and test scripts designed to streamline the validation process.  They are called “tookits” for a reason – they are not intended to be completed validation projects.  They are intended to be a starting point for validation exercises.

The FDA states in the General Principles of Software Validation; Final Guidance For Industry and FDA Staff issued on January 11, 2002, “…If the vendor can provide information about their system requirements, software requirements, validation process, and the results of their validation, the medical device manufacturer (or any company) can use that information as a beginning point for their required validation documentation…”

Notice, that the FDA says “… use a beginning point” for validation – NOT USE IT EXCLUSIVELY AS THE COMPLETE VALIDATION PACKAGE.  This is because the FDA expects each company to conduct its own due diligence for validation.  Also note that the FDA allows the use of such documentation if available from a vendor.

Beware that some vendors believe their “toolkits” can substitute for rigorous review of their software applications.  It cannot.  In addition to leveraging the software vendor information, you should ALWAYS conduct your own validation due diligence to ensure that applications are thoroughly examined without bias by your team or designated validation professional.

It is an excellent idea to leverage vendor information if provided.  There is a learning curve with most enterprise applications and the use of vendor-supplied information can help streamline the process and add value.  The old validation toolkits were delivered with a set of document templates and pre-defined test scripts often resulting in hundreds of documents depending on the application.  In most cases, companies would edit the documents or put them in their pre-defined validation templates for consistency.  This resulted in A LOT OF EDITING!  In some cases, depending on the number of documents, time-savings could be eroded by just editing the existing documents.

THERE IS A BETTER WAY!  

What if YOUR unique validation templates were pre-loaded into an automated Enterprise Validation Lifecycle Management system?   What if the Commercial-off-the-Shelf (COTS) product requirements were pre-loaded into this system?  Further, what if a full set of OQ/PQ test scripts were traced to each of the pre-loaded requirements?  What if you were able to export all of these documents day one in your unique validation templates?  What if you only had to add test scripts that represented the CHANGES you were making to a COTS application versus writing all of your test scripts from scratch?  Finally, what if you could execute your test scripts on line and automatically generate a requirements trace matrix and validation summary report with PASS/FAIL status?

This is the vision of the CloudMaster 365™  Validation Accelerator.  CloudMaster 365™  is the next generation of the validation toolkit concept.    The ValidationMaster Enterprise Validation Management system is the core of the CloudMaster 365™ Validation Accelerator.  The application can be either hosted or on-premise.  This is a great way to jump start your Microsoft Dynamics 365® validation project.

CloudMaster 365

The CloudMaster 365™ Validation Accelerator includes (3) key components:

  • ValidationMaster™ Enterprise Validation Management System
  • Comprehensive set of user requirements for out-of-the-box features
  • Full set of IQ/OQ/PQ test scripts and validation document templates

The Validation Accelerator is not intended to be “validation out of the box”.  It delivers the foundation for your current and future validation projects delivering “Level 5” validation processes.  If you consider the validation capability maturity model, most validation processes at level 1 are ad hoc, chaotic, undefined and reactive to project impulses and external events.

Validation Maturity Model

Level 5 validation is an optimized, lean validation process facilitated with automation.  The test process is optimized, quality control is assured and there is a specific measure for defect prevention and management.  Level 5 cannot be achieved without automation.  The automation is powered by ValidationMaster™.   With the CloudMaster 365™ strategy, instead of having “toolkit” documents suitable for only one project, you have a system that manages your COTS or bespoke development over time and manages the full lifecycle of validation over time as is required by regulators.  This is revolutionary in terms of what you get.

CloudMaster 365™ TRULY ACCELERATES VALIDATION.  It can not only be used for Microsoft Dynamics 365®, but is also available for other applications including:

  • CloudMaster 365™ Validation Accelerator For Dynamics 365®
  • CloudMaster 365™ Validation Accelerator For Merit MAXLife®
  • CloudMaster 365™ Validation Accelerator For Yaveon ProBatch®
  • CloudMaster 365™ Validation Accelerator For Oracle e-Business® or Oracle Fusion®
  • CloudMaster 365™ Validation Accelerator For SAP®
  • CloudMaster 365™ Validation Accelerator For BatchMaster®
  • CloudMaster 365™ Validation Accelerator For Edgewater EDGE®
  • CloudMaster 365™ Validation Accelerator For VeevaVault®
  • CloudMaster 365™ Validation Accelerator For Microsoft SharePoint®

Whether you are validating an enterprise application or seeking to drive your company to higher levels of efficiency, you should consider how best to accelerate your validation processes not solution by solution but in a more holistic way that ensures sustained compliance across all applications.  If you are embarking on the validation of Microsoft Dynamics 365®, or any of the applications listed above, the CloudMaster 365™ Validation Accelerator is a no-brainer.  It is the most cost-effective way to streamline validation and ensure sustained compliance over time.

Mobility and Validation: Back to the Future

In 2015, the U.S. FDA issues new guidance for Mobile Medical Applications.  Given the widespread adoption and use of mobile technologies, new guidance was necessary to understand how the FDA looks at these devices and enforces them.  While the guidance was prospective in looking at technologies considered to be medical devices, it also pointed out those which were NOT considered to be medical devices including educational tools, training mobile apps, or apps that are intended to provide electronic copies of medical text books and the like.  I was happy to learn that my Apple Watch® was in the category of “Mobile apps that provide patients with simple tools to organize and track their health information” and that the FDA intends to exercise enforcement discretion over these devices. (US FDA Mobile Medical Applications – Guidance for Industry and Food and Drug Administration Staff, Page 16)

What drew my attention to mobile apps in a validation context were recent changes in major software vendor technologies.  I attended a Microsoft Summit meeting in 2016 and I was facinated with some of the changes that Microsoft has added to their Dynamics 365 platform.  Large software vendors such as Microsoft and Oracle have released mobile technologies integrated directly within their platforms.  Take a look at the Azure platform in the figure below.

Microsoft Dynamics 365

You will notice that Microsoft included a “Microsoft AppSource” store and Azure Internet of Things (IoT) to the solution.  Mr. Nadella, CEO of Microsoft, shared his ambitious vision for the Microsoft platform which is to “reinvent productivity and business processes”.  He believes that businesses of all sizes will not only use these technologies but will become digital entities able to engage their own customers, empower their own employees, optimize their own operations and transform their own products.  As a validation practitioner, I strongly agree with this approach.  Software SHOULD empower businesses to deliver value to their clients.  What has been so elusive for many software companies is the interoperability of software applications to work together.  He spoke of “silos” of information and unlocking data stored within these applications to drive new insights and change the way we work. This was my takeaway from Nadella’s talk.  However, it leads us back to MOBILITY.

“There’s an App For That…”

One of the key things to note about today’s enterprise applications is that they can be integrated with Apps.  Sometimes Apps are integrated into desktop applications and sometimes they have a mobile companion as well.  The Apps in the AppSource store are developed by various partners – not necessarily Microsoft.  These Apps can run seamlessly on tablets or other mobile devices.  The questions from a validation perspective regarding mobile applications.

  • How do you conduct a supplier audit for multiple App vendors?
  • How do you control changes to Apps integrated in your applications?
  • How do you maintain the validated state with these Apps?

The answer to these questions are sometimes challenging.  Supplier audit principles still ensure when you are discussing mobile applications.  You still must be careful to select your suppliers carefully.  However, it is impractical to audit every supplier in the AppSource store.  In the case of Microsoft, these Apps become a part of the overall system thus a part of the Dynamics 365 application itself.  Therefore, I would rely on rigorous testing to overcome the deficit in evaluating each and every App provider which may not be possible.

Regarding changes to your Apps, I employ the concept of Continuous Testing.  This is a test strategy for cloud based service offerings where by you define your testing intervals at either quarterly or semi-annual for these types of technologies and you conduct frequent testing to help maintain the validated state.

Generally, there are two types of mobile testing:  (1) Hardware Testing and (2) Mobile Application Testing.  Whether you are using native, mobile web apps or hybrid apps, your strategy should take into account the risk associated with these applications and their impact on critical quality attributes.

VALIDATION AND VERIFICATION OF MOBILE APPLICATIONS

So how does one go about testing (validating) mobile applications in a regulated systems environment.  The IQ/OQ/PQ testing paradigm still applies.

INSTALLATION QUALIFICATION (IQ) OF MOBILE DEVICES

The Installation Qualification (IQ) of mobile applications consists of:

  • Installation tests– Testing of mobile applications by installing /uninstalling it onto devices.
  • Services testing– Testing the services of the application online and offline.

The Operational Qualification (OQ) of mobile applications consists of:

  • Usability testing– To ensure that mobile applications are easy to use and deliver a satisfactory user experience to the customers
  • Compatibility testing– Testing of the application in different mobiles devices, browsers, screen sizes and operating system versions according in accordance with written user requirement specifications.
  • Interface testing– Testing of menu options, buttons, bookmarks, history, settings, and navigation flow of the application.
  • Low-level resource testing: Testing of memory usage, auto-deletion of temporary files, local database growing issues.
  • Mobile Operations Testing – Testing of backups and recovery plan if a battery goes down, or data loss while upgrading the application from a store.
  • Mobile Security Testing– Confirmation tests of mobile applications to confirm the protection of system data.

The Performance Qualification (PQ) of mobile applications consists of:

  • Performance testing– Testing the performance of the application by changing the connection from 2G, 3G to WIFI, sharing the documents, battery consumption, and other related tests.

It is important to understand that validating mobile applications is an essential part of validation and should be conducted based on risk and impact to regulatory requirements and GMP.   You must always have a process around the acquisition, installation and support of mobile apps.  Be aware of the changes that major software players are making and the impact on your systems environments.  It is important to understand the impact of these changes on your validated systems environment.  With the validation of mobile applications, what is old is new again. WE ARE BACK TO THE FUTURE WITH MOBILE APPLICATION VALIDATION.  The principles of validation still endure in this new environment.  You must keep up with the latest trends and respond accordingly.  Watch this space for more information!

5 Ways to Revive Your CSV Validation Process in 2018

If you are like most veteran validation engineers, you have been conducting validation exercises the same old way.  You have been gathering user requirements, conducting risk assessments, developing test scripts (most often on paper) and delivering validation documentation and due diligence as required by current global regulations.

Validation processes have not changed very much over the last 40 years (unless you include ISPE GAMP® as revolutionary).  However, the systems we validate and the state of play with respect to cybersecurity, data integrity, mobility and compliance has changed in a profound way.  Cloud-based applications are seeing increasing adoptions.  Platforms such as Microsoft Azure are changing the way companies do business.  Mobility is in the hands of everyone who can obtain an affordable device.  These game-changing trends call for changes in the way we wake up our sleepy validation processes and use them as opportunities for process improvement.

To help you address these new challenges for the coming year, I offer 5 ways to revive sleepy, manual validation processes for your consideration.

  1. Think LEAN
  2. Automate
  3. Leverage Electronic Document Review & Approval Processes
  4. Adopt Agile Validation Processes
  5. Conduct Cybersecurity Qualification

STEP 1.  THINK LEAN

Lean manufacturing is a proven process that powered Toyota to success. Lean manufacturing processes were designed to eliminate waste and improve operational efficiency. Toyota adopted lean manufacturing processes and rose to prominence with quality vehicles that outsold most of their competitors in each automotive class. To improve your validation processes in today’s systems environment, it is important to think lean. As we are being asked to do more with less as validation engineers it is important and dare I say critical that you eliminate any wasteful processes and focus on validation processes that in the end at value. This is the basic premise of lean validation.

Throughout my practice, we deliver lean validation services to all of our clients. Lean validation is powered through automation. You can’t have a lean validation process without automation since automation powers lean. Through automation and lean validation processes, we are able to automate the development and management of requirements, incidents, and validation testing in a manner not possible on paper. As you look to wake up your validation processes in 2018, it is important to think lean. I wrote a blog article on the principles and best practices of lean as a good starting point to educate you as to how to maintain and establish lean validation processes.

STEP 2.  AUTOMATE

As I mentioned above you cannot adopt lean validation processes unless you automate your validation processes. Automation has often been looked at by validation engineers as a luxury rather than the necessity. In 2018, automation of the validation process is a necessity not a luxury. Electronic workflow used to be the purview of expensive enterprise systems often costing in the millions of dollars. Today, there is no excuse for not automating your process other than simply not wanting to change.

Electronic systems to manage automated validation processes have come a long way in the last 20 years. We offer a system call ValidationMaster™  which was designed by validation engineers for validation engineers. ValidationMaster™  is an enterprise validation management system designed to manage the full lifecycle of validation processes. From requirements to test script development and execution through incident reporting and validation document control, validation master manages the full lifecycle of validation processes and their associated deliverables. The system comes with a fully integrated validation master portal based on SharePoint designed to manage document deliverables in a controlled manner. The system is integrated with electronic signatures and graphical workflows to eliminate document bottlenecks and improve the process of development of your document deliverables.

The development and execution of test cases represents the most laborious part of validation. This is where most validation exercises are either successful or fail. Most often, given the manual cryptic nature that most validation engineers use to develop test scripts, a lot of focus is spent on the actual development of the test script and not the quality of the test case itself. This is due to wasteful processes such as cutting and pasting screenshots or manually typing in information that the validation engineer sees on the screen during the manual test script development process. Both the process of capturing screenshots or typing in information to describe a screen are considered in my view wasteful and don’t add to the value of the test case.

The FDA says a good test case is one that finds an error. As validation engineers we should be focused more on finding errors and software quality than cutting and pasting screenshots into word documents as most of us do. There are things that computers can do much more efficiently and effectively than humans can do and one of them is automated test generation.

There are tools on the market such as ValidationMaster™ that allow you to fully automate this process and eliminate the waste in this process. ValidationMaster™ allows you to navigate through the application that is subject for your validation exercise while the system automatically captures the screenshot and the text and places this text directly into your formatted documents. You simply push a button at the end of the test script development and your test script is written for you. This is standard out-of-the-box functionality. Managing requirements for enterprise applications can also be laborious and challenging.

If you are validating enterprise resource planning systems which consist of hundreds of requirements, keeping track of the version of each requirement may be difficult. Who changes a requirement and why was a requirement changed is often the subject of hours and hours of meetings that waste time and money. ValidationMaster™ has a full-featured requirements management engine which not only tracks the requirement but it tracks the full audit history of a requirement so that you can understand who change the requirement, when was the requirement changed and why. This gives you full visibility to requirements. These requirements can be directly traced to test scripts thereby fully automating requirements traceability in your validation process.

Validation testing can be conducted online as either a fully automated validation test script which requires no human intervention or test cases that are semi-automatic where the validation engineer navigates through each software application and the actual results are captured by the system.

Manually tracking tests runs can be a headache and sometimes error-prone. An automated system can do this for you with ease. ValidationMaster™ allows you to conduct multiple tests runs and it tracks each of the tests runs, the person who ran the test, how long it took to actually execute the test, and the actual results of the test. If an incident or software anomaly occurred during testing the automated system would actually allow you to capture that information and keeps the incident report with each validation test. The time for automation has calm.

If you want to wake up your sleepy validation processes and deliver greater value for your company, you cannot afford to overlook automation of your validation process. Check out a demonstration of ValidationMaster™ to see how automated validation lifecycle management can work for you.

STEP 3 – LEVERAGE ELECTRONIC DOCUMENT REVIEW & APPROVAL PROCESSES

One of the many challenges that validation engineers experience every day is the route review and approval of validation documentation. In many paper-based systems this involves routing around printed documents for physical handwritten signatures. This is the way we did it in the 1980s. 21st-century validation requires a new more efficient approach. As mentioned in the section above on automation, there really is no excuse for using manual paper-based processes anymore. There are affordable technologies on the market that would facilitate electronic document review and approval processes and tremendously speed up the efficiency and quality of your document management processes.

In one previous company that I worked for our document cycle times were up to 120 days per document. With electronic route and review using 21 CFR part 11 electronic signatures we managed to get the process down to three weeks which saved a significant amount of time and money. Electronic document review and approval is a necessity and no longer a luxury there are tools on the market that costs less than a happy meal per month that would allow you to efficiently route and review your validation documentation and apply electronic signatures in compliance with 21 CFR part 11. To wake up your document management processes electronic document review and approval is a MUST.

STEP 4 – ADOPT AGILE VALIDATION PROCESSES

For as long as I can remember, whenever validation was spoken up we talked about the V model. This is the model as shown in the figure below where validation planning happens on the left-hand side of the V and validation testing happens on the right-hand side of the V.

V-MODEL

This model often assumed a waterfall approach to software development where all requirements were developed in advance and were tested subsequent to the development/configuration process. In today’s systems environment this is actually not how software is developed or implemented in most cases. Most development teams use an agile development process. Agile software development methodologies have really revolutionized the way organizations plan, develop, test and release software applications today. It is fair to say that agile development methods have now been established as the most accepted way to develop software applications. Having said this, many validation engineers insist on the waterfall methodology to support validation.

Agile development is basically a lightweight software development methodology that focuses on having small time boxed sprints of new functionality that are incorporated into an integrated product baseline. In other words, all requirements are not developed upfront. Agile recognizes that you may not know all requirements up front. The scrum places emphasis on customer interaction, feedback and adjustments rather than documentation and prediction. From a validation perspective this means iterations of validation testing as software iterations are developed in sprints. It’s a new way of looking at validation from the old waterfall approach. This may add more time to the validation process overall but the gradual introduction of functionality within complex software applications has value in and of itself. The Big Bang approach of delivering ERP systems sometimes has been fraught with issues and has sometimes been error-prone. The gradual introduction of features and functionality within software application has its merits. You should adopt agile validation processes to wake up your current processes.

STEP 5 – CONDUCT CYBERSECURITY QUALIFICATION (CYQ)

Last but not least is cyber security. In my previous blog posts, I discussed extensively the impact of cyber security on validation. I cannot emphasize enough how you need to pay attention to this phenomenon and be ready to address it. Cyber threats are a clear and present danger not only to validated systems environments but to all systems environments. As validation engineers our job is to ensure that system are established and meet their intended use. How can you confirm that the system meets its intended use when it’s vulnerable to cyber threats? The reality is you can’t.

Many validation engineers believe that cyber security is the purview of the IT department. This is something that the IT group is responsible for handling and many believe that it has nothing to do with validation. Nothing could be further from the truth. If you remember the old adage in validation “if it’s not documented it didn’t happen” you will quickly realize that you must document your readiness to deal with a cyber security event in the event that one occurs in your environment.

I call this “cyber security qualification” or “CyQ”. A cyber security qualification is an assessment of your readiness to protect validated systems environments against the threat of a cyber event. The CyQ is intended to evaluate your readiness to ensure compliance. One of the ways that you can wake up your validation processes in 2018 is to understand and educate yourself as to the threats that cyber events can have on your current validated systems environment and conduct a CyQ in addition to your IQ/OQ/PQ testing to ensure that you have the processes and procedures in place to deal with any threats that may come your way.  If you would like to see an example of a CyQ just write me and I’ll send you one.

2018 is going to bring many challenges your way. It is time that you bring your validation processes to level V validation maturity as shown in the figure below. You need to develop more mature processes that deal with the threats of cyber security and embrace lean validation and agile methodologies. If you’re like most you will be asked to do more with less. Updating old antiquated validation processes is essential to you being able to have the agility to do more with less.  It’s about time that you revive your old antiquated validation processes. Are you ready to go lean?

Risk Management & Computer Systems Validation: The Power Twins

For as long as systems have been validated, risk is an inherent part of the process.  Although validation engineers have been drafting risk assessments since the beginning of computer systems validation, many do not understand how crucial this process is to the overall validation process.

Risk management and validation go hand-in-hand.  The ISPE GAMP 5® (“GAMP 5”) methodology is a risk-based approach to validation.  GAMP 5 recommends that you scale all validation life cycle activities and associated documentation according to risk, complexity and novelty.  As shown in the figure below, the key drivers for GAMP 5 is science-based quality management of risks.

Drivers for GAMP 5

From a systems perspective, quality risk management, according to GAMP 5 is “…a systematic approach for the assessment, control, communication, and review of risks to patient safety, product quality, and data integrity.  It is an iterative process applied throughout the entire system life cycle…”  The guidance recommends qualitative and quantitative techniques to identify, document and manage risks over the lifecycle of a computer system.  Most importantly, they must be continually monitored to ensure the on-going effectiveness of each implemented control.

Risk assessments may be used to drive the validation testing process.  In our practice, we focus on critical quality attributes and four types of risks:

  • Business Risks
  • Regulatory Risks
  • Technical Risks
  • Cybersecurity Risks

The last type of risk, cybersecurity risks, is one that has not gotten a lot of attention from validation engineers and is not explicitly mentioned in GAMP 5.  However, this type of risk represents a clear and present danger to ALL systems, not just validated ones.  Cyber threats are real.  Large-scale cybersecurity attacks continue to proliferate across many enterprises and cyber criminals are broadening their approaches to strengthen their impact.  You need to look at a holistic approach to risk that not only includes the traditional risk assessment from GAMP as highlighted in the figure below but one that includes and incorporates cybersecurity as a real threat and risk to validated systems environments.

Risk Assessment - gamp 5

Cyber threats most certainly may impact product quality, patient safety and even data integrity.  We incorporate these risks into our risk management profile to provide a more comprehensive risk assessment for computer systems.

ZDNet reports that “…As new security risks continue to emerge, cloud security spending will grow to $3.5 billion by 2021…”

ZDNet June 15, 2017

As life sciences companies increase their adoption of the cloud, new challenges for validate systems environment are emerging and the risks that go along with them.  I wrote a blog post recently called “Validation as we know it is DEAD”.  In this post, I addressed the challenges and opportunities that cloud and cybersecurity bring.  Although cloud security “solutions” will driving spending by 2021, the solution is not necessarily a technology issue.  You can attack cyber risks effectively with STRATEGY.

For validated systems environments, I am recommending a cybersecurity plan to combat hackers and document your level of due diligence for validated systems.  Remember in validated systems, “…if its not documented, it didn’t happen…”.  You must document you plans and develop an effective strategy for all of your computer systems.

Validation and risk are the power twins of compliance.  Risk management cannot only facilitate the identification of controls to protect your systems long term, it can help ensure compliance and data integrity as well as drive efficiencies in your lean validation processes.  In the conduct of your risk assessments, do not ignore cybersecurity risks – It is the elephant in the room.

Cloud Validation Strategies

If you were to ask me 10 years ago how many of my life sciences clients were deploying systems in the cloud environment I would’ve said may be perhaps one or two. If you ask me today how many of my clients are deploying cloud’s technologies I would say most all of them in one way or another.  The adoption of cloud technologies within life sciences companies is expanding at a rapid pace.

From a validation perspective, this trend has profound consequences.  Here are some key concerns and questions to be answered for any cloud deployment.

  1. How do you validate systems in a cloud environment?
  2. What types of governance do you need to deploy applications in a cloud environment?
  3. How do you manage change in a cloud environment?
  4. How do you maintain the validated state in the cloud?
  5. How can you ensure data integrity in the cloud?
  6. How do you manage cybersecurity in a cloud environment?

The answers to these questions are obvious and routine to validation engineers managing systems in an on-premise environment where the control of the environment is managed by the internal IT team.  They have control over changes, patches, system updates, and other factors that may impact the overall validated state.  In a cloud environment, the software, platform and infrastructure is delivered as a SERVICE.  By leveraging the cloud, life sciences companies are effectively outsourcing the management and operation of a portion of their IT infrastructure to the cloud provider.  However, compliance oversight and responsibility for your validated system cannot be delegated to the cloud provider.  Therefore, these services must have a level of control sufficient to support a validated systems environment.

For years, life sciences companies have been accustomed to governing their own systems environments.  They control how often systems are updated, when patches are applied, when system resources will be updated, etc.  In a cloud environment, control is in the hands of the cloud service provider.  Therefore, who you choose as your cloud provider matters.

So what should your strategy be to manage cloud-based systems?

  • Choose Your Cloud Provider Wisely – All cloud providers are not created equally.  The Cloud Security Alliance (https://cloudsecurityalliance.org/ ) is an excellent starting point for understanding cloud controls.  The Cloud Controls Matrix (CCM) is an Excel spreadsheet that allows you to assess a vendors readiness for the cloud.  You can download it free of charge from the CSA.
  • Establish Governance For The Cloud – You must have an SOP for the management and deployment of the cloud and ensure that this process is closely followed.  You also need an SOP for cyber security to provide a process for protecting validated systems against cyber threats.
  • Leverage Cloud Supplier Audit Reports For Validation – All cloud providers must adhere to standards for their environments.  Typically, they gain 3rd party certification and submit to Service Organization Control (SOC) independent audits.  It is recommended that you capture the SOC 1/2/3 and SSAE 16 reports.  You also want to understand any certifications that your cloud provider has.  I would archive their certifications and SOC reports with the validation package as part of my due diligence for the supplier audit.
  • Embrace Lean Validation Principles and Best Practices – eliminating waste and improving efficiency is essential in any validated systems environment.  Lean validation is derived from the principles of lean manufacturing.  Automation is a MUST.  You need to embrace lean principles for greater efficiency and compliance.
  • Automate Your Validation Processes – Automation and Lean validation go hand in hand.  The testing process is the most laborious process.  We recommend using a system like ValidationMaster™ to automate requirements management, test management and execution, incident management, risk management, validation quality management, agile validation project management, and validation content management. ValidationMaster™ is designed to power lean validation processes and includes built-in best practices to support this process.
  • Use a Risk-Based Approach To Validation – all validation exercises are not created equal.  The level of validation due diligence required for your project should be based on risk – regulatory, technical and business risks.  Conduct a risk assessment for all cloud-based systems.
  • Adopt Continuous Testing Best Practices – the cloud is under continuous change which seems in and of itself counter-intuitive to the validation process.  Continuous testing can be onerous if your testing process is MANUAL.  However, if you adopt lean, automated testing processes regression testing is easy.  You can establish a routine schedule for testing and if your cloud provider delivers a dashboard that tells you when patches/updates/features have been applied and the nature of them, you can select your regression testing plan based on a risk and impact assessment.

 

Cloud environments can be validated!  A clear, practical approach that embraces lean validation and continuous testing is key.  Cloud governance to ensure data integrity and sustained compliance is key.

Cloud technologies are here to stay.  Regulators don’t object to the use of the cloud, they want to know how you are managing it and ensuring the integrity of the data.  They also want you to confirm that you are maintaining the validated state in the cloud.  The principles of validation endure in the cloud.  Just because you are in a cloud environment does not mean validation principles no longer apply.  Consider the impact of cybersecurity in your cloud environment and adopt continuous testing strategies to ensure sustained compliance.

Accelerating Validation: 10 Best Practices You Should Know in 2018

As we open the new year with resolutions and new fresh thinking, I wanted to offer 10 best practices that should be adopted by every validation team.  The major challenges facing validation engineers every day include cyber threats against validated computer systems, data integrity issues, maintaining the validated state, cloud validation techniques and a myriad of other issues that affect validated systems.

I have been championing the concept of validation maturity and lean validation throughout the validation community.  In my recent blog post, I highlighted reasons why validation as we have known it over the past 40 years is dead.   Of course I mean this with a bit of tongue in cheek.

The strategies included in the FDA guidance and those we used to validate enterprise systems in the past do not adequately address topics of IoT, cloud technologies, cybersecurity, enterprise apps, mobility and other deployment strategies such as Agile change the way we think and manage validated systems.

The following paragraphs highlight the top 10 best practices that should be embraced for validated computer systems in 2018.

Practice 1 – Embrace Lean Validation Best Practices

Lean validation is the practice of eliminating waste and inefficiencies throughout the validation process while optimizing the process and ensuring compliance. Lean validation is derived from lean manufacturing practices where non-value-added activity is eliminated. As a best practice it is good to embrace lean validation within your processes. To fully embrace lean it is necessary to automate your validation processes thus it is strongly recommended that automation be part of your consideration for updating your validation processes in 2018.  My recent blog post on lean validation discusses lean in detail and how it can help you not only optimize your validation process but achieve compliance

Practice 2 – Establish Cybersecurity Qualification and SOP

I have noted in previous lock posts that cyber threats were the elephant in the room. Although validation engineers pledge to confirm that the system meets its intended use through the validation process, many organizations do not consider cyber security is a clear and present danger to validated systems. I believe that this is a significant oversight in many validation processes. Thus, I have developed a fourth leg for validation called cyber security qualification or CyQ. Cyber security qualification is an assessment of the readiness of your processes and controls to protect against a cyber event. It includes testing to confirm the effectiveness and the strength of your controls. In addition to IQ, OQ, and PQ, I have added CyQ as the fourth leg of my validation. It is strongly recommended in 2018 that you consider this best practice.

Practice 3 – Automate Validation Testing Processes

In the year 2018, many validation processes are still paper-based. Companies are still generating test scripts using Microsoft Excel or Microsoft Word and manually tracing test scripts to requirements. This is not only time-consuming but very inefficient and causes validation engineers to focus more on formatting test cases rather than finding bugs and other software anomalies that could affect the successful operation of the system. Lean validation requires an embrace of automated testing systems. While there are many point solutions on the market that address different aspects of validation testing, for 2018 it is strongly recommended that you embrace enterprise validation management as part of your overall strategy. An enterprise validation management such as ValidationMaster manage this the full lifecycle of computer systems validation. In addition to managing requirements (user, functional, design), the system also facilitates automated testing, incident management, release management, and agile software project methodologies. In 2018, the time has come to develop a Reusable Test Script Library to support future regression testing and maintaining the validated state.

An enterprise validation management system will also offer a single source of truth for validation and validation testing assets.  This is no longer a luxury but mandatory when you consider the requirement for continuous testing in the cloud. This is a best practice whose time has come. In 2018, establish automated validation processes to help maintain the validated state and ensure software quality.

Practice 4 – Develop a Cloud Management For Validated Systems SOP

many of today’s applications are deployed in the cloud. Office 365, dynamics 365 and other such applications have gained prominence and popularity within life sciences companies. In 2018, you must have a cloud management standard operating procedure that provides governance for all cloud activities. This SOP should define how you acquire cloud technologies and what your specific requirements are. Failure to have such an SOP is failure to understand and effectively manage cloud technologies for validated systems. I have often heard from validation engineers that since you are in the cloud the principles of validation no longer apply. This is absolutely not true. In a cloud environment, the principles of validation endure. You still must qualify cloud environments and the responsibilities between yourself and the cloud provider must be clearly defined. If you don’t have a cloud SOP or don’t know where to start in writing one, write me and I will send you a baseline procedure.

Practice 5 – Establish Validation Metrics and KPIs

Peter Drucker said you can’t improve what you can’t measure. It is very important that you establish validation metrics and key performance indicators in 2018 for your validation process.  You need to define what success looks like. How will you know when you’ve achieved or objectives? Some key performance indicators may include the number of errors, the number of systems validated, the number of incidents (this should be trending downward), and other such measures. Our ValidationMaster™ portal allows you to establish KPIs for validation and track them over time. In 2018 it’s important to embrace this concept. I have found over the years that most validation activities are not looked at from a performance perspective. They are often inefficient and time-consuming and no one actually measures the value to the organization regarding validation or any key performance indicators. The time is come to start looking at this and measuring it as is appropriate.

Practice 6 – Use Validation Accelerators For Enterprise Systems

Across the globe validation engineers are being asked to do more with less. Organizations are demanding greater productivity and compliance. As you look at enterprise technologies such as Microsoft Dynamics 365®, you need to consider the importance of this environment and how it should be validated. There is a learning curve with any enterprise technology. You should recognize that although you may know a lot about validating a system you may not know specifically about this technology and the ends and outs of this technology. The FDA stipulates and their guidance that you may use vendor information as a starting point for validation. Understand that there’s no such thing as validation out of the box therefore you must do your due diligence when it comes to validated systems.

In 2018 we recommend that for systems such as Microsoft Dynamics 365, BatchMaster™, Edgewater Fullscope EDGE®, Veeva® ECM systems, Oracle e-Business®, and many others, that you use a Validation Accelerator to jumpstart your validation process. Onshore Technology Group offers Validation Accelerators for the aforementioned software applications that will help streamline the validation process. Most noteworthy, our validation accelerators come bundled with a full enterprise validation management system, ValidationMaster.  This helps to deliver a single source of truth for validation.

Practice 7 – Fill Staffing Gaps With Experts on Demand

In 2018, you may find yourself in need of validation expertise for your upcoming projects but may not have sufficient resources internally to fulfill those needs. A burgeoning trend today is the outsourcing of validation staff to achieve your objectives – validation staffing on demand. OnShore offers an exclusive service known as ValidationOffice℠ which provides validation team members on demand. We recruit and retain top talent for your next validation project. We offer a validation engineers, technical writers, validation project managers, validation test engineers, validation team leaders, and other such staff with deep domain experience and validation confidence. In 2018 you may want to forgo direct hires and use ValidationOffice℠ to fulfill your next staffing need.

Practice 8 – Establish Validation Test Script Library

One of the primary challenges with validation testing is that of regression testing. Once a system is validated and brought into production any software changes, patches, updates, or other required changes drive the need for regression testing. This often results in a full rewrite of manual, paper-based test scripts. Today’s validation engineers have tools at their disposal that eliminates the need for rewrite of test scripts and the ability to pull test scripts required for regression testing from a reusable test script Library. This is the agile, lean way to conduct validation testing. In 2018, you should consider purchasing an enterprise validation management system to manage a reusable test script Library. This is a process that will save you both time and money in the development and online execution of validation test scripts.

Your test scripts can be either fully automated or semi-automated. Fully automated test scripts can be run automatically with no human intervention. Semi-automated validation test scripts are used by the validation engineer to conduct validation testing online. It should be noted that ValidationMaster supports both. Check out demo to learn more about how this works.

Practice 9 – Develop a Strategy For Data Integrity

The buzzword for 2018 is data integrity. Europe just announced its GDPR regulations for data integrity.  The new regulations hold you accountable for the integrity of data in your system and the privacy of information housed therein.  In 2018 it is imperative that you have a data integrity strategy and that you look at the regulations surrounding data integrity to ensure that you comply. In the old days when we used to do validation, it was left up to each validation engineer when they would do system updates and how often they would validate their systems. Data integrity and cyber security as well as cloud validation demand strategies that include continuous testing. I discuss this in a previous blog post but it is very important that you consider testing your systems often to achieve a high level of integrity not only of the application but of the data.

Practice 10 – Commit to Achieve Level 5 Validation Process Maturity

Finally, your validation processes must mature in 2018. It is no longer acceptable or feasible in most organizations that are attempting to improve efficiency and compliance to simply continue with antiquated, inefficient paper-based systems. Therefore in 2018 it will be important for you to commit to Level 5 validation process maturity where your processes are automated and you have greater control over your validated systems and sustain quality across the validation lifecycle. Check out my other post on achieving Level 5 process maturity.

I wish you all the best in 2018 as you endeavor to establish and maintain the validated state for computer systems. From the development of Level 5 validation processes through automation it’s important to understand the changes that new technology brings and how well you and your organization can adapt to those changes. Your success depends on it! Good luck and keep in touch!