In Case You Missed It: CSV Validation Master Class

In case you missed it, in December 11- 13 KenX conducted the Computer System Validation and Data Integrity Congress.  Recently, the FDA has issued warning letters regarding non-compliance of validated computer systems.  Findings have included issues such as inadequate risk analysis, non-independent audit trails (audit trails that could be manipulated or turned on/off), failure to established SOPs that provide governance for the validation process and other such issues.  The event featured a guest speaker from the FDA who highlighted challenges associated with data integrity and the approach taken by the Agency.

Recently, I have been speaking to the regulatory community about the many challenges we face in validating today’s computer systems.  Cybersecurity, mobility, cloud applications at the enterprise level, the Internet of Things (IoT) and many other changes affecting the installation, qualification, deployment and function of computer systems have compelled me to rethink strategies around computer systems validation.  As a 30-year practitioner in the field, I have developed key best practices to support cloud validation, address cybersecurity and the other challenges associated with today’s technology.

In case you missed it, I conducted one of the first Computer Systems Validation Master Classes.  The Master Class presented a broad range of topics related to the Independent Validation and Verification of today’s technologies.  We addressed topics such as:

  • Lean Validation Principles, Best Practices and Lessons Learned
  • Computer Systems Validation Automation and Best Practicess
  • Cybersecurity & Computer Systems Validation: What You Should Know
  • Cybersecurity Qualification: The Next Frontier in Validation Testing
  • Cloud Validation Best Practices
  • Continuous Testing In The Cloud
  • Leveraging Service Organization Control (SOC) Reports For Supplier Audits
  • … and much more

The 90-minute session was a lively discussion of many topics for validation contemporaries that will help them master validation of the latest technologies and ensure sustained quality and compliance.

Our Master Class format encouraged knowledge-exchange, where each topic was not only debated from the practitioners’ perspective, but participants delivered insights from their experiences presenting the latest best practices, regulatory guidance and practical CSV scenarios resulting in a comprehensive discussion of each topic as well as practical tips, tools and techniques to ensure software quality and a more relevant validation process which takes into account today’s technologies and their profound impact on the validation process writ large.

For participation in the Validation Master Class workshop, I offered participants a copy of my lean validation process templates, a cybersecurity qualification (CyQ) template, a cloud validation SOP, cybersecurity validation SOP, a system risk assessment template and sample SOC 1/SOC2/SOC3 data center reports for cloud providers.  (if you would like to obtain a copy of these materials please contact me using the contact form provided)

In case you missed it, I can report that the event was a huge success as measured by the feedback from the session and the response of all participants.  Check out our events and join us at one of our weekly webinars or industry events!

Can Cloud Applications Be Validated?

Cloud applications are being deployed within life sciences Enterprises at a rapid pace. Microsoft office 365, Microsoft dynamics 365 and the cost benefit of deployment of other such applications are driving the adoption of cloud applications for regulated systems. The question that is always asked is can cloud systems be validated? The reason for the inquiry is due to the fact that many understand how clouds are deployed and maintained over time. In an effort to keep pace with system performance, security and other related controls in the cloud, cloud providers often update their environments to keep pace. The constant updating of the cloud environment makes it challenging from a systems validation perspective.  Maintaining the validated state in a cloud environment is often challenging due to this fact.

So can cloud applications be validated and what are the unique processes that must be changed to accommodate cloud validation?  The short answer is yes, cloud applications can be validated. However, there are changes required to the validation strategy to ensure that the system meets its intended use and the validated state is maintained over time.

ALL CLOUD PROVIDERS ARE NOT CREATED EQUAL

When choosing a cloud provider, it is helpful to understand that all cloud providers are not created equal. To my mind they are divided into two distinct camps: (1) those who understand regulated environments and (2) those who do not understand regulated environments. The first order of business for establishing a validated system in the cloud is to select your cloud providers carefully. I usually like to look at cloud providers who have experience in regulated environments. This goes for any application that you’re using in a highly regulated environment. It is super important that your vendor understand the regulatory requirements that you have to comply with and that in their service they build in best practices to help you comply. This will go a long way during the supplier audit process to confirm that you’ve done your proper due diligence on your cloud provider.

UNDERSTANDING RESPONSIBILITIES IN A CLOUD ENVIRONMENT

It is important to understand responsibilities in a cloud environment.  With packaged software applications, you have complete control over the environment.  With the various cloud service models, responsibilities vary depending on the model used as shown below.  For Infrastructure-as-a-Service, you manage the applications, data, runtime, middle ware and operating system.  While the vendor manages virtualization, servers, storage, and networking.  Please keep in mind that the principles of validation endure in the cloud.  You, not the vendor, are ultimately responsible for the environment and its management.  Therefore, you need to choose your cloud vendor wisely.

cloud services

You can see with Platform-as-a-Service (PaaS) and Software-as-a-Service (SaaS), you have less responsibility for the management of the system.  In the SaaS model, you do not control the application or infrastructure.  The vendor manages the application and it underlying architecture.

The strategy for validation changes based on the cloud model deployed.

The strategy for validation changes based on the cloud model deployed.

CONTINUOUS TESTING IN THE CLOUD

Previously four on premise systems, the validation engineer in consultation with the IT team determined how often validated systems environments were reviewed and updated. If the validation engineer did not want to apply a patch or an update, they simply left the system alone. Many validation engineers due to the labor involved in updating and documenting a validated system would often leave the system in a state where no changes, patches, or updates were applied. In today’s systems due to the threats from cyber security and frequent changes to browser applications, it is not possible to leave a system unpatched. To do so may leave your validated systems environment more exposed to security threats.

Therefore, when validating in the cloud you must employ a continuous testing strategy. When I say that to validation engineers they often cringe thinking of continuously having the test a validated systems environment. This is easier than you may think if you’re using automated tools for testing. Continuous testing is simply a validation testing strategy where on a designated schedule defined by the validation engineer, a system is routinely tested to ensure that patches, and routine updates do not impact the validated state. Continuous testing is facilitated best through automation. When developing a reusable test script Library, you can easily conduct an impact analysis and determine which tests need to be rerun. Regression testing is made easy through validation test script automation. Therefore, it is strongly recommended that if you deploy systems in a cloud environment that you employ an enterprise validation management system that includes automated testing such as ValidationMaster™.

Cloud applications can be validated.  The strategy changes based on the cloud model selected.  Choose your partners carefully.  They assume an important role in the management of critical system assets.

It is not a question of if a cloud application can be validated, it is a question of the deployment model you choose and the validation strategy that will govern deployment. Regulators are not averse to you using the cloud – its all about managing control and risk.

 

5 Ways to Revive Your CSV Validation Process in 2018

If you are like most veteran validation engineers, you have been conducting validation exercises the same old way.  You have been gathering user requirements, conducting risk assessments, developing test scripts (most often on paper) and delivering validation documentation and due diligence as required by current global regulations.

Validation processes have not changed very much over the last 40 years (unless you include ISPE GAMP® as revolutionary).  However, the systems we validate and the state of play with respect to cybersecurity, data integrity, mobility and compliance has changed in a profound way.  Cloud-based applications are seeing increasing adoptions.  Platforms such as Microsoft Azure are changing the way companies do business.  Mobility is in the hands of everyone who can obtain an affordable device.  These game-changing trends call for changes in the way we wake up our sleepy validation processes and use them as opportunities for process improvement.

To help you address these new challenges for the coming year, I offer 5 ways to revive sleepy, manual validation processes for your consideration.

  1. Think LEAN
  2. Automate
  3. Leverage Electronic Document Review & Approval Processes
  4. Adopt Agile Validation Processes
  5. Conduct Cybersecurity Qualification

STEP 1.  THINK LEAN

Lean manufacturing is a proven process that powered Toyota to success. Lean manufacturing processes were designed to eliminate waste and improve operational efficiency. Toyota adopted lean manufacturing processes and rose to prominence with quality vehicles that outsold most of their competitors in each automotive class. To improve your validation processes in today’s systems environment, it is important to think lean. As we are being asked to do more with less as validation engineers it is important and dare I say critical that you eliminate any wasteful processes and focus on validation processes that in the end at value. This is the basic premise of lean validation.

Throughout my practice, we deliver lean validation services to all of our clients. Lean validation is powered through automation. You can’t have a lean validation process without automation since automation powers lean. Through automation and lean validation processes, we are able to automate the development and management of requirements, incidents, and validation testing in a manner not possible on paper. As you look to wake up your validation processes in 2018, it is important to think lean. I wrote a blog article on the principles and best practices of lean as a good starting point to educate you as to how to maintain and establish lean validation processes.

STEP 2.  AUTOMATE

As I mentioned above you cannot adopt lean validation processes unless you automate your validation processes. Automation has often been looked at by validation engineers as a luxury rather than the necessity. In 2018, automation of the validation process is a necessity not a luxury. Electronic workflow used to be the purview of expensive enterprise systems often costing in the millions of dollars. Today, there is no excuse for not automating your process other than simply not wanting to change.

Electronic systems to manage automated validation processes have come a long way in the last 20 years. We offer a system call ValidationMaster™  which was designed by validation engineers for validation engineers. ValidationMaster™  is an enterprise validation management system designed to manage the full lifecycle of validation processes. From requirements to test script development and execution through incident reporting and validation document control, validation master manages the full lifecycle of validation processes and their associated deliverables. The system comes with a fully integrated validation master portal based on SharePoint designed to manage document deliverables in a controlled manner. The system is integrated with electronic signatures and graphical workflows to eliminate document bottlenecks and improve the process of development of your document deliverables.

The development and execution of test cases represents the most laborious part of validation. This is where most validation exercises are either successful or fail. Most often, given the manual cryptic nature that most validation engineers use to develop test scripts, a lot of focus is spent on the actual development of the test script and not the quality of the test case itself. This is due to wasteful processes such as cutting and pasting screenshots or manually typing in information that the validation engineer sees on the screen during the manual test script development process. Both the process of capturing screenshots or typing in information to describe a screen are considered in my view wasteful and don’t add to the value of the test case.

The FDA says a good test case is one that finds an error. As validation engineers we should be focused more on finding errors and software quality than cutting and pasting screenshots into word documents as most of us do. There are things that computers can do much more efficiently and effectively than humans can do and one of them is automated test generation.

There are tools on the market such as ValidationMaster™ that allow you to fully automate this process and eliminate the waste in this process. ValidationMaster™ allows you to navigate through the application that is subject for your validation exercise while the system automatically captures the screenshot and the text and places this text directly into your formatted documents. You simply push a button at the end of the test script development and your test script is written for you. This is standard out-of-the-box functionality. Managing requirements for enterprise applications can also be laborious and challenging.

If you are validating enterprise resource planning systems which consist of hundreds of requirements, keeping track of the version of each requirement may be difficult. Who changes a requirement and why was a requirement changed is often the subject of hours and hours of meetings that waste time and money. ValidationMaster™ has a full-featured requirements management engine which not only tracks the requirement but it tracks the full audit history of a requirement so that you can understand who change the requirement, when was the requirement changed and why. This gives you full visibility to requirements. These requirements can be directly traced to test scripts thereby fully automating requirements traceability in your validation process.

Validation testing can be conducted online as either a fully automated validation test script which requires no human intervention or test cases that are semi-automatic where the validation engineer navigates through each software application and the actual results are captured by the system.

Manually tracking tests runs can be a headache and sometimes error-prone. An automated system can do this for you with ease. ValidationMaster™ allows you to conduct multiple tests runs and it tracks each of the tests runs, the person who ran the test, how long it took to actually execute the test, and the actual results of the test. If an incident or software anomaly occurred during testing the automated system would actually allow you to capture that information and keeps the incident report with each validation test. The time for automation has calm.

If you want to wake up your sleepy validation processes and deliver greater value for your company, you cannot afford to overlook automation of your validation process. Check out a demonstration of ValidationMaster™ to see how automated validation lifecycle management can work for you.

STEP 3 – LEVERAGE ELECTRONIC DOCUMENT REVIEW & APPROVAL PROCESSES

One of the many challenges that validation engineers experience every day is the route review and approval of validation documentation. In many paper-based systems this involves routing around printed documents for physical handwritten signatures. This is the way we did it in the 1980s. 21st-century validation requires a new more efficient approach. As mentioned in the section above on automation, there really is no excuse for using manual paper-based processes anymore. There are affordable technologies on the market that would facilitate electronic document review and approval processes and tremendously speed up the efficiency and quality of your document management processes.

In one previous company that I worked for our document cycle times were up to 120 days per document. With electronic route and review using 21 CFR part 11 electronic signatures we managed to get the process down to three weeks which saved a significant amount of time and money. Electronic document review and approval is a necessity and no longer a luxury there are tools on the market that costs less than a happy meal per month that would allow you to efficiently route and review your validation documentation and apply electronic signatures in compliance with 21 CFR part 11. To wake up your document management processes electronic document review and approval is a MUST.

STEP 4 – ADOPT AGILE VALIDATION PROCESSES

For as long as I can remember, whenever validation was spoken up we talked about the V model. This is the model as shown in the figure below where validation planning happens on the left-hand side of the V and validation testing happens on the right-hand side of the V.

V-MODEL

This model often assumed a waterfall approach to software development where all requirements were developed in advance and were tested subsequent to the development/configuration process. In today’s systems environment this is actually not how software is developed or implemented in most cases. Most development teams use an agile development process. Agile software development methodologies have really revolutionized the way organizations plan, develop, test and release software applications today. It is fair to say that agile development methods have now been established as the most accepted way to develop software applications. Having said this, many validation engineers insist on the waterfall methodology to support validation.

Agile development is basically a lightweight software development methodology that focuses on having small time boxed sprints of new functionality that are incorporated into an integrated product baseline. In other words, all requirements are not developed upfront. Agile recognizes that you may not know all requirements up front. The scrum places emphasis on customer interaction, feedback and adjustments rather than documentation and prediction. From a validation perspective this means iterations of validation testing as software iterations are developed in sprints. It’s a new way of looking at validation from the old waterfall approach. This may add more time to the validation process overall but the gradual introduction of functionality within complex software applications has value in and of itself. The Big Bang approach of delivering ERP systems sometimes has been fraught with issues and has sometimes been error-prone. The gradual introduction of features and functionality within software application has its merits. You should adopt agile validation processes to wake up your current processes.

STEP 5 – CONDUCT CYBERSECURITY QUALIFICATION (CYQ)

Last but not least is cyber security. In my previous blog posts, I discussed extensively the impact of cyber security on validation. I cannot emphasize enough how you need to pay attention to this phenomenon and be ready to address it. Cyber threats are a clear and present danger not only to validated systems environments but to all systems environments. As validation engineers our job is to ensure that system are established and meet their intended use. How can you confirm that the system meets its intended use when it’s vulnerable to cyber threats? The reality is you can’t.

Many validation engineers believe that cyber security is the purview of the IT department. This is something that the IT group is responsible for handling and many believe that it has nothing to do with validation. Nothing could be further from the truth. If you remember the old adage in validation “if it’s not documented it didn’t happen” you will quickly realize that you must document your readiness to deal with a cyber security event in the event that one occurs in your environment.

I call this “cyber security qualification” or “CyQ”. A cyber security qualification is an assessment of your readiness to protect validated systems environments against the threat of a cyber event. The CyQ is intended to evaluate your readiness to ensure compliance. One of the ways that you can wake up your validation processes in 2018 is to understand and educate yourself as to the threats that cyber events can have on your current validated systems environment and conduct a CyQ in addition to your IQ/OQ/PQ testing to ensure that you have the processes and procedures in place to deal with any threats that may come your way.  If you would like to see an example of a CyQ just write me and I’ll send you one.

2018 is going to bring many challenges your way. It is time that you bring your validation processes to level V validation maturity as shown in the figure below. You need to develop more mature processes that deal with the threats of cyber security and embrace lean validation and agile methodologies. If you’re like most you will be asked to do more with less. Updating old antiquated validation processes is essential to you being able to have the agility to do more with less.  It’s about time that you revive your old antiquated validation processes. Are you ready to go lean?

Accelerating Validation: 10 Best Practices You Should Know in 2018

As we open the new year with resolutions and new fresh thinking, I wanted to offer 10 best practices that should be adopted by every validation team.  The major challenges facing validation engineers every day include cyber threats against validated computer systems, data integrity issues, maintaining the validated state, cloud validation techniques and a myriad of other issues that affect validated systems.

I have been championing the concept of validation maturity and lean validation throughout the validation community.  In my recent blog post, I highlighted reasons why validation as we have known it over the past 40 years is dead.   Of course I mean this with a bit of tongue in cheek.

The strategies included in the FDA guidance and those we used to validate enterprise systems in the past do not adequately address topics of IoT, cloud technologies, cybersecurity, enterprise apps, mobility and other deployment strategies such as Agile change the way we think and manage validated systems.

The following paragraphs highlight the top 10 best practices that should be embraced for validated computer systems in 2018.

Practice 1 – Embrace Lean Validation Best Practices

Lean validation is the practice of eliminating waste and inefficiencies throughout the validation process while optimizing the process and ensuring compliance. Lean validation is derived from lean manufacturing practices where non-value-added activity is eliminated. As a best practice it is good to embrace lean validation within your processes. To fully embrace lean it is necessary to automate your validation processes thus it is strongly recommended that automation be part of your consideration for updating your validation processes in 2018.  My recent blog post on lean validation discusses lean in detail and how it can help you not only optimize your validation process but achieve compliance

Practice 2 – Establish Cybersecurity Qualification and SOP

I have noted in previous lock posts that cyber threats were the elephant in the room. Although validation engineers pledge to confirm that the system meets its intended use through the validation process, many organizations do not consider cyber security is a clear and present danger to validated systems. I believe that this is a significant oversight in many validation processes. Thus, I have developed a fourth leg for validation called cyber security qualification or CyQ. Cyber security qualification is an assessment of the readiness of your processes and controls to protect against a cyber event. It includes testing to confirm the effectiveness and the strength of your controls. In addition to IQ, OQ, and PQ, I have added CyQ as the fourth leg of my validation. It is strongly recommended in 2018 that you consider this best practice.

Practice 3 – Automate Validation Testing Processes

In the year 2018, many validation processes are still paper-based. Companies are still generating test scripts using Microsoft Excel or Microsoft Word and manually tracing test scripts to requirements. This is not only time-consuming but very inefficient and causes validation engineers to focus more on formatting test cases rather than finding bugs and other software anomalies that could affect the successful operation of the system. Lean validation requires an embrace of automated testing systems. While there are many point solutions on the market that address different aspects of validation testing, for 2018 it is strongly recommended that you embrace enterprise validation management as part of your overall strategy. An enterprise validation management such as ValidationMaster manage this the full lifecycle of computer systems validation. In addition to managing requirements (user, functional, design), the system also facilitates automated testing, incident management, release management, and agile software project methodologies. In 2018, the time has come to develop a Reusable Test Script Library to support future regression testing and maintaining the validated state.

An enterprise validation management system will also offer a single source of truth for validation and validation testing assets.  This is no longer a luxury but mandatory when you consider the requirement for continuous testing in the cloud. This is a best practice whose time has come. In 2018, establish automated validation processes to help maintain the validated state and ensure software quality.

Practice 4 – Develop a Cloud Management For Validated Systems SOP

many of today’s applications are deployed in the cloud. Office 365, dynamics 365 and other such applications have gained prominence and popularity within life sciences companies. In 2018, you must have a cloud management standard operating procedure that provides governance for all cloud activities. This SOP should define how you acquire cloud technologies and what your specific requirements are. Failure to have such an SOP is failure to understand and effectively manage cloud technologies for validated systems. I have often heard from validation engineers that since you are in the cloud the principles of validation no longer apply. This is absolutely not true. In a cloud environment, the principles of validation endure. You still must qualify cloud environments and the responsibilities between yourself and the cloud provider must be clearly defined. If you don’t have a cloud SOP or don’t know where to start in writing one, write me and I will send you a baseline procedure.

Practice 5 – Establish Validation Metrics and KPIs

Peter Drucker said you can’t improve what you can’t measure. It is very important that you establish validation metrics and key performance indicators in 2018 for your validation process.  You need to define what success looks like. How will you know when you’ve achieved or objectives? Some key performance indicators may include the number of errors, the number of systems validated, the number of incidents (this should be trending downward), and other such measures. Our ValidationMaster™ portal allows you to establish KPIs for validation and track them over time. In 2018 it’s important to embrace this concept. I have found over the years that most validation activities are not looked at from a performance perspective. They are often inefficient and time-consuming and no one actually measures the value to the organization regarding validation or any key performance indicators. The time is come to start looking at this and measuring it as is appropriate.

Practice 6 – Use Validation Accelerators For Enterprise Systems

Across the globe validation engineers are being asked to do more with less. Organizations are demanding greater productivity and compliance. As you look at enterprise technologies such as Microsoft Dynamics 365®, you need to consider the importance of this environment and how it should be validated. There is a learning curve with any enterprise technology. You should recognize that although you may know a lot about validating a system you may not know specifically about this technology and the ends and outs of this technology. The FDA stipulates and their guidance that you may use vendor information as a starting point for validation. Understand that there’s no such thing as validation out of the box therefore you must do your due diligence when it comes to validated systems.

In 2018 we recommend that for systems such as Microsoft Dynamics 365, BatchMaster™, Edgewater Fullscope EDGE®, Veeva® ECM systems, Oracle e-Business®, and many others, that you use a Validation Accelerator to jumpstart your validation process. Onshore Technology Group offers Validation Accelerators for the aforementioned software applications that will help streamline the validation process. Most noteworthy, our validation accelerators come bundled with a full enterprise validation management system, ValidationMaster.  This helps to deliver a single source of truth for validation.

Practice 7 – Fill Staffing Gaps With Experts on Demand

In 2018, you may find yourself in need of validation expertise for your upcoming projects but may not have sufficient resources internally to fulfill those needs. A burgeoning trend today is the outsourcing of validation staff to achieve your objectives – validation staffing on demand. OnShore offers an exclusive service known as ValidationOffice℠ which provides validation team members on demand. We recruit and retain top talent for your next validation project. We offer a validation engineers, technical writers, validation project managers, validation test engineers, validation team leaders, and other such staff with deep domain experience and validation confidence. In 2018 you may want to forgo direct hires and use ValidationOffice℠ to fulfill your next staffing need.

Practice 8 – Establish Validation Test Script Library

One of the primary challenges with validation testing is that of regression testing. Once a system is validated and brought into production any software changes, patches, updates, or other required changes drive the need for regression testing. This often results in a full rewrite of manual, paper-based test scripts. Today’s validation engineers have tools at their disposal that eliminates the need for rewrite of test scripts and the ability to pull test scripts required for regression testing from a reusable test script Library. This is the agile, lean way to conduct validation testing. In 2018, you should consider purchasing an enterprise validation management system to manage a reusable test script Library. This is a process that will save you both time and money in the development and online execution of validation test scripts.

Your test scripts can be either fully automated or semi-automated. Fully automated test scripts can be run automatically with no human intervention. Semi-automated validation test scripts are used by the validation engineer to conduct validation testing online. It should be noted that ValidationMaster supports both. Check out demo to learn more about how this works.

Practice 9 – Develop a Strategy For Data Integrity

The buzzword for 2018 is data integrity. Europe just announced its GDPR regulations for data integrity.  The new regulations hold you accountable for the integrity of data in your system and the privacy of information housed therein.  In 2018 it is imperative that you have a data integrity strategy and that you look at the regulations surrounding data integrity to ensure that you comply. In the old days when we used to do validation, it was left up to each validation engineer when they would do system updates and how often they would validate their systems. Data integrity and cyber security as well as cloud validation demand strategies that include continuous testing. I discuss this in a previous blog post but it is very important that you consider testing your systems often to achieve a high level of integrity not only of the application but of the data.

Practice 10 – Commit to Achieve Level 5 Validation Process Maturity

Finally, your validation processes must mature in 2018. It is no longer acceptable or feasible in most organizations that are attempting to improve efficiency and compliance to simply continue with antiquated, inefficient paper-based systems. Therefore in 2018 it will be important for you to commit to Level 5 validation process maturity where your processes are automated and you have greater control over your validated systems and sustain quality across the validation lifecycle. Check out my other post on achieving Level 5 process maturity.

I wish you all the best in 2018 as you endeavor to establish and maintain the validated state for computer systems. From the development of Level 5 validation processes through automation it’s important to understand the changes that new technology brings and how well you and your organization can adapt to those changes. Your success depends on it! Good luck and keep in touch!

 

Computer Systems Validation As We Know It Is DEAD

Over the past 10 years, the software industry has experienced radical changes.  Enterprise applications deployed in the cloud, the Internet of Things (IoT), mobile applications, robotics, artificial intelligence, X-as-a-Service, agile development, cybersecurity challenges and other technology trends force us to rethink strategies for ensuring software quality.  For over 40 years, validation practices have not changed very much.  Suprisingly, many companies still conduct computer systems validation using paper-based processes.  However, the trends outlined above challenge some of the current assumptions about validation.  I sometimes hear people say “… since I am in the cloud, I don’t have to conduct an IQ…” or they will say, “… well my cloud provider is handling that…”

Issues related to responsibility and testing are changing based on deployment models and development lifecycles.  Validation is designed to confirm that a system meets its intended use.  However, how can we certify that a system meets its intended use if it is left vulnerable to cyber threats?  How can we maintain the validated state over time in production if the cloud environment is constantly changing the validated state?  How can we adequately test computer systems if users can download an “app” from the App Store to integrate with a validated system?  How can we ensure that we are following proper controls for 21 CFR Part 11 if our cloud vendor is not adhering to CSA cloud controls?  How can we test IoT devices connected to validated systems to ensure that they work safely and in accordance with regulatory standards?

You will not find the answers to any of these questions in any regulatory guidance documents.  Technology is moving at the speed of thought yet our validation processes are struggling to keep up.

These questions have led me to conclude that validation as we know it is DEAD.  The challenges imposed by the latest technological advances in agile software development, enterprise cloud applications, IoT, mobility, data integrity, privacy and cybersecurity are forcing validation engineers to rethink current processes.

Gartner group recently announced that firms using IoT grew from 29% in 2015 to 43 % in 2016.  They project that by the year 2020, over 26 billion devices will be IoT-devices.  it should be noted that Microsoft’s Azure platform includes a suite of applications for remote monitoring, predictive maintenance and connected factory monitoring for industrial devices.  Current guidance has not kept pace with ever-changing technology yet the need for quality in software applications remains a consistent imperative.

So how should validation engineers change processes to address these challenges?

First, consider how your systems are developed and deployed.  The V-model assumes a waterfall approach yet most software today is developed using Agile methodologies.  It is important to take this into consideration in your methodologies.

Secondly, I strongly recommend adding two SOPs to your quality procedures – a Cybersecurity SOP for validated computer systems and a Cloud SOP for validated systems.  You will need these two procedures to provide governance for your cloud processes.  (If you do not have a cloud or cybersecurity SOP please contact me and I will send you both SOPs.)

Third, I believe you should incorporate cybersecurity qualification (CyQ) into your testing strategy.  In addition to IQ/OQ/PQ, you should be conducting a CyQ readiness assessment for all validated systems.  A CyQ is an assessment to confirm and document your readiness to protect validated systems against a cyber attack.  It also includes testing to validate current protections for your validated systems.  It is important to note that regulators will judge you on your PROACTIVE approach to compliance.  This is an important step in that direction.

cyq-1

Forth, you should adopt lean validation methodologies.  Lean validation practices are designed to eliminate waste and inefficiency throughout the validation process while ensuring sustained compliance.

Finally, the time has come for automation.  To keep pace with the changes in current technology as discussed above, you MUST include automation for requirements management, validation testing, incident management and validation quality assurance (CAPA, NC, audit management, training, et al).  I recommend consideration of an Enterprise Validation Management system such as ValidationMaster™ to support the full lifecycle of computer systems validation.  ValidationMaster™  allows you to build a re-usable test script library and represents a “SINGLE SOURCE OF TRUTH” for all of your validation projects.  Automation of the validation process is no longer a luxury but a necessity.

Advanced technology is moving fast.  The time is now to rethink your validation strategies for the 21st century.  Validation as we know it is dead.  Lean, agile validation processes are demanded to keep pace with rapidly changing technology.  As you embrace the latest cloud, mobile and IoT technologies, you will quickly find that the old ways of validation are no longer sufficient.  Cyber criminals are not going away but you need to be ready. Step into LEAN and embrace the future!

 

What Is The ROI For Automated Validation Testing?

Independent Verification and Validation (IV&V) is a regulatory imperative for software deployments within the life sciences industry.  The FDA provides guidance on what is expected during the validation process and how to conduct due diligence for the validation process.

IEEE 1012 is the industry standard for IV&V.  This industry standard provides the background, rationale, process, list of deliverables and standards for the conduct of validation exercises.  A common question always comes up during my discussions with clients and prospects and that is the question of return on investment of the validation process.  As life sciences firms implement validation processes for their enterprise cloud and on-premise systems, increasingly many companies are turning to outsourcing of the IV&V process to save both time and money.

Validation is traditionally thought of as a regulatory imperative and is generally accounted for in capitalized expenses or direct expenses to the business.  However, as outsourcing becomes more prevalent and companies consider validation as more of a legal best practice, life sciences executives are rethinking IV&V strategies and seeking a cost-effective approach to ensuring compliance and software quality.

It is understood that releasing enterprise software with defects is risky business and costly.  For software applications driving highly regulated processes deficient software can have a direct impact on product quality or other critical quality attributes.  IV&V is designed to improve the quality and reliability of software systems and detect bugs/defects prior to production use.

IV&V activities should be conducted INDEPENDENTLY of the development team.  Many organizations often blur the lines between the two teams but they are distinct.  The figure below shows a typical development team and the correlation between IV&V activities.  The IV&V team receives information from the development team

ieee 12207 sw process with ivv1

The typical V-model for independent validation and verification is highlighted in the figure below.  Lean validation methods and agile processes are designed to streamline efforts, eliminate waste from the validation and improve efficiencies thereby improving ROI.

WHAT IS RETURN ON INVESTMENT (ROI) & WHY SHOULD YOU CARE?

 

ROI is a performance measure used to evaluate the efficiency of an investment or to compare the efficiency of a number of different investments. To calculate ROI, the benefit (return) of an investment is divided by the cost of the investment; the result is expressed as a percentage or a ratio.

A return on investment formula can be expressed as:

ROI = (Gain from Investment – Cost of Investment)/Cost of Investment

For most life sciences companies, validation is considered a regulatory imperative where the return on investment is not necessarily factored into the decision to validate.  However, validation can be thought of as a legal best practice which can offer return on investment and key benefits.  The key questions to consider are:

  1. What is the cost of the IV&V investment?
  2. What are the tangible and non-tangible benefits gained by life sciences companies?

IV&V along with the cost of quality associated with such projects can typically cost 5 to 10 percent of the total cost of an IT project, depending on the complexity of the project and the specific scope of QA/IV&V activities.

A defect that costs $1 to fix in the requirements or design phase costs $100 to fix  after the software goes into production (live) use.

Several studies indicate that the ROI on IV&V and software quality assurance investments can be 2 to 10 times the investment in QA/IV&V activities depending on how its measured.  There are several studies on the market that address the return on investment for validation as highlighted below.

ROI analysis is an important tool to understand the cost-effectiveness of your validation efforts.  This is why you should care.  Automation of the validation process can play an important role in maximizing your return on investment.  Join us for one of our weekly webinars and let us show you how.

WHAT SHOULD YOU PAY FOR IV&V?

IV&V delivers tremendous value for life sciences companies in early detection and elimination of software defects.  It is reasonable to expect IV&V cost to at least equal internal defect discovery costs or from 5- 10% of a typical project.  This depends on several factors including how customized the system is.  Bespoke development projects are must costlier than COTS systems that require little or no customization.

The simple cost Break Even Point is reached when:

  • IV&V cost = IV&V Savings
  • (1)(0.3)($x/SM)(Zsize)= (100)( Y)($x/SM)
  • (0.3) Zsize = 100 Y
  • (0.3)Zsize/100 = Y *SM – size metric could be expressed as Lines of Code, Function Points, …

One way to achieve greater ROI is through automation.  Automated validation testing can accelerate validation and improve ROI over time.  Automated testing systems like ValidationMaster™ allow you to create a reusable test script library to facilitate regression testing and improve initial testing.

CLOSING THOUGHTS

It should be understood that there is tremendous value inherent in the validation process.  There are several final points for your consideration:

  • No commercial or bespoke software is defect free
  • Early detection is designed to save both time and money
  • Reducing costs by minimizing validation may actually result in INCREASED COST during the operational phase of the system.
  • Validation testing should be automated to achieve maximum ROI

 

Automated Validation Best Practices

Automation is the key to lean validation practices.  Although many validation processes are still paper-based manual processes, there are best practices that support Independent Verification and Validation (IV&V) processes that drive efficiency and compliance.

BEST PRACTICE 1 – Establish Independence

The IEEE 1012 Standard For System, Software and Hardware Verification and Validation states that Independent Verification and Validation (IV&V) is defined by three parameters:

  1. Technical Independence – ensures independence from the development team.  Technical independence is intended to provide a fresh point of view in the examination of software applications to help better detect subtle errors that may be overlooked by those that are too close to the solution such as the development or system implementation team.
  2. Managerial Independence – helps to ensure that an organization separate and distinct from the development or program management team.  Managerial independence ensures that the validation team has the autonomy to independently select the validation methodology, processes, schedule, tasks, and testing strategy to independently confirm the suitability of applications for their intended use.  Managerial independence also ensures that the IV&V team can objectively report all validation test results without any restrictions or approval from the development team or system integration team.  This is a very important level of independence.
  3. Financial Independence – ensures that there are no financial ties between the IV&V team and development team to ensure objectivity.  This level of independence is designed to prevent situations where financial ties may adversely influence or pressure IV&V personnel to deliver less than an objective, authentic test results.

The IEEE 1012 standard speaks of various forms of independence but the bottom line is that the IV&V team should be as independent as possible from the development team.  It is not a good best practice for development teams to also validate their own development projects.  Objectivity is sacrificed when this is done.  Following this best practice ensures objective examination of your software projects free of bias and undue external influence from the development team.

BEST PRACTICE 2 – Continuous Testing In The Cloud

Cloud environments can be validated.  However, there are several issues and characteristics of cloud environments that challenge traditional assumptions regarding validation efforts.

  • Continuous changes in the cloud
  • Inability to conduct supplier audits for large cloud vendors (Microsoft, Oracle, et al)
  • Maintaining the Validated State

Cloud system environments continuously change.  Validation engineers are not used to uncontrolled changes in system environments.  We have been taught that all changes to a system environment once it has been validated must undergo change control.  Thus all changes are subject to a change request process.

In cloud environments, we don’t control when changes are made to systems. Cloud vendors may change disk drives, virtual servers, apply patch updates, and memory and many other system changes that may affect your validated system environment. So the question becomes how do you maintain the validated state in the cloud? There are several best practices designed to answer this question. First of all, you need a way to determine what changes are made in the cloud. Take for example Microsoft office 365 or Microsoft dynamics 365. Microsoft has established what is known as a trust center. The Microsoft trust center is an excellent resource and it provides information about how Microsoft examines its cloud environment. The first consideration you should look at when selecting cloud technology is who your provider is. All cloud providers are not created equal. There are some cloud providers that take compliance, security, data integrity and governance seriously and those who are more general or consumer oriented in nature and do not prioritize these characteristics.

Microsoft, continuing the example, has achieved several key industry certifications for their cloud environment. But most importantly, through the trust center they have provided visibility and clear communications as to how they manage the cloud. From a testing perspective Microsoft solves one of the biggest problems you have in the cloud and that is the question of how do you know what changes were made in the cloud and when the cloud provider made them. Microsoft provides a list of updates byproduct application and tells you exactly what changes were made, the date that the changes were made and if the updates or patches were successfully applied in the environment. With the Microsoft cloud it’s no longer the case that you don’t know when Microsoft has changed the environment. Thanks to their transparency, you know exactly what changes are made to the environment which brings us to the best practice of continuous testing.

Since cloud environments change so often you should employ a strategy known as continuous testing. Continuous testing is essentially validation testing at predefined (user-defined) intervals to ensure that cloud environments are maintained in a validated state. To successfully employ a continuous testing strategy, automation is essential. This is not a process that you would want to carry out manually although it can be carried out manually if you so desire. Automation adds a dimension of efficiency and consistency in the environment.

To employ continuous testing you want to establish a reusable test script Library. This is essential. Once you have validated your system using automated tools such as ValidationMaster™, you will have established a reusable test script Library. The test scripts developed can be used for subsequent regression testing and can be automated to save both time and money. For continuous testing you would conduct an impact analysis to determine the impact of changes that are made to the cloud environment. Once you conduct an impact analysis, you would want to do a risk assessment to ensure that you effectively monitor risk in accordance with ISPE GAMP 5®.  You then want to select from among your reusable test script Library regression tests suitable for continuous testing in your cloud environment. Once these test scripts are executed you can document the actual results and provide a level of due diligence for regulators that you are maintaining your cloud environment in a validated state.

BEST PRACTICE 3 – Select The Right Automation Tools

Another key best practice is selecting the right automation tools for validation. How do you know how to select the right tools? There are two types of automated tools on the market: (1) point solutions and (2) enterprise solutions. The point solution is one that addresses a single element of the validation process. For example, the management of requirements is an essential core component of any validation exercise. There are requirements management point solutions on the market that would assist you in effectively managing user, functional, and design requirements for any validation initiative. Testing is another core element of the validation process. There are many solutions out there that would allow you to capture and record test scripts and some even allow you to execute test scripts online. The problem with point solutions is that they only provide one step in the process. When validating systems it is not common for you to use up words of 17 different systems (point solutions) to prepare validation documentation and due diligence. This does not seem to make much sense and is often fraught with duplication of effort and inefficiencies that cost time and money.

To drive lean validation processes and to achieve automation best practice, you need an enterprise validation management solution to fully automate the validation process – not just one part of it. An enterprise validation management system has the capability of managing validation planning documentation such as the validation master plan, risk assessment, validation project plan, and other related documentation.

As a matter of fact an enterprise validation management system includes an enterprise content management system as a core component of the overall solution. The key deliverables from the validation process are documents. Lots of documents! It stands to reason that an enterprise content management system would be an overall core part of the solution. An enterprise validation management system should also include a requirements management system. It should have the ability to manage any type of requirements. An automated test engine should be at the core of such a solution. The automated test engine should have the ability to not only record test scripts but execute test scripts online and capture objective actual results.

The system should have a robust reporting engine that facilitates the efficient output of any type of report required as part of the due diligence for validation. Quality management is at the core of the validation process. Therefore an enterprise validation management system should include capabilities for all aspects of quality including change control, audit management, CAPA, nonconformances, training, periodic review, trend analysis, and validation key performance indicators. The system should provide real time, statistics on the overall health and performance of validation processes. The system should have at its core foundation standard technology adaptable in any systems environment.

It is best practice to select and deploy the RIGHT tools to support enterprise validation processes.  Point solutions will only get you so far.  Selecting the proper automation tools can save both time and money and deliver a single source of truth for your validation projects.

BEST PRACTICE 4 – Establish a Reusable Test Library

One of the most laborious tasks during software validation is TESTING.  The test script development, execution and documentation process takes considerable time if you do it correctly.  From the establishment of a test environment through the development of test scripts, the validation engineer must carefully document expected and actual results sufficient to prove that systems meet their intended use and have the requisite quality expected of such systems.

Developing test scripts takes time.  Traceability also takes time.  When systems are validated, you want to have the ability to retest a system as required but not have to rewrite test scripts over and over again.  A reusable test script library has been one  of the most effective practices I have implemented.

Reusable test scripts can save up to 60% of time which may be required to rewrite test scripts. Establishing a reusable test script library with FULLY AUTOMATED scripts can save even more time and money in that the fully automated scripts can be executed without human intervention.  You have the ability to set a date/time when test scripts are to be executed and the system (ValidationMaster®) will automatically execute them and report the results back in a fraction of the time it takes to manually execute them.  It is therefore best practice to establish a reusable test script library for enterprise validated systems.

BEST PRACTICE 5 – Document Clear Objective Test Evidence

Documentation of clear, objective test evidence is essential for validation. Many of the automated validation management systems do not have reporting engines that are robust enough to allow you to report documents in your unique format. It is best practice to employ an enterprise validation management system that allows you to present clear objective test evidence in your unique document formats as specified by your SOP’s. ValidationMaster™ has a comprehensive reporting engine that allows you to deliver validation reports in your unique format as is required by this best practice.

.

BEST PRACTICE 6 – Establish a Single Source of Truth For Validation Deliverables

For many organizations that conduct validation on paper, there doesn’t exist a single source of truth for validation. Some validation assets are housed within the document management system. A part of the validation package are kept within a code management systems such as SourceSafe. Part of the deliverables may be kept within a requirements management system. Some incident reports may be kept in an incident management system. Other validation deliverables may be paper-based. In many cases validation engineers attempt to keep sign copies of documentation and multiple three ring binders. This was the traditional practice in the ‘80s.

For lean validation practices that support automation, it is best practice to establish a single source of truth for all validation deliverables. This means is that there is a single point where all validation deliverables pre-and post-execution are stored.  A single source of truth facilitates better auditing and eliminates the common occurrence of loss of documentation to support an audit exercise. This best practice is essential to achieving validation excellence.

The core benefits of ValidationMaster™ is to deliver a single source of truth for all validation projects. You can store all of your validation projects in a single, easy-to-use system and reference it for internal or external audits. For lean validation this is current best practice.

Following the six key best practices above can save both time and money. Validation has not changed much over the last 40 years but the way we manage it has changed significantly. Cloud validation, mobility, cyber security and a host of other factors change the way we look at computer systems validation and manage it. I hope you find these best practices useful and effective in helping you to deliver your next validation project on time and within budget.  We use ValidationMaster™ in our practice every day to support our lean validation processes saving our clients considerable time and money.  What’s in your validation office?