Women in Validation: In Case You Missed It

One of my annual “must attend” events is the Institute of Validation Technology (IVT) Computer Systems Validation Annual Validation Week in October.  This is an excellent event to get the latest and greatest thinking on validation topics and industry guidance.  An official from the U.S. FDA attended the event and providing key information on the latest data integrity guidance and reorganization at the FDA.

The highlight of the event for me was a new session added called the “Women in Validation Empowerment Summit”.  The panel featured yours truly and along with other talented women in validation.  The goal of the panel was to discuss some of the unique challenges women face in STEM professions and how to over come them.  The panel was moderated by Roberta Goode of Goode Compliance International.  She led panelists through a lively discussion of topics ranging from career building, job challenges, work place difficulties and much more.  You can see a synopsis of what we discussed here.

I am a fierce advocate of STEM. There are women who are “hidden figures” in STEM professions such as computer systems validation that don’t get the recognition they so richly deserve or the attention of most corporate executives.  This session was about EMPOWERMENT.

There are many “hidden figures” in the field of validation but no real community surrounding them.  I left the session most impressed with my accomplished co-panelists.  Each and every one of them had tremendous insights on their career paths and stories to tell as to how they moved up the ladder.  The talk was real.  It was down to earth.  It was practical.  I didn’t want it to end!

A couple of troubling statistics highlighted during the session related to the number of undergraduate degrees earned by women in STEM.  According to The World Economic Forum, women earn “only 35 percent of the undergraduate degrees in STEM”.  The sad truth is this number remains unchanged for over a decade.  Look at the statistics for women in engineering from the National Science Foundation.

Women in Engineering I graduated in Civil and Environmental Engineering from the University of Wisconsin – Madison in 1982.  I was one of only 3 women in my class at the time.  You can see from the graph above that at the doctorate level, women drop off precipitously.

At the session, we discussed gender bias, stereotypes and other factors that affect women in our profession.  It was an excellent forum and one I hope they continue.  In case you missed it, the session was enlightening and fun for all!   See you in October!

In Case You Missed It: CSV Validation Master Class

In case you missed it, in December 11- 13 KenX conducted the Computer System Validation and Data Integrity Congress.  Recently, the FDA has issued warning letters regarding non-compliance of validated computer systems.  Findings have included issues such as inadequate risk analysis, non-independent audit trails (audit trails that could be manipulated or turned on/off), failure to established SOPs that provide governance for the validation process and other such issues.  The event featured a guest speaker from the FDA who highlighted challenges associated with data integrity and the approach taken by the Agency.

Recently, I have been speaking to the regulatory community about the many challenges we face in validating today’s computer systems.  Cybersecurity, mobility, cloud applications at the enterprise level, the Internet of Things (IoT) and many other changes affecting the installation, qualification, deployment and function of computer systems have compelled me to rethink strategies around computer systems validation.  As a 30-year practitioner in the field, I have developed key best practices to support cloud validation, address cybersecurity and the other challenges associated with today’s technology.

In case you missed it, I conducted one of the first Computer Systems Validation Master Classes.  The Master Class presented a broad range of topics related to the Independent Validation and Verification of today’s technologies.  We addressed topics such as:

  • Lean Validation Principles, Best Practices and Lessons Learned
  • Computer Systems Validation Automation and Best Practicess
  • Cybersecurity & Computer Systems Validation: What You Should Know
  • Cybersecurity Qualification: The Next Frontier in Validation Testing
  • Cloud Validation Best Practices
  • Continuous Testing In The Cloud
  • Leveraging Service Organization Control (SOC) Reports For Supplier Audits
  • … and much more

The 90-minute session was a lively discussion of many topics for validation contemporaries that will help them master validation of the latest technologies and ensure sustained quality and compliance.

Our Master Class format encouraged knowledge-exchange, where each topic was not only debated from the practitioners’ perspective, but participants delivered insights from their experiences presenting the latest best practices, regulatory guidance and practical CSV scenarios resulting in a comprehensive discussion of each topic as well as practical tips, tools and techniques to ensure software quality and a more relevant validation process which takes into account today’s technologies and their profound impact on the validation process writ large.

For participation in the Validation Master Class workshop, I offered participants a copy of my lean validation process templates, a cybersecurity qualification (CyQ) template, a cloud validation SOP, cybersecurity validation SOP, a system risk assessment template and sample SOC 1/SOC2/SOC3 data center reports for cloud providers.  (if you would like to obtain a copy of these materials please contact me using the contact form provided)

In case you missed it, I can report that the event was a huge success as measured by the feedback from the session and the response of all participants.  Check out our events and join us at one of our weekly webinars or industry events!

What Data Integrity Means For Validation

After much fanfare, the general data protection regulations (GDPR) was approved by the EU parliament on April 14, 2016. The enforcement date for this regulation is May 25, 2018. If companies are not in compliance at that time the potential for heavy fines is inevitable. The EU general data protection regulation replaces the data protection directive 95 /46/EC and was designed primarily to harmonize data privacy laws across Europe and to protect and empower all Eve’s citizens data privacy and reshape the way organizations across the region approach data privacy. The regulation has a profound impact on businesses that operate in the EU. Maximum penalties may be as high as 4% of annual global turnover or €20 million (whichever is higher).

In recent years, we have seen massive data breaches at companies which has exposed private information and other sensitive information without consent. Many of these breaches have been due to cyber-attacks against companies of different sizes. The newspapers are full of such data breaches. The new GDPR regulation requires that breaches be reported to the relevant regulator without undue delay and where feasible, within 72 hours of becoming aware of it unless the breach is unlikely to result in a risk to the right and freedom of individuals. Data subjects must be informed without undue delay with the breaches likely to result in a high risk to the data subjects’ rights and freedoms unless the data has been rendered unintelligible to any third party (for example by encryption). Data processors are required to inform data controllers of any breach without undue delay.

What does all this mean for validated systems?

If you operate in the EU and your validated systems include sensitive data or data which may be of a personal nature, such as patient information, you are subject to the guidelines included within the GDPR regulation. You also need to look at data integrity and security practices around the validated system. We recommend strongly the Cybersecurity Qualification (CyQ ) discussed in a previous post. The CyQ assesses a firm’s readiness to protect itself against the cyber-attack. This could go a long way to meeting the requirements of GDPR since the cyber security qualification requires documentation of your security controls.

I recommend reading GDPR and getting used to it before May 2018. Assess your controls within your validated systems environment to determine how vulnerable your systems really are and your readiness to comply with this regulation. I assure you more will be forthcoming about this topic in the months to come.  WATCH THIS SPACE.

Automating Validation Testing: It’s Easier Than You Think

Automated validation testing has been elusive for many in the validation community.  There have been many “point solutions” on the market that addressed the creation, management and execution of validation testing.  However, what most validation engineers want is TRULY AUTOMATED validation testing that will interrogate an application in a rigorous manner and report results in a manner that not only provides objective evidence of pass/fail criteria but will highlight each point of failure.

In the 1980’s when I was conducting validation exercises for mini- and mainframe computers and drafting test scripts in a very manual way, I often envisioned a time when I would be able to conduct validation testing in a more automated way.  Most validation engineers work in an environment where they are asked to do more with less.  Thus, the need for such a tool is profound.

Cloud computing environments, mobility, cybersecurity, and data integrity imperatives make it essential that we more thoroughly test applications today.  Yet the burden of manual testing persists.  If I could share with you 5 key features of an automated testing system it would include the following:

  • Automated test script procedure capture and development
  • Automated Requirements Traceability
  • Fully Automated Validation Test Script Execution
  • Automated Incident Capture and Management
  • Ability to Support Continuous Testing in the Cloud

In most validation exercises I have participated in, validation testing was the most laborious part of the exercise.  Automated testing is easier than you think.

For example, ValidationMaster™ includes an automated test engine that captures each step of your qualification procedure and annotates the step with details of the action performed.

Test cases can be routed for review and pre-approval with the system quickly and easily through DocuSign.  Test case execution can be conducted online and a dynamic dashboard reports the status of how many test scripts have passed, how many have failed, or which ones may have passed with exception.  Once test scripts have been executed, the test scripts may be routed for post-approval and signed.

Within the ValidationMaster™ system, you can create a reusable test script library to support future regression testing efforts.  The system allows users to link requirements to test cases thus facilitating the easy generation of forward and reverse trace matrices.  Exporting documents in your UNIQUE format is a snap within the system.  Thus, you can consistently comply with your internal document procedures.

Continuous testing in a cloud environment is essential.  ValidationMaster™ supports fully automated validation testing allowing users to set a date/time for testing.  Test scripts are run AUTOMATICALLY without human intervention.  Allowing multiple runs of the same scripts if necessary.

Continuous testing in a cloud environment is ESSENTIAL.  You must have the ability to respond to rapid changes in a cloud environment that may impact the validated state of the system.  Continuous testing reduces risk and ensures sustained compliance in a cloud environment.

The system automatically raises an incident report if a bug is encountered through automated testing.  The system keeps track of each test run and results though automation.  ValidationMaster™ includes a dynamic dashboard that shows the pass/fail status of each test script as well as requirements coverage, open risks, incident trend analysis and much more.

The time is now for automated validation testing.  The good news is that there are enterprise level applications on the market that facilitate the full validation lifecycle process.  Why are you still generating manual test scripts?  Automated testing is easier than you think!

Cloud Validation Strategies

If you were to ask me 10 years ago how many of my life sciences clients were deploying systems in the cloud environment I would’ve said may be perhaps one or two. If you ask me today how many of my clients are deploying cloud’s technologies I would say most all of them in one way or another.  The adoption of cloud technologies within life sciences companies is expanding at a rapid pace.

From a validation perspective, this trend has profound consequences.  Here are some key concerns and questions to be answered for any cloud deployment.

  1. How do you validate systems in a cloud environment?
  2. What types of governance do you need to deploy applications in a cloud environment?
  3. How do you manage change in a cloud environment?
  4. How do you maintain the validated state in the cloud?
  5. How can you ensure data integrity in the cloud?
  6. How do you manage cybersecurity in a cloud environment?

The answers to these questions are obvious and routine to validation engineers managing systems in an on-premise environment where the control of the environment is managed by the internal IT team.  They have control over changes, patches, system updates, and other factors that may impact the overall validated state.  In a cloud environment, the software, platform and infrastructure is delivered as a SERVICE.  By leveraging the cloud, life sciences companies are effectively outsourcing the management and operation of a portion of their IT infrastructure to the cloud provider.  However, compliance oversight and responsibility for your validated system cannot be delegated to the cloud provider.  Therefore, these services must have a level of control sufficient to support a validated systems environment.

For years, life sciences companies have been accustomed to governing their own systems environments.  They control how often systems are updated, when patches are applied, when system resources will be updated, etc.  In a cloud environment, control is in the hands of the cloud service provider.  Therefore, who you choose as your cloud provider matters.

So what should your strategy be to manage cloud-based systems?

  • Choose Your Cloud Provider Wisely – All cloud providers are not created equally.  The Cloud Security Alliance (https://cloudsecurityalliance.org/ ) is an excellent starting point for understanding cloud controls.  The Cloud Controls Matrix (CCM) is an Excel spreadsheet that allows you to assess a vendors readiness for the cloud.  You can download it free of charge from the CSA.
  • Establish Governance For The Cloud – You must have an SOP for the management and deployment of the cloud and ensure that this process is closely followed.  You also need an SOP for cyber security to provide a process for protecting validated systems against cyber threats.
  • Leverage Cloud Supplier Audit Reports For Validation – All cloud providers must adhere to standards for their environments.  Typically, they gain 3rd party certification and submit to Service Organization Control (SOC) independent audits.  It is recommended that you capture the SOC 1/2/3 and SSAE 16 reports.  You also want to understand any certifications that your cloud provider has.  I would archive their certifications and SOC reports with the validation package as part of my due diligence for the supplier audit.
  • Embrace Lean Validation Principles and Best Practices – eliminating waste and improving efficiency is essential in any validated systems environment.  Lean validation is derived from the principles of lean manufacturing.  Automation is a MUST.  You need to embrace lean principles for greater efficiency and compliance.
  • Automate Your Validation Processes – Automation and Lean validation go hand in hand.  The testing process is the most laborious process.  We recommend using a system like ValidationMaster™ to automate requirements management, test management and execution, incident management, risk management, validation quality management, agile validation project management, and validation content management. ValidationMaster™ is designed to power lean validation processes and includes built-in best practices to support this process.
  • Use a Risk-Based Approach To Validation – all validation exercises are not created equal.  The level of validation due diligence required for your project should be based on risk – regulatory, technical and business risks.  Conduct a risk assessment for all cloud-based systems.
  • Adopt Continuous Testing Best Practices – the cloud is under continuous change which seems in and of itself counter-intuitive to the validation process.  Continuous testing can be onerous if your testing process is MANUAL.  However, if you adopt lean, automated testing processes regression testing is easy.  You can establish a routine schedule for testing and if your cloud provider delivers a dashboard that tells you when patches/updates/features have been applied and the nature of them, you can select your regression testing plan based on a risk and impact assessment.

 

Cloud environments can be validated!  A clear, practical approach that embraces lean validation and continuous testing is key.  Cloud governance to ensure data integrity and sustained compliance is key.

Cloud technologies are here to stay.  Regulators don’t object to the use of the cloud, they want to know how you are managing it and ensuring the integrity of the data.  They also want you to confirm that you are maintaining the validated state in the cloud.  The principles of validation endure in the cloud.  Just because you are in a cloud environment does not mean validation principles no longer apply.  Consider the impact of cybersecurity in your cloud environment and adopt continuous testing strategies to ensure sustained compliance.

Understanding Lean Validation: A Practical Approach

Lean manufacturing has been with us since 1988.   Lean principles were derived from the Japanese manufacturing industry.  The “Lean” process was originally created and adopted by Toyota designed to eliminate waste and inefficiency in its manufacturing operations.  Lean processes led to the Toyota Production System (TPS) which is arguably one of the greatest manufacturing success stories of all time.  The focus of lean was the elimination of waste and inefficiencies throughout the manufacturing process.

To identify and eliminate waste from the production process, Toyota believed it was important to understand exactly the nature of waste and where it existed. While Toyota’s products significantly differed between factories, the typical wastes found in manufacturing environments were similar in nature. For each waste, Toyota developed an effective strategy to reduce or eliminate its effects, thereby improving overall performance and quality.  The process became so successful that it has been embraced in manufacturing sectors around the world. Today’s manufacturers have embraced the concepts and philosophies of lean.  Being lean is considered critical competitive advantage and strategic imperative.  It has made Toyota an automotive success story.

I have been a validation practitioner for over 30 years.  In the process of validating large engineering systems and life sciences quality and compliance management technologies, I have experienced first hand the waste involved in the validation processes.  From manually cutting and pasting screenshots into test cases to manually tracing requirements to test scripts through manual document route/review processes and rewriting scripts for regression testing, waste abounds in the validation process.  As I pondered my work over the past three decades, I began to think of a better way to validate computer systems.

In my initial research, I started to think of all of the wastes throughout validation processes and categorized them.

validation waste

Wastes in the validation process include planning process wastes, testing wastes, documentation wastes, defect wastes, wasteful requirement processes, quality and incident management wastes and wastes associated with the re-validation and re-testing of software applications.  As I began to ponder solutions to eliminate these wastes throughout the validation process, I embarked on the development of a Lean Validation strategy.  We endeavor to save our clients both time and money throughout the validation process.  OnShore Technology Group is the only company whose practice focuses on the the delivery of Lean Validation services.  Our products and services are uniquely designed to power lean validation processes.

Lean Validation

Using lean validation principles can result in the elimination of wasteful manual validation processes and significant improvements in validation efficiency, document cycle times, increased testing productivity and greater ability to identify and correct software defects leading to enhanced software quality, lower costs and improved regulatory compliance.

OnShore Technology Group has pioneered the principles and best practices of “Lean Validation” – the process of eliminating waste and inefficiencies while driving greater software quality throughout the validation process.  In addition to the delivery of expert lean validation services, OnShore’s flagship software application is known as ValidationMaster™ – the FIRST Enterprise Validation Management and Quality system designed to automate lean validation processes.  ValidationMaster™ delivers a single source of truth for any type of validation including software, equipment, process, cold chain, facility, and other types of validation projects.

ValidationMaster™ is also the first validation management system accessible via any Window, Apple or Android mobile device.  The system includes key features such as a validation dashboard, fully automated test script development and execution, automatic requirements traceability, custom report development and generation and a full range of quality management capabilities (training, audit management, change management, controlled document management, ISO, validation KPI’s, CAPA, nonconformance management and much more.

In 2017, OnShore Technology Group was recognized as the “Best IV&V Automation Solutions Provider” and the “Software Validation Testing Experts of the Year”.  In 2016, OnShore made the coveted annual list of CIOReview Magazine as one of the 20 Most Promising Pharma & Life Sciences Tech Solution Providers”.

Contact us today to learn more and see a live demonstration of ValidationMaster™.

Leveraging the NIST Cybersecurity Framework

As a validation engineer, why should you be concerned about Cybersecurity?  Good question!  Today’s headlines are filled with instances of cyber attacks and data breaches impacting some of the largest corporate systems around.  As validation engineers, our job is to confirm software quality and that systems meet their intended use.  How can you realistically do this without paying any attention to the threat of potential cyber attacks on validated system environment.

As with every system environment, you must ensure your readiness to help prevent a cyber event from occurring.  Of course, you can never fully protect your systems to the extent that a cyber attack will never be successful, but you can certainly PREPARE and reduce the probability of this risk.   That’s what this article is all about – PREPAREDNESS.

The NIST Cybersecurity Framework was created through collaboration between industry and government and consists of standards, guidelines, and practices to promote the protection of critical infrastructure.

To get a copy of the NIST Cyber Security Framework publication, click here.  If you are not familiar with the NIST Cyber Security Framework, you can view an overview video and get a copy of the Excel spreadsheet.

Remember the old addage, “…if its not documented, it did’nt happen…”?  You must document controls, processes and strategies to ensure that you are able to defend your readiness assessment for cybersecurity.  The NIST Cyber Security Framework is designed to help organizations view cybersecurity in a systematic way as part of your overall risk management strategy for validated systems.   The Framework consists of three parts:

  1. Framework Core – a set of cybersecurity activities, outcomes, and informative references that are common across your validated systems environments.   The Framework Core consists of (5) concurrent and continuous Functions which are: (1) Identify, (2) Protect, (3) Detect, (4) Respond, (5) Recover as shown in the figure below.
  2. Framework Profile – help align your cybersecurity activities with business requirements, risk tolerances, and resources
  3. Framework Implementation Tiers – a method to view, assess, document and understand the characteristics of your approach to managing cybersecurity risks in validated systems environments.  This is assessment is part of your Cybersecurity Qualification (CyQ).  Life sciences companies should characterize their level of readiness from Partial (Tier 1) to Adaptive (Tier 4).  You can use what ever scale you like in your assessment.

NIST-cybersecurity-framework

Most companies are adept at RESPONDING to cyber events rather than preventing them.  This Framework, as part of your overall integrated risk management strategy for validation.  We recommend for validation engineers that you DOCUMENT your strategy to confirm your due diligence with respect to cybersecurity.  In my previous blog post, I recommended that in addition to conducting IQ, OQ, PQ, and UAT testing that you also conduct a CyQ readiness assessment.

Cyber threats are a clear and present danger to companies of all sizes and types.  As validation engineers, we need to rethink our validation strategies and adapt to changes which can have significant impact on our validated systems environments.  Whether you are in the cloud or on-premise, cyber threats are real and may impact you.  This problem is persistent and is not going away anytime soon.  Readiness and preparedness is the key.  Some think that issues concerning cybersecurity are only the perview of the IT team – THINK AGAIN!  Cybersecurity is not only an IT problem, it is an enterprise problem that requires an interdisciplinary approach and a comprehensive governance commitment to ensure that all aspects of your validation processes and business processes are aligned to support effective cybersecurity practices.

If you are responsible for software quality and ensuring the readiness of validated you need to be concerned about this matter.  The threats are real.  The challenges are persistent.  The need for greater diligence is upon us.  Check out the NIST Cyber Security Framework.  Get your cyber house in order.

 

Automated Validation Lifecycle Management

Lorem ipsum dolor sit amet, consectetur adipiscing elit. Nam molestie molestie nisl, eu scelerisque turpis tempus at. Nam luctus ultrices imperdiet.

Class aptent taciti sociosqu ad litora torquent per conubia nostra, per inceptos himenaeos. Suspendisse velit orci, pretium ut feugiat nec, lobortis et est. Nullam cursus ultrices tincidunt. Nam gravida sem gravida ipsum dignissim in dictum urna accumsan. Nullam nec augue magna, sed scelerisque odio. Cras adipiscing feugiat venenatis. Praesent gravida consequat purus sed lobortis. Aenean et eros nunc.

Nam ultricies aliquam imperdiet. Pellentesque massa dui, varius non sodales quis, placerat ullamcorper nisl. Donec pulvinar, arcu vel rhoncus commodo, neque lectus blandit elit, vel iaculis odio lectus sit amet metus. Curabitur sodales semper eros et vulputate. Ut et sem ipsum. Nam elementum neque sem. Fusce fringilla ante id augue sodales venenatis. Nunc eu ipsum enim.

  •  Bibendum in cursus venenatis
  • Ultricies consectetur purus
  • Integer imperdiet lectus vitae

Nunc odio odio, faucibus non porta a, venenatis non mauris. Nam non tortor est. Nullam lacinia, augue quis luctus ullamcorper, sem urna bibendum erat, sed viverra tortor velit sed quam. Sed adipiscing leo a odio condimentum in placerat ipsum bibendum.

Nam pretium, sem iaculis ullamcorper mattis, sem lacus commodo dui, vel ultrices libero nisl et massa. Sed tristique bibendum arcu, dapibus eleifend justo aliquet eu. Fusce sed blandit lorem. Phasellus blandit posuere nulla quis aliquam. In vel ante vitae neque aliquet hendrerit a non velit.

In hac habitasse platea dictumst. Integer ac ante enim, in imperdiet justo. Sed justo mi, convallis et lobortis a, venenatis at odio. Vivamus porttitor dolor eget felis pretium luctus. Sed nec dui id augue blandit accumsan vel et lorem.

Quisque eros purus, sagittis sit amet consectetur eu, scelerisque a purus. Pellentesque sollicitudin velit eu velit fringilla sollicitudin. Vestibulum ante ipsum primis in faucibus orci luctus et ultrices posuere cubilia Curae; Praesent id aliquam magna.

How Vulnerable Is Your Validated System Environment?

If you are a validation engineer reading this post, I have a simple question to ask you.. “How vulnerable is your validated systems environment?”  Has anyone ever asked you this question?  How often do you think about the impact of a cyber threat against your validated computer system?  I know… you may be thinking that this is the IT departments responsibility and you don’t have to worry about it.  THINK AGAIN!

If you have picked up in a newspaper lately you will see cyber threats coming from all directions. The White House has been hacked, all companies big and small have been hacked into and from the hacker’s perspective there seems to be no distinction between small medium or large enterprises. In summary, everyone is vulnerable.  The definition of cyber security is the possibility of a malicious attempt to damage or disrupt a computer network or system.

Network applications continue to create new business opportunities and power the enterprise. A recent report suggested that the impact and scale of cyber-attacks is increasing dramatically. The recent leak of government developed malware has given cyber criminals greater capabilities than they had before. IT is struggling to keep pace with the flow of important software security patches and updates and the continuous adoption of new technologies like the Internet of things (IOT) that are creating new vulnerabilities to contend with.

A recent study in 2017 by CSO highlighted the fact that 61% of corporate boards still seasick security as an IT issue rather than a corporate governance issue. This is only one part of the problem. It is not even on the radar of most validation engineers.  You cannot begin to confirm that a system meets its intended use without dealing with the concept of cyber security and its impact on validated systems.

Validated computer systems house GMP and quality information as well as the data required by regulators thus particular scrutiny must be paid to these systems.  Compromise of a validated system may lead to adulterated product and issues which may affect the products quality efficacy and other critical quality attributes of marketed product. Therefore, attention must be paid to validated systems with respect to their unique vulnerability.

As part of the validation lifecycle process we conduct IQ, OQ, and PQ testing as well as user acceptance testing to confirm a system’s readiness for intended use.  I am suggesting that another type of testing be added to the domain of validation testing which is called cyber security qualification or CyQ.

Cyber security qualification is confirmation of a system’s readiness to protect against a cyber attack.

cyq2

You should incorporate CyQ in your validation testing strategy.  It is imperative that your validated systems are protected against a cyber event.  You must document this as well to prove that you have conducted your due diligence.  Given all of the attention to cyber events in the news, you need a strategy to ensure sustained security and compliance.  Are you protecting your validated systems?  If not, you should.