Is Your Validation Team Ready For GDPR?

GDPR stands for the General Data Protection Regulation.  It governs all personal data collected by companies for customers, potential customers, employees, and others.  Regulators are keen to understand how this information is managed and maintained over time.

In April 2016 the FDA issued new draft guidance for data integrity and compliance with cGMP. The guidance was issued in a question and answer style format and focused on frequently occurring data integrity lapses. When the FDA finalizes the guidance, it will represent their current thinking on data integrity and cGMP compliance for the industry.

Why did the FDA draft such guidance? It should be noted that the FDA has increasingly observed cGMP violations involving data integrity during the inspection process. Over 21 warning letters have involved data integrity lapses in drug manufacturing since January 2015. The FDA strongly believes that ensuring data integrity is an important component of the industry’s responsibility to ensure the safety, efficacy, and quality of drugs to protect public health and safety overall.

In recent years, many articles have been written that referred to data integrity using the ALCOA which means that data has to be attributable, legible, contemporaneous, original or true copy, and accurate. It should be noted that the requirements for record retention and review do not differ depending on the data format. Paper-based and electronic record-keeping systems are subject to the very same requirements.  For example section 211.68 requires that backup data be exact and complete and secure from alteration, inadvertent erasures, or loss. Section 211.180 requires true copies or other accurate reproductions of the original records.

Most life sciences companies validate business systems that have GMP impact. It is best practice to conduct installation, operational, and performance qualification testing to demonstrate that a computer system is fit for its intended use and document any incidents that may affect software quality or the reliability of the records. Data integrity and validation go hand in hand but with the latest guidance there’s really nothing new under the sun from a validation perspective. The same level of due diligence and rigor must be applied to confirm that systems are suitable for their intended use and that the data integrity within these systems is sound.

When you are examining data integrity issues it is critically important to look at all aspects of the system including system security and how it is established to ensure that records entered into the system have the highest level of integrity. The FDA recommends that you restrict the ability to alter files and settings within the system to those administrator users that require such access. A recent warning letter cited the failure to prevent unauthorized access or changes to data.

For systems design in accordance with 21 CFR part 11 it is critical to understand that audit trails should be independent. I know this doesn’t come as a surprise for many but I have seen systems where the audit trail could be turned on or off. Let me be clear. All systems designed in accordance with 21 CFR part 11 must have an independent audit trail generated by the computer such that the audit trail cannot be turned off by ordinary means. This means that someone cannot go to a common function within the system and turn off the audit trail. The FDA recommends that audit trails that capture changes to critical data be reviewed with each record and before the final approval of the record. They recommend that audit trails be subject to regular review. Recent warning letters have cited a lack of audit trail for lab instruments for example and the fact that audit trails can be turned off. If an audit trail can be turned off, fraudulent activity may occur. It is important for you to confirm within your systems that the audit trails are capturing information regarding the each record and that these audit trails are independent to ensure data integrity.

Data integrity is not a new concept but it is one that is receiving a lot of attention. Compliance with data integrity guidelines represents more of common sense for those in the compliance business. Look at data integrity not as the latest buzzword but as a reminder of how important it is to ensure the integrity and authenticity of data established and maintained within validated systems environment. This will go a long way to ensuring sustained compliance.

Staffing Your Next Validation Project: What You Should Know

Finding good talent is always a challenge.  Good people are hard to find.  In the validation world, the unique skill sets required for success are sometimes difficult to find and require diligence to fulfill your objectives.  In today’s competitive environment, a lot of good validation talent is gainfully employed but you can always find good talent if you know where to look and what you are looking for.

When seeking validation talent for your next project, there are several points to consider.  Specifically, you should consider outsourcing as a means to acquire qualified experienced talent on demand.  Outsourcing talent acquisition and services offers your firm several key benefits:

  • Provides deep industry domain experience on demand
  • Frees up your internal team to focus on what they do best
  • Save 30 – 40% on outsourced talent
  • Minimize recruitment delays/costs
  • No overhead costs (unemployment insurance, taxes, healthcare, labor law costs)
  • Scalability – add or terminate resources as required

OnShore Technology Group offers a unique staffing service called ValidationOffice™.  This unique service offering delivers personnel with deep industry domain knowledge in life sciences and the field of validation.  Our validation engineers are adept at navigating the ever-changing regulatory landscape and understand well the principles and best practices for validation in a highly regulated environment.  Each candidate has over a decade of experience in enterprise validation projects and can work on any size project.

When looking to outsource part or all of your validation team, it is important that you have a team that is responsive, qualified and flexible enough to meet your demands.

Validation is a highly regulated process requiring rigorous training and experience to help ensure success.  ValidationOffice™ delivers staffing on demand for validation project resources.

WHAT YOU SHOULD KNOW

If you have ever worked with an outsourced validation firm, below are 5 things you should know.

FIRST THING YOU SHOULD KNOW

ValidationOffice™ can be used to provide temporary or permanent experienced validation staff.  According to the American Staffing Association (ASA), staffing companies hire millions of temporary and contact employees each year.  Around 33% of contract employees were eventually offered permanent jobs.   If you are seeking validation talent, you should know that there are options to help you reduce your risk of hiring mistakes and provide the ability for you to evaluate a potential candidate on the job before making a permanent hiring decision.

SECOND THING YOU SHOULD KNOW

We specialize in validation talent.  We can provide validation test engineers, validation project managers, validation program managers, validation technical engineers, technical writers, and other relevant skill sets.   Our candidates come from the ranks of leading companies and have the skills and know how to hit the ground running.  A good staffing partner is essential to oversee staffing needs as they arise.  You should know that you have a partner acting on your behalf.

THIRD THING YOU SHOULD KNOW

How we do our job in recruitment and retaining talent is critical for you to understand.  We offer a simplified and streamlined staffing process.  Hiring the right person means that we have to be intimately familiar with your hiring requirements and business needs.  We listen.  Hiring the wrong person can be an expensive proposition.  We provide just-in-time access to qualified candidate ready to work in your office environment.

FOURTH THING YOU SHOULD KNOW

You should know that your temporary staff members are taken care of administratively.  We use an applicant tracking system to track candidate activity and a customer relationship management system for business development.  Technology drives our business processes to help us deliver the best candidates for you.  You should know that our systems provide efficiencies for you and the job candidates.

FIFTH THING YOU SHOULD KNOW

Finally, you should know that all staffing firms are not created equal.  We are very selective and dedicated to bringing you the very best in validation talent.  Our services can be beneficial for both companies and prospective employees.  We focus on your objectives to deliver the best and brightest to serve your needs.

For your next project, you may want to consider getting validation talent on demand.  Contact us for more details.

Why Are You Still Generating Validation Test Scripts Manually?

Drafting validation scripts is one of the key activities in a validation exercise designed to provide document evidence that a system performs according to its intended use.  The FDA and other global agencies require objective evidence, usually in the form of screen shots that sequentially capture the target software process, to provide assurance that systems can consistently and repeatedly perform the various processes representing the intended use of the system.

Since the advent of the PC, validation engineers have been writing validation test scripts manually.  The manual process of computer systems validation test script development involves capturing screenshots and pasting them into Microsoft Word test script templates.  To generate screen captures, some companies use tools such as Microsoft Print Screen, TechSmith SnagIT, and other such tools.  A chief complaint of many validation engineers is that the test script development process is a slow, arduous one.  Some validation engineers are very reluctant to update/re-validate systems due to this manual process.  So, the question posed by this blog article is simply this: “Why are you still generating test scripts manually???”

I have been conducting validation exercises for engineering and life sciences systems since the early 1980’s.  I too have experienced first-hand the pain of writing test scripts manually.  We developed and practice “lean validation” so I sought ways to eliminate manual, wasteful validation processes.  One of the most wasteful processes in validation is the manual capture/cutting/pasting of screenshots into a Microsoft Word document.

The obvious follow up question is “how do we capture validation tests in a more automated manner to eliminate waste and create test scripts that are complete, accurate and provide the level of due diligence required for validation?”

In response to this common problem, we developed an Enterprise Validation Management system called ValidationMaster™.  This system includes TestMaster™, an automated testing system that allows validation engineers to capture and execute validation test scripts in a cost-effective manner.

TestMaster™ is designed to validate ANY enterprise or desktop application.  It is a browser-based system and allows test engineers to open any application on their desktop, launch TestMaster™, and capture their test scripts while sequentially executing the various commands in their applications.    As the validation engineer navigates through the application, TestMaster™ captures each screenshot and text entry entered in the application.

Once the test script is saved, TestMaster™ allows the script to be published in your UNIQUE test script template with the push of a button.  No more cutting/pasting screenshots from manual processes!  You can generate your test scripts in MINUTES as opposed to the hours it sometimes takes to compile documents based on a series of screenshots.  If you are one of those validation engineers that does not like screenshots in your scripts, you can easily create text-based processes both quickly and easily using TestMaster™.

So, what is the biggest benefit of using TestMaster™ versus manual processes?  There are three key benefits which are summarized as follows:

  1.  Automated Test Script Execution– for years, validation engineers have wanted a more automated approach for the execution of validation test scripts.  ValidationMaster™ supports both hands-on or hands-off validation testing.  Hands-on validation testing is the process whereby a validation engineer looks at each step of a validation test script and executes the script step-by-step by clicking through the process.  Hands off validation allows a validation engineer to execute a test script with no human intervention.  This type of regression testing (hands off) is very useful for cloud-based systems or systems that require more frequent testing.  The validation engineer simply selects a test script and defines a date/time for its execution.  At the designated time with no human intervention, the system executes the test script and reports the test results back to the system.   Automated testing is here!  Why are you still doing this manually?

  1.  Traceability– TestMaster™ allows validation engineers to link each test script to a requirement or set of requirements, thus the system delivers automatic traceability which is a regulatory requirement.  With the click of a button, TestMaster™ allows validation engineers to create a test script directly from a requirement.  This is powerful capability that allows you to see requirements coverage through our validation dashboard on demand.  This validation dashboard is viewable on a desktop or mobile device (Windows, Apple, Android).

  1.  Test Script Execution– One of the biggest problems with manual test scripts is that they must be printed and manually routed for review and approval.  Some companies who have implemented document management systems may have the ability to route the scripts around electronically for review and approval.  The worst-case scenario is the company that has no electronic document management system and generates these documents manually.  TestMaster™ allows validation engineers to execute test scripts online and capture test script results in an easy manner.  The test script results can be captured in an automated way and published into executed test script templates quickly and easily.   If incidents (bugs/anomalies) occur during testing, users have the ability to automatically capture an incident report which is tied to the exact step where the anomaly/bug occurred.  Once completed, ValidationMaster™ is tightly integrated with a 21 CFR Part 11-compliant portal (ValidationMaster Portal™). Once the test script is executed, is it automatically published to the ValidationMaster™ Portal where it is routed for review/approval in the system.  The ability to draft, route, review, approve, execute and post-approve validation test scripts is an important, time/cost saving feature that should be a part of any 21stcentury validation program.

  1.  Reuse Test Scripts For Regression Testing– manual test scripts are not ‘readily’ reusable.  What I mean by this is that the Word documents must be edited or even re-written for validation regression testing.  Validation is not a one-time process.  Regression testing is a fact of life for validation engineers.  The question is, will you rewrite all of your test scripts or use automated tools to streamline the process.  ValidationMaster™ allows validation engineers to create a reusable test script library.  This library includes all the test scripts that make up your validation test script package.  During re-validation exercises, you have the ability to reuse the same test scripts for regression testing.

Given the rapid adoption of cloud, mobile and enterprise technologies in life sciences, a new approach to validation is required.  Yes, you can still conduct validation exercises on paper but why would you?  In the early days of enterprise technology, we did not have tools available that would facilitate the rapid development of validation test scripts.  Today, that is not the case.  Systems like ValidationMaster™ are available in either a hosted or on-premise environment.  These game-changing systems are revolutionizing the way validation is conducted and offering time/cost-saving features that make this process easier.   So why are you still generating test scripts manually?

Accelerating Validation in the 21st Century

Each day in companies across the globe, employees are being asked to do more with less.  The mantra of the business community in the 21st century is “accelerating business”.  You see it in marketing and all types of corporate communication.  In the validation world accelerating system implementation, validation and deployment is the cry of every chief information officer and senior executive.

Accelerating validation activity means different things to different people.  I define it as driving validation processes efficiently and effectively without sacrificing quality or compliance.  Most manual processes are paper-based, inefficient, and cumbersome.  Creating paper-based test scripts requires a lot of cutting/pasting of screenshots and other objective evidence to meet regulatory demands.  Getting validation information into templates with proper headers and footers in compliance with corporate documentation guidelines is paramount.

The question for today is “HOW DO WE ACCELERATE VALIDATION WITHOUT SACRIFICING QUALITY AND COMPLIANCE?”

Earlier in my career, I developed “Validation Toolkit” for Documentum and Qumas.  A Validation Toolkit was an early version of a validation accelerator.  Many companies now have them.  These toolkits come with pre-defined document templates and test scripts designed to streamline the validation process.  They are called “tookits” for a reason – they are not intended to be completed validation projects.  They are intended to be a starting point for validation exercises.

The FDA states in the General Principles of Software Validation; Final Guidance For Industry and FDA Staff issued on January 11, 2002, “…If the vendor can provide information about their system requirements, software requirements, validation process, and the results of their validation, the medical device manufacturer (or any company) can use that information as a beginning point for their required validation documentation…”

Notice, that the FDA says “… use a beginning point” for validation – NOT USE IT EXCLUSIVELY AS THE COMPLETE VALIDATION PACKAGE.  This is because the FDA expects each company to conduct its own due diligence for validation.  Also note that the FDA allows the use of such documentation if available from a vendor.

Beware that some vendors believe their “toolkits” can substitute for rigorous review of their software applications.  It cannot.  In addition to leveraging the software vendor information, you should ALWAYS conduct your own validation due diligence to ensure that applications are thoroughly examined without bias by your team or designated validation professional.

It is an excellent idea to leverage vendor information if provided.  There is a learning curve with most enterprise applications and the use of vendor-supplied information can help streamline the process and add value.  The old validation toolkits were delivered with a set of document templates and pre-defined test scripts often resulting in hundreds of documents depending on the application.  In most cases, companies would edit the documents or put them in their pre-defined validation templates for consistency.  This resulted in A LOT OF EDITING!  In some cases, depending on the number of documents, time-savings could be eroded by just editing the existing documents.

THERE IS A BETTER WAY!  

What if YOUR unique validation templates were pre-loaded into an automated Enterprise Validation Lifecycle Management system?   What if the Commercial-off-the-Shelf (COTS) product requirements were pre-loaded into this system?  Further, what if a full set of OQ/PQ test scripts were traced to each of the pre-loaded requirements?  What if you were able to export all of these documents day one in your unique validation templates?  What if you only had to add test scripts that represented the CHANGES you were making to a COTS application versus writing all of your test scripts from scratch?  Finally, what if you could execute your test scripts on line and automatically generate a requirements trace matrix and validation summary report with PASS/FAIL status?

This is the vision of the CloudMaster 365™  Validation Accelerator.  CloudMaster 365™  is the next generation of the validation toolkit concept.    The ValidationMaster Enterprise Validation Management system is the core of the CloudMaster 365™ Validation Accelerator.  The application can be either hosted or on-premise.  This is a great way to jump start your Microsoft Dynamics 365® validation project.

CloudMaster 365

The CloudMaster 365™ Validation Accelerator includes (3) key components:

  • ValidationMaster™ Enterprise Validation Management System
  • Comprehensive set of user requirements for out-of-the-box features
  • Full set of IQ/OQ/PQ test scripts and validation document templates

The Validation Accelerator is not intended to be “validation out of the box”.  It delivers the foundation for your current and future validation projects delivering “Level 5” validation processes.  If you consider the validation capability maturity model, most validation processes at level 1 are ad hoc, chaotic, undefined and reactive to project impulses and external events.

Validation Maturity Model

Level 5 validation is an optimized, lean validation process facilitated with automation.  The test process is optimized, quality control is assured and there is a specific measure for defect prevention and management.  Level 5 cannot be achieved without automation.  The automation is powered by ValidationMaster™.   With the CloudMaster 365™ strategy, instead of having “toolkit” documents suitable for only one project, you have a system that manages your COTS or bespoke development over time and manages the full lifecycle of validation over time as is required by regulators.  This is revolutionary in terms of what you get.

CloudMaster 365™ TRULY ACCELERATES VALIDATION.  It can not only be used for Microsoft Dynamics 365®, but is also available for other applications including:

  • CloudMaster 365™ Validation Accelerator For Dynamics 365®
  • CloudMaster 365™ Validation Accelerator For Merit MAXLife®
  • CloudMaster 365™ Validation Accelerator For Yaveon ProBatch®
  • CloudMaster 365™ Validation Accelerator For Oracle e-Business® or Oracle Fusion®
  • CloudMaster 365™ Validation Accelerator For SAP®
  • CloudMaster 365™ Validation Accelerator For BatchMaster®
  • CloudMaster 365™ Validation Accelerator For Edgewater EDGE®
  • CloudMaster 365™ Validation Accelerator For VeevaVault®
  • CloudMaster 365™ Validation Accelerator For Microsoft SharePoint®

Whether you are validating an enterprise application or seeking to drive your company to higher levels of efficiency, you should consider how best to accelerate your validation processes not solution by solution but in a more holistic way that ensures sustained compliance across all applications.  If you are embarking on the validation of Microsoft Dynamics 365®, or any of the applications listed above, the CloudMaster 365™ Validation Accelerator is a no-brainer.  It is the most cost-effective way to streamline validation and ensure sustained compliance over time.

Risk Management & Computer Systems Validation: The Power Twins

For as long as systems have been validated, risk is an inherent part of the process.  Although validation engineers have been drafting risk assessments since the beginning of computer systems validation, many do not understand how crucial this process is to the overall validation process.

Risk management and validation go hand-in-hand.  The ISPE GAMP 5® (“GAMP 5”) methodology is a risk-based approach to validation.  GAMP 5 recommends that you scale all validation life cycle activities and associated documentation according to risk, complexity and novelty.  As shown in the figure below, the key drivers for GAMP 5 is science-based quality management of risks.

Drivers for GAMP 5

From a systems perspective, quality risk management, according to GAMP 5 is “…a systematic approach for the assessment, control, communication, and review of risks to patient safety, product quality, and data integrity.  It is an iterative process applied throughout the entire system life cycle…”  The guidance recommends qualitative and quantitative techniques to identify, document and manage risks over the lifecycle of a computer system.  Most importantly, they must be continually monitored to ensure the on-going effectiveness of each implemented control.

Risk assessments may be used to drive the validation testing process.  In our practice, we focus on critical quality attributes and four types of risks:

  • Business Risks
  • Regulatory Risks
  • Technical Risks
  • Cybersecurity Risks

The last type of risk, cybersecurity risks, is one that has not gotten a lot of attention from validation engineers and is not explicitly mentioned in GAMP 5.  However, this type of risk represents a clear and present danger to ALL systems, not just validated ones.  Cyber threats are real.  Large-scale cybersecurity attacks continue to proliferate across many enterprises and cyber criminals are broadening their approaches to strengthen their impact.  You need to look at a holistic approach to risk that not only includes the traditional risk assessment from GAMP as highlighted in the figure below but one that includes and incorporates cybersecurity as a real threat and risk to validated systems environments.

Risk Assessment - gamp 5

Cyber threats most certainly may impact product quality, patient safety and even data integrity.  We incorporate these risks into our risk management profile to provide a more comprehensive risk assessment for computer systems.

ZDNet reports that “…As new security risks continue to emerge, cloud security spending will grow to $3.5 billion by 2021…”

ZDNet June 15, 2017

As life sciences companies increase their adoption of the cloud, new challenges for validate systems environment are emerging and the risks that go along with them.  I wrote a blog post recently called “Validation as we know it is DEAD”.  In this post, I addressed the challenges and opportunities that cloud and cybersecurity bring.  Although cloud security “solutions” will driving spending by 2021, the solution is not necessarily a technology issue.  You can attack cyber risks effectively with STRATEGY.

For validated systems environments, I am recommending a cybersecurity plan to combat hackers and document your level of due diligence for validated systems.  Remember in validated systems, “…if its not documented, it didn’t happen…”.  You must document you plans and develop an effective strategy for all of your computer systems.

Validation and risk are the power twins of compliance.  Risk management cannot only facilitate the identification of controls to protect your systems long term, it can help ensure compliance and data integrity as well as drive efficiencies in your lean validation processes.  In the conduct of your risk assessments, do not ignore cybersecurity risks – It is the elephant in the room.

Accelerating Validation: 10 Best Practices You Should Know in 2018

As we open the new year with resolutions and new fresh thinking, I wanted to offer 10 best practices that should be adopted by every validation team.  The major challenges facing validation engineers every day include cyber threats against validated computer systems, data integrity issues, maintaining the validated state, cloud validation techniques and a myriad of other issues that affect validated systems.

I have been championing the concept of validation maturity and lean validation throughout the validation community.  In my recent blog post, I highlighted reasons why validation as we have known it over the past 40 years is dead.   Of course I mean this with a bit of tongue in cheek.

The strategies included in the FDA guidance and those we used to validate enterprise systems in the past do not adequately address topics of IoT, cloud technologies, cybersecurity, enterprise apps, mobility and other deployment strategies such as Agile change the way we think and manage validated systems.

The following paragraphs highlight the top 10 best practices that should be embraced for validated computer systems in 2018.

Practice 1 – Embrace Lean Validation Best Practices

Lean validation is the practice of eliminating waste and inefficiencies throughout the validation process while optimizing the process and ensuring compliance. Lean validation is derived from lean manufacturing practices where non-value-added activity is eliminated. As a best practice it is good to embrace lean validation within your processes. To fully embrace lean it is necessary to automate your validation processes thus it is strongly recommended that automation be part of your consideration for updating your validation processes in 2018.  My recent blog post on lean validation discusses lean in detail and how it can help you not only optimize your validation process but achieve compliance

Practice 2 – Establish Cybersecurity Qualification and SOP

I have noted in previous lock posts that cyber threats were the elephant in the room. Although validation engineers pledge to confirm that the system meets its intended use through the validation process, many organizations do not consider cyber security is a clear and present danger to validated systems. I believe that this is a significant oversight in many validation processes. Thus, I have developed a fourth leg for validation called cyber security qualification or CyQ. Cyber security qualification is an assessment of the readiness of your processes and controls to protect against a cyber event. It includes testing to confirm the effectiveness and the strength of your controls. In addition to IQ, OQ, and PQ, I have added CyQ as the fourth leg of my validation. It is strongly recommended in 2018 that you consider this best practice.

Practice 3 – Automate Validation Testing Processes

In the year 2018, many validation processes are still paper-based. Companies are still generating test scripts using Microsoft Excel or Microsoft Word and manually tracing test scripts to requirements. This is not only time-consuming but very inefficient and causes validation engineers to focus more on formatting test cases rather than finding bugs and other software anomalies that could affect the successful operation of the system. Lean validation requires an embrace of automated testing systems. While there are many point solutions on the market that address different aspects of validation testing, for 2018 it is strongly recommended that you embrace enterprise validation management as part of your overall strategy. An enterprise validation management such as ValidationMaster manage this the full lifecycle of computer systems validation. In addition to managing requirements (user, functional, design), the system also facilitates automated testing, incident management, release management, and agile software project methodologies. In 2018, the time has come to develop a Reusable Test Script Library to support future regression testing and maintaining the validated state.

An enterprise validation management system will also offer a single source of truth for validation and validation testing assets.  This is no longer a luxury but mandatory when you consider the requirement for continuous testing in the cloud. This is a best practice whose time has come. In 2018, establish automated validation processes to help maintain the validated state and ensure software quality.

Practice 4 – Develop a Cloud Management For Validated Systems SOP

many of today’s applications are deployed in the cloud. Office 365, dynamics 365 and other such applications have gained prominence and popularity within life sciences companies. In 2018, you must have a cloud management standard operating procedure that provides governance for all cloud activities. This SOP should define how you acquire cloud technologies and what your specific requirements are. Failure to have such an SOP is failure to understand and effectively manage cloud technologies for validated systems. I have often heard from validation engineers that since you are in the cloud the principles of validation no longer apply. This is absolutely not true. In a cloud environment, the principles of validation endure. You still must qualify cloud environments and the responsibilities between yourself and the cloud provider must be clearly defined. If you don’t have a cloud SOP or don’t know where to start in writing one, write me and I will send you a baseline procedure.

Practice 5 – Establish Validation Metrics and KPIs

Peter Drucker said you can’t improve what you can’t measure. It is very important that you establish validation metrics and key performance indicators in 2018 for your validation process.  You need to define what success looks like. How will you know when you’ve achieved or objectives? Some key performance indicators may include the number of errors, the number of systems validated, the number of incidents (this should be trending downward), and other such measures. Our ValidationMaster™ portal allows you to establish KPIs for validation and track them over time. In 2018 it’s important to embrace this concept. I have found over the years that most validation activities are not looked at from a performance perspective. They are often inefficient and time-consuming and no one actually measures the value to the organization regarding validation or any key performance indicators. The time is come to start looking at this and measuring it as is appropriate.

Practice 6 – Use Validation Accelerators For Enterprise Systems

Across the globe validation engineers are being asked to do more with less. Organizations are demanding greater productivity and compliance. As you look at enterprise technologies such as Microsoft Dynamics 365®, you need to consider the importance of this environment and how it should be validated. There is a learning curve with any enterprise technology. You should recognize that although you may know a lot about validating a system you may not know specifically about this technology and the ends and outs of this technology. The FDA stipulates and their guidance that you may use vendor information as a starting point for validation. Understand that there’s no such thing as validation out of the box therefore you must do your due diligence when it comes to validated systems.

In 2018 we recommend that for systems such as Microsoft Dynamics 365, BatchMaster™, Edgewater Fullscope EDGE®, Veeva® ECM systems, Oracle e-Business®, and many others, that you use a Validation Accelerator to jumpstart your validation process. Onshore Technology Group offers Validation Accelerators for the aforementioned software applications that will help streamline the validation process. Most noteworthy, our validation accelerators come bundled with a full enterprise validation management system, ValidationMaster.  This helps to deliver a single source of truth for validation.

Practice 7 – Fill Staffing Gaps With Experts on Demand

In 2018, you may find yourself in need of validation expertise for your upcoming projects but may not have sufficient resources internally to fulfill those needs. A burgeoning trend today is the outsourcing of validation staff to achieve your objectives – validation staffing on demand. OnShore offers an exclusive service known as ValidationOffice℠ which provides validation team members on demand. We recruit and retain top talent for your next validation project. We offer a validation engineers, technical writers, validation project managers, validation test engineers, validation team leaders, and other such staff with deep domain experience and validation confidence. In 2018 you may want to forgo direct hires and use ValidationOffice℠ to fulfill your next staffing need.

Practice 8 – Establish Validation Test Script Library

One of the primary challenges with validation testing is that of regression testing. Once a system is validated and brought into production any software changes, patches, updates, or other required changes drive the need for regression testing. This often results in a full rewrite of manual, paper-based test scripts. Today’s validation engineers have tools at their disposal that eliminates the need for rewrite of test scripts and the ability to pull test scripts required for regression testing from a reusable test script Library. This is the agile, lean way to conduct validation testing. In 2018, you should consider purchasing an enterprise validation management system to manage a reusable test script Library. This is a process that will save you both time and money in the development and online execution of validation test scripts.

Your test scripts can be either fully automated or semi-automated. Fully automated test scripts can be run automatically with no human intervention. Semi-automated validation test scripts are used by the validation engineer to conduct validation testing online. It should be noted that ValidationMaster supports both. Check out demo to learn more about how this works.

Practice 9 – Develop a Strategy For Data Integrity

The buzzword for 2018 is data integrity. Europe just announced its GDPR regulations for data integrity.  The new regulations hold you accountable for the integrity of data in your system and the privacy of information housed therein.  In 2018 it is imperative that you have a data integrity strategy and that you look at the regulations surrounding data integrity to ensure that you comply. In the old days when we used to do validation, it was left up to each validation engineer when they would do system updates and how often they would validate their systems. Data integrity and cyber security as well as cloud validation demand strategies that include continuous testing. I discuss this in a previous blog post but it is very important that you consider testing your systems often to achieve a high level of integrity not only of the application but of the data.

Practice 10 – Commit to Achieve Level 5 Validation Process Maturity

Finally, your validation processes must mature in 2018. It is no longer acceptable or feasible in most organizations that are attempting to improve efficiency and compliance to simply continue with antiquated, inefficient paper-based systems. Therefore in 2018 it will be important for you to commit to Level 5 validation process maturity where your processes are automated and you have greater control over your validated systems and sustain quality across the validation lifecycle. Check out my other post on achieving Level 5 process maturity.

I wish you all the best in 2018 as you endeavor to establish and maintain the validated state for computer systems. From the development of Level 5 validation processes through automation it’s important to understand the changes that new technology brings and how well you and your organization can adapt to those changes. Your success depends on it! Good luck and keep in touch!

 

Understanding Lean Validation: A Practical Approach

Lean manufacturing has been with us since 1988.   Lean principles were derived from the Japanese manufacturing industry.  The “Lean” process was originally created and adopted by Toyota designed to eliminate waste and inefficiency in its manufacturing operations.  Lean processes led to the Toyota Production System (TPS) which is arguably one of the greatest manufacturing success stories of all time.  The focus of lean was the elimination of waste and inefficiencies throughout the manufacturing process.

To identify and eliminate waste from the production process, Toyota believed it was important to understand exactly the nature of waste and where it existed. While Toyota’s products significantly differed between factories, the typical wastes found in manufacturing environments were similar in nature. For each waste, Toyota developed an effective strategy to reduce or eliminate its effects, thereby improving overall performance and quality.  The process became so successful that it has been embraced in manufacturing sectors around the world. Today’s manufacturers have embraced the concepts and philosophies of lean.  Being lean is considered critical competitive advantage and strategic imperative.  It has made Toyota an automotive success story.

I have been a validation practitioner for over 30 years.  In the process of validating large engineering systems and life sciences quality and compliance management technologies, I have experienced first hand the waste involved in the validation processes.  From manually cutting and pasting screenshots into test cases to manually tracing requirements to test scripts through manual document route/review processes and rewriting scripts for regression testing, waste abounds in the validation process.  As I pondered my work over the past three decades, I began to think of a better way to validate computer systems.

In my initial research, I started to think of all of the wastes throughout validation processes and categorized them.

validation waste

Wastes in the validation process include planning process wastes, testing wastes, documentation wastes, defect wastes, wasteful requirement processes, quality and incident management wastes and wastes associated with the re-validation and re-testing of software applications.  As I began to ponder solutions to eliminate these wastes throughout the validation process, I embarked on the development of a Lean Validation strategy.  We endeavor to save our clients both time and money throughout the validation process.  OnShore Technology Group is the only company whose practice focuses on the the delivery of Lean Validation services.  Our products and services are uniquely designed to power lean validation processes.

Lean Validation

Using lean validation principles can result in the elimination of wasteful manual validation processes and significant improvements in validation efficiency, document cycle times, increased testing productivity and greater ability to identify and correct software defects leading to enhanced software quality, lower costs and improved regulatory compliance.

OnShore Technology Group has pioneered the principles and best practices of “Lean Validation” – the process of eliminating waste and inefficiencies while driving greater software quality throughout the validation process.  In addition to the delivery of expert lean validation services, OnShore’s flagship software application is known as ValidationMaster™ – the FIRST Enterprise Validation Management and Quality system designed to automate lean validation processes.  ValidationMaster™ delivers a single source of truth for any type of validation including software, equipment, process, cold chain, facility, and other types of validation projects.

ValidationMaster™ is also the first validation management system accessible via any Window, Apple or Android mobile device.  The system includes key features such as a validation dashboard, fully automated test script development and execution, automatic requirements traceability, custom report development and generation and a full range of quality management capabilities (training, audit management, change management, controlled document management, ISO, validation KPI’s, CAPA, nonconformance management and much more.

In 2017, OnShore Technology Group was recognized as the “Best IV&V Automation Solutions Provider” and the “Software Validation Testing Experts of the Year”.  In 2016, OnShore made the coveted annual list of CIOReview Magazine as one of the 20 Most Promising Pharma & Life Sciences Tech Solution Providers”.

Contact us today to learn more and see a live demonstration of ValidationMaster™.

Computer Systems Validation As We Know It Is DEAD

Over the past 10 years, the software industry has experienced radical changes.  Enterprise applications deployed in the cloud, the Internet of Things (IoT), mobile applications, robotics, artificial intelligence, X-as-a-Service, agile development, cybersecurity challenges and other technology trends force us to rethink strategies for ensuring software quality.  For over 40 years, validation practices have not changed very much.  Suprisingly, many companies still conduct computer systems validation using paper-based processes.  However, the trends outlined above challenge some of the current assumptions about validation.  I sometimes hear people say “… since I am in the cloud, I don’t have to conduct an IQ…” or they will say, “… well my cloud provider is handling that…”

Issues related to responsibility and testing are changing based on deployment models and development lifecycles.  Validation is designed to confirm that a system meets its intended use.  However, how can we certify that a system meets its intended use if it is left vulnerable to cyber threats?  How can we maintain the validated state over time in production if the cloud environment is constantly changing the validated state?  How can we adequately test computer systems if users can download an “app” from the App Store to integrate with a validated system?  How can we ensure that we are following proper controls for 21 CFR Part 11 if our cloud vendor is not adhering to CSA cloud controls?  How can we test IoT devices connected to validated systems to ensure that they work safely and in accordance with regulatory standards?

You will not find the answers to any of these questions in any regulatory guidance documents.  Technology is moving at the speed of thought yet our validation processes are struggling to keep up.

These questions have led me to conclude that validation as we know it is DEAD.  The challenges imposed by the latest technological advances in agile software development, enterprise cloud applications, IoT, mobility, data integrity, privacy and cybersecurity are forcing validation engineers to rethink current processes.

Gartner group recently announced that firms using IoT grew from 29% in 2015 to 43 % in 2016.  They project that by the year 2020, over 26 billion devices will be IoT-devices.  it should be noted that Microsoft’s Azure platform includes a suite of applications for remote monitoring, predictive maintenance and connected factory monitoring for industrial devices.  Current guidance has not kept pace with ever-changing technology yet the need for quality in software applications remains a consistent imperative.

So how should validation engineers change processes to address these challenges?

First, consider how your systems are developed and deployed.  The V-model assumes a waterfall approach yet most software today is developed using Agile methodologies.  It is important to take this into consideration in your methodologies.

Secondly, I strongly recommend adding two SOPs to your quality procedures – a Cybersecurity SOP for validated computer systems and a Cloud SOP for validated systems.  You will need these two procedures to provide governance for your cloud processes.  (If you do not have a cloud or cybersecurity SOP please contact me and I will send you both SOPs.)

Third, I believe you should incorporate cybersecurity qualification (CyQ) into your testing strategy.  In addition to IQ/OQ/PQ, you should be conducting a CyQ readiness assessment for all validated systems.  A CyQ is an assessment to confirm and document your readiness to protect validated systems against a cyber attack.  It also includes testing to validate current protections for your validated systems.  It is important to note that regulators will judge you on your PROACTIVE approach to compliance.  This is an important step in that direction.

cyq-1

Forth, you should adopt lean validation methodologies.  Lean validation practices are designed to eliminate waste and inefficiency throughout the validation process while ensuring sustained compliance.

Finally, the time has come for automation.  To keep pace with the changes in current technology as discussed above, you MUST include automation for requirements management, validation testing, incident management and validation quality assurance (CAPA, NC, audit management, training, et al).  I recommend consideration of an Enterprise Validation Management system such as ValidationMaster™ to support the full lifecycle of computer systems validation.  ValidationMaster™  allows you to build a re-usable test script library and represents a “SINGLE SOURCE OF TRUTH” for all of your validation projects.  Automation of the validation process is no longer a luxury but a necessity.

Advanced technology is moving fast.  The time is now to rethink your validation strategies for the 21st century.  Validation as we know it is dead.  Lean, agile validation processes are demanded to keep pace with rapidly changing technology.  As you embrace the latest cloud, mobile and IoT technologies, you will quickly find that the old ways of validation are no longer sufficient.  Cyber criminals are not going away but you need to be ready. Step into LEAN and embrace the future!

 

Automated Validation Best Practices

Automation is the key to lean validation practices.  Although many validation processes are still paper-based manual processes, there are best practices that support Independent Verification and Validation (IV&V) processes that drive efficiency and compliance.

BEST PRACTICE 1 – Establish Independence

The IEEE 1012 Standard For System, Software and Hardware Verification and Validation states that Independent Verification and Validation (IV&V) is defined by three parameters:

  1. Technical Independence – ensures independence from the development team.  Technical independence is intended to provide a fresh point of view in the examination of software applications to help better detect subtle errors that may be overlooked by those that are too close to the solution such as the development or system implementation team.
  2. Managerial Independence – helps to ensure that an organization separate and distinct from the development or program management team.  Managerial independence ensures that the validation team has the autonomy to independently select the validation methodology, processes, schedule, tasks, and testing strategy to independently confirm the suitability of applications for their intended use.  Managerial independence also ensures that the IV&V team can objectively report all validation test results without any restrictions or approval from the development team or system integration team.  This is a very important level of independence.
  3. Financial Independence – ensures that there are no financial ties between the IV&V team and development team to ensure objectivity.  This level of independence is designed to prevent situations where financial ties may adversely influence or pressure IV&V personnel to deliver less than an objective, authentic test results.

The IEEE 1012 standard speaks of various forms of independence but the bottom line is that the IV&V team should be as independent as possible from the development team.  It is not a good best practice for development teams to also validate their own development projects.  Objectivity is sacrificed when this is done.  Following this best practice ensures objective examination of your software projects free of bias and undue external influence from the development team.

BEST PRACTICE 2 – Continuous Testing In The Cloud

Cloud environments can be validated.  However, there are several issues and characteristics of cloud environments that challenge traditional assumptions regarding validation efforts.

  • Continuous changes in the cloud
  • Inability to conduct supplier audits for large cloud vendors (Microsoft, Oracle, et al)
  • Maintaining the Validated State

Cloud system environments continuously change.  Validation engineers are not used to uncontrolled changes in system environments.  We have been taught that all changes to a system environment once it has been validated must undergo change control.  Thus all changes are subject to a change request process.

In cloud environments, we don’t control when changes are made to systems. Cloud vendors may change disk drives, virtual servers, apply patch updates, and memory and many other system changes that may affect your validated system environment. So the question becomes how do you maintain the validated state in the cloud? There are several best practices designed to answer this question. First of all, you need a way to determine what changes are made in the cloud. Take for example Microsoft office 365 or Microsoft dynamics 365. Microsoft has established what is known as a trust center. The Microsoft trust center is an excellent resource and it provides information about how Microsoft examines its cloud environment. The first consideration you should look at when selecting cloud technology is who your provider is. All cloud providers are not created equal. There are some cloud providers that take compliance, security, data integrity and governance seriously and those who are more general or consumer oriented in nature and do not prioritize these characteristics.

Microsoft, continuing the example, has achieved several key industry certifications for their cloud environment. But most importantly, through the trust center they have provided visibility and clear communications as to how they manage the cloud. From a testing perspective Microsoft solves one of the biggest problems you have in the cloud and that is the question of how do you know what changes were made in the cloud and when the cloud provider made them. Microsoft provides a list of updates byproduct application and tells you exactly what changes were made, the date that the changes were made and if the updates or patches were successfully applied in the environment. With the Microsoft cloud it’s no longer the case that you don’t know when Microsoft has changed the environment. Thanks to their transparency, you know exactly what changes are made to the environment which brings us to the best practice of continuous testing.

Since cloud environments change so often you should employ a strategy known as continuous testing. Continuous testing is essentially validation testing at predefined (user-defined) intervals to ensure that cloud environments are maintained in a validated state. To successfully employ a continuous testing strategy, automation is essential. This is not a process that you would want to carry out manually although it can be carried out manually if you so desire. Automation adds a dimension of efficiency and consistency in the environment.

To employ continuous testing you want to establish a reusable test script Library. This is essential. Once you have validated your system using automated tools such as ValidationMaster™, you will have established a reusable test script Library. The test scripts developed can be used for subsequent regression testing and can be automated to save both time and money. For continuous testing you would conduct an impact analysis to determine the impact of changes that are made to the cloud environment. Once you conduct an impact analysis, you would want to do a risk assessment to ensure that you effectively monitor risk in accordance with ISPE GAMP 5®.  You then want to select from among your reusable test script Library regression tests suitable for continuous testing in your cloud environment. Once these test scripts are executed you can document the actual results and provide a level of due diligence for regulators that you are maintaining your cloud environment in a validated state.

BEST PRACTICE 3 – Select The Right Automation Tools

Another key best practice is selecting the right automation tools for validation. How do you know how to select the right tools? There are two types of automated tools on the market: (1) point solutions and (2) enterprise solutions. The point solution is one that addresses a single element of the validation process. For example, the management of requirements is an essential core component of any validation exercise. There are requirements management point solutions on the market that would assist you in effectively managing user, functional, and design requirements for any validation initiative. Testing is another core element of the validation process. There are many solutions out there that would allow you to capture and record test scripts and some even allow you to execute test scripts online. The problem with point solutions is that they only provide one step in the process. When validating systems it is not common for you to use up words of 17 different systems (point solutions) to prepare validation documentation and due diligence. This does not seem to make much sense and is often fraught with duplication of effort and inefficiencies that cost time and money.

To drive lean validation processes and to achieve automation best practice, you need an enterprise validation management solution to fully automate the validation process – not just one part of it. An enterprise validation management system has the capability of managing validation planning documentation such as the validation master plan, risk assessment, validation project plan, and other related documentation.

As a matter of fact an enterprise validation management system includes an enterprise content management system as a core component of the overall solution. The key deliverables from the validation process are documents. Lots of documents! It stands to reason that an enterprise content management system would be an overall core part of the solution. An enterprise validation management system should also include a requirements management system. It should have the ability to manage any type of requirements. An automated test engine should be at the core of such a solution. The automated test engine should have the ability to not only record test scripts but execute test scripts online and capture objective actual results.

The system should have a robust reporting engine that facilitates the efficient output of any type of report required as part of the due diligence for validation. Quality management is at the core of the validation process. Therefore an enterprise validation management system should include capabilities for all aspects of quality including change control, audit management, CAPA, nonconformances, training, periodic review, trend analysis, and validation key performance indicators. The system should provide real time, statistics on the overall health and performance of validation processes. The system should have at its core foundation standard technology adaptable in any systems environment.

It is best practice to select and deploy the RIGHT tools to support enterprise validation processes.  Point solutions will only get you so far.  Selecting the proper automation tools can save both time and money and deliver a single source of truth for your validation projects.

BEST PRACTICE 4 – Establish a Reusable Test Library

One of the most laborious tasks during software validation is TESTING.  The test script development, execution and documentation process takes considerable time if you do it correctly.  From the establishment of a test environment through the development of test scripts, the validation engineer must carefully document expected and actual results sufficient to prove that systems meet their intended use and have the requisite quality expected of such systems.

Developing test scripts takes time.  Traceability also takes time.  When systems are validated, you want to have the ability to retest a system as required but not have to rewrite test scripts over and over again.  A reusable test script library has been one  of the most effective practices I have implemented.

Reusable test scripts can save up to 60% of time which may be required to rewrite test scripts. Establishing a reusable test script library with FULLY AUTOMATED scripts can save even more time and money in that the fully automated scripts can be executed without human intervention.  You have the ability to set a date/time when test scripts are to be executed and the system (ValidationMaster®) will automatically execute them and report the results back in a fraction of the time it takes to manually execute them.  It is therefore best practice to establish a reusable test script library for enterprise validated systems.

BEST PRACTICE 5 – Document Clear Objective Test Evidence

Documentation of clear, objective test evidence is essential for validation. Many of the automated validation management systems do not have reporting engines that are robust enough to allow you to report documents in your unique format. It is best practice to employ an enterprise validation management system that allows you to present clear objective test evidence in your unique document formats as specified by your SOP’s. ValidationMaster™ has a comprehensive reporting engine that allows you to deliver validation reports in your unique format as is required by this best practice.

.

BEST PRACTICE 6 – Establish a Single Source of Truth For Validation Deliverables

For many organizations that conduct validation on paper, there doesn’t exist a single source of truth for validation. Some validation assets are housed within the document management system. A part of the validation package are kept within a code management systems such as SourceSafe. Part of the deliverables may be kept within a requirements management system. Some incident reports may be kept in an incident management system. Other validation deliverables may be paper-based. In many cases validation engineers attempt to keep sign copies of documentation and multiple three ring binders. This was the traditional practice in the ‘80s.

For lean validation practices that support automation, it is best practice to establish a single source of truth for all validation deliverables. This means is that there is a single point where all validation deliverables pre-and post-execution are stored.  A single source of truth facilitates better auditing and eliminates the common occurrence of loss of documentation to support an audit exercise. This best practice is essential to achieving validation excellence.

The core benefits of ValidationMaster™ is to deliver a single source of truth for all validation projects. You can store all of your validation projects in a single, easy-to-use system and reference it for internal or external audits. For lean validation this is current best practice.

Following the six key best practices above can save both time and money. Validation has not changed much over the last 40 years but the way we manage it has changed significantly. Cloud validation, mobility, cyber security and a host of other factors change the way we look at computer systems validation and manage it. I hope you find these best practices useful and effective in helping you to deliver your next validation project on time and within budget.  We use ValidationMaster™ in our practice every day to support our lean validation processes saving our clients considerable time and money.  What’s in your validation office?

The True Meaning of Software Quality

As a long-time validation engineer, I often ponder questions such as “what does it mean to achieve software quality and is it sustainable over time?”  I ask myself these questions because in today’s systems environments, there are many factors that can impact software quality assurance.

Cyber threats are the elephant in the room.  Most validation projects include IQ/OQ/PQ and UAT testing but do not address cyber threats at all.  Can you really ensure that your validated environments are safe and secure without considering cybersecurity as part of your overall validation strategy?  The International Software Testing Qualifications Board (ISTQB) defines software quality as “…The totality of functionality and features of a software product that bear on its ability to satisfy stated or implied needs…”  Another definition is “…the degree of conformance to explicit or implicit requirements and expectations…”  Finally, IEEE calls software quality “…The degree to which a system, component, or process meets specified requirements, customer, user needs or expectations…”  As shown by the definitions above, software quality is somewhat subjective.

Data integrity is also a critical concern for validated systems.  It is also a key imperative for software quality.  Data integrity is a hot topic lately and generally refers to the accuracy and consistency of information stored in corporate databases, data warehouses or other such constructs.  Data integrity ensures that information is accurate and reliable and in today’s environments, legally defensible.   The accuracy and trustworthiness of data within your systems MUST NOT be in question.

Why is data integrity so important?  Because companies make decisions routinely bases on information housed within corporate databases.

The lack of data integrity over the lifecycle of a system could cause adulterated product to get to the market, incorrect shipping of controlled materials/substances, and a wide variety of  issues affecting the quality, safety and efficacy of a company’s products.  Data integrity is not the purview of technology alone.  To manage data integrity in the broadest sense requires people, processes and technology.

The ALCOA principle as highlighted in the figure below requires that data be attributable to the individual responsible for recording the data/activity.  The “L” in ALCOA means that information must be clear and legible after it is recorded and permanent.  The “C” in ALCOA means that the data must be recorded at the time it was generated.  The “O” means data must be preserved in a unaltered state.  The final “A” in ALCOA means that data must be accurate and reflect the action or observation made.  Modifications must be explained if they are not self-explanatory.

ALCOA picture

No matter what the definition, software quality is all about providing assurance that a system is suitable for its intended use in some way.  We confirm this through testing.  However, it should be noted that testing alone cannot in and of itself ensure software quality.  Testing merely provides a level of assurance or confidence in a software application under specific controlled conditions.

You cannot discuss software quality without a discussion on data integrity.  To derive the true meaning of software quality it is important to consider the following key activities:

  • Establish SOPs That Provide Governance For Software Quality Assurance and Data Integrity
  • Document Everything (if its not documented, it didn’t happen)
  • Establish a Rigorous Software Change Management Process
  • Attain Level 5 Validation Processes Through Automation
  • Enforce Standards For Testing and Documentation
  • Identify Track and Manage Software Quality Metrics and KPIs
  • Conduct Positive and Negative Software Testing

The first step on your way to software quality and data integrity is to establish and follow procedures that provide governance over the process.  You must have procedures that cover everything from validation to data integrity, automation, and everything in between.  Secondly, you must document everything you do to ensure software quality and integrity.  Third, you must establish a rigorous software change management process that helps track and manage all changes made to a cloud-based or on-premise system and who made the changes and why.

Forth, you must drive your organization to Level 5 validation processes.  This is derived from the validation capability maturity model as illustrated in the figure below.

Validation Maturity Model

Level 5 validation means your processes are automated and optimized in a way to ensure quality and compliance.  Fifth, you must enforce all standards for testing and documentation.  This will also require Level 5 automation to achieve your objectives. Sixth, you must identify and track software quality metrics.  You cannot achieve what you don’t measure.  Peter Drucker often said “… you can’t manage what you can’t measure…”  He also said “… what gets measured gets improved…”  You must identify and track metrics to ensure you stay on track.

And finally, in all of your validation testing, conduct positive and negative testing against applications.  The FDA states in the General Principles of Software Validation; Final Guidance For Industry and FDA Staff issued on Jan 11, 2002, that “… A good test case has a high probability of exposing an error; A successful test is one that finds an error…”  This may be somewhat counter-intuitive but I am often stunned at how many validation test scripts are written so that they PASS rather than written to discover an error.  A good software test will reveal errors if written correctly.  When I interrogate applications, I often am looking to reveal problems that may arise during production.

It has been often said that software quality is no accident.  It is the deliberate result of intelligent planning, hard work and rigorous execution.

Software quality is NOT error or bug-free software.  It is about software that is of high quality and sufficiently meets the demands and expectations of the end user community.  AUTOMATION IS KEY.  Automated testing helps easily replicate tests, increases test coverage, reduces errors, improves consistency, and delivers automated traceability enabling more software defects to be discovered and addressed.

The issues surrounding software quality and data integrity are increasing across the globe.  Your organization must be ready to deal with the challenges presented by these issues.  WILL YOUR ORGANIZATION BE READY ?- Think about it.