Best Practices to Achieve Computer Software Assurance with Digital Validation: FAQ

30 Nov 2022

The U.S. FDA is encouraging life sciences manufacturers to adopt a risk-based approach in assessing production and quality software. The agency released guidance on computer software assurance (CSA),(1) an approach designed to reduce drag on computer system validation (CSV) so companies can focus on patient safety and product quality. But what is CSA and what do manufacturers need to do to adopt this new approach? Kneat’s CSV expert Darren Geaney answered these questions and others, in ‘Achieving Computer Software Assurance with Digital Validation’.


In this webinar, Darren dove deep into CSA, how it’s different from CSV, and what companies can do to efficiently adapt their programs. of the webinar today. Watch our on-demand recording of the webinar today.

Does CSA apply to Pharma GxP? What’s the best way to leverage supplier documentation for compliance? Read on for answers to pressing CSA questions from an expert in the industry:

Q: What additional competencies are required with CSA for resources working in CSV projects?

A: I would look at the following competencies that should be developed in an organization when developing your CSA methodology:

By using a risk-based approach realize that you are now going to be working from a quality-centric (not compliance-centric) mindset. You will no longer simply be performing a task or creating an artifact (for example a design document) simply because your process states you need to create the artifact. Critical thinking should be respected as a core competency with organizational capability continually improved.

Scientific knowledge of the system specifications and the business process being supported to enable appropriate risk assessment. Include subject matter experts during the specification and design phase of the project.

An in-depth knowledge of supplier audits to allow you to leverage supplier documentation.

Look at the use of CSV tools to automate assurance activities. The FDA does not intend to review validation of support tools. The manufacturer determines assurance activity of these tools for their intended use.

Q: How do you exhibit physical test evidence (i.e., a label) under CSA?

A: In relation to adding test evidence where test evidence is deemed to be required, I suggest adding a section or column to your test where you can add objective evidence. If you are executing the test in Kneat, for example, you can take an image of the label and add it as an image to the executed test step.

The CSA guidance provides some leeway for manufacturers regarding the use of supporting evidence:

Documentation of assurance activities need not include more evidence than necessary to show that the software feature, function, or operation performs as intended for the risk identified. FDA recommends the record retain sufficient details of the assurance activity to serve as a baseline for improvements or as a reference point if issues occur.”  

Manufacturers may use alternative approaches and provide different documentation so long as their approach satisfies applicable legal documentation requirements.

Q: How does FDA or other regulatory bodies feel about leveraging vendor testing documentation as leveraging on the supplier is always a challenge as it often requires an audit which also comes with a cost. Risk-based approach could suggest questionnaire or mail assessment, but do you think it provides sufficient assurance about the quality of their internal testing/documentation?

From the Draft Computer Software Assurance guidance “Established purchasing control procedures for selecting and monitoring software developers. For example, the manufacturer could incorporate the practices, validation work, and electronic information already performed by developers of the software as the starting point and determine what additional activities may be needed. For some lower risk software features, functions, and operations, this may be all the assurance that is needed by the manufacturer.”

Leverage existing activities and supplier data. Do not reinvent the wheel; take credit for work already done where you can. Let the vendor know that you may be looking for their Software Development Lifecycle Documentation (SDLC) deliverables as part of the supplier agreement for the system being purchased.

Remember, as the manufacturer you will be the one defending the approach you took when auditing your vendor, so you will need a clear and concise assessing strategy to defend your auditing approach.

In relation to testing for a specific business requirement, I would look at the risk to the patient or product quality of the functions/features/operations being implemented by the system. If, for example it’s a GxP impacting system, look at the risk associated with the system to guide you on the depth of audit required. For example, is it a system which contains functions/features/operations that control parameters that are critical to quality? Has the supplier tested it at your processes defined Upper and Lower limits? You may decide to have an onsite audit review of the supplier’s documentation and observe that the supplier did not verify that the function/feature/operation operates at your upper or lower limit. Therefore, you may decide to conduct further testing to ensure that the function/feature/operation will continue to operate at your process’s parameter limits.

For example, for High-Risk functions/features/operations, you may decide to examine the objective evidence generated by the supplier as detailed above and if required perform some additional testing, but for those functions/features/operations assessed as Not High Risk, no further testing by the manufacturer may be required. It will be for the manufacturer to determine if further testing of functions/features/operations is required supported by clear and unambiguous rationale explaining how critical thinking was applied based on possible impact of the system operations to the patient or product quality.

Q: How do you suggest dealing with industry leaders being adamant (and very bad) at sharing software development lifecycle (SDLC) evidence with respect to specific critical product or product version?

A: Considering the above answer, if proprietary information related to the software being supplied by the vendor is in a design document for example, then not sharing that document with the manufacturer is justifiable.

However, if a vendor is unwilling to provide the reports and testing evidence related to a product version, how do you justify it in front of an auditor or internally to your quality group? I recommend leveraging existing activities and supplier data. Do not reinvent the wheel; take credit for work already done where you can. Let the vendor know that you may be looking for their documentation as part of the supplier agreement or when auditing them. As the industry starts adopting the CSA approach, I think vendors who are not willing to share their SDLC evidence will need to change their approach.

Q: How can companies adopt CSA regarding unscripted or ad-hoc test templates with so few tools available?

A: You cannot really apply the unscripted and ad-hoc testing approach for test automation as by its very nature you need scripted testing for automated test automation. In the CSA presentation, I provided examples of Ad-hoc and Unscripted Testing that can be created in Kneat. While not automated it allows the user to execute unscripted and ad-hoc testing electronically.

An option could be to complete the script using the appropriate automated tools on the appropriate features/function/operations and completing the ad-hoc and unscripted testing using another system such as Kneat. The corresponding validation report should then indicate the location of the test evidence that has been completed by both the automated test tools and the completed ad-hoc and unscripted testing approaches.

Q: Would electronic signature software (such as Adobe e-sig) be considered low risk applications and therefore not requiring scripted testing?

A: From the guidance: “Establish purchasing control procedures for selecting and monitoring software developers. For example, the manufacturer could incorporate the practices, validation work, and electronic information already performed by developers of the software as the starting point and determine what additional activities may be needed. For some lower risk software features, functions, and operations, this may be all the assurance that is needed by the manufacturer.”

Page 21 of the CSA guidance provides a good example of using an electronic signature in a system. The intended use of the electronic signature function is to capture and store an electronic signature so that it meets compliance requirements. The guidance gives an example on further verification activities that could be conducted by the manufacturer, as failure of the electronic signature functionality would not “foreseeably lead to compromised safety.”

So, unscripted testing would be a suitable approach here provided as detailed on page 21 of the CSA guidance “The manufacturer has performed an assessment of the system capability, supplier evaluation, and installation activities.”. However, you could also argue that it’s a medium Risk based on the GAMP 5 guidelines(2) as well. Refer to the Audit Trail function detailed in Table 11.4 “Example Risk Assessments for Medium and High-Impact Functions” on page 126 of the GAMP 5 Guidelines, and you could then use an unscripted testing approach.

Q: How do we determine whether automated software test tools (from the web) need to be qualified?

A: I believe that the FDA does not intend to review validation of support tools. It will be up to the manufacturer to determine assurance activity of these tools for their intended use.

Scope of 21 CFR 820.70(i) is applied when computers or automated data processing systems are used as part of production or quality system.

But in relation to deciding if qualification is required, I would look to establish a process that is applicable to software that is being utilized in either systems design, development, and deployment processes, the quality system or product/service-related business applications.

I would recommend that the process asks initial questions relating to the automated software test tool. You might consider some of the following questions in an initial checklist:

Does the software have appropriate certifications?

 Does the software have appropriate references? Are similar organizations/industries to yours using it?

Is the software used for validation of your system?

Is the software produced by a reputable source? How long is the software operational?

Does the software have sufficient documentation for the user to refer to? E.g., User Guides

You could then look at the risk associated with the use of the test tool. For example, you could consider assessing the following: Does this automated software test tool pose any potential risk to the development, testing, configuration or deployment of the software you are testing. I would the look at several other risks as well along the lines of risk to legal regulatory obligations, contractual (NDS for example) and information security.

Based on the response to the initial questions and how you answer the example possible risks as outlined above, a manufacturer should decide if the tool needs to be qualified.

Q: What are your thoughts on using simulations to ensure programming and code is accurate and effective?

A: I would always advise using simulations in testing various scenarios to ensure the code is accurate and effective. For example, if a feature/function/operation controls a critical to process parameter, test the system when the process parameter is at its lower, mid, and upper limits as well as above the upper limit and below the lower limit to simulate alarm and error conditions.

FDA Guidance supports this:

“The validation must be conducted in accordance with a documented protocol, and the validation results must also be documented. (See 21 CFR §820.70(i).) Test cases should be documented that will exercise the system to challenge its performance against the pre-determined criteria, especially for its most critical parameters. Test cases should address error and alarm conditions, startup, shutdown, all applicable user functions and operator controls, potential operator errors, maximum and minimum ranges of allowed values, and stress conditions applicable to the intended use of the equipment. The test cases should be executed, and the results should be recorded and evaluated to determine whether the results support a conclusion that the software is validated for its intended use.”

Q: Does Kneat e-Validation have a template for CSA configuration, for those who would implement both Kneat and CSA at the same time (i.e., best practice CSA configuration)

A: When digitizing a customer process, in this case a customer’s CSA process, Kneat takes the customer’s templates and digitizes them when mapping the process in Kneat. The customer reviews and approves the digitized templates prior to use. The examples of scripted, unscripted and ad-hoc test templates in the CSA presentation were based on a best practice approach to ensure, for example, each requirement is traced to the appropriate test script. Kneat ensures the design of templates work for the customer when deploying any process in Kneat.

Q: What are the risks of having both CSV & CSA procedures concurrently?

A: Assuming both CSV and CSA processes meet regulatory requirements, there’s no risk to either product or patient. However, adopting a CSA approach will lead to significant savings in terms of streamlined processes through leveraging supplier documentation (including testing) and risk-based assessments.

That said, a team should take care to apply critical thinking detailed in CSA across all features, functions, and operations.

Q: What is the applicability of CSA to Pharma GxP system validation in regards to GAMP 5 and a seeming focus on medical devices?

A: As you correctly point out the FDA CSA guidance provides recommendations on computer software assurance for computers and automated data processing systems used as part of medical device production. However, per the published CSA FDA guidance the CSA guidance was developed in conjunction with the Center for Biologics Evaluation and Research (CBER).(3) I believe CSA principles can be applied to Pharma GxP systems as if you look at the footer of page 4 of the draft CSA guidance you will see the following:

1“This guidance has been prepared by the Center for Devices and Radiological Health (CDRH) and the Center for Biologics Evaluation and Research (CBER) in consultation with the Center for Drug Evaluation and Research (CDER), Office of Combination Products (OCP), and Office of Regulatory Affairs (ORA).”

And I also believe that CSA guidance can apply to Pharma GxP systems based on the following:

From the latest GAMP 5 Guidance Section 1 introduction:

“GAMP guidance aims to safeguard patient safety, product quality, and data integrity in the use of GxP computerized systems. It aims to achieve computerized systems that are fit for intended use and meet current regulatory requirements by building upon existing good practice in an efficient and effective manner.”

The latest edition of GAMP 5 the Section 1.4 Scope states, the following:

“This Guide applies to computerized systems used in regulated activities covered by:

Good Manufacturing Practice (pharmaceutical, including Active Pharmaceutical Ingredient (API), veterinary, and blood).”

GAMP 5 also states in section 20.3.3 Quality Risk Management:

“Critical thinking should be applied to ensure that [14]:

“The evaluation of the risk to quality should be assessed on scientific knowledge and ultimately link to the protection of the patient, and…The level of effort, formality and documentation of the quality risk management process should be commensurate with the level of risk.”

From GAMP 5 Section 3.4 Critical Thinking Through the Lifecycle

“Regulators look for scaled paperwork with well-organized information and records that have an appropriate level of detail, supported by clear and unambiguous rationales explaining critical thinking applied.”

From subsection Section 20.3.6 of the Critical Thinking section 20 Appendix M12, of the GAMP:

“Effective evidence is crucial when determining that a computerized system is fit for its intended use (See Appendix 5 for guidance on testing). Critical thinking should facilitate the testing of functions and features with appropriate risk priorities that reflect:

The potential impact on patient safety, product quality, and data integrity (higher -risk functions may require greater test rigor and level of documentation).”

Section 25.5.2 Test Cases of GAMP gives details of the Test Assurance Approaches i.e., scripted, unscripted, and ad-hoc testing that very much reflects the table on page 16 of the FDA CSA guidance.

I believe that if an organization applies critical thinking as detailed in the CSA guidance, and as detailed in the latest GAMP 5 guidelines (remembering that at its heart CSA requires us to use critical thinking assesses the functional risk to patient safety and product quality), then I feel that the CSA approach can be also applied to Pharma GxP system validations for computerized systems when supported by a clear and unambiguous critical thinking approach.

Q: How do you present Kneat-validated info required during Regulatory Compliance Audits, e.g., FDA?

A: Kneat is a SaaS platform which allows Auditors to gain controlled access to the required Kneat Instance. This allows the audit team to view content directly in Kneat. However, if this is not the required approach, Kneat allows for paperless handover, allowing the customer to export records to PDF, which will include the parent document along with all child forms and attachments.

Learn more in our on-demand webinar ‘Achieving CSA with Digital Validation’

Join CSV specialist, Darren Geaney, to walk through the changes and what they mean for validating computer systems.

Attend and learn:

  • What the FDA’s September 2022 CSA guidance means for validating computer systems
  • How CSA reduces documentation burden and issues in Production through unscripted and ad-hoc testing
  • How to shift from CSV to CSA through risk-based critical thinking
  • How to ‘scale assurance’ by leveraging other activities
  • Efficiencies and assurances of performing CSA using a digital validation system [case study]

About the presenter

                                      

Darren Geaney, BEng, Kneat
A Computer Systems Validation specialist, Darren has over 23 years’ experience in software validation, providing right-sized computer system validation solutions to medical device companies. Knowledgeable in regulations FDA 21 CFR Part 820, 21 CFR Part 11, ISO 62304 and ISO 14971, Darren is ‘Lead Auditor’ accredited and experienced in supporting both internal and external audits (including FDA, IMB, TUV, and BSI).

 


References

  1. Computer Software Assurance for Production and Quality System Software, [https://www.fda.gov/regulatory-information/search-fda-guidance-documents/computer-software-assurance-production-and-quality-system-software].
  2. GAMP 5 Guide 2nd Edition, [https://ispe.org/publications/guidance-documents/gamp-5-guide-2nd-edition].  
  3. Center for Biologics Evaluation and Research (CBER), [https://www.fda.gov/about-fda/fda-organization/center-biologics-evaluation-and-research-cber].

Sign up to our Newsletter

Contact

Talk to us

Find out how Kneat can make your validation easier, faster, and smarter.
Start your paperless validation revolution by speaking to our experts.

Europe: +353-61-203826
USA: +1 888 88 KNEAT
Canada: +1 902 706 9074