Questions All Auditors Should Ask: The Use And Misuse Of Audit Checklists
By Laurie Meehan, Polaris Compliance Consultants, Inc.
There’s nothing wrong with using a good checklist, as long as you remember that there will always be something wrong with your checklist. It is simply not possible to develop a checklist that will get to the core of every problem, that will cover every scenario, or that will ever be any reasonable substitute for that all important question: "Why?" How can you develop the best checklists possible and avoid relying on them too heavily?
Why We Use Them
Of course, checklists serve a number of purposes:
- They can be a very useful auditing tool, especially when you have a lot of processes and information to review, a large facility filled with equipment to tour, and a lot of people to talk to in a short space of time
- Items on an auditor’s checklist are likely to be aligned with regulations or a specific audit report format, or both. This makes verifying compliance and preparing the final deliverable easier and more systematic.
- Checklists jog your memory, so you don’t forget an important detail to verify.
- Checklists can promote consistency across your auditing program so different auditors will follow similar procedures when qualifying vendors and conducting QA audits.
Since checklists clearly provide value, how can we make the best use of them?
Ask the Real Questions
We’ve seen checklists for CSV Audits that ask the question, “Do you have a Traceability Matrix?” That’s not a bad question, but what an auditor really wants to know is “How do you know you’ve tested everything?” The same checklist may ask, “Is the system validated?” But the real question is “How do you know the system works and is under control?” It could include the question, “Do you have screen shots for your testing?” A better question is, “What evidence do you have of actual results?” In each case, the original checklist question was too specific and presumed to know how the vendor would achieve the desired result; the real questions make no assumptions, are more open-ended, and invite discussion. Discussion helps ensure that findings are actually findings, promotes follow-up questions, and creates an opportunity for education – in both directions.
An example from the GCP world involves the inspection of a study site’s drug storage area. The auditor was working from a checklist that asked, “Does the drug storage area have restricted access?” About to check “yes” upon seeing locks on the door, she realized this was not the right question when she noticed the circuit breaker box on the wall. It contained all of the circuit breakers for the entire research suite. Problem: the set of people who need access to a drug storage area are probably not the same set of people who need access to a facility’s circuit breakers. The subsequent Q&A revealed that, in addition to the research director, the Principal Investigator, and the study coordinators, key holders included the maintenance crew, some of the housekeeping staff, and the landlord. So the question really should have been, “Who has keys/access to the drug storage area?”
Even at that, there would still be a number of follow-up questions the auditor would need to ask to get a complete picture. The trouble is that it’s not a simple sequential list of questions; it more closely resembles a decision tree. How many people have access to the drug storage area? How do they gain access? If keys are used, are they individually issued or shared keys? If they are shared, where are the keys stored? What happens if someone loses a key? Maybe a keypad is used for access. If so, who has the access code? How often is it changed? What happens if someone leaves the company? Is the access code on a sticky note somewhere? Or maybe, access is controlled by issuing badges that get scanned at the point of entry...
Remember that Checklists Are Not Training
I’m not trying to insult anyone’s intelligence; I know you know you have to train your staff. Still, over time and despite best intentions, a checklist can easily morph from helpful job aid to training substitute.
An example from the GMP arena involves the investigation of test results that showed an API batch failed to meet its final chemical specification. The team who investigated the problem determined that a valve position was hindering the steady flow of raw material, accounting for its low concentration in the product. The QA department had recently conducted an audit to assure, among other things, that equipment operators were following the facility’s SOPs. The auditor’s checklist included an entry to verify the valve status on this piece of equipment: “Valve Open: Y/N.” The auditor had checked “Y,” unaware that the mostly open valve status he observed was insufficient for proper API synthesis.
Of course, had the auditor’s checklist indicated the valve needed to be completely open for the equipment to function correctly, there would probably have been a different outcome. But a checklist can never capture 100% of what needs to be verified. They serve only as reminders for what the well-trained auditor knows must be done. If we forget the limitations of even the best checklists, we risk misusing them.
Remember that Using a Checklist Is No Substitute for Critical Thinking
Throughout an audit, good auditors continually weigh the importance of what they hear and what they observe. Is this important enough to pursue?” “Might this line of questioning lead to a critical finding?” “Was that explanation reasonable?” “What is that black stuff?” In this respect, checklists are not necessarily our friends. They tend to pave over important distinctions. Auditors who rely too heavily on checklists are in danger of developing a sort of “robot” mentality in which everything has equal weight, and is either one way or the other. Yes/no, high/low, compliant/noncompliant, 0/1: the binary audit.
As an example, let’s look at a common GCP audit finding: the patient has not signed the most current version of the Informed Consent Form. We’ve seen this finding in audit reports and we’ve seen it in Warning Letters, so it must be critical, right? Maybe, maybe not – it depends on the content of the unsigned form. For example, the latest update to the protocol could have removed a requirement that an invasive procedure be conducted on every study visit. Signed or not, what study subject wouldn’t be ok with that amendment? Are they at higher risk for not signing the latest form? Have their rights been trampled? Probably not too egregiously. That’s a very different scenario than if the latest version of the ICF was informing the patient that a competitive product had just been approved by FDA. Failing to disclose the existence of the new drug clearly violates both the subject’s rights and safety. Had she been informed of the alternative, the subject may have chosen to drop out of the study, gladly trading the experimental (and possibly placebo) treatment for the relative safety and convenience of the approved one.
Just because a tool can be misused is not necessarily an argument against its use. So what can we do to guard against checklist misuse? Here are two ideas that surfaced during February’s NCCSQA panel discussion entitled “The Robot Auditor: When Did We Stop Asking Why?” led by Celine Clive, Lisa Olson, and Linda Borkowski.
First, it’s important to keep your checklists current. You’ll need to update your checklists to reflect new regulations and guidances, recent citations and Warning Letters, and other current events. You’ll need to make revisions to accommodate advancements in technology or changes to standard industry practices. (For example, that facilities audit question about archive media should be modified to include cloud solutions.) You should be adding entries designed to catch issues you found during audits that are not covered by your current checklist. And you should be subtracting when you can, too -- are there any questions you can stop asking?
Second, be prepared with follow-up questions: Here is a list of generic follow-up questions that will usually bring about productive dialog:
- Can you walk me through that?
- Can you demonstrate that? (These questions are particularly useful when the auditor is unsure whether something is actually a finding.)
- Can you show me the documentation which supports that?
- Do outsourcing partners follow their own SOPs or are they trained on yours? Is it documented
- What proof do you have that it happened as you describe?