Endoscopy | 2021

Colonoscopy competence assessment tools: A systematic review of validity evidence.

 
 
 
 
 
 
 
 
 
 

Abstract


BACKGROUND\nAssessment tools are essential for endoscopy training, required to support feedback provision, optimize learner capabilities, and document competence. We aimed to evaluate the strength of validity evidence that supports available colonoscopy direct observation assessment tools using the unified framework of validity.\n\n\nMETHODS\nWe systematically searched five databases for studies investigating colonoscopy direct observation assessment tools from inception until April 8, 2020. We extracted data outlining validity evidence from the five sources (content, response process, internal structure, relations to other variables, and consequences) and graded the degree of evidence, with a maximum score of 15. We assessed educational utility using an Accreditation Council for Graduate Medical Education framework and methodological quality using the Medical Education Research Quality Instrument (MERSQI).\n\n\nRESULTS\nFrom 10,841 records, we identified 27 studies representing 13 assessment tools (10 adult, 2 pediatric, 1 both). All tools assessed technical skills, while 10 assessed cognitive and integrative skills. Validity evidence scores ranged from 1-15. The Assessment of Competency in Endoscopy (ACE) tool, the Direct Observation of Procedural Skills (DOPS) tool, and the Gastrointestinal Endoscopy Competency Assessment Tool (GiECAT) had the strongest validity evidence, with scores of 13, 15, and 14, respectively. Most tools were easy to use and interpret and required minimal resources. MERSQI scores ranged from 9.5-11.5 (maximum score 14.5).\n\n\nCONCLUSIONS\nThe ACE, DOPS, and GiECAT have strong validity evidence compared to other assessments. Future studies should identify barriers to widespread implementation and report on use of these tools in credentialing examinations.

Volume None
Pages None
DOI 10.1055/a-1352-7293
Language English
Journal Endoscopy

Full Text