Abstract
Background: Formative colonoscopy direct observation of procedural skills (DOPS) assessments were updated in 2016 and incorporated into UK training but lack validity evidence. We aimed to appraise the validity of DOPS assessments, benchmark performance and evaluate competency development during training in diagnostic colonoscopy.
Methods: This prospective national study identified colonoscopy DOPS submitted over an 18-month period to the UK training e-portfolio. Generalisability analyses were conducted to evaluate internal structure validity and reliability. Benchmarking was performed using receiver operator characteristics (ROC) analyses. Learning curves for DOPS items and domains were studied and multivariable analyses performed to identify predictors of DOPS competency.
Results: Across 279 training units, 10749 DOPS submitted for 1199 trainees were analysed. The acceptable reliability threshold (G>0.70) was achieved with 3 assessors performing 2 DOPS each. DOPS competency rates correlated with the unassisted caecal intubation rate (rho 0.404, P<0.001). Demonstrating competency in 90% of assessed items provided optimal sensitivity (90.2%) and specificity (87.2%) for benchmarking overall DOPS competence. This threshold was attained in the following order: ‘pre-procedure’ (50-99 procedures), ‘endoscopic non-technical skills’ and ‘post-procedure’ (150-199), ‘management’ (200-249) and ‘procedure’ (250-299) domain. At item-level, competency in ‘proactive problem solving’ (rho 0.787) and ‘loop management’ (rho 0.780) correlated strongest with the overall DOPS rating (P<0.001) and were the last to develop. Lifetime procedure count, DOPS count, trainer specialty, easier case difficulty and higher caecal intubation rate were significant multivariable predictors of DOPS competence.
Conclusion: This study establishes milestones for competency acquisition during colonoscopy training and provides novel validity and reliability evidence to support colonoscopy DOPS as a competency assessment tool.
Methods: This prospective national study identified colonoscopy DOPS submitted over an 18-month period to the UK training e-portfolio. Generalisability analyses were conducted to evaluate internal structure validity and reliability. Benchmarking was performed using receiver operator characteristics (ROC) analyses. Learning curves for DOPS items and domains were studied and multivariable analyses performed to identify predictors of DOPS competency.
Results: Across 279 training units, 10749 DOPS submitted for 1199 trainees were analysed. The acceptable reliability threshold (G>0.70) was achieved with 3 assessors performing 2 DOPS each. DOPS competency rates correlated with the unassisted caecal intubation rate (rho 0.404, P<0.001). Demonstrating competency in 90% of assessed items provided optimal sensitivity (90.2%) and specificity (87.2%) for benchmarking overall DOPS competence. This threshold was attained in the following order: ‘pre-procedure’ (50-99 procedures), ‘endoscopic non-technical skills’ and ‘post-procedure’ (150-199), ‘management’ (200-249) and ‘procedure’ (250-299) domain. At item-level, competency in ‘proactive problem solving’ (rho 0.787) and ‘loop management’ (rho 0.780) correlated strongest with the overall DOPS rating (P<0.001) and were the last to develop. Lifetime procedure count, DOPS count, trainer specialty, easier case difficulty and higher caecal intubation rate were significant multivariable predictors of DOPS competence.
Conclusion: This study establishes milestones for competency acquisition during colonoscopy training and provides novel validity and reliability evidence to support colonoscopy DOPS as a competency assessment tool.
Original language | English |
---|---|
Journal | The American Journal of Gastroenterology |
Early online date | 15 Nov 2019 |
DOIs | |
Publication status | E-pub ahead of print - 15 Nov 2019 |
Keywords
- Competence
- Colonoscopy
- DOPS
- assessment
- training