How does your ERCP practice and performance compare with others? An opportunity for benchmarking.|
Over the last decade there has been increasing interest in quality issues in endoscopy . I have argued that endoscopists should keep track of their activities and outcomes, and be prepared to provide snapshots of their practice (“report cards”), when asked for them by potential patients and payers . I believe that those who do so will eventually have a practice advantage. The ASGE, with other organizations and individuals, has striven to define the metrics of quality performance [3-5). What has been lacking hitherto is an infrastructure to facilitate collection and analyses of these data to allow practitioners to compare themselves with others (“benchmarking”).
Several pilot projects are now in progress in the USA. One is the “ERCP Quality Network”, which I initiated and is supported by Olympus America. Practitioners (or their staff) upload key data points (indications, sedation/anesthesia, therapies, successes and adverse events), on each case, to a central web site, without identifying the patients. Data are analyzed and updated immediately to provide a “report card” which is available to the person who reported it (and to no one else). That person can then compare their own data with the average of all other people submitting data (again without being able to identify them). As of November 2008, 66 endoscopists had entered over 7500 ERCP procedures in the first phase of this study. The figure is a nice example of being able to compare one’s own (in blue) biliary cannulation rate with that of all of the other contributors (not individually identified).
Skeptics worry about the quality of self-reported data, and some are concerned that the data may show them in a bad light. One answer to these criticisms stems from the fact that the data are submitted anonymously. That being the case, the only person who could be deceived by submitting inaccurate data is that person. And, as can be seen in the figure, some people are indeed reporting poor success rates. There is no intention to publish “league tables”.
This and other pilot projects are helping to explore the mechanics and pitfalls of this process in advance of a national scheme which is being discussed by ASGE and ACG.
In moving forward, we now seek the help of other “ERCPists” world-wide. This is an invitation to participate. If you are interested, or just wish to learn more, send me a fax to (+1) 843 876 4705 with your email address.
- Johanson JF, Schmitt CM, Deas TM Jr, et al. Quality and outcomes assessment in Gastrointestinal Endoscopy. Gastrointest Endosc 2000; 52: 827-30.
- Cotton PB. How many times have you done this procedure, doctor? Am J Gastroenterol 2002; 97: 522-3.
- Faigel DO, Pike IM, Baron TH, et al. Quality indicators for gastrointestinal endoscopic procedures: an introduction. Gastrointest Endosc 2006; 63(4 Suppl): S3-9.
- Cotton PB, Hawes RH, Barkun A, et al. Excellence in endoscopy: toward practical metrics. Gastrointest Endosc 2006; 63(2): 286-91.
- Rex DK, Bond JH, Winawer S, et al Quality in the technical performance of colonoscopy and the continuous quality improvement process for colonoscopy: recommendations of the U.S. Multi-Society Task Force on Colorectal Cancer. Am J Gastroenterol 2002; 97: 1296-1308.