Introduction: Artificial intelligence (AI) has been predicted to drastically improve physician efficiency and patient outcomes.1 One field of medicine with a tremendous increase in integration of artificial intelligence and computer-assistance is radiology. While there is growing literature documenting the impact of AI on the radiologist’s final interpretation, there is limited literature assessing the impact on radiology residents’ interpretations. Specifically, while early data has shown computer-aided detection (CAD) improved radiology residents’ sensitivity in mammography2, there is a lack of additional investigations on how CAD affects residents’ interpretations. As CAD and other AI systems become more prevalent in radiology, it is crucial to understand their impact on radiology residents’ confidence in making diagnoses without CAD. This study proposes to examine how a theoretical CAD program will alter a radiology resident’s analysis of fractures on radiographs.
This can potentially serve as a model for future assessments of the impact of AI/CAD on residents’ interpretations and education.
Methods: Radiology residents will complete an evaluation of trauma skeletal radiographs. A 5 point Likert scale will be used to assess the residents confidence in determining the presence or absence of a fracture. The resident will first see a radiograph without CAD marks, rank their level of confidence, and then be shown the case again with a theoretical CAD mark and rank their confidence again. There will be a variety of true positive, true negative, false positive and false negative CAD marked images; the participants will be not be informed of the sensitivity and specificity of the theoretical CAD system. The resident will have an allotted time interval to complete the evaluation to simulate the typical reading workflow of actual practice.3 After at least a 2 week delay, the cases without CAD marks will be shown again and the resident will rate his/her confidence again. This study is currently pending IRB approval. The next step will be obtaining the data through utilization of a PACS system familiar to the residents and a web-based survey system for the Likert scale.
Potential Findings/Conclusion: Anticipate the resident’s confidence level will increase with true positive and true negative CAD marked cases, but confidence level will decrease with the false positive and false negative CAD marked cases and potential change initial interpretations. With a second review of cases without CAD marks, after a 2 week delay, it is uncertain if the residents confidence level will decrease without assistance of CAD. Part of radiology residency training is to develop confidence in his/her diagnostic accuracy. One potential harm of integrating CAD early into training is a lack of residents’ ability to make diagnoses independently.
Jha S, Topol EJ. Adapting to artificial intelligence: radiologists and pathologists as information specialists. JAMA. 2016 Dec 13;316(22):2353-4.
Balleyguier C, Kinkel K, Fermanian J, Malan S, Djen G, Taourel P, Helenon O. Computer-aided detection (CAD) in mammography: does it help the junior or the senior radiologist?. European journal of radiology. 2005 Apr 30;54(1):90-6.
Fleishon HB, Bhargavan M, Meghea C. Radiologists’ reading times using PACS and using films: one practice’s experience. Academic radiology. 2006 Apr 30;13(4):453-60.
MEDICAL IMAGING & BIOMEDICAL DIAGNOSTICS
Author: Kassie McCullagh
Coauthor(s): Kassie McCullagh, MD; Stacy O’Connor, MD
Status: Work In Progress