Abstract

AI-based tools are increasingly developed for clinical radiology, offering potential gains in diagnostic accuracy, consistency, and workflow efficiency. Yet adoption remains limited, as many systems lack support for effective human-AI collaboration within complex diagnostic workflows. To address this gap, we visually analysed 17 FDA- and/or CE-certified commercial radiology AI systems, examining how their interface designs operational ize human-AI interaction across three key dimensions: diagnostic task coverage, information richness of AI explanations, and human control mechanisms. Using visual diagramming, typology-based task mapping, and cross-system comparison, we characterize when and how AI outputs are introduced, how understandable they are communicated, and how clinicians can influence or correct AI behaviour. Our analysis shows that most systems augment isolated subtasks, such as detection, quantification, and annotation, while higher-order or multi-phase workflow support remains rare. AI explanations typically combine categorical, numerical, and visual outputs but rarely make underlying reasoning transparent, leaving interpretive responsibility to clinicians. Control mechanisms vary in depth and frequency, ranging from single-step initiation to multi-stage walk throughs, yet few systems support iterative engagement and oversight of AI output. These findings reveal significant variation and fragmentation in current design practices, emphasizing the need for standardized frameworks to evaluate and guide human-AI interaction in clinical tools. Future work should link interaction design dimensions, such as control granularity and explanation richness, to safety, usability, and adoption outcomes, ensuring that AI systems enhance rather than constrain clinician expertise and agency in diagnostic decision-making.

Keywords

Radiology; Artificial Intelligence; User Experience; Human-AI Interaction

Creative Commons License

Creative Commons Attribution-NonCommercial 4.0 International License
This work is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License

Conference Track

Track 4 - Human-Centered AI

Share

COinS
 
Dec 2nd, 9:00 AM Dec 5th, 5:00 PM

The Missing Link: Understanding Human-AI Interaction in FDA/CE-Approved Radiology Tools

AI-based tools are increasingly developed for clinical radiology, offering potential gains in diagnostic accuracy, consistency, and workflow efficiency. Yet adoption remains limited, as many systems lack support for effective human-AI collaboration within complex diagnostic workflows. To address this gap, we visually analysed 17 FDA- and/or CE-certified commercial radiology AI systems, examining how their interface designs operational ize human-AI interaction across three key dimensions: diagnostic task coverage, information richness of AI explanations, and human control mechanisms. Using visual diagramming, typology-based task mapping, and cross-system comparison, we characterize when and how AI outputs are introduced, how understandable they are communicated, and how clinicians can influence or correct AI behaviour. Our analysis shows that most systems augment isolated subtasks, such as detection, quantification, and annotation, while higher-order or multi-phase workflow support remains rare. AI explanations typically combine categorical, numerical, and visual outputs but rarely make underlying reasoning transparent, leaving interpretive responsibility to clinicians. Control mechanisms vary in depth and frequency, ranging from single-step initiation to multi-stage walk throughs, yet few systems support iterative engagement and oversight of AI output. These findings reveal significant variation and fragmentation in current design practices, emphasizing the need for standardized frameworks to evaluate and guide human-AI interaction in clinical tools. Future work should link interaction design dimensions, such as control granularity and explanation richness, to safety, usability, and adoption outcomes, ensuring that AI systems enhance rather than constrain clinician expertise and agency in diagnostic decision-making.

 

To view the content in your browser, please download Adobe Reader or, alternately,
you may Download the file to your hard drive.

NOTE: The latest versions of Adobe Reader do not support viewing PDF files within Firefox on Mac OS and if you are using a modern (Intel) Mac, there is no official plugin for viewing PDF files within the browser window.