DESRIST 2023 submission 1792: Best Paper Nomination

28.04.2023

The paper “Designing User-Centric Explanations for Medical Imaging With Informed Machine Learning“ by Luis Oberste, Florian Rüffer, Okan Aydingül, Johann Rink and Armin Heinzl has been nominated for the DESRIST (Design Science Research in Information Systems and Technology) best paper award by the Sustainability and Responsible Design Track. The research is part of the Research Campus M²OLIE.

“The convention has been the leading conference for the „build side“ in our discipline for 18 years. Design science focusses on the development and validation of prescriptive knowledge in the information sciences. Nobel prize laureate Herbert A. Simon distinguished the natural sciences, concerned with explaining how things are, from design sciences which are concerned with how things ought to be, i.e., with devising technological artifacts to attain goals. The conference takes place from May 31 until June 2, in Pretoria, South Africa. We thank the conference chairs, the track chairs, and the review panel for their constructive feedback and for the signal of appreciation.”

Designing User-Centric Explanations for Medical Imaging
With Informed Machine Learning

Abstract

A flawed algorithm released in clinical practice can cause unintended harm to patient health. Risks, regulation, responsibility, and ethics shape the demand of clinical users to understand and rely on the outputs made by artificial intelligence. Explainable artificial intelligence (XAI) offers methods to render a model’s behavior understandable from different perspectives. Extant XAI, however, is mainly data-driven and designed to meet developers’ demands to correct models rather than clinical users’ expectations to reflect clinically relevant information. To this end, informed machine learning (IML) utilizes prior knowledge jointly with data to generate predictions, a promising paradigm to enrich XAI with medical knowledge. To explore how IML can be used to generate explanations that are congruent to clinical users’ demands and useful to medical decision-making, we conduct Action Design Research (ADR) in collaboration with a team of radiologists. We propose an IML-based XAI system for clinically relevant explanations of diagnostic imaging predictions. With the help of ADR, we reduce the gap between implementation and user evaluation and demonstrate the effectiveness of the system in a real-world application with clinicians. While we develop design principles of using IML for user-centric XAI in diagnostic imaging, the study demonstrates that an IML-based design adequately reflects clinicians’ conceptions. In this way, IML inspires greater understandability and trustworthiness of AI-enabled diagnostic imaging.