Skip to main content

Advertisement

Explainable AI in Medical Informatics and Decision Support

Call for papers

Based on a successful workshop on explainable AI during the Cross Domain for Machine Learning and Knowledge Extraction (CD-MAKE) 2018 conference, we launch this call for a special issue at BMC Medical Informatics and Decision Making, with the possibility to present the papers at the next session on explainable AI during the CD-MAKE 2020 conference in Dublin (Ireland) at the end of August 2020.

New Content Item

We want to inspire cross-domain experts interested in artificial intelligence/machine learning to stimulate research, engineering and evaluation in, around and for explainable AI - towards making machine decisions transparent, re-enactive, comprehensible, interpretable, thus explainable, re-traceable and reproducible; the latter is the cornerstone of scientific research per se!

We foster cross-disciplinary and interdisciplinary work including but not limited to:

  • Novel methods, algorithms, tools for supporting explainable AI
  • Proof-of-concepts and demonstrators of how to integrate explainable AI into workflows
  • Frameworks, architectures, algorithms and tools to support post-hoc and ante-hoc explainability and causality machine learning
  • Theoretical approaches of explainability ("What is a good explanation?")
  • Towards argumentation theories of explanation and issues of cognition
  • Comparison Human intelligence vs. Artificial Intelligence (HCI -- KDD)
  • Interactive machine learning with human(s)-in-the-loop (crowd intelligence)
  • Explanation User Interfaces and Human-Computer Interaction (HCI) for explainable AI
  • Novel Intelligent User Interfaces and affective computing approaches
  • Fairness, accountability and trust
  • Ethical aspects, law and social responsibility
  • Business aspects of explainable AI
  • Self-explanatory agents and decision support systems
  • Explanation agents and recommender systems
  • Combination of statistical learning approaches with large knowledge repositories (ontologies)

The grand goal of future explainable AI is to make results understandable and transparent  and to answer questions of how and why a result was achieved. In fact: “Can we explain how and why a specific result was achieved by an algorithm?”

Submission for this special issue is open until 31 December 2019. The special issue is overseen by Section Editor Andreas Holzinger.

Authors who aspire to present their paper at the CD-MAKE 2019 conference should submit their manuscript before 30 April 2019. The call for papers will remain open throughout 2019 to allow high quality manuscripts from the CD-MAKE conference in August 2020 to be included and bring together a unique and high quality collection of work on explainable AI.

For further information, please click here. For an introduction to the topic, please see the online content provided by Prof. Holzinger. 

For pre-submission enquiries, please contact the in-house Editor (Dirk Krüger) at bmcmedinformdecismak@biomedcentral.com.

  1. A variant of unknown significance (VUS) is a variant form of a gene that has been identified through genetic testing, but whose significance to the organism function is not known. An actual challenge in precis...

    Authors: Priscilla Machado do Nascimento, Inácio Gomes Medeiros, Raul Maia Falcão, Beatriz Stransky and Jorge Estefano Santana de Souza

    Citation: BMC Medical Informatics and Decision Making 2020 20:52

    Content type: Technical advance

    Published on:

  2. The penetration level of mobile technology has grown exponentially and is part of our lifestyle, at all levels. The use of the smartphone has opened up a new horizon of possibilities in the treatment of health...

    Authors: A. Hernández-Reyes, G. Molina-Recio, R. Molina-Luque, M. Romero-Saldaña, F. Cámara-Martos and R. Moreno-Rojas

    Citation: BMC Medical Informatics and Decision Making 2020 20:40

    Content type: Study protocol

    Published on:

  3. Cloud storage facilities (CSF) has become popular among the internet users. There is limited data on CSF usage among university students in low middle-income countries including Sri Lanka. In this study we pre...

    Authors: Samankumara Hettige, Eshani Dasanayaka and Dileepa Senajith Ediriweera

    Citation: BMC Medical Informatics and Decision Making 2020 20:10

    Content type: Research article

    Published on:

  4. In classification and diagnostic testing, the receiver-operator characteristic (ROC) plot and the area under the ROC curve (AUC) describe how an adjustable threshold causes changes in two types of error: false...

    Authors: André M. Carrington, Paul W. Fieguth, Hammad Qazi, Andreas Holzinger, Helen H. Chen, Franz Mayr and Douglas G. Manuel

    Citation: BMC Medical Informatics and Decision Making 2020 20:4

    Content type: Research article

    Published on:

  5. Fetal heart rate (FHR) monitoring is a screening tool used by obstetricians to evaluate the fetal state. Because of the complexity and non-linearity, a visual interpretation of FHR signals using common guideli...

    Authors: Zhidong Zhao, Yanjun Deng, Yang Zhang, Yefei Zhang, Xiaohong Zhang and Lihuan Shao

    Citation: BMC Medical Informatics and Decision Making 2019 19:286

    Content type: Research article

    Published on:

  6. Predictive modeling with longitudinal electronic health record (EHR) data offers great promise for accelerating personalized medicine and better informs clinical decision-making. Recently, deep learning models...

    Authors: Rawan AlSaad, Qutaibah Malluhi, Ibrahim Janahi and Sabri Boughorbel

    Citation: BMC Medical Informatics and Decision Making 2019 19:214

    Content type: Research Article

    Published on: