“The aim of science is to seek the simplest explanations of complex facts.”

Alfred North Whitehead and The Concept of Nature.


This timely article that was just published in Science is a very useful and insightful manuscript on explainable AI in health care. The article clearly delineates that the core source of users’ skepticism and related issues of lack of trust and slow uptake is the black-box nature of predictive algorithms. The authors feel that the consensus for explainable AI in healthcare overstates the benefits while undercounts the drawbacks of this strategy.

The black box of AI implies, according to our authors, that the estimated function relating inputs to outputs is not understandable at the human level due to large number of parameters as well as nonlinearity of these parameters. An interpretable AI/ML uses a transparent function that is in a relatively easy form to understand (which leaves out deep learning). An explainable AI/ML, on the other hand, finds a white box that partially mimics the behavior of the black box; these methods can include an explanation of attributes of the input data that matter most to a specific prediction or an easy to understand liner model that gives outputs similar to the black box algorithm.

After these definitions, the authors further elucidates the disadvantages of this approach: black box remains the most accurate; the white box is not perfect; and explanations are post hoc. Of note, the FDA approved models are non interpretable black box models using deep learning.

The authors also outline why explainability of AI as it is, currently is limited with ersatz understanding and lack of robustness. Disadvantages of explainability include misleading in the hands of imperfect users.

Finally, the authors offer that healthcare workers should be wary of explainable AI as it may very well not transform the health care landscape. While this work offers no truly substantive surprises, it is nevertheless a valuable perspective to counter the explainability hype for AI in healthcare.

The full article can be read here