“We approach these accidents with a data-driven strategy.”
Official at Federal Aviation Administration

The media has been in a frenzy with the second Boeing 737 Max 8 tragedy in less than six months. There are three very important lessons here for our AI in medicine journey.

First, the technology surged forward without sufficient human training. This plane has been the fastest selling airplane in Boeing history (airplane ‘hype”), but now there is preliminary evidence that there was not sufficient groundwork for an assiduous human-to-machine synergy between the pilot and the plane.

Second, there should be no hesitation or delay in “grounding” of the machine suspected at fault if there is a negative outcome as human lives are at stake.

We need to remember this velocity of response should there be a safety issue with any AI tool in medicine. Lastly, this human-to-machine synergy needs more human cognition when the machine has significant issues (the converse is of course valid also). With two new planes of the same class having similar erratic accidents within the first few minutes of a flight, human intuition (rather than “more data’) should have prevailed towards a decision to avert more human lives lost.

Anthony Chang, MD, MBA, MPH, MS
Founder, AIMed
Chief Intelligence and Innovation Officer
Medical Director, The Sharon Disney Lund
Medical Intelligence and Innovation Institute (mi3)
Children’s Hospital of Orange County