Cardiology is no stranger to artificial intelligence (AI). It’s getting more common to deploy machine learning to interpret electrocardiograms (ECGs) for its potential to assist physicians during invasive electrophysiology procedures. AI is also present in interventional cardiology to identify and evaluate coronary disease from intravascular ultrasound and non-invasive functional assessments in some of the piloting studies.

For example, researchers used neural network to perform automated pressure waveform analysis and single out, in real-time, characteristic of “damping” waveform that happens during deep intubation of the coronary arteries. This believes will enable physicians to conduct a safer angiography and improve diagnostic accuracy of physiological assessment of coronary stenosis.

Nevertheless, AI is often believed to be part of a solution; not the solution. Once it has been implemented, there is still a need to walk the journey, to maintain, evaluate, and support it. Here are some thoughts AIMed have when it comes to scaling the technology in cardiology.

Access the model’s generalizability

Many cardiology-used algorithms are trained with high quality data in standardized research environment and it needs to be validated thoroughly to a diverse real-world population. An overfitting (i.e., when a model learns all the details and noise from a particular dataset and is unable to generalize what it has learned to new dataset) or underfitting model (i.e., when a model neither master what it has learned from existing dataset nor generalize what it has learned to new dataset) could have serious implications on patients.

For example, echocardiography AI is trained using relatively small sample sizes that are obtained within certain geographical boundaries, institutions or even particular brands of echocardiography machine. Besides, results of echocardiography AI research are still largely replying on human interpretation as the ground truth even though inherent human variability in interpretation and measurement are known. So, all these factors may aggravate overfitting, particularly if researchers or physicians are not paying close attention to where they have obtained the data, how representative are the information and so on.

Infrastructure to widen the deployment of AI

A group of Harvard researchers including Dr. Leo Anthony Celi, Associate Professor of Medicine (part-time) at Harvard Medical School and Principal Research Scientist at Massachusetts Institute of Technology who spoke at the recent AIMed webinar, once suggested an “evolutionary path to creating generalized data infrastructure” by building on existing impactful research successes like the Science and Technology Research Infrastructure for Discovery, Experimentation and Sustainability (STRIDES) initiatives at the National Institutes of Health (NIH) or MIMIC from the MIT Laboratory for Computational Physiology for a momentum of change.

Another option is to get the government to mandate the use of commercially available clouds across all healthcare organizations. For instance, the Observational Medical Outcomes Partnership (OMOP) and Fast Healthcare Interoperability Resources (FHIR) were both created common data schema to store and transfer of healthcare information taking into consideration of an AI enabled future for easy migration of data.

Cardiology AI and other technologies

AIMed Founder and Chief AI Officer at Children’s Hospital of Orange County (CHOC) Dr. Anthony Chang said at the recent AIMed Surgery virtual conference that AI could be overlaid with extended reality to create virtual twin of a patient. So, surgeons can train themselves on this new dimension rather than trying on real patients. Indeed, researchers at Duke University led by Dr. Amanda Randles had developed a 3D user interface for blood flow simulation called HARVEY.

They believe the interface will provide guidance for doctors in their treatment plans and also simulate a patient’s specific vasculature and accurately predict how the treatment plan will affect the surgical outcomes. Randles and her research team also feed all the simulation processes and results into a machine learning model, to develop a more specific risk profile for each patient. They hope that in the long run, a synergy between machine learning, high performance computing and 3D designs will shed lights on certain cardiology risk factors that are worth prolong monitoring.

If you are interested in how technology, particularly, artificial intelligence (AI), robotics, virtual/augmented/mixed realities, and many others are influencing Cardiology and strategies to deploy them in the clinical setting, do not miss the upcoming AIMed Cardiology virtual conference, organized in association with the American College of Cardiology, taking place on 4 November. Register your interest or get a copy of the agenda here today!


Author Bio

Hazel Tang A science writer with data background and an interest in the current affair, culture, and arts; a no-med from an (almost) all-med family. Follow on Twitter.