Dr Girish Shirali is a Pediatric Cardiologist and Co-Director of the Ward Family Heart Center at Children’s Mercy Hospital (CMH). He leads the multi- disciplinary team at CMH that created Cardiac High Acuity Monitoring Program (CHAMP), a cloud-based App which enables remote monitoring of babies with single ventricle heart disease, enabling them to spend more time at home.
“I began thinking about CHAMP in 2011. At that time, over 20% of babies who were born with hypoplastic left heart syndrome, one of the most complex forms of congenital heart disease, were dying unexpectedly after going home from the hospital. This was obviously awful for the families, especially after having endured diagnosis, heart surgery and a long hospitalization. It was incredibly draining for them and for us too.
At that time, and even today, in centers that don’t have CHAMP or similar technology, when these babies were sent home after their first surgery, parents were given a three-ring binder. They had to write down and track their baby’s vital signs and other information. Everything was hand-written, and they had to make calculations: a rather primitive way of doing things. The family was expected to call the care team if the baby had measurements that crossed certain critical thresh- olds. This system was prone to delays, and put a lot of
responsibility on the family. Besides, families would complain that when they brought the binder in for their regular clinic visits, the information in it was not always reviewed.
Above all, given that there was a known and unacceptably high mortality rate, we felt this was a setup for families to feel guilt-ridden and traumatized if their baby didn’t survive. Our original goals were to take away some of the logging responsibilities from the parents, so they could concentrate on taking care of their babies. We also aimed to decrease delays in care, and hopefully improve outcomes. Around that time, tablet PCs had just hit the consumer market. Our vision was to have families enter data into those instead of the binders, and to have that data transmitted instantly to the cloud, where it could be analyzed using standard algorithms, which would alert the care team, who could then circle back to the family.
Our dedicated informatics and clinical teams were up for the challenge. We quickly realized we were heading for a completely transformed clinical paradigm, where we were getting real-time information from families so our entire model of care had to be changed. One other thing that we did, which was purely serendipitous, was deciding to use the camera on the tablet PC to record a video of the baby every day.
We have been funded by Children’s Mercy as well as by the Claire Giannini Foundation, which enables families to get tablet PCs with cellular connectivity for free during the critical 4-6 month period when the babies are at risk for dying. As we started architecting this project, Microsoft helped our team to optimize the cloud-based infrastructure and we went live with CHAMP in 2014. Our young families are digital natives, for the most part, and they really like using CHAMP. We used their input to optimize the way they receive summary information, and developed CHAMP in nine different languages.
Our early results were promising. Many hospitals across the country (and internationally) expressed their interest in adopting CHAMP. Thanks to continuing philanthropic support, we were able to expand the program across the USA. Presently, we have nine different partner sites across the country, with over 400 babies in over 20 states across the USA, including Washington, Oregon, Alaska, Utah, Idaho, Nevada, Kansas, Missouri, Nebraska, Texas, Arkansas, Oklaho- ma, Ohio, Virginia, Maryland, West Virginia and Washington, DC. We have seen a dramatically reduced mortality rate from 20% to below 2.3%, which is a huge improvement.
One of the unexpectedly wonderful things that we realized about CHAMP is that because the parent is holding the tablet PC to record video,we see the babies interacting with the parent in their home environment. That was pretty magical as we seldom see it in the clinical environment, where we focus on medical evaluation of the babies, not witnessing how they interact with their parents at home. We have learned a lot from that: the way they breathe, make eye contact and smile in response to their parents. When they don’t feel good, we can see it not only in the way they breathe, or their color, but also in the way they interact.
Overall, CHAMP has helped us minimize delays in providing care. It has also hopefully helped address disparities in care, regardless of the family’s socioeconomic status, ethnicity or distancefromthesurgicalcenter.
As part of the plan to scale up the project, we have developed a downloadable version of CHAMP (for both iOS and Android, available through the App store), so families can use their own smart- phone or tablet. That will decrease costs and Indeed, not all parents are receptive to the idea of videos of their babies being stored: some are concerned about privacy. So, some of
them would just not video their baby, and others would video the baby neck-down. That diluted the purpose of the video, which is to capture the expressions of the babies to see if they appear comfortable and interactive. An experienced care provider can tell whether a baby is sick or not sick, just by looking. We reassure all families that CHAMP is fully HIPAA-compliant. The only people who can view these videos are the individual care team responsible for taking care of these babies. Beyond that, only approved research projects are given access to videos.improve portability. Something that we are really excited about is how having all of the data from all patients lends itself to predictive analytics, in a manner that the old binder- based method simply could not do. We are looking into whether the data that we have acquired, which includes clinical outcome data, can be used to predict hospital readmission or adverse outcomes using machine learning.
One of my dreams is to be able to use techniques such as deep learning to analyze the video data, to see if computing can discern the non-quantifiable instinct of a clinician who can tell, just by looking, whether a baby is sick or not. If we can make that work, it could have broad application:not just for these babies. Now that really would be something.”