Dr. Paul Chang, Professor of Radiology at the University of Chicago compared artificial intelligence (AI) to a wonderful racing car. Yet, regardless of how fast and competent the car can be, it will be a waste of expensive metal without roads and gasoline. For AI, the gasoline refers to data interoperability while the road is workflow integration. Both of them need to be supported by very capable IT infrastructure, which in the words of Dr. Chang, is unfortunately lacking at the moment.
The catch 22 of healthcare
This IT immaturity is significantly limiting the healthcare system the ability to take full advantage of AI and other data-driven tools. Besides, many of the current AI applications remain constrain as they are driven by data, mostly taken from the electronic health records (EHRs) only rather than compelling use cases. So, that has become a catch 22, whereby most institutions are not investing in the needed IT capability without AI excelling or demonstrating that it can excel in use cases but such AI tools cannot be created without an ideal IT capability.
Dr. Chang believes EHRs and PACS (i.e., Picture Archiving and Communication System) are just part of a bigger problem. Healthcare infrastructure needs to transit from merely storing data to one that can optimally leverage the knowledge within the information. Likewise, there is a need for evolution from standard business practices (i.e., dashboards and scoreboards) to data-driven business intelligence (i.e., real-time predictive analytics, complex event processing, machine learning and so on).
The future IT infrastructure assimilated into healthcare need be interoperable and scalable to consume and analyze large and complex data in an agile, and often real-time manner (i.e., that means a constantly changing use cases, schemas, and data models). The field needs to look at abandoning the expensive “vertical” (i.e., upgrading each server CPU and storage) to the relatively cheaper “horizontal” (i.e., adding more commodity servers) to maximize scalability.
The danger of becoming complacent
Nevertheless, Dr. Chang warned the complacency derived from the belief that a monolithic “single vendor” EHRs approach is adequate because healthcare requires a more comprehensive architecture. One of the reasons healthcare under-delivers innovations is because of a mismatch between how it views data use and the way data scientists view it. Many regard data as sloppy because they were being used outside the original use case, so there ought to be some synchronization. A good IT infrastructure to extract and “normalized and sanitized” data and develop, test and validate creations.
Although AI is still immature at this stage to deliver real values, there is still a need to recognize that the technology will fundamentally change the way clinicians practice medicine. Hopefully, improving the way human interacts with machines in the long run. Right now, this relationship is primitive and mono-directional (i.e., human giving orders to machine) and healthcare system has not managed to find ways to optimally utilize machine in a way that can maximize human performance.
As such, Dr. Chang encouraged clinicians to trust fellow vendors and initiative active collaborations to overcome the production mismatch. In a way, this will prevent a massive disruption to existing IT infrastructure but still facilitate some forms of affordable push depending on what the vendors can offer. On the other hand, a centralized data repository can be created to facilitate the provision of real-time data and overcome the data sloppiness. However, this requires a little bit more work, possibly a cultural change and more investment.
Overall, Dr. Chang said the real goal is to stop the reliance on EHRs and build a true healthcare architecture. He believes newer methods like federated learning can lead us to a different picture especially they mitigate privacy and security concerns coupled in data access and use.
You may also revisit the virtual conference on demand here.