One of America’s leading professional medical associations, the American College of Radiology (ACR), has created a Data Science Institute (ACR DSI).

The ACR was founded in 1924 and represents more than 38,000 radiologists, radiation oncologists, nuclear medicine physicians and medical physicists.

The ACR DSI is collaborating with radiologists, industry leaders, government agencies and patients to facilitate the development and implementation of AI applications that will help radiology professionals provide improved medical care.

We spoke to ACR Executive Vice President and Chief Information Officer (CIO) Mike Tilkin to get the latest exclusive information on this momentous development:

AIMed: What is the short-term goal of the ACR DSI?

Mike Tilkin: I think we’d like to see a good base of clinically relevant examples of AI from identification of clinical aid, to creation of the algorithm, to validation, to the assessment process so we can create real-world evidence and data demonstrating how these are performing, what the value add is, and the extent that we can measure outcomes.

I think there’s interim milestones along the way to use case and data set creation. But over the next few years, is to establish real-world evidence and continually fine tune and evaluate what we’re doing and make sure we’re focused on the right areas and providing the right value.

Good, concrete case studies help people see and realise the value and help us understand continually how to focus and refocus our efforts and point industry and our members in the right direction.

As long as we’re defining problems crisply, and we’re tuning algorithms to define specific problems, I think we’re heading in the right direction in terms of realising value.

AIMed: Can you give examples of specific use cases the ACR DSI is focusing on?

MT: One example is a Kaggle competition we were involved in which was about detecting cancer in lung screen images from a clinical trial.

It was an interesting and useful experience, but what we identified quickly is we had created a clinical classification system for lung nodules.

Clearly it would be better for the algorithm to produce, for example, a lung RES classification system which could give the referring physician something actionable they can do with that finding.

Simply having a probability as to whether a patient has a nodule of cancer is interesting but it’s not nearly as clinically relevant and useful as giving the classification outcome that can be used to determine treatment.

That’s an example of moving from what is an interesting data science problem to something that fits concretely in the clinical workflow, produces actionable results and can add value quickly in a well-defined way.

At this stage some of the best use cases and early wins are: prioritizing work lists so you could go after the most critical cases first, improving operations and efficiency and identifying potential problems and critical cases early.

Some of the biggest challenges are the generalizability, which is one of the reasons a key part of our strategy is ongoing monitoring.

We want to be appropriately careful that algorithms perform and continue to perform as you expect, that there isn’t implicit bias or problems when you try to generalize to different patient populations which potentially the algorithms were trained on.

This registering, monitoring, ongoing assessment process will be a critical part of this and will help us understand the value proposition and where this is providing the greatest opportunities to improve.

AIMed: What is the biggest blocker to implementing AI tech into the clinical setting?

MT: A lot of the algorithm developers would say getting well-qualified, well-labelled data is the biggest challenge. In a world where you’re trying to look at supervised learning as a way to train these algorithms you need ground truth, so it can classify something it hasn’t seen before.

That means you need really good data sets to train and test these algorithms. Lots of data that isn’t well-qualified is not nearly as good as a little bit of data that really is well-qualified and curated.

We’re trying to work with our members and partner facilities to step up to this challenge. Part of our effort is to create use cases, identify the problem areas that we want to target, and then help provide a framework and tools so we can help the folks who have data in our facilities create useful data sets that can be leveraged in the space.

AIMed: What has been the reaction of the ACR members to the DSI?

MT: We’re in the middle of a hype-cycle, and the good and bad of the hype-cycle is that people get excited about the promise, but people are also concerned. You also have the potential to build expectations up and then struggle to get off the ground because industry is unable to meet those expectations.

In general we’ve gotten very positive reactions from our members in terms of this being an important tool set that radiology needs to embrace and help guide so that we can really leverage it to improve the care of patients.

As a community of physicians that have been heavily involved in technology to do what they do, I think there’s an appreciation generally for the fact that this is a wave of technology that will be important for healthcare in general and particularly within the practice of radiology.

They need to take a role and helping shepherding the development of these toolsets and deciding how we best leverage them. I think it’s been a mix of needing to educate and help frame things, but in general I think it’s been positive in terms of the receptivity to what we’re doing and what the potential is.

AIMed: As CIO of the ACR how do you work to drive innovation and create a culture where innovation is accepted?

MT: Communication and education becomes important and providing a pathway to make sure our clinicians, vendors and regulatory folks have got clean communication channels across that landscape so we can reduce friction.

We need to make sure the core concerns of the various constituencies are heard and understood so we can best come together to address them and try to provide some underlying framework.

Part of my role is helping facilitate unlocking the basic infrastructure, but a large part of it is providing the facilities and the ability to bring together our clinicians and the different parties they need to talk to.

When we talk about creating use cases, really what it’s about is creating large panels of physicians that can harness their expertise and get it packaged in a way that can be translated to the algorithm developers.

Then likewise to make sure the algorithm developers are able to communicate adequately with the radiology workflow vendors and that pathway is frictionless.

I need to ensure the feedback loop is such that new technology is being implemented in the clinical setting, our clinician experts are getting the results so they can adjust their thinking and promote the right guidance.

Nurturing that feedback loop and making sure we have clean communication channels is one of the core things that I work on.

AIMed: Would you have any advice to CIOs who are taking the first steps to innovate with AI?

MT: Try and reduce the fear factor and mystique. To even get the conversation going and to make progress on implementing technology you’ve got to start from some common base of understanding.

One of the early challenges that people struggle with is a lack of understanding of what the technology is, and how it can be used, both the good and the bad, which gets in the way of making progress

You don’t have to have the level of understanding of that an algorithm developer would have: the nuts and bolts and the greatest details of how algorithms are implemented.

But you need enough of a base of understanding that it takes away the fear and mystique and allows for a practical, effective conversation to occur about workflow or the ethical and legal concerns.

If you don’t have a common basis of understanding it can be hard to have those important conversations and really get things moving.

This interview originally appeared in issue 03 of AIMed Magazine. Download the magazine here.