New research from Pfizer has revealed that while artificial intelligence (AI) might feel new and unfamiliar to patients in a healthcare setting, for the patients we interviewed crucially, they can see the benefits of it being used alongside doctors to augment traditional delivery methods of care.
Pfizer’s research conducted with a group of cancer patients in the UK* reveals several key patient insights and unmet needs where AI-supported solutions would be positively welcomed during their cancer diagnosis and ongoing treatment.

The research involved qualitative interviews with a group of 10 male and female cancer patients who had undergone treatment through the Velindre Cancer Centre in Cardiff for different types of cancer.

Steve Cartwright, Patient Experience Manager at Pfizer Oncology, welcomed the results of the research “during our collaborations with the NHS and patient organisations, we have recognized the need for greater support for cancer patients in the UK. We believe in the potential of AI to achieve improvement but it is essential to allow patients to lead the direction of travel so that they are comfortable with the level that AI is introduced into their care”.

In the research, patients typically described the difficulty of taking in information at the point of their diagnosis; “You almost stop listening. Sometimes I had to ask, can we just slow down?” The initial emotional shock of the diagnosis blocks out complex but crucial new information with many saying they weren’t prepared and didn’t think to take notes.

Many patients in the research group also highlighted that the same level of detail isn’t always sent via letters following appointments unless requested, leaving some still searching for answers to pressing questions following their consultations. Leaflets they were given proved too generic, and “Dr Google” was too frightening and overwhelming with some information patients didn’t want to cross paths with, such as survival statistics. It was only after their first few appointments they realized they would either need to bring someone with them or take their own notes to create their own personalized patient record. A significant additional burden at a time of extreme stress.

In addition to capturing more critical details around their diagnosis and treatment, patients in the research study also felt an audio record could provide significant help. They felt this is something they could have referred back to at their own pace and time; “Having the diagnosis through a digital record means you could hear everything back as you’re not really taking it all on board. Later, you doubt yourself so much with what’s just been said, and your emotions take over from the facts. If there was a digital record you could go back to, it would probably make people feel less panic. You could listen to the facts again of what it is exactly you’re dealing with.”

When exploring what other forms of technology could have supported their Cancer journey, a digital support nurse, similar to a chatbot, was also enthusiastically suggested by the research group. Patients were vocal about wanting support between consultations or late at night and thought a chatbot could allay their worries in between appointments, if the bot could answer many common questions that a cancer support nurse or oncologist typically would; “Sometimes you get these questions that keep you awake at night, and you can’t ring anyone. Sometimes I think a chatbot could have helped then.”

“It’s nice not to have to drive 40 miles to have an appointment if I can get the response I need virtually.”

“You go through two weeks of fretting, your anxiety builds, and by the time you go into your next appointment, you’re feeling really wound up, which is all really counter-productive to your immune system and overall health during cancer.” Here’s where a digital support nurse with quick, credible answers could prove transformative.

Nearly all the participants we interviewed had experience of using chatbots online, in a non-medical context, but experience varied with some expressing frustration when the technology was not correctly implemented; “It can waste a lot of time taking you through a series of questions that sometimes a human could answer in 5 minutes.”

Past experience of using chatbots has a determining effect on usage so it’s clear that user experience in a medical scenario needs to be simple, seamless and efficient to encourage trust and adoption. For patients who might not describe themselves as “tech-savvy”, so long as the chatbot is inviting and easy to use, it can still be an approachable and convenient way to get instant information and advice, alleviating the mental load at a key stage of the treatment journey.

That journey also encompasses life after successful cancer treatment. All patients spoke of anxiety or post-traumatic stress that stayed with them for some time after treatment. Having to distinguish between lingering side-effects or cancer re-growth, for example, proves hugely traumatic. The patients vocalized concerns that some support systems disappear once treatment ends, such as regular contact with an assigned cancer support nurse as a point of contact for questions in between hospital visits. And after treatment, follow-up oncology appointments also become less frequent, leading to many living without reassurance and heightened levels of fear and vigilance for extended periods of time until they can next speak with a nurse or oncologist. The physical after-effects also take a huge toll on their mental health; “I kept thinking I shouldn’t be feeling so anxious now I’ve finished treatment…but I felt and looked completely different. It can completely knock your confidence.”

In the absence of referrals or recommendations, most of the patients mentioned having to find peer support groups themselves; “I ended up finding the most useful information through my breast cancer support group to help me manage side effects from treatment.”

“Being part of an online forum was really important for me. Sometimes there are things you don’t feel comfortable discussing with friends or family because you don’t want to worry them more, so being able to speak with a separate group who really know what you’re going through is quite helpful.”

Regular mindfulness support through apps without subscription costs would also be seen as a huge benefit. Although apps like Calm and Headspace were familiar to the group, the subscription costs were hard to justify when many felt like they were only just recovering from the financial strain cancer can bring as well.

Personalized information, tailored to their individual needs and care, was also indicated to be an area where patients felt AI could help empower them; “Personalizing more of your care to you and your needs. That’s where I think AI could have massive uses.”

 A commonly held view among the patients was also a desire to feel more involved in their healthcare and to not be wholly dependent on healthcare professionals as the sole provider of information; “Contact points and sign-posting to useful resources along your cancer journey could be so helpful.” Clearly this is a huge opportunity for an AI-powered resource such as a digital patient library – a sophisticated search tool to direct patients towards relevant online and offline resources, without having to scroll through Google.

But while the report highlights the opportunities for AI to positively impact on a patient’s oncology treatment journey, it also sheds light on the real barriers to its integration and adoption. “I wouldn’t like to rule out the human altogether. It could be very difficult to have a machine answer very detailed medical questions, but I’m very happy to talk to a machine that can still provide a great deal of information and learning.”

The need to fully gain patients’ trust will be the most vital step to the ongoing development of AI in healthcare. Human trust is based on our understanding of how other people think and having first-hand experience of their reliability in order to cement a feeling of reassurance and safety. But as AI is still a relatively new and foreboding concept to most people, and often working unheralded in the background, positive reportable interactions with it are less common in healthcare settings to date.

While the research underlines that there are some concerns from patients about AI based on their experiences across other customer service settings, other perceptions of AI were also derived from what the research group picked up in the media. Nearly all the patients had heard the term ‘AI’ but awareness was centered around what they’d seen in films and TV, such as fictional evolution of robotics and predictive technology, at times with more negative and threatening connotations. While few knew how AI was currently being used in healthcare, if patients can see beyond the unknown and unknowable ‘black boxes’ of AI, trust and acceptance will be more forthcoming.

There are encouraging signs. Artificial intelligence is already embedding itself in patients’ everyday lives with over half of the participants stating they use Amazon Alexa in their home, albeit just for playing music or checking the weather. In these instances, the technology can actively be seen to work instantly and effectively, raising acceptance and trust that a machine can make decisions that are in a person’s best interests.

More encouraging is the common theme in the research, revealing that patients aren’t averse to the adoption of AI when used alongside in-person care and support; “I would be interested in using AI-based solutions when it comes to my health care, as long as I knew I could also speak with a doctor or nurse if I needed other answers.”

If harnessed correctly, all the patients in the research group could see the benefits of AI being used alongside doctors; “A very good computer can hold a great deal more knowledge than any human brain.”

“If we use AI intelligently, we can improve speed, we can improve accuracy of analysis, and I think we should.”

The findings highlight that patients need to be reassured that AI is a powerful tool that will augment clinicians’ abilities, and not be utilized as a substitute for human intelligence. While human intelligence is brilliant at intuition, creativity and learning quickly (patients may also need reminding that AI algorithms require tens of thousands if not millions of examples to learn, which in human terms isn’t very smart at all), AI is a really powerful statistical technique that can find patterns in a million examples.

As the research indicates, patients prefer empathetic interactions and believe their medical needs are unique and cannot be addressed by algorithms alone.  In short, they recognise that AI is smart but they also understand it’s a long way from being able to fully interpret a patient’s nuanced response.

Even though awareness of emerging applications of AI in healthcare was generally low amongst the patient research group, more positively, they recognize that harnessing the collaboration between humans and technology is the ultimate response in healthcare. It’s an understanding that opens the door for AI healthcare developers to meet several unmet needs. Because despite their concerns about artificial intelligence, patients are all too aware that AI and humans are at their most potent when they co-operate.

Pfizer Oncology Patient Experience & Service Director Geoff Rollason is keen to continue the journey “with all of the challenges brought about by Covid, it is more important than ever to embrace new technology to improve the services available to cancer patients. We will be at the forefront of initiatives to make these positive developments available to as many UK patients as possible“

*This research was independently run by Ipsos MORI in November 2020 among n=10 patients who “opted in” to the research via the Velindre Cancer Centre in Cardiff.  The methodology was 60 minute in-depth interviews via video calls.