In one of the panel discussions on delivering patient-driven innovation at the AIMed UK 2020 virtual summit took place early on, Dr. Natalie Banner, Lead for Understanding Patient Data (UPD), an initiative by the UK Wellcome Trust to support the conversations around how patient data can be better use for care and research expressed people who asked about “what can we do to improve (patients/public) trust?” have been asking the wrong question because trust is influenced by many different factors.
“What can we do to improve (patient/public) trust?” is a wrong question
For instance, how the government respond to the pandemic is going to have an impact on how people think about public health policy and ultimately, how their data is being used in this combat against a global health crisis. As such, it is hard to truly separate the public trust component from aspects like governance; data use for specific development and so on. “So, here is the question we need to ask instead: How do we ensure the system of managing and using data and developing data-driven technology are all trustworthy? How can we demonstrate that they hold basic principles like transparency; having a clear structure of accountability, good oversight and ideally, involvement of citizens in important decision making processes,” Dr. Banner comments.
Pete Wheatstone from the Patient Advisory Group – useMYdata agreed. He said there has been a confusion around actions taken by the government and the role patient data or public health data play during the pandemic. Wheatstone also poured in his personal experience as a cancer patient. He was surprised and found it memorable when a few of the other patient representatives told him their lives were “dented” by the NHS. Overall, he believes communication is the key. “How do you demonstrate trust? These trust grow in an information vacuum. So, what can we do to help inform patients, especially future patients?” Wheatstone says.
Mark Briggs, Head of Cell and Gene Therapy at Welsh Blood Service added engagement, openness, transparency, and involvement all play a role. The pandemic and the sudden “data awakening” have moved healthcare systems away from their usual “arrogance” and forced them to start talking to people, understanding their needs, explaining what they will like to achieve, and making sure goals are aligned. “I think it’s about partnership and activating our patients, population as well as the social support circle around them. To facilitate individuals to take ownership of their healthcare, make them feel they are part of the decision-making process and tools that are eventually being developed will help them”.
Who should be talking to the patients about change, innovation and even AI?
Briggs believes patient participation is altruistic because it is already very challenging to encompass all the other stakeholders in public and private sectors, coming in at different phases of the development and finding the way to go forward that is comfortable for all. Dr. Banner thought the challenge is quite a systemic issue. The practice has always been involving patients at an ad-hoc basis so we tend to miss out a bigger picture of what it means to involve and engage people in the long run.
Besides, change, innovations and artificial intelligence (AI) remain a promise as of now. There is a tendency that benefits brought about by all these promises are not evenly distributed; incidences when algorithms just won’t work or trigger unintended consequences for the marginalized or under-represented communities. It is finding a sweet spot between involving patients while not over-promising them with unrealistic fantasies and also to figure out when and where in the process should patients get involved?
“This needs to be multi-layer; we are not going to get patients on a one-off occasion,” Dr. Banner says. Wheatstone added the importance of engaging both patients and the healthy population who are getting on with their lives and not under the radar of the healthcare system. “Healthcare is for sick people, but if I am healthy, I am getting on with my life. You may be ending up talking to just a specific group of patient representative like me. People who have been ill, middle class, and middle age”.
Patients are not part of the algorithm and do not assumed a generation gap
An audience raised a concern whether involving patients in innovation or AI development means making them a part of an algorithm, which does not necessarily do them favors as in the case of the recent UK A-level controversy or the often criticized universal credit system. Briggs explained this is where appropriate human interactions should come in. Most of the time, technology will only weight the risks and benefits but it takes direct support for patients, who are obviously going through a traumatic period of their lives, to demonstrate to them the tools and make them realize the benefits and limitations.
Wheatstone mentioned an article written by the session’s moderator, Geoff Rollason, Patient Experience and Service Director at Pfizer Oncology UK on emotion intelligence meets with AI. “I think patients and people aren’t necessarily logical and emotions are sometimes more powerful than rationality. So, these are the things we need to take note of. Algorithms can be wrong, for example, I have taken a look at my own personal health record. Even though it has been in the cancer registry and curated for 18 months, it’s still full of errors. I am not talking about missing data but pure inaccuracy”.
Another concern raised by audience was a possibility of generational differences when it comes to AI acceptability. Dr. Banner warned the danger of making baseless assumption and cited a past study asking people about their attitudes towards commercial access to health data and people between the age of 16 to 24 generally understood data use more quickly and intuitively and they also have higher demands on seeking prior approval.
“We should not over-generalize age because people regard AI rather differently depending on their background, experiences, experiences with healthcare and a whole range of other things… We assumed these young people in the study used social media so they may not care about privacy, but that’s not true. In fact, they have higher expectations of being ask permission to use their data,” Dr. Banner explained. Nevertheless, the research was done some time ago, so she suspected attitudes might have shifted and she look forward to seeing more of such research in the near future.