At the upcoming AIMed UK virtual summit, Gerry Reilly, Chief Technology Officer at Health Data Research (HDR) UK will be part of a panel discussing information governance and data access as barriers to implementation of artificial intelligence (AI) in healthcare. After spending most of his career in technical and executive roles in the IT industry, Gerry joined HDR in 2018 to provide the leadership to the institution so that it had the necessary infrastructures to support data research across the four nations of the UK at scale.

Gerry gave an exclusive interview to AIMed, sharing with us his thoughts and strategies on protecting sensitive healthcare information including the re-identification of anonymized patient data; harnessing patients’ trust during data sharing, and how should regulators prepare the UK as she exits European Union and the influence of GDPR (i.e., General Data Protection Regulation).

HDR UK effort towards enhancing public trust for data sharing

“I think, primarily, public trust is essential for ensuring we can conduct world class health data science in the UK and it sits at the heart of HDR UK’s values. An open and honest approach to the de-identification of patient level data is key but since no approach to de-identification, that retains the utility of the data, is 100% reliable. So, we should be open about the risks of re-identification and how this can be mitigated”.

Gerry explained the approach that HDR UK has been pushing in collaboration with the UK Health Data Research Alliance (HDRA) is that research should be done within secure “Trusted Research Environments” (TREs) where de-identification is only one of the available controls. TREs are reliable spaces where researchers can securely access sensitive data with robust controls to ensure that only summary statistical results, and not patient level data, can be exported from the TRE.

Gerry Reilly, Chief Technology Officer, Health Data Research UK

TREs tracked all data usage and support technical safeguards to prevent data from leaving the trusted spaces. TREs are augmented by the use of the “Five Safes” model, which incorporates Safe Project; Safe People; Safe Setting; Safe Data and Safe Outputs. Gerry said a rigorous approach to protecting patient privacy is provided by combining all of the above. In spite so, he believes the framework only forms part of the picture.

It all boils down to transparency

“I think at the same time, we have to be open about how data is being used and for what purposes,” Gerry adds. He believes transparency is a key element here and it is important regardless of whether patient data is used in research conducted by academia, the UK National Health Service (NHS) or industry, even though public concern is perhaps more strongly focused on industry.

Besides, Gerry feels those who are handling the data should also be open about who is involved in the research and how those partnerships are built. “For example, the partnerships around our Health Data Research Hubs are all open to see. I believe we do have robust processes in the UK around the approval of access to data, but I suspect we are sometimes poor at communicating this,” Gerry cites.

He suggested involving the public and patients to be in all aspects of the data access approvals process and they should also be part of access committees, tendering processes for partnerships and on grant awarding committees and not just as a token presence. In terms of regulation, Gerry admitted that it is not his area of expertise. Nonetheless, at the personal level, he believes GDPR has been critical not just because it has introduced a regulatory framework but also brought the whole topic of regulation and privacy to the front of the minds of both professionals and the public.

Healthcare data is not commercial asset

As the UK leaves the European Union, Gerry warned the danger of pressure to treat data as just a valuable commercial asset and trade this off against softer regulations. “I think this would be definitely a mistake and would result in a public backlash, potentially damaging public trust towards the excellent health data science field in the country. We must all be ready to argue that the best approach to protecting health data science is to have a robust framework based on privacy and ethics and the fundamental principles that the science should benefit patients and ultimately, the general public,” he wrote.

Overall, Gerry also believes there is a continuing need to focus on improving the quality of data, making them more democratic and inclusive.

“We should ask ourselves as we look at the data: Do they truly represent all our communities or is the representation of the disadvantaged groups correctly balanced? Indeed, some of these areas are less well engaged but this is not an excuse for our data science to not equally support their needs and bring benefits to all communities”.

AIMed UK will take place this Friday (30 October). You may register your interest or obtain a copy of the agenda here.

*

Author Bio

Hazel Tang A science writer with data background and an interest in the current affair, culture, and arts; a no-med from an (almost) all-med family. Follow on Twitter.