In December 2016, The University of Chicago Medical Center signed a contract with Google on a research partnership which would allow the latter to use de-identified data from the Center’s electronic health records to improve on its predictive analytics power. Last June, a class-action was filed in the District Court for the Northern District of Illinois against the University, its medical center as well as the tech giant, accusing them that the information used in the collaboration are at risks of being re-identified.

According to the plaintiff, who happened to be a former patient at the University’s medical center, claimed that the shared data did not remove information such as doctor’s notes and date stamps of when the patients check in and out of the hospital. Coupling with the wealth of information that Google has already obtained over the years, it’s believed that the tech giant can easily re-identified the subjects captured in the health records.

Risks of HIPAA exception  

As artificial intelligence (AI) becomes more promising in changing medicine and healthcare, collaboration between medical institutions and private sectors seems inevitable. It’s believed algorithms will save $100 billion annually if they are applied to optimize medical procedures; facilitate research and clinical trials, or inventing new tools. However, looking at the case mentioned above, it also highlights the possibility that some of the savings incurred from AI deployment may have to go into legal settlements.

In the US, the Health Insurance Portability and Accountability Act (HIPAA) prevents the disclosures of individually identifiable health information. This means if one cannot identify a person via given health information, those data are not subject to all of HIPAA’s requirements (i.e., HIPAA exception).

On the other hand, according to the Code of Federal Regulations, de-identified data can be established either by the removal of certain identifiable, so that the “covered entity does not have actual knowledge that the information could be used alone or in combination with other information to identify an individual who is a subject of the information” or to be determined by a fellow expert with related knowledge “that the risk is very small that the information could be used alone or in combination with other information to identify an individual who is a subject of the information”.

Because the amount of healthcare and personal data shared between health institutions and private companies are often huge. Unless data are extremely aggregated, if not, regardless of how de-identified the information is, when they are piece together, it’s challenging they will not reveal the slightest hint of who the owner of the information is. As such, falling into HIPAA exception may not be entirely possible.

It’s time for a change

Most importantly, HIPAA does not require information to be re-identified to be considered as de-identified. Companies should not take it for granted that HIPAA exception may spare them from potential litigation.

In Europe, Recital 26 under General Data Protection Regulation (GDPR), anonymous information is defined as “information which does not relate to an identified or identifiable natural person or to personal data rendered anonymous in such a manner that the data subject is not or no longer identifiable”. Once data is “truly non-identifiable”, they will not fall within the scope of GDPR. Again, there’s ambiguity of what’s “truly non-identifiable”.

Therefore, the question often becomes whether private entities or in this case, Google, tried to re-identify that data it has obtained. Also, what was stated in the contract the company has agreed upon with the medical institutions and if it allows the company to re-identify any data? Either way, there’s no clear evident that data involved are de-identified and thus, fall under HIPAA exception.

Such grey area, as suggested by healthcare attorney – Patricia S. Calhoun and data privacy and cybersecurity litigation attorney – Patricia M. Carreiro, perhaps HIPAA has not kept up with the present technological advancement, whereby privacy and related protection over sensitive data is rapidly changing.


Author Bio

Hazel Tang A science writer with data background and an interest in the current affair, culture, and arts; a no-med from an (almost) all-med family. Follow on Twitter.