According to the latest research finding released by ORCHA, a UK based health application evaluation and advisory organization, only 15% of healthcare applications available to the public for download, meet the minimum safety standards set by its review process, indicating the need for stricter or formal regulation. 

The review process 

ORCHA assessed over 5000 healthcare applications over seven stages, based on 260 performance and compliance factors including clinical effectiveness, user experience, data security and so on. 

Each health application is then given a score, as a reference for the government, public health services, or social care organizations to choose and deliver the ones which they believe would have the biggest impact in improving clinical outcomes. On top of which, ORCHA also liaised with developers to clear any discrepancy found during the evaluation.

During which, ORCHA found that 75% of the health applications targeted at individuals with blood pressure concerns and 85% of femtech and pregnancy applications were found not to meet its quality threshold. 

The need for regular and appropriate assessments 

Liz Ashall-Payne, ORCHA’s Chief Executive Officer believes digital health applications are one of the most crucial tools available to tackle health issues in the present, increasingly aging population that is going to face more complex and long-term problems. 

In view of the prevalence of unsafe health applications, she added, “the fact that only 15% of apps that we review meet the minimum standards show there is a desperate need to regularly and properly assess the apps available to ensure that people are protected against the serious risks associated with downloading ineffective or even harmful apps”.

Helen Huges, Chief Executive Officer of Patient Safety Learning, a London-based independent non-profit organization dedicated to improve patient safety, thought the research results reaffirmed the need for consistent regulatory standards and the need for an accreditation framework. 

“Essentially, what we want is patient safety embedded in all of the review processes so that we can inform and guide clinicians and inform and guide patients and that there is appropriate research on their use and their impact so that information can feed the improvement of standards,” Huge said. She also believes in providing better training for developers to support them in creating clinically safe applications. 

Author Bio
synthetic gene empathy chinese artificial intelligence data medicine healthcare ai

Hazel Tang

A science writer with data background and an interest in the current affair, culture, and arts; a no-med from an (almost) all-med family. Follow on Twitter.