From period and sleep trackers to mental health chatbots and digital weight loss diaries, there’s no shortage of wellness apps out there that aim to help us better understand our health. But there have also been concerns about the misuse of data collected by some apps. Indeed, a recent report from University College London and King’s College London warned that the users of dozens of female health monitoring apps are being subjected to “unnecessary privacy and safety risks through poor data handling practices”.
Against this backdrop, the developers of a new generation of health technology called digital therapeutics have had to be especially mindful of privacy concerns. Digital therapeutics uses software to deliver clinically approved treatments right to the patient’s mobile device or laptop. Unlike off-the-shelf wellbeing apps and fitness trackers, they undergo rigorous testing and approval processes.
When used properly, the data deployed by digital therapeutics gives healthcare practitioners the opportunity to administer more personalised care. Guy Checketts, head of evaluation and transformation at Health Innovation Oxford & Thames Valley, part of an NHS network that encourages the spread of innovation, says that while digital therapeutics solutions also collect and use personal data, they are assessed and approved by industry regulators – which means they generally have stricter data privacy and usage controls than non-regulated wellness apps.
As part of their regulatory obligations, digital therapeutics solutions should only use the personal data required to achieve positive patient outcomes and improve medical interventions accordingly.
Matt Williams, senior programme manager for mental health at Health Innovation Oxford & Thames Valley, explains how this might work in practice by using the example of a digital therapeutics service designed to tackle sleep disorders by tracking how long a patient sleeps or how many times they wake up at night.
He says this data should not be used to treat any other conditions because this could breach strict agreements struck with regulators. “The data collected by the digital therapeutic would be put to good use as part of the treatment, or used to enable communication with a patient,” says Williams. “It wouldn’t make sense to collect additional data for the sake of it as part of a digital therapeutic, unless this was part of a study.”
Echoing these points, Checketts says the companies developing digital therapeutics solutions and the health professionals prescribing them can only utilise these technologies and their underlying datasets for purposes and outcomes approved by regulators. These must be clearly outlined during the regulatory process, too.
Checketts adds that developers should use population-level [rather than individual] data to ensure people are using their digital therapeutics solutions correctly and to fix any technical issues. He adds that healthcare professionals would typically monitor individual patients to ensure health outcomes are being met within “expected limits”.
While digital therapeutics solutions follow strict industry regulations to prevent data misuse, Williams points out that they are often used with other digital health systems. He warns this generates large swathes of patient data that have the potential to be breached or misused: “This is why data security and the Data Protection Act are so vital in this space and are applied – and enforced – so rigorously.”
When it comes to complying with laws such as the Data Protection Act and the UK General Data Protection Regulation (GDPR), Checketts says organisations developing and using digital therapeutics solutions must create and implement comprehensive data-sharing agreements.
With all parties sharing digital therapeutics data expected to follow these arrangements, Checketts says they must also create a data-sharing protocol and a data protection impact assessment specific to the shared and processed data.
According to Williams, data-sharing agreements are vital to clarify how and who can share or use health-related data. They will also state the type of data being gathered and its purpose.
“These protect our data and ensure ethical as well as legal issues are taken into consideration, to avoid the misuse or abuse of data,” says Williams. “The penalty for data breaches are significant and it is recommended that such agreements are reviewed and revised over time.”
Effective data flow management is crucial to the running of digital therapeutics solutions and meeting their data protection obligations. Williams notes that this can be challenging for a number of reasons. “The flow of data can also be disrupted by issues such as the lack of interoperability within any health system,” he says.
Unique challenges such as these make it all the more essential that digital therapeutics developers and providers adopt best practice and meet their regulatory obligations when it comes to data and privacy. “My feeling is that there is a broad and growing recognition of the value of health-related data and a growing interest in the use of digital therapeutics, particularly as an alternative to pharmaceutical interventions to mental health conditions,” says Checketts.
Otsuka is not responsible for the content of the external links in this article