Health care apps have a privacy problem

What’s happening? California-based health technology start-up myNurse has stopped operating following a major data breach in March that exposed the personal health information of its users. The company, which offered remote patient monitoring services and chronic care management, said the shutdown was “unrelated to the data security incident”, but gave no other reason for its decision. The breach, which took place on 7 March, saw the personal data of users such as names, dates of birth, medical histories, diagnoses and insurance information accessed by an unauthorised individual. (TechCrunch)

Why does this matter? Health care apps often require sensitive and personal information from individuals in order to operate effectively. It would be expected, therefore, that data of this nature would be protected and securely encrypted to safeguard users, but this is often not the case.

This issue has been thrust into the spotlight once again after the US Supreme Court’s leaked opinion on the Roe vs. Wade case. There are concerns that period-tracking apps could be used by law enforcement to target those suspected of having abortions because, unlike medical records, information gathered by apps is not protected by the Health Insurance Portability and Accountability Act (HIPAA) in the US.

Digital help — Following the Covid-19 pandemic, health care app downloads have increased with mental health apps in particular rising by 200%.

Growing demand has created a plethora of mental health apps all hoping to capitalise on consumer needs. Pressure to release products, however, may have sidelined privacy details in favour of first-mover advantage.

Not only do certain apps have poor privacy practices, but several, including BetterHelp and Cerebral, claim they reserve the right to change policies at any time. Moreover, data can also be passed on to the purchasing company in the case of an acquisition. These incidents aren’t limited to emerging apps or websites – Crisis Text Line has recently stopped sharing conversation data with customer service firm Loris.ai after concerns from data privacy experts.

Negative data — Mental health websites and apps don’t have to sell data to third parties, but many still do. The personal nature of the data is what makes it so fruitful for advertisers who can become more targeted in their approach. This method, however, can backfire due to the sensitivity of the information. A bereaved mother, for example, called out Facebook, Instagram, Twitter and Experian after she was overwhelmed by baby-related promotions following the death of her child.

Positive data — A lack of transparency and failure to encrypt data creates distrust among users and makes it less likely for health care data to be shared when doing so can have benefits for treating conditions or furthering research. This suggests there are ways to use data more effectively than just for profit. The prospect of advertising money is an enticing one, however, companies shouldn’t sell sensitive data just because they can.

Read more articles

Sign up to newsletter

Share This Post

You might also like

IBAT launches new DLE lithium extraction technology in commercialisation move

What’s happening? International Battery Metals (IBAT), the Houston-based lithium processing company, has launched its own version of a lithium filtration ...

Read more

Claire Pickard
July 25, 2024

Microsoft and Occidental Petroleum sign record carbon credit deal to offset data centre and AI-driven emissions 

What's happening? Occidental Petroleum will sell 500,000 carbon credits to Microsoft over six years in a record carbon credit deal ...

Read more

Sam Robinson
July 18, 2024

Avatar photo

Shell faces $2bn impairment charge over biofuel backtrack

What's happening? Shell has said that it expects to experience an impairment charge of up to $2bn as a result ...

Read more

Claire Pickard
July 17, 2024