Menstrual tracking apps are some of the most popular on the digital health market. Through collecting intimate data about users – when their period is, when they last had sex, their mood and general health, even which sanitary products they use – the app offers an estimate for when they are the most fertile and when to expect their next period.
According to a study recently carried out by Privacy International (PI), several menstrual tracking apps have been sharing user data with Facebook via the social network’s software development kit (SDK).
The SDK allows software developers to receive analytics on which aspects of their app are most popular with users. Developers using SDK can also use Facebook services to monetise their apps through the Facebook Audience Network. Facebook may also use SDK data to provide individual users with more personalised adverts when they use the platform through the ‘Login with Facebook’ function available on many applications, with the Facebook user’s prior consent.
Whether this consent is given knowingly is questionable, with the clause often buried in lengthy terms and conditions pages consumers are unlikely to read.
Breaching the privacy of millions
PI found that the most popular menstrual apps on the market – two apps both named Period Tracker from software developers Leap Fitness Group and Simple Design, Flo and Clue – did not share data with Facebook. However, smaller apps which still boast millions of users, such as Maya by Plackal Tech, MIA by Mobapp and My Period Tracker by Linchpin Health, all did.
Maya informed PI that it had removed both the Facebook core SDK and Analytics SDK from the app upon receipt of the study. The company said it would continue to use Facebook SDK’s advertising services for users who had historically agreed to this in the terms and conditions and privacy policy of the app, claiming that no personally identifiable or medical data is shared.
How well do you really know your competitors?
Access the most comprehensive Company Profiles on the market, powered by GlobalData. Save hours of research. Gain competitive edge.
Thank you!
Your download email will arrive shortly
Not ready to buy yet? Download a free sample
We are confident about the unique quality of our Company Profiles. However, we want you to make the most beneficial decision for your business, so we offer a free sample that you can download by submitting the below form
By GlobalDataFacebook later told the BBC that its terms of service prohibit app developers who use SDK from sharing sensitive health information, and that it actively seeks to remove this data from the platform.
However, it’s hard to ignore that Maya, MIA and My Period Tracker alone have over seven million collective downloads, with all of the resulting information made available to Facebook via SDK. This isn’t a case of one small app with only a few hundred users flying under the radar, but of the sensitive data of millions of people being shared with a targeted advertising wing in one of the biggest social media giants in the world.
In its report, PI stated: “The responsibility should not be on users to worry about what they are sharing with the apps they have chosen. The responsibility should be on the companies to comply with their legal obligations and live up to the trust that users will have placed in them when deciding to use their service.”
Artificial intelligence and patient data
It’s understandable that tech users don’t want sensitive information about their health in the hands of big conglomerates, and it’s not just Facebook’s activities that have raised eyebrows. In the UK, news that the Amazon Alexa home assistant was going to start responding with NHS-ratified information when asked healthcare questions by users sparked concerns of a data protection disaster waiting to happen.
But some large corporations do appear to be taking steps to protect patient data. Google and the Mayo Clinic recently signed a ten-year partnership in which patient data will be stored in the cloud, to support the building of high-tech patient care products.
Despite Mayo’s data being stored in Google’s cloud, only the clinic itself will have access to patient information. Mayo may decide to share de-identified patient data with Google and other parties for specific research projects, but this data won’t be used to target advertising – unlike the data shared by the collective of menstrual tracker apps.
The partnership will give the Mayo Clinic greater access to state-of-the-art engineering and computing resources to develop artificial intelligence (AI) and other technologies.
AI in medicine itself raises several significant legal and ethical concerns surrounding data and privacy. For instance, data broking giants in the US, like LexisNexis and Axicom, are known to mine personal data and engage in AI development.
These companies could theoretically sell their AI-acquired healthcare data to third parties like marketers, employers and insurers. While the US Health Insurance Portability and Accountability Act (HIPAA) forbids health care providers and insurers from obtaining or disclosing patient information without consent, it doesn’t apply to other types of businesses.
Nothing to fear?
Despite these facts, some commentators see the entire conversation around AI and patient data protection as being blown out of proportion.
“The assumption that patient data is put at risk through AI is not necessarily true,” says AI echocardiography analysis firm Ultromics co-founder Professor Paul Leeson. “AI medical applications are being developed appropriately through regulatory systems to pretty much mirror what is required for any medical practitioner. Any AI applications that fail to start from that principle will fail to get adopted or get killed by authorities.”
Leeson isn’t wrong about this. The UK Information Commissioner’s Office (ICO) recently found that an agreement between Google’s DeepMind AI platform and the Royal Free London NHS Foundation Trust was in breach of the law.
Royal Free London was providing vast quantities of patient data to DeepMind for the development of its Streams platform, without adequately informing patients that their data was being used in this way.
The Trust was required to establish a proper legal basis for future data processing and complete a privacy impact assessment, as well as commission an independent audit into the processing of patient data during the implementation of Streams during the breach period. The partnership has continued, with no ICO objection to the new state of affairs.
As 2019 draws to a close, patient data is on undeniably strange tides. Many people don’t understand quite how their data is used, or why, and a lot of companies aren’t forthcoming with this information either. Facebook, Amazon and Google all claim the mass amounts of healthcare data they’re acquiring are being used to improve services or for other innocuous reasons, but that can be hard to believe when you’re being constantly bombarded with targeted adverts. But if something as innocuous as a period tracking app can be caught up in a data protection scandal, it might be time to start looking at the terms and conditions a little more closely.