Sensitive Data and Sexual Privacy: Spotlight on Flo Health
9-minute read
The FTC is continuing to crack down on companies that profit from people’s intimate data, this time going after data broker Kochava that made it possible to track people’s visits to reproductive health clinics.
Before this, the FTC pursued intimate reproductive health data that the period-tracking app FloHealth was surreptitiously funneling to Facebook and subsequently third parties.
Can women trust the explicit promises made by FemTech apps about their privacy?
As post-Roe restrictions on abortion access are introduced across states, concerns mount about how information shed by abortion-seekers, if not adequately safeguarded, could be accessed and used as evidence of wrongdoing. Recently, the Federal Trade Commission filed a lawsuit against data broker Kochava Inc. for selling geolocation data that could be used to pinpoint individuals who might have visited sensitive locations. A by-product of selling location data “to the highest bidder,” the FTC alleges, is that Kochava has made it possible to identify and trace people visiting reproductive health clinics. Privacy advocates highlight how this data could be used to identify and build a case against abortion-seekers, as well as medical professionals performing or assisting with abortions. Though the FTC alleges Kochava’s practices were unfair, it is far from the only data broker with access to sensitive data related to reproductive health. Over two dozen of the nation’s data brokers have been found to hold or sell information on millions of pregnant – or even potentially pregnant – women across the country. In the wake of the Supreme Court’s decision to overturn the right to seek abortion, the increasingly precarious status of reproductive data is stark. But this is not the first time the FTC has sounded the alarm on unfair or deceptive practices towards data in reproductive health contexts. The case of Flo Health last year was a foreboding precursor to many of the issues regarding sexual privacy that must be addressed in a post-Roe world.
Flo Health Inc, a period and ovulation tracker which claimed 32 million active users in 2021, offers its users accurate cycle predictions, personalized health insights, and a better standard of gynecological health. The catch? Despite explicitly promising to keep users’ intimate and sensitive health data private, Flo disclosed this data to third parties that provided marketing and analytics services to the app, including to Facebook’s analytics division. Having been charged by the FTC for “unfair and deceptive practices” regarding its handling of millions of women’s data, Flo’s data practices serve as a powerful reminder of the privacy harms caused by data aggregation, a lack of transparency, exposure, and intrusion. The case of Flo reiterates the highly intimate status held by reproductive data – as well as the conditions that lead women to funnel this data into apps like Flo to gain insights into their wellbeing rather than trusting a medical system that has historically under-researched women’s health.
Founded in 2015 by Belarusian co-founders Dmitry and Yuri Gurski, Flo has a valiant mission statement to “help girls and women to prioritize their health.” It uses Artificial Intelligence (AI) to provide personalized menstruation cycle predictions, virtual health support, and insights on peak fertility to assist with pregnancy planning. Flo’s users are asked to enter a swathe of personal, and health information to gain access to what Flo claims are “the most precise AI-based period and ovulation predictions.” So, users are asked to provide details regarding their experience of over 70 symptoms throughout the month, including cramps, discharge, and headaches. In addition to providing their menstrual and pregnancy-related symptoms, users must input their names, dates of birth, email addresses, places of residence, dates of menstrual cycles, dates of pregnancies, weight, and temperatures. Upon using the Flo app, users are also encouraged to provide more qualitative intimate data to Flo by completing survey questions, including “how often do you have sex?” and “how often do you masturbate?” Active users of the app are continuously urged to input intimate health data.
Recognizing that users would be concerned about the privacy and security of this type of data, Flo specified in its 2017 to 2019 privacy policies that it would not share “information regarding…marked cycles, pregnancy, symptoms, notes and other information…that…[users] do not elect to share.” Despite this explicit assurance, the Wall Street Journal reported in February 2019 that it had successfully intercepted health information about users transmitted by the Flo App to Facebook. This information was unencrypted, and it allowed the Journal to identify individual users, whether these users intended to get pregnant, and even when they were menstruating.
How did this data sharing occur? Like most app developers, Flo incorporated “software development kits”— or SDKs – provided by Facebook, to build their app. Using Facebook’s SDK allowed Flo developers to incorporate an analytics tool called “App Events.” This tool enabled Flo developers to record their users’ activity, such as when they entered menstruation dates. Developers could then analyze this activity to improve the app’s functionality and accrue insights on which app features were likely to interest new users. These SDK functionalities required, in turn, that Flo sent any “App Events” to Facebook. As a result, information about users’ devices was made available to Facebook, as well as any other data Flo defined as an “event”, whether that may be the dates of users’ menstruation, acne flareups, or pregnancies. Crucially, Facebook could access data belonging to Flo users, regardless of whether they had logged into Flo via Facebook or even had a Facebook account in the first place.
The result of this SDK-enabled transaction between Facebook and Flo was that Flo could access statistics about Flo users to iterate and better their experience on the app. In return, Facebook could access these same statistics to send more personalized Facebook ads and content to its own users. All the while, Flo users were kept in the dark about the swathes of personal and health information that was being captured and shared about them between Flo and Facebook.
The egregious privacy violations facilitated through these SDKs were heightened because Facebook was able to re-identify any sensitive information about Flo users by matching data they received from Flo to actual Facebook users. The promises of anonymity and privacy that Flo developers explicitly made to users were abandoned in a bid to keep women using and downloading their app. At its core, the privacy harms Flo perpetuated stemmed from its secondary use of users’ data without obtaining consent. While women who inputted sensitive details about their bodies agreed to make this data accessible to Flo, they did not agree to hand it over to Facebook. They certainly did not agree to make it available to any companies to which Facebook might forward this data thereafter. Flo was false or misleading in its privacy policies by neglecting to make this secondary use of data clear, thereby violating its users’ trust.
In addition, Flo exploited the uniquely trusting community that it had established on its app. Upon learning that information about their menstruation and pregnancies was accessible to third parties, hundreds of users stated that they felt “victimized” and “violated” by Flo’s actions. Users reasonably felt exposed by the intimacy of the information Flo shared about them. This exposure is especially pertinent because user data included health information that would typically be protected by privacy laws if shared during a standard medical consultation. Given that Flo used its unethical data practices to encourage users to dedicate more data and time to their app, its treatment of its users is even more morally problematic. Scholars have recognized the intrusive dignitary harm that comes from involving the individual in the very process that makes an app more addictive to them.
Flo could have explored a variety of options to prevent the privacy harms that their users experienced. It could have, but it failed to do so. This failure speaks to the pressures of protecting profit margins over user safety and security. Flo app developers could have chosen not to use SDKs in their development process, sacrificing the time and money saved by this process in exchange for the certainty that user data was being protected. But the use of SDKs area remains a fairly standard piece of the software development puzzle. A perhaps more realistic stance that Flo could have taken was to have enforced structural checks on the incorporation of SDKs into the app. For example, Flo’s management could have required that impact assessment reports and other disclosure requirements be used to document the ambit of data sharing initiated by the SDKs. Ultimately, the Flo app development process should have centered the perspectives of women who might input their reproductive health data into the app to apprehend their data privacy concerns. Most importantly, Flo should have clearly disclosed to users what their SDKs meant for user data, such as by disclosing how this data would be harnessed to garner analytics on users and advertise to them. An option for users to opt-out to the tracking and sharing of their data to third parties should have been harnessed to establish user consent.
The blame for this privacy violation does not just lie with Flo. Regarding Facebook’s role in the privacy violation, the tech giant could have implemented up-front controls on the types of data sharing that its SDKs would allow. This would have prevented any health data, financial information and other sensitive characteristics being shared to Facebook by its partners. The problem remains, however, that Facebook is not incentivized to impose clear limits on the data that is shared with it through SDKs. For this sharing is, in part, what allows Facebook to monetize targeted ads to its users. Some even argue that it is entirely unreasonable to expect Facebook to limit data sharing through SDKs, given that this transaction is such a standard industry practice. However, given the string of high-profile privacy scandals for which Facebook has been criticized since its involvement with Cambridge Analytica, it is reasonable to expect that it – among other platforms – might feel pressured to impose greater protections on sensitive data, even when using SDKs.
The case of Flo Health is not just distressing for privacy enthusiasts; it also evidences that the disconcerting effects of the gender healthcare gap continue to be felt. In the UK, less than 2.5% of publicly funded research specifically targets reproductive health. This is in spite of the reality that one in three women in the UK will suffer from reproductive health issues. This disparity exists in the US, too, which historically excluded female participation in medical research trials by the National Institute of Health (NIH) until 1993. Such exclusion created a vacuum of data about women’s bodies and contributed to the current and striking lack of research into conditions like endometriosis, infertility, and other gynecological and reproductive issues.
This context matters for the case of Flo Health. It makes the privacy violations enacted by Flo even more egregious. Given the disappointing level of medical attention given to female reproductive health issues, you could argue that many women feel obliged to turn to technological solutions to manage their menstruation and fertility using “FemTech” apps. With female sexuality continuing to be stigmatized in the 21st century and female reproductive health conditions remaining woefully under-researched, misdiagnosed, and mistreated in practice, it is no wonder that women jump at the opportunity to share their data to digital services claiming to prioritize their health. When FemTech apps like Flo share the intimate data of their users with social media platforms in an unfair or deceptive manner, they not only jeopardize women’s autonomy, but also violate their trust. As the FemTech industry continues to boom in the wake of the pandemic, policymakers and technologists must ensure that women are able to invest trust into digital reproductive solutions without compromising their privacy.