Meta Loses Post-Trial Bid in Landmark CIPA Case Involving Flo Health App
A landmark jury verdict against Meta Platforms, Inc. in a California privacy class action was recently upheld by the United States District Court for the Northern District of California. The case, Frasco v. Flo Health, Inc., involved allegations that Meta, through its integration of the Facebook SDK in the popular Flo Period and Ovulation Tracker app, unlawfully obtained and recorded highly sensitive reproductive health information, data users entered into the app during onboarding and regular use, without proper consent, in violation of the California Invasion of Privacy Act (CIPA).
Key Findings from the Decision
The court explicitly rejected Meta’s post-trial motions to overturn the unanimous jury verdict, which found Meta liable for using the Facebook SDK as an “electronic recording device” capturing confidential communications without user consent. The other parties to the lawsuit, such as Flo, settled prior.
The judge emphasized that Meta:
- “Used the phone as a recording device” through the Facebook SDK embedded in the Flo App, which intercepted user inputs—such as menstrual cycle data and pregnancy status—in real time.
- Had actual knowledge it was receiving personal health information transmitted by the SDK, yet “did not take steps to stop it” during the class period.
- Received highly personal health data converted and transmitted to Meta’s servers instantaneously as each user answered onboarding survey questions, including fertility goals, cycle details, pregnancy weeks, and age.
- Did not gain valid consent through its privacy policies because the disclosures were too ambiguous to serve as clear, explicit notice to users. The court reinforced the standard that the “relevant question” is whether “a reasonable user reading [the privacy disclosures] would think that he or she was consenting to the data collection” (§ see Calhoun v. Google, LLC, 113 F.4th 1141 (9th Cir. 2024)).
The judge noted that the SDK functioned as a recording device despite being software, agreeing that CIPA’s plain meaning extends beyond physical devices to electronic means of recording communications. The Flo App users’ phones, with the SDK integrated, operated as a recording device “meshed” with app code, capturing and transmitting the communications.
Trial Evidence and Technology Details
Expert testimony detailed the mechanics of the SDK and app interaction. Each time a user answered survey questions and tapped “Next,” the SDK’s functions triggered immediate logging and transmission of the user’s responses to Meta. Functions like LOGSINGLEEVENT and LOGEVENTFACEBOOK were examined in the source code, showing Meta’s code actively recording and sending each answer.
The SDK also collected persistent device and advertising identifiers to link this health data with individual Facebook profiles, enhancing Meta’s advertising and machine learning capabilities. The judge stated that Meta considered this data extremely valuable and actively encouraged developers to incorporate its SDK to bolster its ad business.
Rejection of Meta and Flo’s Disclosures
Meta claimed that plaintiffs, as Facebook users, consented or were otherwise made aware of Meta’s privacy policy, which broadly stated that Meta could receive data from third-party apps. The court rejected this, focusing on whether a reasonable user would believe they had consented to the specific type of data collection in question. The jury found Meta’s privacy policies too vague to clearly inform users that their intimate health data was collected via the SDK.
Meta cited language about advertisers and app developers sending information through Facebook tools, but the court found this did not explicitly cover the precise conduct of collecting sensitive menstrual and pregnancy data. The disclosures did not meet the legal requirement for clear, explicit consent.
Flo’s privacy policies further supported the lack of consent by assuring users that personal health information, such as cycle details and pregnancy symptoms, would not be shared with third parties. This reinforced users’ expectations that such data would remain private.
The court thus upheld the jury’s finding that plaintiffs did not consent to Meta’s conduct based on neither Meta’s general privacy disclosures nor Flo’s privacy assurances.
Class Certification and Privacy Expectations
The court upheld the class certification, rejecting Meta’s argument that individual factual differences would overwhelm common evidence. The three named plaintiffs and millions of class members experienced uniform app onboarding, privacy disclosures, and SDK data collection practices. Plaintiffs considered the information they shared to be highly private and did not consent to Meta’s recording.
Statutory Damages and Injury
The court reaffirmed that under CIPA, statutory damages accrue at the moment an unauthorized recording occurs, and plaintiffs need not prove actual damages beyond the violation itself. The jurors credited the plaintiffs’ testimony about embarrassment and invasion of privacy as valid.
Impact and Takeaways
- First-ever jury verdict to find an SDK violated CIPA by functioning as an electronic recording device.
- Companies integrating third-party SDKs, especially in health or sensitive data contexts, face heightened legal risk for real-time interception without explicit consent.
- Boilerplate privacy policies or generic notices are unlikely to satisfy consent requirements under CIPA for sensitive communications.
- Device-based data collection, even by software components, can constitute “electronic recording devices” under California law.
Conclusion
The Meta-Flo Health CIPA verdict is a reminder that the risk from pixels and wiretapping-based claims, as well as other vectors of attack for “opportunist litigation,” cannot be overstated. It underscores the necessity of clear, informed user consent through cookie banners, privacy policies, and other disclosures, as well as affirmative consent, as well as rigorous oversight of third-party code embedded in apps, especially in the context of targeted advertising, including through data mapping. Particularly for businesses processing sensitive data, including health data or genetic data, among other kinds of data with heightened sensitivity, implementing both technological and legal controls is essential to stay ahead of growing scrutiny and risk.