FTC Takes Action Against OkCupid and Match Group for Sharing User Photos with a Facial Recognition AI Company

On March 30, 2026, the Federal Trade Commission filed a complaint in the U.S. District Court for the Northern District of Texas against Humor Rainbow, Inc. d/b/a OkCupid and its affiliate Match Group Americas, LLC, alleging that OkCupid provided nearly three million users’ photos, demographic information, and location data to an unrelated facial recognition AI company, in direct contradiction of the privacy promises OkCupid made to its users. The parties simultaneously filed a Stipulated Final Order settling the matter.

The FTC’s announcement frames the case in simple terms: OkCupid said one thing in its privacy policy and did another. The settlement imposes no monetary fine, but it carries a permanent injunction, a decade of compliance reporting obligations, and a 20-year court order. The Commission voted 2-0 to authorize the filing.

What Happened: A 2014 Data Transfer That Took Over a Decade to Resolve

In September 2014, the CEO of Clarifai, Inc., an artificial intelligence company that builds facial recognition and computer vision technology, contacted one of OkCupid’s founders and requested access to large datasets of OkCupid user photos. Clarifai was at the time building image recognition systems that required large training datasets. OkCupid’s founders obliged: the company provided Clarifai with access to approximately three million OkCupid user photos, along with each user’s demographic and location information.

The reason for the transfer had nothing to do with OkCupid’s business operations. According to the FTC complaint, OkCupid’s founders were personally and financially invested in Clarifai. The data was provided as a favor to a company in which the founders held a financial stake, not pursuant to any commercial arrangement, service agreement, or legitimate business purpose. Clarifai paid nothing for the data and provided no services in return.

Compounding the problem, OkCupid never executed a formal agreement or placed any restrictions on how Clarifai could access, use, retain, or further share the data. There was no data processing agreement, no purpose limitation, no deletion requirement, and no security obligation. Millions of users’ intimate dating profile photos, shared on a platform marketed on privacy and trust, ended up in the hands of a facial recognition company without any guardrails whatsoever.

The Privacy Policy Problem

At the heart of the FTC’s case is a privacy policy problem. OkCupid’s 2014 privacy policy told users it does not share “your personal information with others except as indicated in this Privacy Policy or when we inform you and give you an opportunity to opt out of having your personal information shared.” The policy specified the categories with whom OkCupid might share data: service providers, business partners, other businesses within its family of businesses (i.e., Match Group entities), and in response to legal obligations.

Clarifai fit none of those categories. It was not a service provider as it provided no services. It was not a business partner, as there was no business relationship. It was not a Match Group affiliate. And OkCupid neither informed users of the sharing nor gave them any opportunity to opt out, as its own policy required.

The FTC’s theory is straightforward: this is classic Section 5 deception under the FTC Act. OkCupid made material representations about its data sharing practices in its privacy policy, and those representations were false. As FTC Bureau of Consumer Protection Director Christopher Mufarrige stated: “The FTC enforces the privacy promises that companies make. We will investigate, and where appropriate, take action against companies that promise to safeguard your data but fail to follow through.”

This case reinforces a point we have written about extensively: privacy policies are not background compliance documents; they are legal commitments that will be enforced. What your policy says about data sharing must accurately reflect what your company actually does, including every data flow, every vendor relationship, and every third-party transfer.

The Concealment and the Investigation That Required a Court Order to Complete

The FTC’s complaint goes beyond the initial data transfer. It alleges that Match Group and OkCupid spent more than a decade actively concealing the Clarifai data sharing. When the New York Times began investigating the story in 2019, OkCupid issued a public statement that obscured its relationship with Clarifai and denied providing the company with user data. When OkCupid users directly inquired, Humor Rainbow told them that “any implication that OkCupid released users’ information to [Clarifai] is false.”

The concealment extended to the FTC’s own investigation. The Commission had to file a petition in federal court to enforce its Civil Investigative Demand, the agency’s investigative tool for obtaining documents, after OkCupid resisted compliance. The FTC prevailed in that enforcement proceeding, which ultimately unlocked the documents that supported the complaint. The press release notes this sequence explicitly, describing the OkCupid action as following the FTC’s “success in enforcing Civil Investigative Demand in federal court.”

This pattern, initial misconduct followed by years of denial, followed by obstruction of regulators, is relevant not only to the legal theory of the case but also to the remedies imposed. While the settlement imposes no fine for the underlying 2014 conduct, the compliance infrastructure built into the order is designed to ensure that future violations will not escape scrutiny.

The Settlement: No Fine, But a Permanent Prohibition and 20-Year Order

The Stipulated Final Order imposes no monetary penalty, a fact that has drawn significant public attention and criticism given the scale of the underlying conduct. Neither defendant admitted wrongdoing. However, the injunctive terms are substantial and long-lasting.

What the order permanently prohibits:

The defendants, and all officers, agents, and employees acting in concert with them, are permanently enjoined from misrepresenting, expressly or by implication:

  • The extent to which they collect, maintain, use, disclose, delete, or protect any covered personal information (including photos, demographic data, and location data)
  • The purposes for which they collect, maintain, use, or disclose that information
  • The function of any privacy controls presented to users through their interfaces, any consumer choices available under state privacy laws, or any other mechanisms represented as allowing users to limit or manage how their data is processed

These prohibitions apply to OkCupid and any successor online dating service.

Compliance obligations:

  • Each defendant must submit a sworn compliance report within one year
  • For 10 years, defendants must notify the FTC within 14 days of any material change in business structure
  • Defendants must maintain records of consumer privacy complaints and all records necessary to demonstrate compliance for 10 years, with five-year retention per record
  • The order remains in effect for 20 years

The absence of a monetary penalty reflects what ArentFox Schiff characterized as the current FTC’s return to “back-to-basics” Section 5 deception enforcement, focused on truthful disclosures and injunctive relief, rather than the broader unfairness theories and aggressive monetary remedies associated with prior administrations. The order is injunctive and forward-looking. But it creates a framework where future violations of the privacy representations it prohibits would expose Match and OkCupid to contempt proceedings and significant financial liability.

The AI Dimension: Why This Case Matters for How Businesses Handle AI Vendor Data Sharing

The third-party recipient in this case was not a generic analytics vendor or ad-tech partner. It was Clarifai, an AI company building facial recognition technology using large image training datasets. That fact elevates the case’s significance for any business that currently shares, or is considering sharing, user data with AI platforms for training, fine-tuning, or model development purposes.

As ArentFox Schiff observed in their analysis, many businesses now interact with AI platforms for analytics, content moderation, image analysis, fraud detection, and other use cases. A disclosure that says “we share data with service providers” will frequently not cover an unrelated AI vendor that is not bound by a data processing agreement. Where a company’s privacy policy reserves sharing for specific categories, or commits to providing notice and an opt-out before broader transfers, any flow of personal data to an AI company must either fall within a disclosed category or be separately and accurately disclosed.

The OkCupid case makes clear that AI vendors are not exempt from the basic rules governing third-party data sharing. If anything, the sensitivity of the data involved, photos used to train facial recognition algorithms, makes accurate disclosure and proper contracting more important, not less. For a deeper look at the legal issues surrounding AI vendor contracts and data use, see our article on navigating AI vendor contracts and protecting your data from AI training.

Key Compliance Lessons

Your privacy policy must reflect your actual data practices, including every third-party relationship. The OkCupid case is a textbook illustration of what happens when a privacy policy says one thing and company conduct says another. Review your privacy policy against your actual data flows, including every vendor, partner, and AI platform that receives personal information. If a recipient is not covered by your current disclosures, you either need to stop the transfer or update your policy with adequate notice and consent before the transfer, not after.

Conflicts of interest in data sharing decisions are a red flag. The Clarifai transfer happened because OkCupid’s founders were personally invested in the recipient company. That conflict of interest drove a data-sharing decision that served the founders’ financial interests rather than OkCupid’s users. Any data-sharing decision motivated by relationships or interests outside the company’s business operations should receive heightened scrutiny.

No data processing agreement is not an option. The complaint specifically highlighted that OkCupid “never executed a formal agreement or set forth restrictions” on how Clarifai could use the data. Proper data processing agreements or related sharing agreements are not optional paperwork; they are the contractual foundation that allows a company to accurately represent that a third party is receiving data subject to purpose limitations, security requirements, and use restrictions. Without them, both the legal classification of the relationship and the accuracy of your privacy disclosures are at risk, and the disclosure may not be compliant more generally.

AI platforms require explicit coverage in your disclosures. “We share data with service providers” does not cover an AI company that is not providing services to you. If personal data flows to an AI vendor for training, analysis, or any other purpose, that relationship must be accurately described in your privacy policy and governed by appropriate written agreements. The FTC’s enforcement priorities are increasingly attentive to undisclosed AI data flows. See our AI governance page for guidance on how to structure these arrangements.

The FTC will enforce its Civil Investigative Demands. The Commission went to federal court to compel compliance with its document demand when OkCupid resisted. Obstructing a regulatory investigation does not make a problem go away; it makes it worse and more expensive. If your organization receives an FTC CID or other regulatory inquiry, engage FTC compliance counsel immediately.

Dating app users’ data deserves the same protections as any other sensitive data. Dating profiles contain photos, sexual orientation indicators, relationship preferences, geolocation, and other sensitive information. The FTC’s action confirms that this data class warrants the same rigorous handling, accurate disclosure, proper agreements, and meaningful access controls as health data or financial records.

Conclusion

The OkCupid/Match settlement is a decade-in-the-making reminder that privacy policy accuracy is not a technical compliance exercise; rather, it is a legal obligation that the FTC and other regulators will eventually enforce, regardless of how much time has passed or how thoroughly a company has tried to obscure the facts. The AI dimension of this case gives it particular resonance for 2026: as businesses increasingly send personal data to AI companies for training and analysis, the gap between what privacy policies say and what actually happens is widening for many organizations. The time to close that gap is before the investigation, not after.


UPDATE: April 20, 2026 – Following the FTC’s recent settlement with OkCupid and Match Group regarding the unauthorized sharing of user data, artificial intelligence firm Clarifai has confirmed it has deleted the 3 million user photos at the center of the controversy. In a certification to the FTC and statements to U.S. lawmakers, Clarifai stated that it has also destroyed the facial-recognition models trained on that data.