Class Action Alleges Meta, OnlyFans ‘Blacklisted’ Certain Adult Performers from Social Media Advertising
Dangaard et al. v. Instagram, LLC et al.
Filed: February 22, 2022 ◆§ 3:22-cv-01101
A class action alleges Meta Platforms has colluded with OnlyFans to “blacklist” from social media advertising adult entertainment performers who work with the subscription site’s competitors.
Instagram, LLC Meta Platforms, Inc. Fenix Internet LLC Fenix International Inc. Facebook Operations, LLC
A proposed class action alleges Meta Platforms has colluded with the operator of OnlyFans to effectively “blacklist” from Facebook and Instagram advertising adult entertainment performers who work with the subscription site’s competitors.
The 39-page lawsuit says that Meta and OnlyFans’ blacklisting “scheme” has harmed thousands of small entrepreneurs who rely on social media to promote themselves and earn a living, while adult entertainment performers associated exclusively with OnlyFans have seen no such harm.
At the same time, Meta has helped OnlyFans solidify its status as the dominant online subscription platform in the adult entertainment industry by way of deleting the accounts and/or blocking the visibility of performers affiliated with competing adult entertainment platforms, the complaint alleges.
“The scheme was intended to destroy the [adult entertainment] Platforms’ businesses, and either destroy the [adult entertainment] Providers or force them to work exclusively through OnlyFans,” the case alleges.
According to the complaint, professional adult entertainment performers such as the plaintiffs rely on social media such as Instagram and Twitter to guide customers to their pages on adult entertainment platforms. Social media is so important to the subscription-based adult entertainment industry that, without it, the lawsuit says, “the business model for the industry is dead.”
The lawsuit alleges that the “collusion” between Meta, Facebook, Instagram and OnlyFans operator Fenix International Ltd. has “methodically damaged or destroyed the businesses” of competing adult entertainment platforms and at the same time unfairly harmed the livelihoods of performers who use those platforms.
The suit says that although the online adult entertainment industry was a “vibrant, competitive market” as recently as late 2018 or early 2019, it was around that time that professional adult entertainers experienced “a drop-off in traffic and user engagement on social media platforms.” As the case tells it, the “deletion and hiding of posts” and subsequent drop in social media traffic for certain performers occurred suddenly, and was so “substantial and dramatic” that “only automated processes could be responsible.”
As a result, OnlyFans “began to grow incrementally, and then exponentially,” while its competition “stagnated or saw dramatically reduced traffic and revenue,” the lawsuit claims.
The suit alleges that the “blacklisting process,” i.e., when certain accounts are identified to social media platforms in a way that encourages their deletion or a reduction in their visibility, was accomplished first internally at Instagram and Facebook by way of automated classifiers or filters, which were then submitted to a shared industry database of “hashes,” or unique digital fingerprints. According to the complaint, this database was and is intended to flag and remove content made by terrorists and related “dangerous individuals and organizations (DIO).”
“However, the [adult entertainment] Performers blacklisted, and the [adult entertainment] performers injured by the blacklisting, were not terrorists and had nothing to do with terrorism of any kind,” the suit stresses.
The proposed class action looks to represent:
“All Adult Entertainment Providers, regardless of the label they use, such as performer, influencer or artist, who suffered economic injury because they either (i) used their Instagram or Facebook account to link to, promote, or demonstrate praise, substantive support, or representation of any competitor of OnlyFans at a time when those businesses were falsely designated as a Dangerous Individual or Organization (‘DIO’) under any past or present version of Meta’s DIO policy, or that of Facebook or Instagram or any of their predecessor or subsidiary entities or technologies, or (ii) were themselves falsely designated as a DIO; the class includes anyone who suffered damages from any shift in the scheme beyond the initial DIO tactic, such as suffering continuing effects through other computerized systems.”
Get class action lawsuit news sent to your inbox – sign up for ClassAction.org’s free weekly newsletter here.
Camp Lejeune residents now have the opportunity to claim compensation for harm suffered from contaminated water.
Read more here: Camp Lejeune Lawsuit Claims
Sign Up For
New cases and investigations, settlement deadlines, and news straight to your inbox.
Before commenting, please review our comment policy.