A proposed class action alleges Roblox systematically takes advantage of users—the vast majority of whom are under 18—by engaging in a profit-driven scheme whereby the immersive gaming platform deletes in-game content that’s already been paid for under the guise of what it calls “content moderation.”
“Sell first, moderate later”
According to the 25-page lawsuit, defendant Roblox Corporation must appear, at least on the surface, as a company that maintains community standards and respects intellectual property while dealing with the fact that its “trademarked, vulgar, or otherwise objectionable content” is in high demand and “genuinely valuable” to the company’s bottom line. To maintain this balance, Roblox, according to the suit, has implemented a “clever content deleting scheme” that makes it look like the company is meaningfully moderating its platform while at the same time ensuring certain types of less-than-wholesome yet revenue-generating content is there for players, at least initially, so as to continue pulling in money.
It works like this. Roblox encourages users—at least 70 percent of whom are under 18 years old—to buy in-game content but fails to perform any meaningful oversight to ensure that what’s brought into the online marketplace complies with its policies, the suit says.
What Roblox does instead, according to the lawsuit, is allow users to buy in-game content before later deleting it for alleged policy violations—but only after users have already paid.
According to the lawsuit, Roblox offers neither refunds nor account credits when users’ already purchased in-game content is deleted, and, in many instances, the content deleted from the platform is not offensive, inappropriate or infringing upon any trademarks—and may even return to the online store after it has been removed.
“Unsurprisingly, the majority of Defendant’s item removals are capricious and are untethered from any platform policy violations,” the suit, filed by a minor represented by her father, claims.
As the lawsuit tells it, although Roblox has drawn criticism for existing as a “dangerous space” for its predominantly underage userbase, the company has engaged in “predatory conduct” of its own by deleting content users have already paid for—and calling it “content moderation.” When Roblox users report that their paid-for content has disappeared and demand refunds, the defendant, the complaint contends, “cleverly defects its irresponsible profit-seeking behavior” by alleging the content ran afoul of the platform’s policies while providing no other details. The victor in this scenario, the lawsuit charges, is always Roblox:
The result is a win-win for Roblox. Removing content that may on its face violate the platform’s policies earns Roblox the appearance of content moderation, while dovetailing with Roblox’s financial interests. The scheme allows Roblox to deflect blame for deleting users’ content without issuing refunds, forcing users to make new purchases to replace their in-game experience.”
A (profit-driven) world of its own?
Roblox is an interactive “metaverse” that boasts an average of 36.2 million users worldwide, the lawsuit relays. These users inhabit and interact with a 3D digital world generated entirely by users and built by a community of nearly seven million developers, the case states. According to the lawsuit, more than half of Roblox’s users are under the age of 13, and players spend an average of 2.6 hours per day on the platform, rivaling the amount of time spent on TikTok.
With Roblox Client, users can explore 3D worlds through the eyes of an avatar customized with clothing, gear, animations, simulated gestures, objects and emotions. On the developer side, there’s Roblox Studio, the toolkit used to both build, publish and operate the 3D worlds enjoyed via the Roblox Client and enable the user-to-user sale of in-world items via the Avatar Shop, the case says. According to the complaint, Roblox takes a commission from user-to-user transactions in the Avatar Shop in addition to offering for sale its own proprietary content.
Underscoring everything on the platform, the lawsuit says, is Robux, the in-game currency users purchase with real-world money. Every transaction within the Roblox universe is conducted through Robux, and the defendant benefits financially by both selling the in-game currency to users and charging a 30 percent commission on every user-to-user transaction within its marketplace, the suit relays.
Get class action lawsuit news sent to your inbox – sign up for ClassAction.org’s free weekly newsletter here.
Class action says “hidden dangers” lurk in Roblox
With most schools nationwide operating remotely over the last year due to the COVID-19 pandemic, kids have increasingly turned to the Roblox platform for community, the lawsuit says. According to the complaint, however, Roblox has seen throughout its short history incidents of lewd behavior and the failure of systems supposedly designed to monitor such conduct. Highlighted in the suit are “condo games,” in which nude avatars are shown engaging in sexual acts and using profane language, the lawsuit says.
The suit alleges the Roblox platform, and in particular “condo games,” serve as “a breeding ground for online predators looking to groom children for sexual abuse.”
Although the defendant has created tools with which to monitor for and remove “condo games,” various techniques exist to get around Roblox’s filters, the case states. The suit charges that in addition to the foregoing predatory conduct, Roblox itself “also preys on its vulnerable customers,” in particular by deleting paid-for content without offering a refund or credit and claiming it’s all to protect underage users and trademarks.
According to the complaint, Roblox has employed “extremely lax content moderation (if any at all)” when developer-generated content is initially uploaded to the Avatar Shop. Instead, Roblox “elicits the appearance” of content moderation by taking allegedly violative content down, but only after it’s been paid for, the case asserts.
“As a result users sometimes have only a fleeting moment to enjoy their newly-purchased items in the Avatar Shop,” the suit reads. “Roblox has deleted items days, months, and even years after the user has made the purchase. Worst of all, Roblox refuses to refund or credit users’ accounts after it has deleted these items.”
Targeted by Roblox for deletion are Avatar Shop items deemed inappropriate or in violation of intellectual property rights, such as clothing that, say, features the Nike logo. The suit stresses, however, that Roblox “does nothing” to prevent such items from being sold in the Avatar Shop in the first place.
But oftentimes, deleted items are often “not offensive, inappropriate or infringe upon any trademark or intellectual property,” according to the suit. Ultimately, users and developers often re-introduce items deleted by Roblox back into the Avatar Shop, the case claims, alleging the defendant “knows and appreciates that the biggest offenders of its platform’s policies are also its ‘golden goose’ that drives revenue.”
The case goes on to take issue with the seemingly arbitrary manner in which Roblox removes Avatar Shop content, claiming the platform’s first priority is profit over any sort of clarity on its determinations:
Roblox’s sham ‘content moderation’ is nothing more than a cover for its attempt to generate additional revenue from its users—which are predominantly children. In fact, Roblox routinely removes content that doesn’t on its face appear to violate any of the platform’s policies, and its structure for ‘moderating’ content fails to do so with consistency, accuracy, or any transparency. When it takes the drastic measure of removing purchased content from a user’s inventory (rendering it unusable), Roblox offers no explanation of what policy the item actually violated, how it generally makes such determinations or how to appeal its decisions.”
Who does the lawsuit cover?
The case, filed on May 25 in the Northern District of California, looks to represent all Roblox users who purchased content on the platform that was later deleted.
This means it might be a while before the time comes for those who are considered “covered” by a lawsuit—called “class members”—to submit claims for whatever compensation the court decides is appropriate. And that’s only if a case ends with a settlement. Basically, it’s only if and when a case settles that consumers would need to take action.
If you believe you’ve been affected by a company’s alleged conduct, the best thing to do is to stay informed and check back with ClassAction.org for updates. You can sign up for our free newsletter here.