How Twitter/X Is Setting Itself Up For Disaster By Officially Allowing Adult Content
On Monday, the social media platform X (formerly known as Twitter) formalized a policy that had been in place for years, but never addressed publicly: That adult content is officially allowed as long as it involves consenting adults and is marked as adult for filtering purposes. "Adult Content is any consensually produced and distributed material depicting adult nudity or sexual behavior that is pornographic or intended to cause sexual arousal," reads the new directive. It specifies "full or partial nudity, including close-ups of genitals, buttocks, or breasts" as well as "explicit or implied sexual behavior or simulated acts such as sexual intercourse and other sexual acts."
Again, this had unofficially been the way that X/Twitter had operated for years, this is just formalizing it. Adult performers have long spoken of how important how the former Twitter is to them, as it's a primary promotional vehicle, and they don't have access to mainstream media outlets or most other major social media platforms. If you're an OnlyFans creator dealing in nudity and/or explicit content, X is a key facet to driving subscriptions because you can't post teaser photos or videos anywhere else — unlike non-nude/non-explicit creators who can push their content on Facebook, Instagram, and TikTok.
However, in formalizing this policy, X is crossing a rubicon — one that it had tried avoiding before Elon Musk bought the company. That explicitly allowing and monetizing adult content at scale could cause the company to run afoul of record-keeping laws for adult entertainment companies that require documenting performers' ages.
The adult industry, and age verification
How age verification works in the porn industry can be traced back to 1986. Stemming from an FBI raid, the entire profession learned that one of the biggest stars in the business, Traci Lords, had been a minor for almost her entire adult film career. Numerous video titles and magazines had quickly become illegal child sexual abuse material, or CSAM for short. Lords used someone else's birth certificate to get a genuine California state ID card, as opposed to cutting and pasting together a fake ID — she even managed to get a passport with it.
In November 1987, Reagan's administration announced the impending bill for the Child Protection and Obscenity Enforcement Act of 1988, which would pass and introduce initial guidelines designed to prevent another Traci Lords situation. Legal challenges, both constitutional and ethical, would disrupt the law for the bulk of the next two decades. However, by 2006, everything finally got ironed out in a satisfactory way Congress.
From that point forward, adult producers were explicitly required to keep detailed, clear records documenting all of the proof they had of performers being legal adults. And this doesn't just apply to traditional porn studios: User-generated content hubs like OnlyFans and PornHub also have to be very strict about complying with what's commonly referred to as "2257" guidelines, owing to the statute number. In theory, X could also do this — but in practice, there are questions about scale.
Twitter's near-miss with becoming an OnlyFans competitor
In August 2022, The Verge broke the story that a few months earlier, Twitter put together a proposal for adult creators to sell subscription-based explicit content through the platform. On the surface, it made sense: Twitter was the primary marketing vehicle for OnlyFans, so shouldn't Twitter try to siphon off as much of the latter's revenue as possible for itself? The executives who made the proposal even felt that the theoretical "ACM: Adult Content Monetization" program could easily offset whatever losses the company would be hit with from advertisers leaving the platform due to the embrace of adult content.
To test the theory, Twitter drafted 84 employees to form what was dubbed the Red Team, who would endeavor to find out if this could be done responsibly. The answer? A resounding no. "Twitter cannot accurately detect child sexual exploitation and non-consensual nudity at scale," wrote the Red Team in a document reviewed by The Verge. That killed the project dead, and rightfully so.
If Twitter was not able to keep up with getting rid of clearly illegal and immoral explicit content in day-to-day operation, it most certainly couldn't be expected to officially become an adult content platform at scale. Monday's news is not a step back in the specific direction of offering ACM and competing with OnlyFans, but it still raises the same basic questions with some very uncomfortable answers.
Elon Musk's X has mishandled CSAM so badly it can't be trusted to embrace porn
There is already ample evidence that X, as run by Elon Musk, is not remotely up for this challenge. Perhaps most obviously, we can point to one of the most shocking tech stories of 2023. That July, the site suspended the account of QAnon conspiracy theorist Dominick "Dom Lucre" McGee because he posted screenshots of a CSAM video involving convicted Australian child predator Peter Scully. Good, right? Well, that decision was quickly rendered moot when Musk rescinded the suspension, seemingly because according to The Washington Post, Lucre claimed that he was trying to raise awareness of child abuse by posting the images.
"I'm sorry, just have to post a big ol' lol about the fact that this guy blew up my life by saying I condone pedophilia, and then he turns around and does this," wrote former Twitter Trust and Safety head Yoel Roth in a BlueSky post on July 26, 2023. Roth was referring to how, several months earlier, Musk had repeatedly tried to suggest that he — an openly gay man at a time when queer people are increasingly smeared as predators — was a pedophile.
The prior month, The Wall Street Journal, citing Stanford Internet Observatory researchers, reported that Twitter failed to detect dozens of known CSAM images that were posted to the platform between March 12 and May 20, 2023. Twitter told Stanford that it had improved its detection methods, but declined to comment on The Washington Post's report.
The Trust and Safety team has been gutted under Musk
Elon Musk's tenure as sole owner of Twitter/X has been plagued by constant, massive layoffs and other cost-cutting moves. In an April 2023 interview with BBC, Musk outright said that he had cut the company's head count down to about 1,500 — from just south of 8,000 — an approximate 80% reduction. According to an NBC News report three months earlier, that included significant cuts to the Trust and Safety department, which lost more than half of its workforce. Weeks before that, Musk had disbanded Twitter's volunteer Trust and Safety Council. Later, when the Stanford researchers found the issues it shared with The Wall Street Journal, they discovered all of its contacts in the department were gone from the company.
Twitter had also come under fire when NBC News discovered that it was not blocking various known CSAM-centric hashtags. That led to Senator Richard Durbin (D-IL) pushing for the Department of Justice to investigate the company's handling of such illegal content. Later, Australian regulators also fined X for its poor handling of CSAM, despite Musk claiming that they're doing a great job in this area.
As long as X remains in its current form as Elon Musk's fiefdom, it can't be trusted to responsibly manage adult content like OnlyFans can. If the old Twitter didn't think that they could do it, why would the guy who reinstated Dom Lucre be any better?