close
close

OnlyFans’ Millions of Paywalls Make Child Sex Abuse Hard to Detect, Police Say

Reuters reported that paywalls on OnlyFans make it harder for police to detect child sexual abuse material (CSAM) on the platform — especially new CSAM that is harder to detect online.

Because each OnlyFans creator posts their content behind their own paywall, five online child sexual abuse experts told Reuters it’s difficult to independently verify how much CSAM is being posted. Police would apparently have to subscribe to each account to monitor the entire platform, Trey Amick, an expert who helps police investigate CSAM, told Reuters.

OnlyFans claims that the amount of CSAM on its platform is incredibly low. Out of 3.2 million accounts sharing “hundreds of millions of posts,” OnlyFans only removed 347 posts for suspected CSAM in 2023. Each post was voluntarily reported to the National Center for Missing and Exploited Children’s (NCMEC) CyberTipline, which OnlyFans says has “full access” to monitor content on the platform.

But that increased monitoring appears to be just getting started. NCMEC only gained access to OnlyFans in late 2023, the child safety group told Reuters. And NCMEC apparently can’t scan the entire platform at once, telling Reuters its access was “limited” to “OnlyFans accounts reported to CyberTipline or associated with a missing child case.”

Similarly, OnlyFans told Reuters that police do not need to subscribe to investigate a creator’s posts, but the platform only grants free access to accounts when there is an active investigation underway. That means that when police suspect an account is being used to exchange CSAM, they are given “full access” to view “account details, content, and direct messages,” Reuters reports.

But that access doesn’t help police uncover CSAM shared on accounts that haven’t already been flagged for investigation. That’s a problem, a Reuters investigation found, because it’s easy for creators to set up a new account where bad actors can hide their identities to avoid “OnlyFans’ scrutiny of holding account holders accountable for their own content,” one of the detectives, Edward Scoggins, told Reuters.

Bypassing OnlyFans CSAM detection seems easy

OnlyFans told Reuters that “potential creators must provide at least nine pieces of personally identifiable information and documents, including bank details, a selfie with a government photo ID and — in the United States — a Social Security number.”

“All of this is verified by human review and age estimation technology that analyzes the selfie,” OnlyFans told Reuters. On OnlyFans’s website, the platform further explained that “we constantly scan our platform to prevent CSAM from being posted. All of our content moderators are trained to identify and quickly report any suspected CSAM.”

But Reuters found that none of these checks worked 100 percent to stop bad actors from sharing CSAM. The same apparently applies to some minors who want to post their own pornographic content. One girl told Reuters that she first avoided age verification by using an adult driver’s license to sign up, then taking over the adult’s account.

An OnlyFans spokesperson told Ars that the low number of CSAM cases reported to NCMEC is “a testament to the rigorous security controls implemented by OnlyFans.”

OnlyFans is proud of the work we do to aggressively target, report, and support the investigation and prosecution of anyone who attempts to abuse our platform in this way,” an OnlyFans spokesperson told Ars. “Unlike many other platforms, OnlyFans’ lack of anonymity and end-to-end encryption means that reports are actionable by law enforcement and prosecutors.”