Detecting Child Sexual Abuse on OnlyFans: The Challenges Police Face
OnlyFans has become a popular platform for creators to share exclusive content with their subscribers. However, the paywalls on OnlyFans make it difficult for police to detect child sexual abuse materials (CSAM) on the platform. According to a report by Reuters, the decentralized nature of OnlyFans, where each creator posts behind their own paywall, poses a significant challenge for law enforcement agencies trying to monitor and investigate potential cases of CSAM.
Five specialists in online child sexual abuse told Reuters that the sheer volume of content on OnlyFans, with millions of accounts sharing hundreds of millions of posts, makes it nearly impossible to independently verify the extent of CSAM being posted. In order to monitor the entire platform, law enforcement would need to subscribe to each account, a daunting task as highlighted by Trey Amick, an expert who assists in police CSAM investigations.
OnlyFans claims that the amount of CSAM on its platform is minimal, citing the removal of only 347 posts suspected of containing CSAM in 2023. Each of these posts was voluntarily reported to the CyberTipline of the National Center for Missing and Exploited Children (NCMEC), which OnlyFans stated has full access to monitor content on the platform. However, NCMEC only gained access to OnlyFans in late 2023 and their monitoring capabilities are limited to accounts reported to its CyberTipline or connected to a missing child case.
Moreover, law enforcement agencies do not have unfettered access to investigate creators’ posts on OnlyFans. Free access is only granted when there is an active investigation into suspected CSAM. This limitation means that police may not be able to uncover CSAM shared on accounts that have not yet been flagged for investigation, allowing bad actors to evade detection.
While OnlyFans has implemented strict controls to prevent the posting of CSAM, such as requiring creators to provide extensive personal information and undergoing age verification, Reuters found that these measures are not foolproof. Bad actors have found ways to bypass these controls, and some minors have also managed to post explicit content on the platform by circumventing age verification processes.
An OnlyFans spokesperson defended the platform’s safety controls, stating that the low number of CSAM reports to NCMEC is a testament to the rigorous measures in place. However, the challenges faced by law enforcement in detecting and preventing CSAM on OnlyFans highlight the need for more robust monitoring and investigative tools to safeguard vulnerable individuals from exploitation on online platforms.