Stanford researchers find Mastodon has a massive child abuse material problem

  • Thread starter Thread starter The Verge RSS
  • Start date Start date
T

The Verge RSS

Guest
Author: Emma Roth

An image showing the Mastodon logo on a black background

Illustration: The Verge

Mastodon, the decentralized network viewed as a viable alternative to Twitter, is rife with child sexual abuse material (CSAM), according to a new study from Stanford’s Internet Observatory (via The Washington Post). In just two days, researchers found 112 instances of known CSAM across 325,000 posts on the platform — with the first instance showing up after just five minutes of searching.

To conduct its research, the Internet Observatory scanned the 25 most popular Mastodon instances for CSAM. Researchers also employed Google’s SafeSearch API to identify explicit images, along with PhotoDNA, a tool that helps find flagged CSAM. During its search, the team found 554 pieces of content that matched hashtags or keywords often used by child...

Continue reading…

Continue reading...