Mastodon, the decentralized network viewed as a viable alternative to Twitter, is rife with child sexual abuse material (CSAM), according to a new study from Stanford’s Internet Observatory (via The Washington Post). In just two days, researchers found 112 instances of known CSAM across 325,000 posts on the platform — with the first instance showing up after just five minutes of searching.
To conduct its research, the Internet Observatory scanned the 25 most popular Mastodon instances for CSAM. Researchers also employed Google’s SafeSearch API to identify explicit images, along with PhotoDNA, a tool that helps find flagged CSAM. During its search, the team found 554 pieces of content that matched hashtags or keywords often used by child…
Continue reading…