Not a good look for Mastodon - what can be done to automate the removal of CSAM?
You must log in or register to comment.
The content in question is unfortunately something that has become very common in recent months: CSAM (child sexual abuse material), generally AI-generated.
AI is now apparently generating entire children, abusing them, and uploading video of it.
Or, they are counting “CSAM-like” images as CSAM.
Of course they’re counting “CSAM-like” in the stats, otherwise they wouldn’t have any stats at all. In any case, they don’t really care about child abuse at all. They care about a platform existing that they haven’t been able to wrap their slimy tentacles around yet.