Not a good look for Mastodon - what can be done to automate the removal of CSAM?

  • NuclearArmWrestling@lemmy.world
    link
    fedilink
    arrow-up
    7
    ·
    1 year ago

    I know that people like to dump on Cloudflare, but it’s incredibly easy to enable a built-in CSAM scanner with CloudFlare.

    On that note, I’d like to see built-in moderation tools using something like PDQ and TMK+PDQF and a shared hashtable of CSAM and other material that may be outlawed or desirable to filter out in different regions (e.g. terrorist content, Nazi content in Germany, etc.)

    • redcalcium@lemmy.institute
      link
      fedilink
      arrow-up
      8
      ·
      1 year ago

      I don’t want much, I just want deletion to be propagated reliably across the fediverse. If someone got banned for CSAM and their contents purged, I want those action propagated across all federated instances. I can’t even delete my comment reliably here on Lemmy since many instances doesn’t seem to get the deletion requests.

      • redcalcium@lemmy.institute
        link
        fedilink
        arrow-up
        6
        ·
        1 year ago

        People are wary about how internet got more and more centralized behind cloudlare. If you’re ever getting caught in cloudlare’s captcha hell because they flag your IP as suspicious, you’ll get wary too because you suddenly realized how big cloudlare now when half of the internet suddenly ask you to solve cloudlare captcha.

      • NuclearArmWrestling@lemmy.world
        link
        fedilink
        arrow-up
        3
        ·
        1 year ago

        Lemmy.world had to start using CloudFlare because some script kiddies were DDOSing it. Some people were complaining that it encourages centralization, etc.

        Personally, I love it. The service you get even at the lowest level of payment ($20/mo) is great. And what you get for free can’t be compared.

      • NuclearArmWrestling@lemmy.world
        link
        fedilink
        arrow-up
        5
        ·
        1 year ago

        It looks like it scans and flags on the outbound (user download of the image), so as long as it sits in front of your instance, it should work just fine.

        You’re still responsible for removing the material, complying with any preservation requirements, and any other legal obligations, and notifying CloudFlare that it’s been removed.

        It would be ideal if it could block on upload, so the material never makes it to your instance, but that would likely be something else like integration with PhotoDNA or something similar.