• UndercoverUlrikHD@programming.dev
    link
    fedilink
    English
    arrow-up
    1
    ·
    10 months ago

    Admins are actively looking into solutions, nobody wants that stuff stored on their server, and there’s a bunch of legal stuff you must do when it happens.

    One of the problems is the cost of compute power for running programs detecting CSAM in pictures before uploading, making it not viable for many instances. Lemmy.world is moving towards only allowing images hosted via whitelisted sites I think.