Not a good look for Mastodon - what can be done to automate the removal of CSAM?

    • Lemdee
      link
      fedilink
      342 years ago

      So if I’m understanding right, based on their recommendations this will all be addressed as more moderation and QOL tools are introduced as we move further down the development roadmap?

    • @mindbleach@lemmy.world
      link
      fedilink
      82 years ago

      4.1 Illustrated and Computer-Generated CSAM

      Stopped reading.

      Child abuse laws “exclude anime” for the same reason animal cruelty laws “exclude lettuce.” Drawings are not children.

      Drawings are not real.

      Half the goddamn point of saying CSAM instead of CP is to make clear that Bart Simpson doesn’t count. Bart Simpson is not real. It is fundamentally impossible to violate Bart Simpson’s rights, because he doesn’t fucking exist. There is nothing to protect him from. He cannot be harmed. He is imaginary.

      This cannot be a controversial statement. Anyone who can’t distinguish fiction from real life has brain problems.

      You can’t rape someone in MS Paint. Songs about murder don’t leave a body. If you write about robbing Fort Knox, the gold is still there. We’re not about to arrest Mads Mikkelsen for eating people. It did not happen. It was not real.

      If you still want to get mad at people for jerking off to the wrong fantasies, that is an entirely different problem from photographs of child rape.

        • @mindbleach@lemmy.world
          link
          fedilink
          02 years ago

          What does that even mean?

          There’s nothing to “cover.” They’re talking about illustrations of bad things, alongside actual photographic evidence of actual bad things actually happening. Nothing can excuse that.

          No shit they are also discussing actual CSAM alongside… drawings. That is the problem. That’s what they did wrong.

      • @DrQuint@lemmy.world
        link
        fedilink
        22 years ago

        Oh, wait, Japanese in the other comment, now I get it. This conversation is a about AI Loli porn.

        Pfft, of course, that’s why no one is saying the words they mean, because it suddenly becomes much harder to take the stance since hatred towards Loli Porn is not universal.

      • Mark
        link
        fedilink
        12 years ago

        Oh no, what you describe is definitely illegal here in Canada. CSAM includes depictions here. Child sex dolls are illegal. And it should be that way because that stuff is disgusting.

        • @mindbleach@lemmy.world
          link
          fedilink
          -2
          edit-2
          2 years ago

          CSAM includes depictions here.

          Literally impossible.

          Child rape cannot include drawings. You can’t sexually assault a fictional character. Not “you musn’t.” You can’t.

          If you think the problem with child rape amounts to ‘ew, gross,’ fuck you. Your moral scale is broken, if there’s not a vast gulf between those two bad things.

  • @whatsarefoogee@lemmy.world
    link
    fedilink
    862 years ago

    Mastodon is a piece of software. I don’t see anyone saying “phpBB” or “WordPress” has a massive child abuse material problem.

    Has anyone in the history ever said “Not a good look for phpBB”? No. Why? Because it would make no sense whatsoever.

    I feel kind of a loss for words because how obvious it should be. It’s like saying “paper is being used for illegal material. Not a good look for paper.”

    What is the solution to someone hosting illegal material on an nginx server? You report it to the authorities. You want to automate it? Go ahead and crawl the web for illegal material and generate automated reports. Though you’ll probably be the first to end up in prison.

    • @Dubious_Fart@lemmy.ml
      link
      fedilink
      02 years ago

      Thats a dumb argument, though.

      phpbb is not the host or the provider. Its just something you download and install on your server, with the actual service provider (You, the owner of the server and operator of the phpbb forum) being responsible for its content and curation.

      Mastadon/Twitter/social media is the host/provider/moderator.

  • @HughJanus@lemmy.ml
    link
    fedilink
    21
    edit-2
    2 years ago

    “We got more photoDNA hits in a two-day period than we’ve probably had in the entire history of our organization of doing any kind of social media analysis, and it’s not even close,

    How do you have “probably” and “it’s not even close” in the same sentence?

    Here’s the thing, and what I’ve been saying for a long time about The Fediverse:

    I don’t care what platform you have, if it is sufficiently popular, you’re GOING to have CSAM. You’re going to have alt-right assholes. You’re going to have transphobia, you’re going to have racism and every other kind of discrimination.

    People point fingers at Meta for “allowing” this but there’s no amount of money that can reasonably moderate 3 b-b-billion users. Meta, and probably every other platform that’s not Twitter or False social, does what they can about this.

    Masto and Fedi admins need to be cognizant of the amount of users on their instances and need to have a sufficient number of moderators to manage those users. If they don’t have them, they need to close registrations.

    But ultimately the Fediverse can also create safe-havens for these sorts of things. Making it easy to set up a discriminatory network that has no outside moderation. This is the downside of free speech.

    • @Grimpen@lemmy.ca
      link
      fedilink
      62 years ago

      Heck, Truth Social uses Mastodon, IIRC.

      Ultimately, it’s software. Even if my home instance does a good job of enforcing it’s CoC, and every instance it federated with does as well, someone else can spin up their own instance, load up on whatever, and I’ll never know or even be aware if it’s never federated with my instance.

    • @clutchmatic@lemmy.world
      link
      fedilink
      22 years ago

      People point fingers at Meta for “allowing” this but there’s no amount of money that can reasonably moderate 3 b-b-billion users.

      This is a prime use case for AI technology

      • @Dubious_Fart@lemmy.ml
        link
        fedilink
        12 years ago

        No, cause then you end up with a case like the guy who lost 15+ years of emails, his phone number, all his photos, his contacts, and everything else he had tied to a google account, because Googles automated detection triggered on a naked photo of their baby, that they sent to the doctor during covid, that the doctor requested, about a rash on the babies diaper area… and no amount of common sense would stay their hand or reverse their ignorant judgement that this man was a child pornographer, and even called the police on him.

    • @Dubious_Fart@lemmy.ml
      link
      fedilink
      02 years ago

      Yes, it will be an issue on any platform.

      But how that platform deals with/fights it is what makes the platform good or bad.

      Take Facebook for example in horrible… Facebook is rife with the stuff, and it regularly gets reported… and nothing happens. To the point that a Reporter once confronted them about it during an interview, and Facebook proved it did have the capability to contact law enforcement… by calling them on the reporter who showed them the evidence of it on their platform.

  • I know that people like to dump on Cloudflare, but it’s incredibly easy to enable a built-in CSAM scanner with CloudFlare.

    On that note, I’d like to see built-in moderation tools using something like PDQ and TMK+PDQF and a shared hashtable of CSAM and other material that may be outlawed or desirable to filter out in different regions (e.g. terrorist content, Nazi content in Germany, etc.)

      • Lemmy.world had to start using CloudFlare because some script kiddies were DDOSing it. Some people were complaining that it encourages centralization, etc.

        Personally, I love it. The service you get even at the lowest level of payment ($20/mo) is great. And what you get for free can’t be compared.

  • @Reddit_was_fun@lemmy.world
    link
    fedilink
    -9
    edit-2
    2 years ago

    The article points out that the strength of the Fediverse is also it’s downside. Federated moderation makes it challenging to consistently moderate CSAM.

    We have seen it even here with the challenges of Lemmynsfw. In fact they have taken a stance that CSAM like images with of age models made to look underage is fine as long as there is some dodgy ‘age verification’

    The idea is that abusive instances would get defederated, but I think we are going to find that inadequate to keep up without some sort of centralized reporting escalation and ai auto screening.