• deweydecibel
    link
    fedilink
    English
    222 years ago

    And here we go.

    This will be one of the Fediverse’s biggest obstacles.

    Need to get this under control somehow or else in a few years, tech companies, banks, and regulators will decide a crackdown on the fediverse as a whole is needed.

      • @TheNotorious7113@lemmy.world
        link
        fedilink
        English
        12 years ago

        Would some sort of loosely organized group of instance admins help to make this happen? Like the U.N. for the fediverse? Sounds like a structured communication system would fix this.

    • @cerevant@lemmy.world
      link
      fedilink
      English
      62 years ago

      The fediverse is the name for services that use ActivityPub - a communication protocol. What you are saying is like saying “tech companies, banks and regulators need to crack down on http because there is CSAM on the web”.

    • @weedazz@lemmy.world
      link
      fedilink
      English
      52 years ago

      A few years? I bet Threads is doing this right now to shut down every private instance and take the fediverse for themselves. They’ll argue they are the only one that can moderate the content due to their size/resources

  • Glarrf
    link
    fedilink
    English
    72 years ago

    We need more tools, more automation, in order to fight the trash

  • Mario Bariša
    link
    fedilink
    English
    42 years ago

    Just added “Stanford researchers” to my list of stupid people

  • Metal Zealot
    link
    fedilink
    English
    42 years ago

    That’s like child molesters texting each other, and then saying “TELUS AND BELL ARE PERPETUATING A CHILD SEX TRAFFICKING RING”

  • @whenigrowup356@lemmy.world
    link
    fedilink
    English
    32 years ago

    Shouldn’t it be possible to create open-source bots that use the same databases as the researchers to automatically flag and block that kind of content?

    • @ozymandias117@lemmy.world
      link
      fedilink
      English
      32 years ago

      Those databases are highly regulated, as they are, themselves CSAM

      Apple tried to do fuzzy hashes to download them to devices, and it wasn’t able to reliably identify things at all

  • @woefkardoes@lemmy.world
    link
    fedilink
    English
    32 years ago

    Somewhere we went from finding those who do wrong and punishing them to censor everything because everyone is bad.

  • @Techmaster@lemmy.world
    link
    fedilink
    English
    -52 years ago

    So they went on mastodon and started searching for CP? WTF is wrong with these sick people? I hope they’re on an FBI list now.

    • @deong@lemmy.world
      link
      fedilink
      English
      82 years ago

      So your advice to any organization seeking to minimize illegal activity is to willfully ignore any trace of it?

      • @Techmaster@lemmy.world
        link
        fedilink
        English
        1
        edit-2
        2 years ago

        “I swear, officer, I was just searching for CP to catch OTHER people!”

        It would be just as pathetic as that scene from Something About Mary. “Yeah I was just going to pee, too!”

        Maybe they had some kind of legal sanctioning to do it, but holy crap, I wouldn’t want that in my search history. I would hope software like that has some mechanism where if people search for certain words it results in an automatic reporting to some FBI API somewhere. I actually know of a couple of people who got caught with that stuff. One got 25 years. The other jumped bail and they eventually caught him. I’m not sure if he’s been sentenced yet but I bet he’ll get double of what the other guy who cooperated got. Those people are creepy AF and nobody in their right mind would want to be associated with any of it. Those people are 10 times worse than neo nazis.

        The funny thing is the first guy, everybody could kind of tell he was a creep. But the FBI caught him and he completely cooperated and admitted everything. The second guy, he really seemed like he was going to be the only person in his family who actually turned out to be a decent guy. He was a really sweet kid in a super trashy family. And then all of a sudden everything goes down and everybody is in shock. Then he jumps bail. Last I heard his dad was about to lose his house because he used it as collateral to bail his piece of shit son out of jail.

        This open source software needs to include code that reports certain search terms. There are ML algorithms out there that can automatically detect this stuff. Do not search for that kind of content, thinking you’re some sort of vigilante. There are ways to deal with this shit without putting yourself in serious legal peril.

        • @deong@lemmy.world
          link
          fedilink
          English
          22 years ago

          I don’t think you understand how a research organization works. This isn’t three guys in a basement searching for child porn. It’s a research institute at Stanford University. They’ll have gotten funding to do the work by applying for federal grants, getting approval from multiple Institutional Review Boards who are charged with, among other things, making sure that the people involved in the research are appropriately taken care of. They’ll be required to have counselors on board. However “legit” you think such an outfit might possibly be, multiply that by three.

          This is their job. It is the same as if they worked for a law enforcement agency. When someone gets arrested for child porn, we don’t also charge the police, prosecutors, and judges who might have to look at the material as part of prosecuting a case. I promise you Stanford isn’t paying a team of professors and postdocs to just diddle themselves to kiddie porn all day.