“The surveillance, theft and death machine recommends more surveillance to balance out the death.”

  • pleasejustdie@lemmy.world
    link
    fedilink
    English
    arrow-up
    33
    arrow-down
    14
    ·
    edit-2
    1 month ago

    I don’t actually have a problem with this. If people are stupid enough to admit to a crime or engage in criminal activity on a platform that they don’t control, that’s on them. I put this as the next step of evolution from people who would commit a crime on youtube for views then get shocked pikachu’d when the police arrest them for it. They have no one to blame but themselves, they brought a 3rd party AI company into it and they did not consent to be an accomplice and if there is any company out there with the resources to have AI scan conversations to flags to send to the police with good accuracy, openAi would definitely be at the front of it.

    • TommySoda@lemmy.world
      link
      fedilink
      English
      arrow-up
      17
      arrow-down
      2
      ·
      edit-2
      1 month ago

      Well, you should have a problem with it but not for the reasons you think. Any invasion of privacy is an issue when the people in control get to decide what is a reportable offense without explicitly telling you. I get it, you definitely shouldn’t be admitting anything illegal or asking illegal advice from a chat bot. You shouldn’t be doing anything that is illegal in the first place.That’s basically the same as googling how to make a bomb and if you’re that dumb you’ll get what’s coming to you. The issue arrives when you look at the bigger picture. If they have the ability to report anything they want to the police, what’s stopping them from releasing anything they want to anyone they want at any time? And when it comes to those receiving the data that’s been reported, what proof do you have that these entities have yours or anyone else’s interests or safety in mind? What if they decide to change the rules on what they should report, they don’t tell you, and then retroactively flag a bunch of your conversions with said LLM.

      It’s the same kinda situation that we face with these AI cameras that track us and our vehicles literally everywhere we go. There have already been multiple cases where people in law enforcement were using these tools to stalk people like ex girlfriends. All this is putting a lot of trust into people that none of us even know and expect them to have the best of intentions. What would stop them from reporting that you asked ChatGPT about the current situation in Gaza?

      • thatsnothowyoudoit@lemmy.ca
        link
        fedilink
        English
        arrow-up
        4
        ·
        1 month ago

        Fair points.

        One thing I think we all miss: what happens when an overzealous government makes something a crime retroactively? Say, um, disparaging two Cheetos in an ill fitting suit masquerading as a world leader.

        That’s part of why we should care about privacy and why we should care when data we expect to be private isn’t.

        Most tech users are victims in a system they don’t understand. We might complain that they don’t want to understand but the truth is the providers don’t want them to understand - as it’s easier to sell them whatever crap they’re shilling.

    • Seleni@lemmy.world
      link
      fedilink
      English
      arrow-up
      12
      arrow-down
      2
      ·
      1 month ago

      Ahh, the ol’ ‘nothing to hide’ defense.

      Ever consider things that are labeled as ‘crimes’ can and will be anything the people in power want?

      Just because, say, calling Republicans ‘shithead pedophiles’ on Lemmy isn’t illegal now doesn’t mean Cheeto Mussolini won’t make it illegal tomorrow.

    • KingPorkChop@lemmy.ca
      link
      fedilink
      English
      arrow-up
      2
      arrow-down
      1
      ·
      1 month ago

      And of course you’re fine with wasting police and investigation time on people who entered shit into AI just to see what it would return.

  • Treczoks@lemmy.world
    link
    fedilink
    English
    arrow-up
    8
    ·
    1 month ago

    So it is not stupid enough to just use it, some people are so totally stupid and think what they put into a commercial, online thing would be private.

  • TankovayaDiviziya@lemmy.world
    link
    fedilink
    English
    arrow-up
    3
    arrow-down
    1
    ·
    1 month ago

    Will people stop calling the capital and fascist protectors law enforcement? Calling them that makes it sound like they are honourable when they’re not.

  • Pxtl@lemmy.ca
    link
    fedilink
    English
    arrow-up
    4
    arrow-down
    2
    ·
    1 month ago

    As much as I hate the AI-gens, this is probably a good thing after that poor kid got talked into killing himself. I assume Google et al do similar already.

    Now, if the cops react to being called for a person in crisis by tazing somebody, that’s a different problem.