• @bleistift2@sopuli.xyz
    link
    fedilink
    English
    786 months ago

    10% false positives is an enormous rate given how Police like to ‘find’ evidence and ‘elicit’ confessions.

    • snooggums
      link
      fedilink
      English
      336 months ago

      It isn’t predicting individual crimes, just pattern recognition and extrapolation like how the weather is predicted.

      “There are on average 4 shootings in November in this general area so there probably will be 4 again this year.” is the kind of prediction that AI is making.

        • @rottingleaf@lemmy.world
          link
          fedilink
          English
          06 months ago

          That’s also how communist attempts at building a better civilization work. They can’t avoid that base of their ideology where one human classification of reality takes precedence over life. So they have plans. Plan for steel production, plan for grain production, plan for convicted criminals.

          You are falling behind the plan? Have to arrest someone. Someone. There’s a weird teen walking there, let’s tie him to a battery and beat him till he signs a paper saying he stole some shit.

          The plan is overshot? Won’t bother if there’s a gang-rape with murder before the police station with some policemen participating.

          What I don’t understand is why people want to do that again, just with clueless (not possessing the necessary information) planners replaced with clueless (for the same reason) machines.

          Even USSR’s problems with planning were mostly not due to insufficient computational resources (people from today think those were miserable, but let’s please remember that they were programmed by better and more qualified people that most of today’s programmers), but due to power balance in hierarchy meaning that planning was bent for the wishes of power. In other words, plans were made for what people on important posts wanted to see, and didn’t account for what people on other important posts didn’t want to share. Just like it’s going to be with any system. Tech doesn’t solve power balance by itself.

    • @lugal@sopuli.xyz
      link
      fedilink
      English
      36 months ago

      That’s in a punitive system. Used in a transformative/preventive manner (which it will not), this can actually save lives and help people in need

  • @Pilferjinx@lemmy.world
    link
    fedilink
    English
    236 months ago

    Israel and China implement sophisticated algorithms to suppress Palestinians and Uyghurs with severe effectiveness. Don’t take this tech lightly.

    • @aeshna_cyanea@lemm.ee
      link
      fedilink
      English
      4
      edit-2
      6 months ago

      Idk about china but Israel carpet bombs apartment buildings. You don’t need precision ai for that

  • andrew_bidlaw
    link
    fedilink
    English
    96 months ago

    Do they have skull measurements in their dataset? It’s predestined to reproduce and cement existing biases.

    • @0laura@lemmy.dbzer0.com
      link
      fedilink
      English
      36 months ago

      AI is AI. not all AI is AGI, but stable diffusion, LLMs and all the other ones are real AI. the only reason people disagree is because they watched too much sci-fi and think that AI is supposed to be sentient or whatever. hell, even the code controlling the creepers in Minecraft is called AI. in the game. you can spawn a creeper with the noai flag and it’ll make it so the creeper doesn’t do anything. quite a silly take to say it’s not ai just because you don’t like it. there’s many things to dislike about the modern state of AI, your argument is just shooting yourself in the foot.

  • @dgmib@lemmy.world
    link
    fedilink
    English
    66 months ago

    If crime is that predictable, that would mean crime is isn’t caused by people’s choices but something else… like say mental illness, poverty, hunger, lack of social supports and that lots of cops locking up people in prisons as a deterrent won’t work to reduce crime… hmmm wait a second…

  • @stupidcasey@lemmy.world
    link
    fedilink
    English
    26 months ago

    As we can see from this advanced simulation, the perpetrator had 13 fingers you are the only person who has 13 fingers the evidence is obvious.

    Mr. Thirteen Fingers, I simply do not understand how an innocent man like yourself can take a dark turn and suddenly commit over 300 crimes scattered throughout every country across the globe, you had every reason not to commit them but you did it anyway, how do you plead?

    Would it matter if I said not guilty?

  • @Shardikprime@lemmy.world
    link
    fedilink
    English
    -16 months ago

    I mean you can train ai to look for really early signs of multiple diseases. It can predict the future, sort of

    • @Tayb@lemmy.world
      link
      fedilink
      English
      56 months ago

      Didn’t one ai have a lot of false positives because it would say any picture of skin with a ruler in it was cancer? The moment it saw a ruler it responded with cancer because all the data it was fed about confirmed cancers had rulers in them.

    • @callouscomic@lemm.ee
      link
      fedilink
      English
      26 months ago

      Detecting symptoms amd signs of a thing is not predicting the future.

      That’s like seeing a car that isn’t going to stop, so you slow down before you might T-bone them. That’s not really “predicting the future” but just paying attention and calculating likelihoods.