• Sneezycat
      link
      fedilink
      English
      652 years ago

      Oh no, we figured it out, but killer robots are profitable while happiness is not.

      • @o2inhaler@lemmy.ca
        link
        fedilink
        English
        272 years ago

        I would argue happiness is profitable, but would have to shared amongst the people. Killer robots are profitable for a concentrated group of people

        • Meowing Thing
          link
          fedilink
          English
          72 years ago

          What if we gave everyone their own killer robot and then everyone could just fight each other for what they wanted?

            • @zalgotext@sh.itjust.works
              link
              fedilink
              English
              52 years ago

              No the Republican plan would be to sell killer robots at a vastly inflated price to guarantee none but the rich can own them, and then blame people for “being lazy” when they can’t afford their own killer robot.

              • @TopRamenBinLaden@sh.itjust.works
                link
                fedilink
                English
                7
                edit-2
                2 years ago

                Also, they would say that the second amendment very obviously covers killer robots. The founding fathers definitely foresaw the AI revolution, and wanted to give every man and woman the right to bear killer robots.

              • @winterayars@sh.itjust.works
                link
                fedilink
                English
                32 years ago

                They’d say they’re gonna pass a law to give every male, property owning citizen a killer robot but first they have to pass a law saying it’s legal to own killer robots. They pass that law then all talk about the other law is dropped forever. No one ever follows up or asks what happened to it. Meanwhile, the rich buy millions and millions of killer robots.

        • Sneezycat
          link
          fedilink
          English
          4
          edit-2
          2 years ago

          No, it isn’t just about survival. People living on the streets are surviving. They have no homes, they barely have any food.

    • @cosmicrookie@lemmy.world
      link
      fedilink
      English
      1
      edit-2
      2 years ago

      Especially one that is made to kill everybody else except their own. Let it replace the police. I’m sure the quality controll would be a tad stricter then

  • @pelicans_plight@lemmy.world
    link
    fedilink
    English
    562 years ago

    Great, so I guess the future of terrorism will be fueled by people learning programming and figuring out how to make emps so they can send the murder robots back to where they came from. At this point one of the biggest security threats to the U.S. and for that matter the entire world is the extremely low I.Q. of every one that is supposed to be protecting this world. But I think they do this all on purpose, I mean the day the Pentagon created ISIS was probably their proudest day.

    • @Snapz@lemmy.world
      link
      fedilink
      English
      242 years ago

      The real problem (and the thing that will destroy society) is boomer pride. I’ve said this for a long time, they’re in power now and they are terrified to admit that they don’t understand technology.

      So they’ll make the wrong decisions, act confident and the future will pay the tab for their cowardice, driven solely by pride/fear.

      • @primal_buddhist@lemmy.world
        link
        fedilink
        English
        32 years ago

        Boomers have been in power for a long long time and the technology we are debating is as a result of their investment and prioritisation. So am not sure they are very afraid of it.

        • @Snapz@lemmy.world
          link
          fedilink
          English
          72 years ago

          I didn’t say they were afraid of the technology, I said they were afraid to admit that they don’t understand it enough to legislate it. Their hubris in trying to preset a confident facade in response to something they can’t comprehend is what will end us.

    • @zaphod@feddit.de
      link
      fedilink
      English
      152 years ago

      Great, so I guess the future of terrorism will be fueled by people learning programming and figuring out how to make emps so they can send the murder robots back to where they came from.

      Eh, they could’ve done that without AI for like two decades now. I suppose the drones would crashland in a rather destructive way due to the EMP, which might also fry some of the electronics rendering the drone useless without access to replacement components.

      • @pelicans_plight@lemmy.world
        link
        fedilink
        English
        -1
        edit-2
        2 years ago

        I hope so, but I was born with an extremely good sense of trajectory and I also know how to use nets. So lets just hope I’m superhuman and the only one who possesses these powers.

        Edit; I’m being a little extreme here because I heavily disagree with the way everything in this world is being run. So I’m giving a little push back on this subject that I’m wholly against. I do have a lot of manufacturing experience, and I would hope any killer robots governments produce would be extremely shielded against EMPs, but that is not my field, and I have no idea if shielding a remote controlled robot from EMPs is even possible?

        • @AngryCommieKender@lemmy.world
          link
          fedilink
          English
          62 years ago

          The movie Small Soldiers is totally fiction, but the one part of that movie that made “sense” was that because the toy robots were so small, they had basically no shielding whatsoever, so the protagonist just had to haul a large wrench/ spanner up a utility pole, and connect the positive and negative terminals on the pole transformer. It blew up of course, and blew the protagonist off the pole IIRC. That also caused a small (2-3 city block diameter) EMP that shut down the malfunctioning soldier robots.

          I realize this is a total fantasy/ fictional story, but it did highlight the major flaw in these drones. You can either have them small, lightweight, and inexpensive, or you can put the shielding on. In almost all cases when humans are involved, we don’t spend the extra $$$ and mass to properly shield ourselves from the sun, much less other sources of radiation. This leads me to believe that we wouldn’t bother shielding these low cost drones.

    • Flying Squid
      link
      fedilink
      English
      72 years ago

      Is there a way to create an EMP without a nuclear weapon? Because if that’s what they have to develop, we have bigger things to worry about.

      • @TopRamenBinLaden@sh.itjust.works
        link
        fedilink
        English
        52 years ago

        Your comment got me curious about what would be the easiest way to make a homemade emp. Business Insider of all things has got us all covered, even if that business may be antithetical to business insiders pro capitalistic agenda.

      • @Madison420@lemmy.world
        link
        fedilink
        English
        32 years ago

        Yeah very easy ways, one of the most common ways to cheat a slot machine is with a localized emp device to convince the machine you’re adding tokens.

      • @Buddahriffic@lemmy.world
        link
        fedilink
        English
        12 years ago

        One way involves replacing the flash with an antenna on an old camera flash. It’s not strong enough to fry electronics, but your phone might need anything from a reboot to a factory reset to servicing if it’s in range when that goes off.

        I think the difficulty for EMPs comes from the device itself being an electronic, so the more effective the pulse it can give, the more likely it will fry its own circuits. Though if you know the target device well, you can target the frequencies it is vulnerable to, which could be easier on your own device, plus everything else in range that don’t resonate on the same frequencies as the target.

        Tesla apparently built (designed?) a device that could fry a whole city with a massive lighting strike using just 6 transmitters located in various locations on the planet. If that’s true, I think it means it’s possible to create an EMP stronger than a nuke’s that doesn’t have to destroy itself in the process, but it would be a massive infrastructure project spanning multiple countries. There was speculation that massive antenna arrays (like HAARP) might be able to accomplish similar from a single location, but that came out of the conspiracy theory side of the world, so take that with a grain of salt (and apply that to the original Tesla invention also).

    • @criticalthreshold@lemmy.world
      link
      fedilink
      English
      52 years ago

      A true autonomous system would have Integrated image recognition chips on the drones themselves, and hardening against any EM interference. They would not have any comms to their ‘mothership’ once deployed.

    • @FreshProduceAndShit@lemmy.ml
      link
      fedilink
      English
      12 years ago

      so I guess the future of terrorism will be fueled by people learning programming and figuring out how to make emps

      Honestly the terrorists will just figure out what masks to wear to get the robots to think they’re friendly/commanders, then turn the guns around on our guys

  • @cosmicrookie@lemmy.world
    link
    fedilink
    English
    48
    edit-2
    2 years ago

    It’s so much easier to say that the AI decided to bomb that kindergarden based on advanced Intel, than if it were a human choice. You can’t punish AI for doing something wrong. AI does not require a raise for doing something right either

    • Meowing Thing
      link
      fedilink
      English
      272 years ago

      That’s an issue with the whole tech industry. They do something wrong, say it was AI/ML/the algorithm and get off with just a slap on the wrist.

      We should all remember that every single tech we have was built by someone. And this someone and their employer should be held accountable for all this tech does.

    • @Ultraviolet@lemmy.world
      link
      fedilink
      English
      152 years ago

      1979: A computer can never be held accountable, therefore a computer must never make a management decision.

      2023: A computer can never be held accountable, therefore a computer must make all decisions that are inconvenient to take accountability for.

    • @recapitated@lemmy.world
      link
      fedilink
      English
      32 years ago

      Whether in military or business, responsibility should lie with whomever deploys it. If they’re willing to pass the buck up to the implementor or designer, then they shouldn’t be convinced enough to use it.

      Because, like all tech, it is a tool.

    • @zalgotext@sh.itjust.works
      link
      fedilink
      English
      32 years ago

      You can’t punish AI for doing something wrong.

      Maybe I’m being pedantic, but technically, you do punish AIs when they do something “wrong”, during training. Just like you reward it for doing something right.

      • @cosmicrookie@lemmy.world
        link
        fedilink
        English
        42 years ago

        But that is during training. I insinuated that you can’t punish AI for making a mistake, when used in combat situations, which is very convenient for the ones intentionally wanting that mistake to happen

  • BombOmOm
    link
    fedilink
    English
    33
    edit-2
    2 years ago

    As an important note in this discussion, we already have weapons that autonomously decide to kill humans. Mines.

    • Chuck
      link
      fedilink
      English
      832 years ago

      Imagine a mine that could move around, target seek, refuel, rearm, and kill hundreds of people without human intervention. Comparing an autonomous murder machine to a mine is like comparing a flint lock pistol to the fucking gattling cannon in an a10.

      • @gibmiser@lemmy.world
        cake
        link
        fedilink
        English
        462 years ago

        Well, an important point you and him. Both forget to mention is that mines are considered inhumane. Perhaps that means AI murdering should also be considered. Inhumane, and we should just not do it instead of allowing landmines.

        • livus
          link
          fedilink
          202 years ago

          This, jesus, we’re still losing limbs and clearing mines from wars that were over decades ago.

          An autonomous field of those is horror movie stuff.

      • Chozo
        link
        fedilink
        232 years ago

        Imagine a mine that could move around, target seek, refuel, rearm, and kill hundreds of people without human intervention.

        Pretty sure the entire DOD got a collective boner reading this.

      • @Sterile_Technique@lemmy.world
        link
        fedilink
        English
        82 years ago

        Imagine a mine that could move around, target seek, refuel, rearm, and kill hundreds of people without human intervention. Comparing an autonomous murder machine to a mine is like comparing a flint lock pistol to the fucking gattling cannon in an a10.

        For what it’s worth, there’s footage on youtube of drone swarm demonstrations that were posted 6 years ago. Considering that the military doesn’t typically release footage of the cutting edge of its tech to the public, so this demonstration was likely for a product that was already going obsolete; and that the 6 years that have passed since have made lightning fast developments in things like facial recognition… at this point I’d be surprised if we weren’t already at the very least field testing the murder machines you described.

      • FaceDeer
        link
        fedilink
        -132 years ago

        Imagine a mine that could recognize “that’s just a child/civilian/medic stepping on me, I’m going to save myself for an enemy soldier.” Or a mine that could recognize “ah, CenCom just announced a ceasefire, I’m going to take a little nap.” Or “the enemy soldier that just stepped on me is unarmed and frantically calling out that he’s surrendered, I’ll let this one go through. Not the barrier troops chasing him, though.”

        There’s opportunities for good here.

        • Flying Squid
          link
          fedilink
          English
          102 years ago

          Yes, those definitely sound like the sort of things military contractors consider.

        • livus
          link
          fedilink
          32 years ago

          @FaceDeer okay so now that mines allegedly recognise these things they can be automatically deployed in cities.

          Sure there’s a 5% margin of error but that’s an “acceptable” level of colateral according to their masters. And sure they are better at recognising some ethnicities than others but since those they discriminate against aren’t a dominant part of the culture that peoduces them, nothing gets done about it.

          And after 20 years when the tech is obsolete and they all start malfunctioning we’re left with the same problems we have with current mines, only because the ban on mines was reversed the scale of the problem is much much worse than ever before.

  • @Immersive_Matthew@sh.itjust.works
    link
    fedilink
    English
    262 years ago

    We are all worried about AI, but it is humans I worry about and how we will use AI not the AI itself. I am sure when electricity was invented people also feared it but it was how humans used it that was/is always the risk.

  • Pirky
    link
    fedilink
    English
    252 years ago

    Horizon: Zero Dawn, here we come.

  • Marxism-Fennekinism
    link
    fedilink
    English
    21
    edit-2
    2 years ago

    Remember: There is no such thing as an “evil” AI, there is such a thing as evil humans programming and manipulating the weights, conditions, and training data that the AI operates on and learns from.

    • @Zacryon@feddit.de
      link
      fedilink
      English
      152 years ago

      Evil humans also manipulated weights and programming of other humans who weren’t evil before.

      Very important philosophical issue you stumbled upon here.

  • Kühe sind toll
    link
    fedilink
    English
    202 years ago

    Saw a video where the military was testing a “war robot”. The best strategy to avoid being killed by it was to stay u human liek(e.g. Crawling or rolling your way to the robot).

    Apart of that, this is the stupidest idea I have ever heard of.

  • @unreasonabro@lemmy.world
    link
    fedilink
    English
    192 years ago

    any intelligent creature, artificial or not, recognizes the pentagon as the thing that needs to be stopped first

    • LoafyLemon
      link
      fedilink
      72 years ago

      Welp, we’re doomed then, because AI may be intelligent, but it lacks wisdom.

  • Yardy Sardley
    link
    fedilink
    English
    172 years ago

    For the record, I’m not super worried about AI taking over because there’s very little an AI can do to affect the real world.

    Giving them guns and telling them to shoot whoever they want changes things a bit.

    • @tinwhiskers@lemmy.world
      link
      fedilink
      English
      1
      edit-2
      2 years ago

      An AI can potentially build a fund through investments given some seed money, then it can hire human contractors to build parts of whatever nefarious thing it wants. No human need know what the project is as they only work on single jobs. Yeah, it’s a wee way away before they can do it, but they can potentially affect the real world.

      The seed money could come in all sorts of forms. Acting as an AI girlfriend seems pretty lucrative, but it could be as simple as taking surveys for a few cents each time.

      Once we get robots with embodied AIs, they can directly affect the world, and that’s probably less than 5 years away - around the time AI might be capable of such things too.

      AI girlfriends are pretty lucrative. That sort of thing is an option too.

  • Dizzy Devil Ducky
    link
    fedilink
    English
    162 years ago

    As disturbing as this is, it’s inevitable at this point. If one of the superpowers doesn’t develop their own fully autonomous murder drones, another country will. And eventually those drones will malfunction or some sort of bug will be present that will give it the go ahead to indiscriminately kill everyone.

    If you ask me, it’s just an arms race to see who build the murder drones first.

    • FaceDeer
      link
      fedilink
      62 years ago

      A drone that is indiscriminately killing everyone is a failure and a waste. Even the most callous military would try to design better than that for purely pragmatic reasons, if nothing else.

      • @SomeSphinx@lemmy.world
        link
        fedilink
        English
        12 years ago

        Even the best laid plans go awry though. The point is even if they pragmatically design it to not kill indiscriminately, bugs and glitches happen. The technology isn’t all the way there yet and putting the ability to kill in the machine body of something that cannot understand context is a terrible idea. It’s not that the military wants to indiscriminately kill everything, it’s that they can’t possibly plan for problems in the code they haven’t encountered yet.

    • @Pheonixdown@lemm.ee
      link
      fedilink
      English
      32 years ago

      I feel like it’s ok to skip to optimizing the autonomous drone-killing drone.

      You’ll want those either way.

      • threelonmusketeers
        link
        fedilink
        English
        12 years ago

        If entire wars could be fought by proxy with robots instead of humans, would that be better (or less bad) than the way wars are currently fought? I feel like it might be.

        • @Pheonixdown@lemm.ee
          link
          fedilink
          English
          22 years ago

          You’re headed towards the Star Trek episode “A Taste of Armageddon”. I’d also note, that people losing a war without suffering recognizable losses are less likely to surrender to the victor.

  • @MindSkipperBro12@lemmy.world
    link
    fedilink
    English
    16
    edit-2
    2 years ago

    For everyone who’s against this, just remember that we can’t put the genie back in the bottle. Like the A Bomb, this will be a fact of life in the near future.

    All one can do is adapt to it.

    • @kromem@lemmy.world
      link
      fedilink
      English
      02 years ago

      There is a key difference though.

      The A bomb wasn’t a technology that as the arms race advanced enough would develop the capacity to be anywhere between a conscientious objector to an usurper.

      There’s a prisoner’s dilemma to arms races that in this case is going to lead to world powers effectively paving the path to their own obsolescence.

      In many ways, that’s going to be uncharted territory for us all (though not necessarily a bad thing).