• @[email protected]
    link
    fedilink
    982 months ago

    Testing armed robot dogs in the Middle East instead of the US is pretty telling.

    Can’t be accidentally murdering Americans with a software glitch.

          • @[email protected]
            link
            fedilink
            4
            edit-2
            2 months ago

            Don’t worry, no danger of killing real people in the Middle East. All the “collateral damage” will be brown people, not Americans. They’ll have all the kinks ironed out and will make sure that the AI doesn’t hurt white targets before the technology is distributed to every national police district.

            I wish this post even deserved a /s.

    • @[email protected]
      link
      fedilink
      142 months ago

      Which is wild when you add perspective using facts like the police in the US are less disciplined than troops overseas and tbe US still uses substances banned by the Geneva Convention on its civilian population. So if even the US wwon’t test it on their own people, it’s bad.

      • Jojo, Lady of the West
        link
        fedilink
        82 months ago

        Listen, the Geneva convention only specifies what we can’t use on enemies, okay? As long as the targets are technically friendlies, it’s fair game!

        • @[email protected]
          link
          fedilink
          English
          42 months ago

          GC is for war and soldiers are combatants and not criminals by default (switching can happen easily). As an example Hollowpoint against criminals is okay as it can protect surrounding bystanders.

          It’s a bit weird, but for countries war is different from domestic problems.

    • @[email protected]
      link
      fedilink
      262 months ago

      Oh it was already tremendously fucked. This is just gravy on top.

      Fuckin killbots. Coming soon to the 1033 program and thus, your local police department. The Boston Dynamics: Wardog!

    • @[email protected]
      link
      fedilink
      172 months ago

      We should never have moved away from sticks and stones tbh. Anything that works at long range makes people misunderstand what war is. War needs to look disgusting, because the more clean and automated it looks, the less horrible it looks to people spectacting it. But it is indeed just as horrible as beating someone to death with a rock.

      • Flying Squid
        link
        fedilink
        212 months ago

        I mean, I’d rather not be hunted down by an AI robot dog, but you do you.

        • @[email protected]
          link
          fedilink
          142 months ago

          It’s happening anyway. We build them. Others build them in response because they have to. The sophistication of killbots will increase. Terrorists will get hold of them eventually. They’ll be hacked and turned on their handlers and/or civilians.

          All this is on top of ever increasing climate catastrophe. Look at Appalachia. The topography of those mountains was just rewritten. Whole towns erased like they were never there.

          • Flying Squid
            link
            fedilink
            42 months ago

            That’s not a reason for me to want it to happen. Which was your original post’s suggestion.

            • @[email protected]
              link
              fedilink
              22 months ago

              My first post was about letting the army fuck around and find out. Let the natural course of events remind them of those scifi movies they forgot about.

                • @[email protected]
                  link
                  fedilink
                  -12 months ago

                  Thousands at least. The more effective the killbots are the more money our war economy will throw at warbot R&D.

                  This is happening. Nothing on this planet can stop it.

    • @[email protected]
      link
      fedilink
      42 months ago

      I remember some kinda skit about sci Fi authors writing about how bad a torture matrix would be ironically inspiring real people to create the torture matrix cause it’s the future.

    • @[email protected]
      link
      fedilink
      82 months ago

      Well you see, the owners know you won’t die for them anymore, but now they’re able to take you out of the equation. Don’t even need poors to conquer the world. It’s really a great deal for them.

      • @[email protected]
        link
        fedilink
        32 months ago

        Roston Bynamics was found to actually be Boston Dynamics with some 100mph tape slapped over the logo.

      • @[email protected]
        link
        fedilink
        12 months ago

        I dunno, I’m subscribed to the BD YouTube channel and the very sudden change in facilities and upgrades to bots seems to be a little too in line with this. Like someone definitely caved in my opinion.

  • @[email protected]
    link
    fedilink
    192 months ago

    Without reading the article can I take a wild guess and say this is from “we promise never to make weaponized robots” Boston Dynamics?

    A promise from a corporation is just a lie by another name.

  • @[email protected]
    link
    fedilink
    192 months ago

    dont worry first they test it where civil lives dont matter and once it passes some basic tests, they will become available for domestic (ab)use

  • @[email protected]
    link
    fedilink
    112 months ago

    So if a robot commits a war crime, they can just blame it on AI and call it a day, right? Sounds like an easy way to do whatever the fuck you want.

  • @[email protected]
    link
    fedilink
    10
    edit-2
    2 months ago

    Is this their way of exterminating civilian populations like the Palestinians without dropping bombs and contributing so significantly to climate change?

    “The US military has been adopting a new climate friendly mindset and approach to international conflict. With this invention we can help our genocidal colonies acquire more land with little to no carbon emissions. We plan to be carbon-neutral by 2050, provided no one retaliates and attacks back.”