• @[email protected]
    link
    fedilink
    974 days ago

    Testing armed robot dogs in the Middle East instead of the US is pretty telling.

    Can’t be accidentally murdering Americans with a software glitch.

          • @[email protected]
            link
            fedilink
            4
            edit-2
            2 days ago

            Don’t worry, no danger of killing real people in the Middle East. All the “collateral damage” will be brown people, not Americans. They’ll have all the kinks ironed out and will make sure that the AI doesn’t hurt white targets before the technology is distributed to every national police district.

            I wish this post even deserved a /s.

    • @[email protected]
      link
      fedilink
      144 days ago

      Which is wild when you add perspective using facts like the police in the US are less disciplined than troops overseas and tbe US still uses substances banned by the Geneva Convention on its civilian population. So if even the US wwon’t test it on their own people, it’s bad.

      • Jojo, Lady of the West
        link
        fedilink
        73 days ago

        Listen, the Geneva convention only specifies what we can’t use on enemies, okay? As long as the targets are technically friendlies, it’s fair game!

        • @[email protected]
          link
          fedilink
          English
          43 days ago

          GC is for war and soldiers are combatants and not criminals by default (switching can happen easily). As an example Hollowpoint against criminals is okay as it can protect surrounding bystanders.

          It’s a bit weird, but for countries war is different from domestic problems.