• Furbag@lemmy.world
    link
    fedilink
    English
    arrow-up
    2
    arrow-down
    1
    ·
    12 hours ago

    Yeah, because the AI will look at everything with cold logic and rationality and come to the conclusion that even though the best chance of survival is for everyone to keep their fingers off the button, all it takes is for one actor to do it for the whole system of mutually assured destruction to collapse into nuclear armageddon, in which case the best chance of survival is to be the first one to launch your nukes and take out all your enemies capabilities to retaliate.

    A human being who isn’t psychotic can clearly see that the resulting survival and new world order would not be particularly a pleasant one to live in. The AI doesn’t care about its own comfort, though, so it will see this as the best outcome that minimizes variables.

    This is why AI should never be allowed to make decisions.

    • RememberTheApollo_@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      10 hours ago

      Maybe AI/LLM being programmed by self-serving interests has bled through to the “thought” process. Do unto others before they do unto you.