• whereisk@lemmy.world
    link
    fedilink
    arrow-up
    6
    ·
    11 months ago

    Can’t wait for the wave of lawsuits after the ai hallucinantes lethal advice then insists it’s right.

    • ArtVandelay@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      edit-2
      11 months ago

      Reminds me of an AI that was programmed to play Tetris and survive for as long as possible. So the machine simply paused the game. Except in this case, it might decide the easiest way to end your suffering is to kill you, so slightly different stakes.

    • HerrBeter@lemmy.world
      link
      fedilink
      arrow-up
      1
      ·
      11 months ago

      They did a trial test in Sweden but the LLM did tell a patient to take a ibuprofen and chill pill. The patient had a hard time breathing, pressure over the chest, and some other symptoms I can’t remember.

      A nurse overseeing the convo stepped in and told the patient to immediately call the equivalent of 911