• dohpaz42@lemmy.world
    link
    fedilink
    English
    arrow-up
    37
    arrow-down
    8
    ·
    22 days ago

    Can we normalize not calling them hallucinations? They’re not hallucinations. They are fabrications; lies. We should not be romanticizing a robot lying to us.

    • Imgonnatrythis@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      31
      ·
      22 days ago

      Pretty engrained vocabulary at this point. Lies implies intent. I would have preferred “errors”

      Also, for the record, this is the most dystopian headline I’ve come across to date.

      • dohpaz42@lemmy.world
        link
        fedilink
        English
        arrow-up
        5
        arrow-down
        11
        ·
        22 days ago

        If a human does not know an answer to a question, yet they make some shit up instead of saying “I don’t know”, what would you call that?

        • ramirezmike@programming.dev
          link
          fedilink
          English
          arrow-up
          25
          ·
          22 days ago

          that’s a lie. They knowingly made something up. The AI doesn’t know what it’s saying so it’s not lying. “Hallucinating” isn’t a perfect word but it’s much more accurate than “lying.”

          • 5too@lemmy.world
            link
            fedilink
            English
            arrow-up
            4
            ·
            edit-2
            22 days ago

            This is what I’ve been calling it. Not as a pejorative term, just descriptive. It has no concept of truth or not-truth, it just tells good-sounding stories. It’s just bullshitting. It’s a bullshit engine.