• 7 Posts
  • 101 Comments
Joined 1 年前
cake
Cake day: 2023年6月13日

help-circle
















  • Limeey@lemmy.worldtoScience Memes@mander.xyzHuh
    link
    fedilink
    English
    arrow-up
    79
    arrow-down
    4
    ·
    6 个月前

    It all comes down to the fact that LLMs are not AGI - they have no clue what they’re saying or why or to whom. They have no concept of “context” and as a result have no ability to “know” if they’re giving right info or just hallucinating.