I’ve been saying this for about a year since seeing the Othello GPT research, but it’s nice to see more minds changing as the research builds up.

  • superfes@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    10 months ago

    Stupid, LLMs do not create new relationships to words that don’t exist.

    This is all just fluff to make them seem more like AGI, which they never will be.

    • Gnome Kat@lemmy.blahaj.zone
      link
      fedilink
      English
      arrow-up
      0
      ·
      10 months ago

      Why would that be required for understanding? Presumably during the training it would have made connections between words it saw. Now that the training has stopped it hasn’t just lost those connections, sure it can’t make new connections but why is that important for using the connections it already has?