Blood Music was way cooler then this just saying.

  • bitofhope@awful.systems
    link
    fedilink
    English
    arrow-up
    17
    ·
    edit-2
    1 year ago

    Small detail: biological viruses are not even remotely similar to computer “viruses”.

    that’s where the LLM comes in! oh my god check your reading comprehension

    U-huh, and an LLM trained on video game source code and clothing patterns can invent real life Gauntlets of Dexterity.

    Why exactly is he so convinced LLMs are indistinguishable from magic? In the reality where I live, LLMs can sometimes produce a correct function on their own and are not capable of reliably transpiling code even for well specified and understood systems, let alone doing comic book mad scientist ass arbitrary code execution on viral DNA. Honestly, they’re hardly capable of doing anything reliably.

    Along with the AI compiler story he inflicted on Xitter recently, I think he’s simply confused LLM and LLVM.

    • Soyweiser@awful.systems
      link
      fedilink
      English
      arrow-up
      20
      ·
      edit-2
      1 year ago

      For decades he build a belief system where high intelligence is basically magic. That is needed to power his fears of AGI turning everything into paperclips, and it has become such a load bearing belief (one of the reasons for it is is a fear of death and grief over people he lost so not totally weird) that he has other assumptions added to this, for example we know that computers are pretty limited by matter esp the higher end ones need all kinds of metals which must be mined etc today. So that is why he switches his fears to biology, as biology is ‘cheap’ ‘easy’ and ‘everywhere’. The patterns in his reasoning are not that hard to grok. That is also why he thinks LLMs (which clearly are now at the start of their development not the end, it is like the early internet! (personally I think we are mostly at the end and we will just see a few relatively minor improvents but no big revolutionary leap)) will lead to AGI, on some level he needs this.

      Men will nuke datacenters before going to therapy for grief and their mid life crisis.

      • 200fifty@awful.systems
        link
        fedilink
        English
        arrow-up
        13
        ·
        1 year ago

        What I don’t get is, ok, even granting the insane Eliezer assumption that LLMs can become arbitrarily smart and learn to reverse hash functions or whatever because it helps them predict the next word sometimes… humans don’t entirely understand biology ourselves! How is the LLM going to acquire the knowledge of biology to know how to do things humans can’t do when it doesn’t have access to the physical world, only things humans have written about it?

        Even if it is using its godly intelligence to predict the next word, wouldn’t it only be able to predict the next word as it relates to things that have already been discovered through experiment? What’s his proposed mechanism for it to suddenly start deriving all of biology from first principles?

        I guess maybe he thinks all of biology is “in” the DNA and it’s just a matter of simulating the ‘compilation’ process with enough fidelity to have a 100% accurate understanding of biology, but that just reveals how little he actually understands the field. Like, come on dude, that’s such a common tech nerd misunderstanding of biology that xkcd made fun of it, get better material

        • Evinceo@awful.systems
          link
          fedilink
          English
          arrow-up
          5
          ·
          1 year ago

          What’s his proposed mechanism for it to suddenly start deriving all of biology from first principles?

          He considers deriving stuff from first principles much more versatile than it actually is. That and he really believes in the possibility of using simulations for anything.