• trevor@lemmy.blahaj.zone
    link
    fedilink
    arrow-up
    2
    ·
    11 months ago

    I hate that LLMs seem to be positioned to replace real artists (in specific, corporate niches) as much as anyone, but trying to poison the LLMs themselves just seems completely ineffective in the face of regular snapshots and QA, right?

    Maybe I’m wrong, but I assume that any company trying to sell their LLM to anyone would surely be taking weekly (or even daily) snapshots of it and testing for degradation. If so, stuff like this seems pretty futile.

    • MBM@lemmings.world
      link
      fedilink
      arrow-up
      1
      ·
      11 months ago

      It means that artists can choose to make sure their art can’t get used for generative AI, unless they specifically send over the ‘clean’ version