Taylor Swift AI images prompt US bill to tackle nonconsensual, sexual deepfakes::Bipartisan measure introduced in the US senate will allow victims in ‘digital forgeries’ to seek civil penalty against perpetrators

  • abhibeckert@lemmy.world
    link
    fedilink
    English
    arrow-up
    2
    arrow-down
    5
    ·
    edit-2
    10 months ago

    People are still going to do it regardless.

    Would they? Last year a woman was awarded $1.2b in damages after her ex boyfriend distributed revenge porn.

    How many people would hit that retweet button, if they knew it might lead to damages on that scale? Presumably her ex-boyfriend went bankrupt and lost everything he owned, having to give all of it to her (and her lawyers).

    Sure, some people would still take that risk but not very many. And they’d only do it once.

    • TheGrandNagus@lemmy.world
      link
      fedilink
      English
      arrow-up
      10
      ·
      edit-2
      10 months ago

      Would they?

      Yes.

      People break copyright and other IP laws all the time, for example.

      Shit, torrenting a film carries a 10 year max prison sentence where I am. It doesn’t stop anybody.

      Speeding fines can be absolutely huge. People still speed. Etc.

      A law like this is virtually impossible to enforce, the crime in question is getting easier and easier to trivially commit, and thus the law likely won’t do much.

      And btw that case you linked is a hell of a lot more than someone retweeting or upvoting a deepfake.

      It covers someone constantly uploading porn of a partner and blackmailing them (even days before the court case), impersonating her online, doxxing her, and sending porn of her to her family members.

      It also covers him illegally using her bank account to pay his bills and using her name and information to apply for loans in her name.

      That case is a very, very, very, very different situation to someone making a Taylor Swift deepfake.

      So different that it calls into question whether you even read past the headline.