THE SENATE UNANIMOUSLY passed a bipartisan bill to provide recourse to victims of porn deepfakes — or sexually-explicit, non-consensual images created with artificial intelligence.

The legislation, called the Disrupt Explicit Forged Images and Non-Consensual Edits (DEFIANCE) Act — passed in Congress’ upper chamber on Tuesday.  The legislation has been led by Sens. Dick Durbin (D-Ill.) and Lindsey Graham (R-S.C.), as well as Rep. Alexandria Ocasio-Cortez (D-N.Y.) in the House.

The legislation would amend the Violence Against Women Act (VAWA) to allow people to sue those who produce, distribute, or receive the deepfake pornography, if they “knew or recklessly disregarded” the fact that the victim did not consent to those images.

  • Lets_Eat_Grandma@lemm.ee
    link
    fedilink
    English
    arrow-up
    3
    arrow-down
    1
    ·
    edit-2
    3 months ago

    People have been using magazine clippings glued onto porn actress heads since before I was born. Photoshop for over a decade.

    How do you delineate between a really good photoshop and “ai”?

      • Lets_Eat_Grandma@lemm.ee
        link
        fedilink
        English
        arrow-up
        4
        arrow-down
        3
        ·
        3 months ago

        So you’re saying if someone makes a nude that is remotely similar to your likeness you can sue them.

        What do you do about identical twins if one chooses to be a porn star and takes self shots? Wouldn’t it look the same? Is it a crime to sell nudes if you have an identical twin?

        What about anybody who is not related but looks VERY SIMILAR - we’ve probably all heard stories of this happening.

        Finally, how do you know if it’s a US citizen that created the image vs anybody in any other country not bound by US laws?

        What if an AI creates a nude and then a child is born, and 20 years later they grow up to look identical to the ai generated image?

        There’s so many reasons why generated images should be treated like art and protected as free speech imo. It’s one thing if someone you know makes fake nudes of you and then uses them to ruin your image - that’s likely covered under many other laws including something like slander.

        People have been going to 11 trying to do anything preventing machine learning from being used for absolutely anything. It’s completely predictable because everyone wants a cut of whatever wealth may be generated by a new technology but maybe we should adapt to the new tool rather than punishing everybody for using it. AI is quickly turning into a tool that will only be usable by multibillion dollar companies with in house legal teams that can handle all the lawsuits.

        • NιƙƙιDιɱҽʂ@lemmy.world
          link
          fedilink
          arrow-up
          2
          ·
          3 months ago

          So you’re saying if someone makes a nude that is remotely similar to your likeness you can sue them.

          The law specifies that the images are indistinguishable from reality and are presented as you, directly or indirectly.

          What do you do about identical twins if one chooses to be a porn star and takes self shots? Wouldn’t it look the same? Is it a crime to sell nudes if you have an identical twin?

          If the images were of twin A and presented as twin B, I think this law would apply, as it would be fake porn of twin B.

          What about anybody who is not related but looks VERY SIMILAR - we’ve probably all heard stories of this happening.

          Again, if it’s being presented as someone it’s not, that’s an issue.

          Finally, how do you know if it’s a US citizen that created the image vs anybody in any other country not bound by US laws?

          Then the law does not apply. That’s literally how all laws work.

          What if an AI creates a nude and then a child is born, and 20 years later they grow up to look identical to the ai generated image?

          Was it presented as a nude of that person that did not yet exist? Impressive that they knew what their parents would name them ahead of time. Again, it must be presented as this person, directly or indirectly. This scenario couldn’t happen.

          Regarding trying to ruin someone’s image, I imagine that would indeed fall under some form of defamation laws, although not slander as that is specifically spoken words. I do agree we must tread carefully regarding free speech rights, however are we not also expected to have our rights to privacy? Even if someone isn’t trying to defame us, even if an image is fake, if at face value it’s completely indistinguishable from being real does it make a difference? Obviously there’s no simple solution and I think I agree with you that a law such as this probably isn’t it.

          I’m a bit lost on your last bit, however. Are you saying this law will further push AI into the hands of large corporations? I don’t see that, so much as see them being forced to implement stronger filters, while pushing users to the open source community in search to get around them. Horny people gonna horny and this law won’t stop that, it’d only stop public facing models from producing such content and stop individuals from distributing it.

          • Lets_Eat_Grandma@lemm.ee
            link
            fedilink
            English
            arrow-up
            1
            ·
            edit-2
            3 months ago

            I’m a bit lost on your last bit, however. Are you saying this law will further push AI into the hands of large corporations? I don’t see that, so much as see them being forced to implement stronger filters, while pushing users to the open source community in search to get around them. Horny people gonna horny and this law won’t stop that, it’d only stop public facing models from producing such content and stop individuals from distributing it.

            Remember how taxi medallions were worth millions and millions of dollars in NYC and Boston before uber? (1.2m was the peak per medallion in nyc. They dropped to like 35k at one point and are now around 140k.) Remember how uber got billions in VC money to fight the taxi industry lobbyists and effectively operates despite violating the systems that were in place for ages that prevented small independent operators from being taxis without having a bunch of seed money? Those billions of dollars let them win the legal fights and continue operating. You or I could never have challenged it.

            That’s the comparison I was making. If you regulate AI into the ground the only innovations and usage will come from big money interests, because they can eat the lawsuits. Individuals can’t eat the lawsuits. The law only applies to the small fries, the big guys cheat and get away with it because there is no transparency and a whole lot of pinky swears.