It's worth pointing out here that this script was probably written by a human.
Edit: reporting now indicates that it was human written https://arstechnica.com/ai/2024/01/george-carlins-heirs-sue-comedy-podcast-over-ai-generated-impression/
This is a most excellent place for technology news and articles.
It's worth pointing out here that this script was probably written by a human.
Edit: reporting now indicates that it was human written https://arstechnica.com/ai/2024/01/george-carlins-heirs-sue-comedy-podcast-over-ai-generated-impression/
This case is not just about AI, it’s about the humans that use AI to violate the law, infringe on intellectual property rights and flout common decency.”
Well put.
Eh…. I don’t know that I can agree with this.
I understand the intent behind it, but this specific instance is legitimately in parallel with impersonators, or satire. Hear me out.
They are impersonating his voice, using new content in his style, and make no claim to be legitimate.
So this comes down to “this is in bad taste” which, while I can understand and might even agree with… isn’t illegal.
The only novel concept in this, is that “scary tech” was used. There was no fraud, there was no IP violation, and no defamation. Where is the legal standing?
If its wrong to use AI to put genitals in someone's mouth it should probably be wrong to use AI to put words in their mouth as well.
Damn.
snaps
I agree and I get it's a funny way to put it, but in this case they started the video with a massive disclaimer that they were not Carlin and that it was AI. So it's hard to argue they were putting things in his mouth. If anything it's praiseworthy of a standard when it comes to disclosing if AI was involved, considering the hate mob revealing that attracts.
The internet doesn’t care though. If I make fake pictures of people using their likeness and add a disclaimer, people will just repost it without the disclaimer and it will still do damage. Now whether or not we can or should stop them is another story
Completely true. But we cannot reasonably push the responsibility of the entire internet onto someone when they did their due diligence.
Like, some people post CoD footage to youtube because it looks cool, and someone else either mistakes or malicious takes that and recontextualizes it to being combat footage from active warzones to shock people. Then people start reposting that footage with a fake explanation text on top of it, furthering the misinformation cycle. Do we now blame the people sharing their CoD footage for what other people did with it? Misinformation and propaganda are something society must work together on to combat.
If it really matters, people would be out there warning people that the pictures being posted are fake. In fact, even before AI that's what happened after tragedy happens. People would post images claiming to be of what happened, only to later be confirmed as being from some other tragedy. Or how some video games have fake leaks because someone rebranded fanmade content as a leak.
Eventually it becomes common knowledge or easy to prove as being fake. Take this picture for instance:
It's been well documented that the bottom image is fake, and as such anyone can now find out what was covered up. It's up to society to speak up when the damage is too great.
I'm torn. I can see why they would be upset. And they may have a case with likeness rights.
But at the same time, this specific example isn't trying to claim any kind of authenticity. It goes out of its way to explain that it's not George. It seems clearly to be along the lines of satire. No different than an impersonator in a SNL type sketch.
I guess I don't have any real problem with clearly fake AI versions of things. My only real problem would be with actual fraud. Like the AI Biden making calls trying to convince people not to vote in a primary. That's clearly criminal fraud, and an actual problem.
My only real problem would be with actual fraud. Like the AI Biden making calls trying to convince people not to vote in a primary.
That's the difference between impression and impersonation. My disappointment in the Lemmy community for not understanding the difference is immeasurable. We're supposed to be better than this but really we're no better than Reddit, running with ragebait headlines for the cheap dopamine hit that is the big upvote number.
If it were a human doing a Carlin impression, literally NOBODY would give a fuck about this video.
Internet: this is awful, of course your inheritors own your own image as stewarts.
Also Internet: I have a right to take pictures of you, your car, your house, or record you without consent. Edit it however I want. Make as much money as I want from the activities and you have no rights. Since if technology allows me to do something you have no expectation that I won't.
We are demanding that a public figure who is dead have more rights than a private person who is alive.
What's the alleged crime? Comedy impersonation isn't illegal. And the special had numerous disclaimers that it was an impersonation of Carlin.
Sounds like a money grab by the estate, which Carlin himself probably would have railed on.
What do you mean by "comedy impersonation" - parody, or just copying a comedian?
If I were to set up a music show with a Madonna impersonator and slightly changed Madonna songs (or songs in her style), I'll get my pants sued off.
If Al Yankovic does a parody of a Madonna song, he's in the clear (He does ask for permission, but that's a courtesy and isn't legally mandatory).
The legal term is "transformative use". Parody, like where SNL has Alec Baldwin impersonating Trump, is a recognized type of transformative use. Baldwin doesn't straight up impersonate Trump, he does so in a comedic fashion (The impersonation itself is funny, regardless of how funny Trump is). The same logic applied when parodying or impersonating a comedian.
If I were to set up a music show with a Madonna impersonator and slightly changed Madonna songs (or songs in her style), I'll get my pants sued off.
Drag shows do stuff like this all the time with zero issue. Artistic freedom is a thing.
How is the AI impersonation of Carlin different from when Paramount used actors who looked like Queen Elizabeth or Barbara Bush, or human impersonators who sound just like the real person they’re impersonating (besides the obvious difference)?
I’m not saying Dudesy is in the right. Making an AI system sound like someone somehow feels different than an impersonator doing the same thing. But I don’t know why I feel that way, as they’re extremely similar cases.
I think your Madonna example is completely fine as long as they don't call themselves Madonna and start uploading videos on YouTube with her name on it (like is the case here).
Madonna owns her name and trademark but not her tone of voice, style of songs or her wardrobe choices.
In the same way, The George Carlin estate doesn't own his speech mannerism or comedic style but they certainly own his name.
Ive been thinking about this a lot and if you think about this like they are selling a stolen product then it can be framed differently.
Say I take several MegaMan games, take a copy of all the assets, recombine them into a new MegaMan game called "Unreal Tales of MegaMan". The game has whole new levels inspired by capcom's Megaman. Many would argue that the work is transformative.
Am I allowed to sell that MegaMan game? I'm not a legal expert but I think the answer to that would generally be no. My intention here is to mimic a property and profit off of a brand I do not own the rights too.
Generative AI uses samples of original content to create the derivative work to synthesize voices of actors. The creator of this special intention is to make content from a brand that they can solely profit from.
If you used an AI to generate a voice like George Carlin to voice the Reptilian Pope in your videogame, I think you would have a different problem here. I think it's because they synthesized the voice and then called it George Carlin and sold it as a "New Comedy Special" it begins to fall into the category of Bootleg.
You couldn't sell that game, even if you created your own assets, because Mega Man is a trademarked character. You could make a game inspired by Mega Man, but if you use any characters or locations from Mega Man, you would be violating their trademark.
AI, celebrity likeness, and trademark is all new territory, and the courts are still sorting out how corporations are allowed to use a celebrities voices and faces without their consent. Last year, Tom Hanks sued a company that used an IA generated version of him for an ad, but I think it's still in court. How the courts rule on cases like this will probably determine how you can use AI generated voices like in your Reptilian Pope example (though in that case, I'd be more worried about a lawsuit from Futurama).
This lawsuit is a little different though; they're sidestepping the issue of likeness and claiming that AI is stealing from Carlin's works themselves, which are under copyright. It's more similar to the class action lawsuit against Chat GPT, where authors are suing because the chatbot was fed their works to create derivative works without their consent. That case also hasn't been resolved yet.
Edit: Sorry, I also realized I explained trademark and copyright very poorly. You can't make a Mega Man game because Mega Man, as a name, is trademarked. You could make a game that has nothing to do with the Mega Man franchise, but if you called it Mega Man you would violate the trademark. The contents of the game (levels, music, and characters) are under copyright. If you used the designs of any of those characters but changed the names, that would violate copyright.
Celebrity likeness is not new territory.
Crispin Glover successfully sued the filmmakers of Back to the Future 2 for using his likeness without permission. Even with dead celebrities, you need permission from their estate in order to use their likeness.
I'll take Lawyers Maximizing Billable Hours for $500, Alex
Ripped it from YouTube last night to add to my media server; curiously it's no longer available on youtube this morning... (at least the original Dudesy upload I'd grabbed, there's re-uploads)
I refuse to watch it - I love the original guys stuff so it wouldn't feel right.
However, is it any good?
Just finished it:
It's an interesting piece. I'm not sure I'd pay to watch it or any other AI comedy specials (didn't even watch it via YouTube to avoid ad revenue), but given free access I wanted to at least see what's up.
It both starts and ends with very clear disclaimers that this is not George Carlin but an AI impersonation of him. The voice is pretty close, but not quite right, though it matches his cadence quite well. Even without the disclaimers, it's pretty obvious to me it's not actually George Carlin.
The majority of the video is clearly AI generated art to match the current topic, mostly stills with a handful of short sections of AI people mouthing the words. I'm fairly sure the script and art were curated by a human, along with the overall editing of the special.
Quite a bit of highly political comedy in a very similar style to Carlin, but definitely doesn't hold a candle to his original/genuine work. It also discusses what he/it is, some of the controversy around it's existence, and the possible future of AI use throughout all professions, but mainly standup comedy roles and similar (like talk show hosts and news anchors for example)
Worth a watch, if you can keep an open mind and recognize there's a difference between the original and an artistic representation of him. I don't think the tools used changes that, especially with it clearly stated as being an impersonation.
Ah... How can I get that file, I haven't seen it.
It's called 'George Carlin: I'm glad that I'm dead'. Have a look around, the original upload was removed, but there are others.