Executive Order 2023.02 (henceforth "NIGHTMARE WORLD WHERE FACT AND FICTION COEXIST")

So the company that makes this has realised that allowing people to create convincing fakes of almost anybody saying anything might just have fucked the world over. Their solution is a paywall and a verification tool, because that will solve everything.

View attachment 4442360
View attachment 4442364
View attachment 4442368

Like, what the fuck did they expect???
"You can totally make something incriminating using our platform but if you tell a tranny she isn't a woman using an AI voice, we will hunt you down."
 
The way I see it, it'll just be common knowledge that audio can be easily fabricated, and so it won't be taken seriously as evidence. People are able to make convincing Photoshops now that weren't possible decades ago, and that tricks people every now and then, but it doesn't have massive repercussions because people are aware that the tool exists.
I also wouldn't be surprised if there are people developing tools to detect AI-generated audio at a similar pace to the people developing tools to create it. But my guess is that arms race will end in favor of AI stuff becoming more and more believable.

The problem will be that genuine audio and video evidence can no longer be trusted, rather than that deepfakes will doom people.
I agree that there will likely be an arms race between AI development and tools to detect AI forgery, but I think the real harm comes from human error. There's no guarentee that AI generated content can be caught in time before it can do real damage to someone's reputation or the verdict in a court case. I think as the technology gets better, there will be more innovative ways this technology can be used to impersonate other people.
EDIT:
Here's a tip, there's many cases where clear video footage of you commuting a crime still isn't sufficient evidence to be fully convicted. Actually investigating the alleged crime would more than likely put holes in the prosecutors case.
There's definitely truth to this too, but again, I think it comes down to potential of abuse, the growth of the technology and inevitable error in judgement. OJ Simpson and Casey Anthony both got away with murder even with substantial, physical proof that they did it but that was still enough to get them a not guilty verdict. It might be possible that audio evidence will be thrown out but that could also lead to issues of if the audio is genuine but a clever lawyer could create doubt in the jury that it is or isn't authentic. Audio forgeries have existed in the past but now with AI technology, it will be harder and harder to tell through ear if it is authentic or not. Maybe we will develop technology to tell if it is but right now, we don't really a defense for it as far as I know. I won't pretend to know anything about AI to comment or rather or not it's possible to create an "AI checker" program but I hope I'm wrong and we can have this great technology without the fear of it being used to falsely incriminate people.
 
Last edited:
Just made something for the DALL-E thread, you gonna be wanting an indication for every single pic there for now on?


Untitled.jpg

Untitled2.jpg
 
I've always had a problem with being very sceptical and distrusting of any and all information, even video evidence. It's absolutely terrifying knowing I can never truly trust something I see or hear ever again. I just wanna go innawoods. The sounds of birds and squirrels are real.
 
This technology is going to legislated to hell and back pretty soon. Good on you for getting ahead of this, the last thing we need is for there to be a reason the site is no longer US legal.

How would you even legislate this ?

It's only a matter of time before this can be done on a private system and you won't be able to track the origins of audio that originated like that without essentially upending the internet as it currently exists.
 
how? it's already created and deployed, nobody can put that genie back in the bottle.
like laws against murder, the idea would be to both mitigate it and create accountability for it. When Cain murdered Abel, the future was here and we couldn't put murder back in the bottle. Murder laws aren't there to eliminate all murder, just to limit it as much as possible. Same with future legislation against synthetic content, except the idea wouldn't be to limit its creation, just to limit the harm it can cause
 
Potentially unpopular opinion: I hope all this AI, ML, deepfake stuff gets outlawed completely.
While it seems fun to mess around with, I see no applicable use of this technology that isn’t outright malicious.
I do definitely agree with the sentiment, because we are stepping into a pretty fucking scary future, but...

This Pandora's Box has already been opened, there is no way to shut it. Elevenlabs may be the main one right now, but this technology has already come this far. No matter how hard you crackdown on it, it exists and it always will.

Unironically only a massive solar flare that fries absolutely all of our technology in one fell swoop can stop it. Even then, faraday cages are a thing.
 
Back