Coalition for Content Provenance and Authenticity (C2PA) - The Final Solution to the AI Question

Ok, but what's to stop me from... Stripping this data from the file?
Reading over the technical specs, it's essentially running a hash on the image, using your private key to encrypt the hash and including a public key in the metadata.

If Instagram or Twitter or whatever enforced making sure you had a public key in the image metadata you wouldn't be able to upload anything.
 
I just won't follow it lol, not like I needed these large companies anyway. ChatGPT has been lobotomized to the extent it won't properly look at info or quote books that it had in training sets (it used to do so properly with citation, example being The Collapse of American Criminal Justice by William Stuntz) unless you forcefully tell it that it knows such information and do all of these workarounds. Image generation takes forever using a website service and is subpar and already has a ton of censorship.

So how are they supposed to stop me? My 3050 I got for $200 on amazon can run a local uncensored LLM with decent response times, it can generate small images in less than 8 seconds or pretty large ones before upscaling in like a minute.
Unless they push to make the actual training illegal (which they very well might) then there's nothing they can do to stop the average guy who can just simply buy a standard GPU
 
  • Like
Reactions: Pedophobe
the AI shit is a red herring for more copyright bullshit they will push the dangers of AI even though these things already apply to photoshop and social media to create copyright its once again the same THINK OF THE ARTISTS bullshit even though UMG treats artists like dog shit and barley pays them fuck this bill AI FTW
 
Big tech an their army of furries and tranny programmer socks keep making horrible technologies that ruin the world and every option going forward is just another form of totalitarianism that fails at fixing it and still ruins the world.

I fucking hate all of it. Fuck AI , fuck social media, fuck Microsoft, fuck all this shit.
 
Reading over the technical specs, it's essentially running a hash on the image, using your private key to encrypt the hash and including a public key in the metadata.

If Instagram or Twitter or whatever enforced making sure you had a public key in the image metadata you wouldn't be able to upload anything.
Can you explain this more? I don't see how that does anything and I want to know if this is a dumb as it sounds or if I'm simply retarded.
 
Bollocks. Flash was a security nightmare. It was an easily-exploited attack vector for endless viruses and malware.

It also only really worked on Windows and begrudgingly on MacOS. They snuck out a few Linux builds like a wet fart but they barely worked and weren't nearly as performant as the Windows flavor. Hope you liked your browser crashes!
Look I get it. You're one of those people who look at it from the IT side. But what he said was true. Flash fundamentally changed the face of the animation and gaming and IT industry.
It allowed an entire generation to have access to a toolset that let them make their own animations and games servable to web browsers, were the vast majority of software at the time was proprietary, expensive, and so highly specialized that you needed a course on it just to draw a line. (Try tweening in Toonz). Without flash we wouldn't have half the indie games that paved the way for crowdfunding, UGC Platforms, and Rich multimedia standards we take for granted on more secure platforms.
The greatest sin of adobe was that flash was never an open standard, so the nerds could troubleshoot it, fork it, and make better engines that are not so open ended. But realistically, no one has done that. I can still watch .swfs I saved 20 years ago, and watch crisp vector art.
Html5+Svg+js never went anywhere and was never a viable containerized solution, unityweb is a cancer, and now we are living in the hellhole of bloated webm streams of crappily converted vector art, on platforms which are increasingly narrowing the noose for anyone but the algorithm money printers.

That Adobe being involved in any standard to mandatorily metadata tag images is disgusting, just as much as Google trying to create authenticated WEB protocols to track and force ads on is disgusting.
 
UMG are the dumb cunts who came up with the DVD format's CSS protection scheme (that's so weak the working code to decrypt it without a key is short enough to fit legibly on a t-shirt).
UMG are also the cunts that will immediately take-down any song you made that has even like 5 seconds worth of shit from their catalogue. Some time ago sites like SoundCloud basically gave them the keys to their palaces to just go full ham on "regulating" content that gets posted on those sites, so even if it's a site solely dedicated to mashups, or if it's a remix that's 100% transformative of the original piece, say bye-bye to it, because UMG will shitcan your work permanently and theres no way to fight it.

I have a VERY specific hatred for these fucks because of that.
 
Bollocks. Flash was a security nightmare. It was an easily-exploited attack vector for endless viruses and malware.
I think in all the Newgrounds rose-tinted glasses somehow people have forgotten that Flash-based advertisements were absolutely fucking everywhere back in its heyday. You couldn't just avoid it by not loading the Flash plugin either because so much useful interactive content relied on it (YouTube).

They lagged, ate up shit loads of bandwidth and were a major distraction. Kinda like modern JavaScript-based ads on current year news sites. :(
 
Can you explain this more? I don't see how that does anything and I want to know if this is a dumb as it sounds or if I'm simply retarded.
It's pretty much as dumb as it sounds, but let me lay out what I think their ideal implementation would be in the basic case.

You have an Android phone. You take a picture. Google prompts you to sign in to your Google account and create a ed25519 key pair. Your private key will be stored on your phone and most likely gets linked to your Google account. It uses SHA256SUM to hash your picture and uses your private key to encrypt it. Your encrypted hash and your public key is included in the metadata.

You upload it to a site. They hash your image using SHA256SUM and decrypt the hash in your file using your public key to verify that you are the creator of this file. This is how you would prove source provenance.

The complication arises with literally any way that you would use to edit photos online or on your desktop computer. Hash changes with any edit you make. So whatever app you are using will need your private key to re-encrypt it.
 
I hate AI doomers and their "innovative" solutions that fix the problem that will never be by over-convoluted solutions that have many loopholes and workarounds.

But think at the poor artists who didn't read the Deviantart's TnA and now their art is used to generate smudges of color!
 
I hate AI doomers and their "innovative" solutions that fix the problem that will never be by over-convoluted solutions that have many loopholes and workarounds.
To be honest it's actually a real problem.

Unless you want to live in a world where literally everything you see on the internet is AI-generated because it's quicker and cheaper than hiring humans to make content.
 
Can you explain this more? I don't see how that does anything and I want to know if this is a dumb as it sounds or if I'm simply retarded.
I think their ideal implementation would be in the basic case.
Imagine a phone manufacturer provides a cryptographic credential for each of the phones they create which is baked into the camera chip. When the camera takes a picture, the image is signed by the camera and this is packaged up in a 'claim' which is included in the picture's metadata. Then you post the photo online and sign the image (including the metadata) as being yours, storing that in another claim. The important thing is that claims exist side-by-side and can each be verified independently of one another while also being stored together in a tamper-evident envelope signed by the most recent claimer. Claims are also multipurpose, they could be about where the image comes from, who generated it, what software was used editing it.
 
Last edited:
To be honest it's actually a real problem.

Unless you want to live in a world where literally everything you see on the internet is AI-generated because it's quicker and cheaper than hiring humans to make content.
But everything IS ai generated.

You don't give it a prompt, you give it a couple coordinates and it draws a line, or brush, or whatever. Then you tell it to apply post-effects and it does so with your image.

Oh oh oh or you mean, now that Photoshop exists, legit Paint artists who draw one pixel at a time are completely obsolete?
 
  • Dumb
Reactions: Shlomo XL
To be honest it's actually a real problem.

Unless you want to live in a world where literally everything you see on the internet is AI-generated because it's quicker and cheaper than hiring humans to make content.
given the absolute goyslop of recent years, and sturgeon's law in general, there's not much difference. for most stuff you'll still want an actual human for touch-ups and fixing, look up the term script doctor for example. all it does is replace the shitters and low-level writers.

people also need to understand there's not much difference to a human already. you could autisticly do nothing else than read the works of the same author, then read all secondary literature about that author, to the point you could more or less ape the style of that author. heck that's usually how art forgers operate too. AI is just faster and cheaper at it. still doesn't mean it will be good or as successful as the original.

will it have an effect on creative works? absolutely. but (for now at least) AI doesn't have any creativity in itself. it might come up with shit you haven't thought of, were not good enough for or too lazy, but in the end it's just an amalgamation of something else (which again is how humans "create" too). but there will still be people creating something "new", either by combining something no one on did before or do something no one did before - although with the amount AI can output it's more likely this happens at some point for someone too. however art doesn't work in a vacuum (and to make money with it is highly dependent on luck anyway), so it rotting away on some folder on someone's harddrive doesn't mean much...
 
Back