Coalition for Content Provenance and Authenticity (C2PA) - The Final Solution to the AI Question

Relinquish

Интернет убил меня
True & Honest Fan
kiwifarms.net
Joined
Oct 17, 2021
1689923240652.png

In 2021 Adobe, Arm, Intel, and Microsoft have partnered with photo verification platform Truepic and the BBC to form the Coalition for Content Provenance and Authenticity (C2PA). Not many people know about it yet but it should be on every developer's radar because it could change the internet as we know it

Basically, it's a spec or set of guidelines for both hardware and software providers that would attach metadata to every media file like images, videos, audio, and so on then use cryptography to digitally sign it making every file tamper aware of the idea is to make it impossible to change a pixel without the provenance of those changes being recorded on the file itself or to a manifest that is permanently attached to that file. The first record will happen on the camera where the photo is taken then when you go to edit in Photoshop another record will be logged and signed there, it's almost like every image will become an NFT minus the blockchain. You'll be able to click on this icon to inspect the provenance information to determine if that image was generated by AI or determine if it comes from a trusted news source like BuzzFeed or Infowars this technology is already here and just last week the Adobe was urging the U.S Senate to put laws on the books that would make this technology essential


And here's the head of public policy StabilityAI telling the US Senate that they are integrating Adobe's C2PA/CAI metadata surveillance tech into their ai products.


The full recording:
And Here's our dear leader Josh predicting what will happen

Top companies like stability AI are fully on board and it just generated throughout the platform and can be digitally stacked with metadata and watermarks to welcome Adobe's leadership in driving the development of some of these Open Standards, not surprisingly the US Department of Defense is on board because they believe this technology can help surface bad actors creating horrible synthetic content. It all sounds great but when looked at from another angle this can be viewed as a mass surveillance apparatus in the future it may be impossible to change a pixel on the internet without leaving a digital footprint. Currently, the spec allows for anonymity but it talks about how this technology can be used with digital IDs issued by the government that would make it far easier to figure out who's creating all these memes that are offensive to our dear leaders. When this technology is combined with a digital currency and social credit system we could easily shut down the meme creators' internet access and reduce their allowance of lab-grown meat to just 12 ounces per week, in addition, it would give the establishment a monopoly on disinformation. Hypothetically they could create all the AI-generated content they want while making it look trustworthy and the vast majority of people out there will believe whatever authorities tell them like if this image had a NASA prominent signature on it almost everybody would believe that we went to Mars even though it's not a real place you can go to in 1981 the CIA director said we'll know our disinformation program is complete when everything in the American public believes is false
 
Last edited:
Microsoft is using C2PA metadata tech to create "reputation management systems" for media uploaded online.



Jeffrey Harleston of Universal Music Group suggests using "metadata" to track what media is used to train ai models.



Former Disney illustrator Karla Ortiz was at the Senate hearing with Adobe. She's the leader of the artist's GoFundMe that seeks to lobby the gov and join the Copyright Alliance.



C2PA published two papers outlining how their surveillance metadata tech could ruin our lives. It includes an "optional" feature to sign content with a government-issued digital ID.
43333.jpeg
F08l-78aIAA36iV.jpeg


C2PA metadata tech is designed to work in concert with Microsoft AI content moderation tools to scrub the net of "disinformation." It is being implemented at the hardware level, including by camera manufacturers.
23.png

344.png
 
Last edited:
This is some of the most perfect examples of "sounds good until you think about it".

The idea of a string of metadata on a image to identify the origin and mark it as being machine generated, real imaged that has been edited or real is great. It coule be used to at least help keep some level of legitimacy going. Things like a digital equivalent of old film based cameras that would stamp the date of the picture on the film.

Then Big Tech and the Government got involved to add their own spin to it. It's not enough to ask for a simple verification to help, oh no. They need you to also make sure they KNOW where ot came from! And of course you also need to make sure you never, ever "misuse" their tech so they want to make sure you can never even try and make a inconvenientlookijg picture. And maybe lets go and try and add as much data as possible on that string, who cares about maximizing the coverage of pictures being verified as real or fake what we REALLY need is to make sure no one can possibly post a picture online without "us" knowing the date, place, social security number of photographer and of course the guaranteed it isn't some "dangerous to Our Democracy" information!
 
>Adobe
>Intel
>Microsoft
>BBC
>Universal Music Group
>The (((Copyright Alliance)))

I don't care if this is the Give Everyone A Free Ferrari Coalition, it's pure evil and nothing good can come of it.
The good news is that this list of participants/developers contains some of the most technologically incompetent companies and people on the planet, so whatever abomination they foist onto an unwitting public will be so broken and unusable (and near-instantly hacked) that it'll become a useless laughing stock.

Adobe gave us Flash (enough said). UMG are the dumb cunts who came up with the DVD format's CSS protection scheme (that's so weak the working code to decrypt it without a key is short enough to fit legibly on a t-shirt). Intel still can't make secure CPU hardware without "cheating" by putting another (insecure) microprocessor on board to watchdog it. Microsoft's security blunders are too numerous to count in the space provided on a Scandinavian basket weaving adjunct forum. And the BBC's resources are mostly dedicated to protecting nonces these days; their technological pedigree comes from the same imbeciles who lost old recordings because "lol let's just re-use old tapes!"

Besides, unless they're planning on foisting an entirely new image and video format on the world with mandatory cryptographic signatures (good luck -- there's about a fuck-zillion embedded devices out there that only grok JPEG, GIF, PNG, MPEG, etc., and only just barely given their space constraints), this will be a simple matter to sidestep with a quick stripping of EXIF or other metadata fields.
 
I wonder if there will be government grade AI that doesn't have this identifying string attached? It can be used to fake images, so I imagine being able to manufacture a picture of a troublemaker doing something like fucking a cow or attacking his children would be worth a lot of money. You can't even describe how it would be faked because you sound crazy and they'll just laugh because they don't care. That's the point of making you the butt of every joke, by the way.
 
  • Thunk-Provoking
Reactions: Relinquish
Adobe gave us Flash
False. Flash was made by FutureWave Software. They got acquired by Macromedia, and then Macromedia got acquired by Adobe. So it was FutureWave who gave us Flash in the first place.

And what Flash gave us was a flourishing creative Internet culture that defined the entirety of the 2000's. For it's time it was an immensely powerful framework which allowed for thousands of people to easily create all kinds of animations and games that could work flawlessly on computers of the era through the convenience of your web browser.

If you want to bash Adobe for something, bash them for shit like their Creative Cloud suite, where they charge you asinine amounts of money per month for software that has decades old issues that never got patched, but most people have no choice since there aren't much alternatives to it, and CC is so easy to use and well integrated that people don't want to use different software.

There is so many things that Adobe did wrong, but the fact that they've kept Flash alive for so long after two acquisitions and keeping it free is a blessing in disguise.
 
Is "disinformation" the new child protection bait and switch? Bill to strip personal freedoms like this have been passing to "defend against child exploitation" for decades, one passed that allows police to seize and plant evidence on devices without warrant in Australia only a year or two ago with that exact purported motive, are we entering a new age where information that is considered false by the state is viewed as badly as cp? Are they trying to make that the case?

If you want to bash Adobe for something, bash them for shit like their Creative Cloud suite, where they charge you asinine amounts of money per month for software that has decades old issues that never got patched, but most people have no choice since there aren't much alternatives to it, and CC is so easy to use and well integrated that people don't want to use different software.

Everything the creative suite offers has a better, cheaper alternative, Krita, Harmony, Vegas etc. The issue lies in its reputation as being "industry standard" and having a long, long history of being so. Same shit with half of Autodesk's products, Blender has been more capable than Maya for a good few years at this point.
 
Ok, but what's to stop me from... Stripping this data from the file? Is this new standard gonna force open source software devs to remove backwards-compatibility with files that don't possess this metadata? Is it gonna force Kiwifarms (and rDrama and all the other contrarian sites I use to say anything of worth) to reject files that lack this metadata? Is it going to give my ISP the magical ability to decrypt all my network traffic regardless of encryption & protocol type and halt file transfers if they detect I'm sending files without this metadata? Is this going to force old Linux distros running on old hardware (which, with Moore's "law" slowing down ain't that much different from "new" hardware anymore) to not function at all with the modern internet?

The only way I can imagine this as personally scary is if this leads to some cryptographic scheme by which a hardware or software component contains an unalterable UID that guarantees the signing of files with that UID, and also somehow communicates that same UID to ISPs for checking against as a prerequisite to Internet use, and also allows said ISP check the signing of all files transferred. I'm not smart enough to know how that could be set up cryptographically, but I'm sure there's some "safe" way to do it (insofar as your ISP being able to dig even deeper into your web traffic details is "safe"). But so long as a UID (that also gets forcibly written into every file I create or edit) isn't a requirement for web access, then the worst thing is that I'll have to set up a dedicated "normie" PC for banking and mainstream social media and any other web service that demands I leave a nice self-incriminating trail, and a separate intentionally-outdated Linux box for all the cool websites.
 
Last edited:
>Adobe
>Intel
>Microsoft
>BBC
>Universal Music Group
>The (((Copyright Alliance)))

I don't care if this is the Give Everyone A Free Ferrari Coalition, it's pure evil and nothing good can come of it.
Absolute fucking retards said the same shit about net neutrality.

There is a world where this can be a good thing. You don't have to force every piece of media to contain this in order to be uploaded or shared, but you can use it to say certain images or videos essentially have a stamp of approval that they haven't been altered or generated through AI.
 
And what Flash gave us was a flourishing creative Internet culture that defined the entirety of the 2000's. For it's time it was an immensely powerful framework which allowed for thousands of people to easily create all kinds of animations and games that could work flawlessly on computers of the era through the convenience of your web browser.

There is so many things that Adobe did wrong, but the fact that they've kept Flash alive for so long after two acquisitions and keeping it free is a blessing in disguise.
Bollocks. Flash was a security nightmare. It was an easily-exploited attack vector for endless viruses and malware.

It also only really worked on Windows and begrudgingly on MacOS. They snuck out a few Linux builds like a wet fart but they barely worked and weren't nearly as performant as the Windows flavor. Hope you liked your browser crashes!
 
Back