Disaster Adobe, Arm, Intel, and Microsoft form content authenticity coalition - Mass deplatforming with the power of Microsoft Azure™


Adobe, Arm, Intel, and Microsoft have partnered with photo verification platform Truepic and the BBC to form the Coalition for Content Provenance and Authenticity (C2PA).

The coalition, open to further membership, is seeking to address the prevalence of disinformation, misinformation, and online content fraud through developing technical standards for certifying the source and history or provenance of media content.

Member organisations will work together to develop content provenance specifications for common asset types and formats, they said in a statement, to enable publishers, creators, and consumers to trace the origin and evolution of a piece of media, including images, videos, audio, and documents.

These technical specifications will include defining what information is associated with each type of asset, how that information is presented and stored, and how evidence of tampering can be identified.

"There's a critical need to address widespread deception in online content -- now supercharged by advances in AI and graphics and diffused rapidly via the internet," Microsoft chief scientific officer Eric Horvitz said.

"Our imperative as researchers and technologists is to create and refine technical and sociotechnical approaches to this grand challenge of our time. We're excited about methods for certifying the origin and provenance of online content."

Adobe general counsel Dana Rao said the C2PA would accelerate the critical work of rebuilding the public's trust in online content.

The coalition said the C2PA's open standard will give platforms a method to preserve and read provenance-based digital content.

"Because an open standard can be adopted by any online platform, it is critical to scaling trust across the internet," the statement continued.

"In addition to the inclusion of varied media types at scale, C2PA is driving an end-to-end provenance experience from the capturing device to the information consumer. Collaboration with chipmakers, news organisations, and software and platform companies is critical to facilitate a comprehensive provenance standard and drive broad adoption across the content ecosystem."

The formation of the C2PA brings together founding members of the Adobe-led Content Authenticity Initiative (CAI) and the Microsoft and BBC-led Project Origin.

CALL FOR AN EU VERSION OF AUSTRALIA'S MEDIA CODE​

With Australia's News Media Bargaining Code -- a piece of legislation directing Google and Facebook to pay to display local news content -- hours away from passage, Microsoft has called for a similar mandate to be introduced in Europe.

Microsoft vice president Brad Smith earlier this month published a blog post that praised the Australian government for the code and said the US should take notice, even going as far as saying the code should be copied by the Biden Administration.

On Tuesday, Microsoft rallied support from a handful of Europe's press publishers, having agreed to work together on a solution for how the former could be paid for the use of their content by "gatekeepers that have dominant market power".

Microsoft said the outcome should be in line with the objectives of the new neighbouring right in the EU Digital Single Market Copyright Directive, which comes into force in June, and that it should take inspiration from the new Australian legislation.

"Although press publishers have been granted a neighbouring right in the EU, negotiations with such gatekeepers will not produce fair outcomes unless additional regulatory measures are brought forward to address gatekeepers with dominant market power, through appropriate regulatory frameworks such as the Digital Markets Act, Digital Services Act, or other national laws," Microsoft wrote.

"EMMA, ENPA, EPC, NME, and Microsoft therefore call for an arbitration mechanism to be implemented in European or national law requiring such gatekeepers to pay for press content in full respect of the Publisher's Right set out in Directive 2019/790."

Microsoft said it welcomes proposals made by several members of the European Parliament to introduce a final arbitration mechanism into relevant regulation.

"This is needed to prevent undermining the scope of the publishers' right and to create legal certainty. Otherwise, even though press publishers have a neighbouring right, they might not have the economic strength to negotiate fair and balanced agreements with these gatekeeper tech companies, who might otherwise threaten to walk away from negotiations or exit markets entirely," the tech giant said.

---

 
Honestly, the potential to deep fake stuff that could cause economic or social panic is real, and tools to quickly demonstrate that something was faked are in pretty dire need.
Sure, but this won't do it. All these signatures will prove is that a certain file either is or isn't the file uploaded by a Trusted News Organization. (Or a minimally edited version of it)
So if you upload a deepfake and call it "Footage from CNN 2/24/2021", CNN could point to the blockchain and say "No, our footage has this signature and yours doesn't match."
But if you call the deepfake "Footage from Q's secret spy camera in Obama's volcano lair", nothing here would help disprove that.

Unless you then make it illegal/impossible to upload or host something with that exact signature. I mean, if YouTube can do that with copyrighted songs then why couldn’t sites do it with a hardcoded ID?
Youtube can (and does) also just ban anything they consider wrongthink, without bothering with the whole ID thing in the first place.
 
Youtube can (and does) also just ban anything they consider wrongthink, without bothering with the whole ID thing in the first place.
Yes, but I meant the way their Content-Id system is able to automatically search against copyright databases to see if videos contain copywritten music or video and DMCA flag them.
If it can detect that out of a video, which has vast ranges of quality and noise, companies can much more easily set up a way to automatically detect for a specific unique ID from this blockchain and have it never be allowed to be uploaded again.
 
  • Agree
Reactions: DumbDude42
I blame NSFW Twitter artfags for all of this, they're extreme spergout over AI is due to a simple fact. AI can proudce better wank material for free than their overpriced "commissions" could ever hope to accomplish. and remember

>ITS NOT STEALING VIA ROM HACKS, ITS ART
>ITS NOT STEALING WITH FANFICS ITS ART
>ITS NOT STEALING VIA RESPONE VIDEOS ITS ART.
>NOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOO BANNNN AI ITS STEALING MY STYLE OF ART, ITS USING MY ART AS A STARTING POINT
 
Jews gonna jew.

Russia, China, take the US/EU out already.
be careful what you wish for
1689963248381.png
 
This (fucking old) news story makes me want to tell lies, recreationally, all day long online.
 
So if you upload a deepfake and call it "Footage from CNN 2/24/2021", CNN could point to the blockchain and say "No, our footage has this signature and yours doesn't match."
But if you call the deepfake "Footage from Q's secret spy camera in Obama's volcano lair", nothing here would help disprove that.
Right up until they have a list of "approved" sources, and anything that doesn't come from there Is automatically rejected. Want to share your wedding video? Upload it to Facebook first to get an authentication tag, then you can send it email it to your uncle.
A walled garden of safe, curated, approved media.
 
I blame NSFW Twitter artfags for all of this, they're extreme spergout over AI is due to a simple fact. AI can proudce better wank material for free than their overpriced "commissions" could ever hope to accomplish. and remember

>ITS NOT STEALING VIA ROM HACKS, ITS ART
>ITS NOT STEALING WITH FANFICS ITS ART
>ITS NOT STEALING VIA RESPONE VIDEOS ITS ART.
>NOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOO BANNNN AI ITS STEALING MY STYLE OF ART, ITS USING MY ART AS A STARTING POINT
Stop bumping two year old threads and lurk moar, newfag.
 
  • Informative
Reactions: Male Idiot
Back