We will be releasing one real-world use case for AI and cryptocurrency every day this week. This includes why you shouldn’t believe the hype. Today: How blockchain fights fakes
Generative AI is very good at generating fake everything: fake photos, fake letters, fake bills, fake conversations. Near co-founder Illia Polosukhin warns that soon we won’t know what content to trust.
“If we don’t solve these reputation and content authentication (problems), things are going to get really weird,” explains Polosukhin. “When you get the call, you might think it’s from someone you know, but that’s not the case.”
“Every image you see, every piece of content, every book will become (suspicious). Imagine a history book that children study. Imagine literally every child looking at a different textbook, and that book trying to influence them in a certain way.”
Blockchain can be used to transparently trace the provenance of online content, allowing users to distinguish between authentic content and AI-generated images. But it will not be able to separate truth from lies.
“People always write things that aren’t true, so it’s a wrong view of the issue. What’s more important is that when you see something, it’s the person who says it is.” Polosukhin says:
“This is where the reputation system comes in. great. This content is provided by its author. “Can you trust what the writer says?”
“So cryptography becomes a tool to ensure consistency and traceability, and you need reputation around this crypto – to actually verify that ‘X posted this’ and ‘X currently works at Cointelegraph.’ It requires on-chain accounting and record keeping.”
If it’s such a good idea, why hasn’t anyone done it yet?
There are a variety of existing supply chain projects that use blockchain to prove the provenance of goods in the real world, including VeChain and OriginTrail.
However, content-based provenance has not yet taken off. The Trive News project aimed to crowdsource article verification via blockchain, and the Po.et project recorded a transparent record of content on the blockchain, but neither currently exists.
Recently, Fact Protocol was launched using a combination of AI and Web3 technologies to crowdsource news verification. The project joined the Content Authenticity Initiative in March last year.
When someone shares an article or content online, it is first automatically verified using AI, then the protocol’s fact-checker checks it again, and then records the information on-chain along with a timestamp and transaction hash.
Also read
characteristic
Consequences when current laws are applied to NFTs and Metaverse
characteristic
Cryptocurrency mass adoption will occur when: (Please fill in the blanks)
“We don’t republish content on the platform, but we create a permanent on-chain record of it, a fact-checking record, and a validator record of it,” founder Mohith Agadi told The Decrypting. story.
And last August, global news agency Reuters ran a proof-of-concept pilot program using prototype Canon cameras to store metadata for photos on-chain using the C2PA standard.
We’ve also integrated Starling Lab’s authentication framework into our photo desk workflow. The metadata, edit history, and blockchain registration embedded in the photo allow users to verify the authenticity of the photo by comparing its unique identifier to identifiers recorded on a public ledger.
Academic research in this field is also underway.
Do you need blockchain?
Technically not. One of the issues that hinders this use case is that it doesn’t actually require blockchain or encryption to prove the origin of the content. However, this makes the process much more powerful.
So while a cryptographic signature can be used to verify content, Polosukhin asks how the reader can be sure it’s the right signature. Even if the key is posted on the original website, someone can hack that website.
Web2 uses trusted service providers to handle these issues, but “there are always problems,” he explains.
“Symantec had been hacked and was issuing invalid SSL certificates. Your website is being hacked. Curve and even Web3 websites are being hacked because they run on the Web2 stack,” he says.
“So from my perspective, if we expect a future where this is used in a malicious way, we need tools that can actually counter that.”
Also read
characteristic
Should cryptocurrency projects negotiate with hackers? maybe
characteristic
OG Guide to Real Cryptocurrencies You Can Meet at Parties (Part 2)
Don’t believe the hype
People have been discussing the use of blockchain to combat ‘disinformation’ and deepfakes long before AI, but there has been little progress until recently.
Microsoft has launched a new watermark to crack down on generative AI fakes used in election campaigns. A watermark from the Coalition for Content Provenance Authenticity is permanently attached to the metadata and shows whether AI is involved with the person who created it.
The New York Times, Adobe, BBC, Truepic, Washington Post, and Arm are all members of C2PA. However, this solution does not require the use of blockchain, as metadata can be protected with a hash code and authenticated digital signature.
But it can also be recorded on a blockchain, as demonstrated in a Reuters pilot program last August. C2PA’s awareness arm is called the Content Authenticity Initiative, and Web3 organizations including Rarible, Fact Protocol, Livepeer, and Dfinity These are CAI members waving the flag of blockchain.
Also read:
#1 Real-World AI Use Case in Cryptocurrency: The Best Money for AI Is Cryptocurrency
Real-world AI use case in cryptocurrency No. 2: AI can run DAOs.
Real-World AI Use Case in Cryptocurrency No. 3: Smart Contract Auditing and Cybersecurity
subscribe
The most interesting read on blockchain. Delivered once a week.
Andrew Fenton
Andrew Fenton, based in Melbourne, is a journalist and editor covering cryptocurrency and blockchain. He has worked as a national entertainment writer for News Corp Australia, a film journalist for SA Weekend and The Melbourne Weekly.
Follow the author @andrewfenton