A blockchain-based marketplace is today launching for creators of “synthetic media,” a term used to describe video, image or voice material generated through artificial intelligence algorithms.
On May 15, Cointelegraph interviewed Arif Khan, CEO of Alethea AI, the firm behind the project, about the legal and moral quagmire that “deepfakes” and other AI-generated content have created for online media consumption.
Khan’s wager is that blockchain can play a role in ensuring that this content is circulated responsibly by providing infrastructure for licensing, circulating and monetizing legal and permissioned creations, as distinct from unlabelled and potentially nefarious media:
“We must distinguish between deepfakes (harmful, unpermissioned e.g. deepfake porn, deepfake misinformation political campaigns) and synthetic media (permissioned use of faces and voices, creating AI-generated clones of your soon-to-be-deceased parent’s voice with their permission to read an audiobook for your kids).”
In partnership with software firm Oasis Labs, all content generated for Alethea AI’s marketplace will be labeled using the Oasis API in an attempt to restore control to content creators as well as to those whose images can be manipulated.
Similar to the blue tick verification that Twitter provides, the company believes that secure blockchain validation on its platform will establish a barrier between valid and suspicious material. Legal permissions and consent will be the fundamental criteria for synthetic media that can be circulated and monetized, Khan said.
He pointed to the public furore earlier this year over law enforcement agencies’ access to Clearview AI, an app that could match faces to photos scraped from social media platforms by using neural net technology:
“Clearview AI stole people’s faces and sold them to security agencies without their consent. A person’s face and voice data belongs to the individual and no corporation or regulator should own this. With the Oasis Parcel API, the aim is to have this data be confidentially and securely stored and accessed through the Oasis blockchain. The user retains control of their data, who can access it […] and choose how to monetize this data.”
Now that AI is being used to go beyond facial recognition and simulate authentic appearances, Khan conceded that, ultimately, it will fall on regulators and citizens to collectively determine what kind of synthetic media is in line with the public interest — and rapidly — “given how bad actors can use this technology.”
He argued that Altethea’s model for permissioned synthetic media will help to educate the public on “the positive use-cases that can emerge from this technology.”
To illustrate what Alethea AI considers to be the positive potential of AI-generated content, the company states:
“Synthetic media […] does not require humans to physically interact, which is crucial during the pandemic as movie studios are unable to produce new content due to quarantine restrictions.
We can now enable actors and their talent agencies to license out their face and voice data in a secure manner […] Our faces and voices are becoming fully portable, composable and tradable and we will enable users to exercise their creativity within legal and permissioned domains.”
As to whether there might be something faintly dystopian in the proposal that synthetic media is attractive precisely because it does not require humans to physically interact, Khan said:
“30+ million Americans are unemployed. Dystopia is already here with a President who retweets deepfakes/cheapfakes on a quarterly basis (my rough estimate). Our Creator program is designed to facilitate the creation of synthetic characters and provide an income-earning opportunity for the myriad of creative use-cases that synthetic media will unlock.”