OpenAI's Sora 2, a powerful AI video generator, has raised concerns about the deepfake crisis.
According to OpenAI, every Sora video includes C2PA metadata, a label meant to prove how a video was made.
OpenAI’s newest toy, Sora 2, is like Photoshop on steroids, if Photoshop had a moral hangover.
The AI video generator has proven to be so good at faking reality that it has created convincing deepfakes of various individuals.
However, users have discovered that Sora can also be used to create unwanted and harmful content, such as racist rants or fetish videos.
Once these clips are shared on social media platforms like TikTok or Instagram, it becomes difficult to distinguish between real and fake content.
OpenAI's watermarking and authenticity tools may not be effective in preventing the spread of deepfakes.
Author's summary: Sora 2 raises concerns about deepfake crisis.