Bryan Cranston thanks OpenAI for blocking Sora 2 deepfakes

TL;DR: Bryan Cranston has thanked OpenAI for strengthening guardrails on Sora 2 after users were able to generate his likeness without consent. OpenAI called the incident “unintentional” and has committed to improving opt-in protocols for public figures.

Breaking Bad star Bryan Cranston has expressed gratitude to OpenAI for addressing deepfake concerns on the company’s generative AI video platform Sora 2, after users were able to generate his voice and likeness without his consent. The actor approached the actors’ union Sag-Aftra with concerns following reports of synthetic videos featuring his likeness during Sora 2’s recent launch phase.

Context and Background

OpenAI has maintained since launch that it takes “measures to block depictions of public figures” and has “guardrails intended to ensure that your audio and image likeness are used with your consent”. However, several publications including the Wall Street Journal and Hollywood Reporter reported widespread anger in Hollywood after OpenAI allegedly told multiple talent agencies that clients would need to opt out rather than opt in to prevent likeness replication.

On Monday, Cranston issued a statement through Sag-Aftra, stating: “I was deeply concerned not just for myself, but for all performers whose work and identity can be misused in this way. I am grateful to OpenAI for its policy and for improving its guardrails, and hope that they and all of the companies involved in this work, respect our personal and professional right to manage replication of our voice and likeness.”

Two of Hollywood’s biggest agencies, Creative Artists Agency and United Talent Agency (which represents Cranston), co-signed a statement with OpenAI, Sag-Aftra and the Association of Talent Agents, confirming that what happened to Cranston was an error and pledging to work together to protect actors’ rights to determine how they can be simulated.

Looking Forward

Critical Context: The NO FAKES Act currently being considered by Congress seeks to ban the production and distribution of AI-generated replicas without consent, with OpenAI publicly supporting the legislation.

Sag-Aftra’s new president Sean Astin warned that Cranston “is one of countless performers whose voice and likeness are in danger of massive misappropriation by replication technology”. The union continues to advocate for opt-in protocols as the only acceptable approach for AI-generated content featuring performers. OpenAI CEO Sam Altman has stated the company is “deeply committed to protecting performers from the misappropriation of their voice and likeness”, with the company recently working with the estate of Martin Luther King Jr to pause his depiction on the platform.

Source Attribution:

Share this article