Red Hot Cyber
Cybersecurity is about sharing. Recognize the risk, combat it, share your experiences, and encourage others to do better than you.
Search
Banner Mobile V1
2nd Edition GlitchZone RHC 970x120 1 Scaled
OpenAI tightens controls on Sora 2 after criticism for videos featuring famous actors

OpenAI tightens controls on Sora 2 after criticism for videos featuring famous actors

Redazione RHC : 22 October 2025 08:06

The continued generation of music videos featuring famous actors being posted without their consent on the Sora 2 platform has once again drawn attention to the problems associated with using neural networks to create digital copies of people .

After OpenAI’s service published videos generated by Bryan Cranston , including one with Michael Jackson, the company announced it would tighten controls on the use of celebrities’ images and voices.

Cranston himself, the actors’ union SAG-AFTRA, and several major agencies ( United Talent Agency, Creative Artists Agency, and the Association of Talent Agents ) issued a joint statement . They emphasized that OpenAI acknowledged the unwanted generation and expressed regret. In response to criticism, the company reaffirmed its commitment to the principle of voluntary participation in the creation of digital copies: no artist or performer should appear in such videos without prior permission . It also promised to respond promptly to complaints regarding violations of this policy.

While OpenAI did not disclose the details of the changes made to the Sora app, the announcement itself represented an important step in public recognition of the issue.

The platform had already sparked a wave of criticism after publishing videos featuring distorted images of celebrities. Following these incidents, the company revoked its previous “default” policy and promised to provide copyright holders with more granular control over content generation, closer to the principles of voluntary consent for the use of likenesses.

While Cranston highlighted OpenAI’s positive response and called it an important example of fostering dialogue , SAG-AFTRA President Sean Astin called for legislative action. He emphasized the need for a legal instrument capable of protecting creative professionals from large-scale unauthorized copying using artificial intelligence . In this context, he mentioned the NO FAKES (Nurture Originals, Foster Art, and Keep Entertainment Safe Act) bill currently under discussion in the United States, which aims to protect against digital replication without the permission of the original artist.

In the context of the rapid spread of generative models and their integration into popular media platforms, such incidents raise an urgent question: what are the limits of what is acceptable when technologies are able to recreate people’s faces and voices with near-perfect fidelity ?

While companies promise to strengthen filters and improve oversight, the professional community continues to insist on institutional guarantees, including at the legislative level.

Immagine del sitoRedazione
The editorial team of Red Hot Cyber consists of a group of individuals and anonymous sources who actively collaborate to provide early information and news on cybersecurity and computing in general.

Lista degli articoli