UnitedHealth tests AI system to streamline medical claims processing - Bloomberg
Investing.com -- Actor Bryan Cranston’s voice and likeness were generated without permission in some outputs of OpenAI’s Sora 2 during its invite-only release two weeks ago, prompting a collaborative response from entertainment industry stakeholders.
OpenAI has strengthened guardrails around replication of voice and likeness when individuals do not opt-in, following the incident. The company expressed regret for these unintentional generations, despite having an existing policy requiring opt-in for voice and likeness use.
Cranston, who brought the issue to SAG-AFTRA’s attention, said, "I was deeply concerned not just for myself, but for all performers whose work and identity can be misused in this way. I am grateful to OpenAI for its policy and for improving its guardrails, and hope that they and all of the companies involved in this work, respect our personal and professional right to manage replication of our voice and likeness."
The collaboration includes SAG-AFTRA, OpenAI, Bryan Cranston, United Talent Agency, Creative Artists Agency, and the Association of Talent Agents, who jointly released a statement regarding the resolution.
OpenAI’s opt-in policy gives all artists, performers, and individuals the right to determine how and whether they can be simulated. The company has also committed to responding quickly to any complaints it receives.
SAG-AFTRA President Sean Astin praised the resolution, saying, "Bryan did the right thing by communicating with his union and his professional representatives to have the matter addressed. This particular case has a positive resolution. I’m glad that OpenAI has committed to using an opt-in protocol, where all artists have the ability to choose whether they wish to participate in the exploitation of their voice and likeness using A.I."
The parties involved have expressed support for the NO FAKES Act, pending federal legislation designed to protect performers from unauthorized digital replication. They endorse its goal of establishing a national standard ensuring performers’ voices and likenesses cannot be used without permission.
Sam Altman, CEO of OpenAI, stated, "OpenAI is deeply committed to protecting performers from the misappropriation of their voice and likeness. We were an early supporter of the NO FAKES Act when it was introduced last year, and will always stand behind the rights of performers."
This article was generated with the support of AI and reviewed by an editor. For more information see our T&C.