33.2 C
Accra

Scarlett Johansson Says She is ‘Shocked, Angered’ Over New ChatGPT Voice

Johansson's legal team has sent OpenAI two letters asking the company to detail the process by which it developed a voice the tech company dubbed "Sky,"

Lawyers for Scarlett Johansson are demanding that OpenAI disclose how it developed an AI personal assistant voice that the actress says sounds uncannily similar to her own.

Johansson’s legal team has sent OpenAI two letters asking the company to detail the process by which it developed a voice the tech company dubbed “Sky,” Johansson’s publicist told NPR in a revelation that has not been previously reported.

After OpenAI held a live demonstration of the voice last week, many observers compared it to Johansson’s voice in the the 2013 Spike Jonze romantic sci-fi film “Her,” which centers on a man who falls in love with the female voice of his computer’s operating system.

- Advertisement -

OpenAI CEO Sam Altman, who has said the 2013 Spike Jonze film is his favorite movie, invited comparisons by posting the word “Her” on X after the company announced the new ChatGPT version. But later, OpenAI executives denied any connection between Johansson and the new voice assistant.

Join our WhatsApp Channel for more news

Then the company suddenly dropped the voice.

In a post on X just before midnight Pacific time Sunday, OpenAI said the voice would be halted as it addresses “questions about how we chose the voices in ChatGPT.” A company spokeswoman would not provide further detail.

- Advertisement -

Turns out, Altman had been courting the Hollywood star for months, and she now feels betrayed.

Johansson said that nine months ago Altman approached her proposing that she allow her voice to be licensed for the new ChatGPT voice assistant. He thought it would be “comforting to people” who are uneasy with AI technology.

“After much consideration and for personal reasons, I declined the offer,” Johansson wrote.

Just two days before the new ChatGPT was unveiled, Altman again reached out to Johansson’s team, urging the actress to reconsider, she said.

But before she and Altman could connect, the company publicly announced its new, splashy product, complete with a voice that she says appears to have copied her likeness.

To Johansson, it was a personal affront.

“I was shocked, angered and in disbelief that Mr. Altman would pursue a voice that sounded so eerily similar to mine that my closest friends and news outlets could not tell the difference,” she said.

She also found it alarming, she said, at a moment when the internet is awash in disinformation.

“In a time when we are all grappling with deepfakes and the protection of our own likeness, our own work, our own identities, I believe these are questions that deserve absolute clarity,” Johansson said.

OpenAI’s Altman denied there is any connection between Johansson and its Sky voice.

“We cast the voice actor behind Sky’s voice before any outreach to Ms. Johansson. Out of respect for Ms. Johansson, we have paused using Sky’s voice in our products. We are sorry to Ms. Johansson that we didn’t communicate better,” Altman wrote in a statement to NPR.

Johansson used the incident to draw attention to the lack of legal safeguards around the use of creative work to power leading AI tools.

“I look forward to resolution in the form of transparency and the passage of appropriate legislation to help ensure that individual rights are protected,” she said.

OpenAI says voice of another actress was used to develop ‘Sky’

In a blog post late Sunday, OpenAI said the AI voice in question, known as “Sky,” was developed from the voice of another actress whose identity the company said it is not revealing to protect her privacy.

“We believe that AI voices should not deliberately mimic a celebrity’s distinctive voice — Sky’s voice is not an imitation of Scarlett Johansson but belongs to a different professional actress using her own natural speaking voice,” the company wrote.

The new model, known as GPT-4o, transforms the chatbot into a voice assistant that can interpret facial expressions, detect emotion and even sing on command.

The new voice assistant will be publicly available in the coming weeks. During a live-demo last week, it struck a knowing, flirtatious tone with some of OpenAI’s employees, leading some to wonder whether the coquettish demeanor was an intentional ploy to keep people engaged with the AI system.

In an interview with NPR last week, OpenAI chief technology officer Mira Murati said the company did not pattern any ChatGPT voices on Johansson’s sultry computer voice in the movie.

“It says more about our imagination, our storytelling as a society than about the technology itself,” Murati said. “The way we developed this technology is not based on the movie or a sci-fi story. We’re trying to build these machines that can think and have robust understandings of the world.”

“I don’t know about the voice. I actually had to go and listen to Scarlett Johansson’s voice,” she said.

Asked about ChatGPT’s flirtatious banter, Murati said the model merely responds to what people provide to it.

“It will react to how you’re interacting with it,” she said. “It’s not preset. It’s based on inputs.”

In its Sunday night blog post, OpenAI said that chatbot was developed with five voices that were produced after working closely with voice and screen actors.

“Looking ahead, you can expect even more options as we plan to introduce additional voices in ChatGPT to better match the diverse interests and preferences of users,” the company wrote in the post.

A day after OpenAI’s announcement, Google held its annual developer conference where it unveiled its own personal AI assistant, also voiced by a female, known as Project Astra. While similar, Google’s version appeared far less quippy and playful and more matter of fact.

Together, experts say, the products provide a glimpse into the next generation of cutting-edge AI technology — and also raise questions about the risks that follow as more and more people adopt the tools.

Visar Berisha, an Arizona State University professor who studies AI speech technology, said it is hard to predict how advanced AI voice assistants that speak with human-like personalities will change society.

“Communication by voice is really intimate, really impactful. It allows the AI to express subtleties, things that are perceived as sincere, urgent, joy, concern,” he said. “And all of these serve to foster a deeper connection between the user and machine. You can see how these interactions can potentially become addictive.”

It’s possible, Berisha said, that people will start forming emotional connections to AI systems, much like the plot of “Her” — a movie that does not end happily for the protagonist.

“When I first saw that movie it seemed like science fiction,” Berisha said. “It doesn’t seem like that now.”

SourceNPR

While you're here, we just want to remind you of our commitment to telling the stories that matter the most.Our commitment is to our readers first before anything else.

Our Picks

THE LATEST

INSIDE POLITICS

Get the Stories Right in Your Inbox

OUR PARTNERS

Allafrica.com

MORE NEWS FOR YOU