OpenAI’s Voice Mimicking Tool Stokes Deepfake Concerns
Photographer: David Paul Morris/Bloomberg

OpenAI’s Voice Mimicking Tool Stokes Deepfake Concerns

OpenAI is sharing early results after testing a feature that can read words aloud in a convincing human voice — highlighting a new frontier for artificial intelligence and raising concerns of deepfake risks.


Have questions about the latest in AI? We’ve got answers. Sign up for Bloomberg Technology's weekly Q&AI newsletter and read the latest edition.


Voice Engine, OpenAI's new feature, just needs 15 seconds of recorded audio of a person speaking to recreate their voice, complete with their specific cadence and intonations.

The company has shared the technology with about 10 developers so far, a spokesperson said. The company decided to scale back the release after receiving feedback from stakeholders such as policymakers, industry experts, educators and creatives.

Other AI tools have already been used to fake voices in some contexts. In January, a bogus but realistic-sounding phone call purporting to be from President Joe Biden encouraged people in New Hampshire not to vote in the primaries — an event that stoked AI fears ahead of critical global elections.

Readers: Do you think AI tools that replicate human voices should be available to the public? Share your thoughts by joining the conversation below in the comments.


Photographer: Nathan Howard/Bloomberg

Amazon Bets $150 Billion on Data Centers Required for AI Boom


Subscribe to our weekly AI-focused newsletter from Bloomberg Technology's Shirin Ghaffary.


AI Stories Not to Miss

Want to read more? Find daily AI news coverage here.


Never miss a beat. For unlimited access to Bloomberg.com and exclusive newsletters, subscribe here.

  • No alternative text description for this image
Like
Reply
Ming Hai Chow

revenue | resiliency | innovation | analysis | art & culture | economy | luxury | sustainability | hospitality | gender equality | growth | philosophy | history

7mo

Expected, personally I even experienced tech remotely controlled road accidents , where my gear box, fuel tank, and breaking system were malfunctioned. That's normal for future tech-driven murdering system. Federal Bureau of Investigation (FBI) #cyber

Like
Reply
Nurain Ibrahim

Corporate Brand Manager | Corporate PR Management | Marketing Executive | @PCCS Peiyang Chemical Engineering Service Corporation PTE. LTD | Oil and Gas | CNG | LNG | LPG | PRMS | Gas Processing | Oil Refining

7mo

Its very thoughtful and innovative, but also scary in similitude to how radioactivity is to humanity, just imagine this AI model being utilised by people with sinister motives and criminals, you wouldn't want to imagine how catastrophic it'll end up becoming

Like
Reply
Richard Mwambanga

Garment Design Creative

7mo

To simply put it, I believe AI is a tool just like any other, only far more advanced. So like a hammer, it can be used on hitting nails and it's a good thing or human heads and it's a murder weapon. A gun can be for protection or... You get my point.

Like
Reply
Erwin Jack

Powering Prime Projects | $100M to $5B+ | Project Finance Assistance for Oil and Gas, Renewable Energy, Agriculture, Data Centers, Infrastructure and More | Sustainable Growth

7mo

There are many risks regarding AI which have not yet been addressed and resolved. We must be careful.

Like
Reply

To view or add a comment, sign in

Insights from the community

Others also viewed

Explore topics