(ORDO NEWS) — Obscene deepfakes with celebrity voices immediately appeared on the Web.
A few days ago, startup ElevenLabs introduced a beta version of a platform for creating synthetic voice clones of real people for voicing texts.
Just a few days later, deepfakes of celebrity voices voicing extremely dubious texts appeared on the Web.
According to the company, there has been an “increasing number of cases of abuse of voice cloning” and the company is already working to solve the problem by introducing additional protection measures.
What was meant by abuse, the company did not specify, but it is already known that audio recordings with the voices of celebrities containing statements of inappropriate content have appeared on Internet forums.
It is not yet known whether all materials were created using ElevenLabs technology , but a significant collection of voice files contains a link specifically to the company’s platform.
However, there is nothing surprising in this, since the emergence of publicly available machine learning systems has led to the emergence of numerous deepfakes of various kinds.
Now ElevenLabs is collecting feedback to prevent abuse of the technology.
So far, the company has come up with nothing out of the ordinary other than adding additional account verification measures to provide access to voice cloning.
Ideas include entering payment information or ID data.
Additionally, verification of the rights to use the voice that users intend to clone is considered, for example, they will be asked to download a sample with reading the proposed text.
Finally, the company is considering phasing out the Voice Lab tool altogether and making it easier to manually verify votes. In the meantime, users were encouraged to share ideas with the developers of the service.
It is known that in the first half of January, Microsoft presented a similar solution. Its VALL-E tool also allows you to convert text to speech using as little as 3 seconds of any person’s voice recording.
Contact us: [email protected]