Conan's voice changer! Baidu AI technology can clone voices in seconds
【Globe Smart Report journalists Zhang Yang】 Seen Japanese anime《 conan》 For those of you who may know, Conan has a bow that can change its voice, Now this technology may no longer be just a fantasy in anime。 According to foreign technology-based mediafuturism.com2 month28 news of the day, Baidu's artificial intelligence team has developed a neural network technique, Ability to imitate sounds, In the demo video it released, The software can even change the gender and even the accent of the imitated voice。
Voice imitation is not a recent technology, But as technology iterates, Increasingly shorter duration of raw material to be used。 (located) at2017 Year Baidu Deep Voice Research Team, Demonstrated the availability of30 Minute long voice clips for voice imitation,Adobe The company has a company calledVoCo approved procedure, Its imitation reduces the duration of the sound material to20 minutes, A Canadian company calledLyrebird 's startup can mimic sound with just a minute of time-lapse footage。 And now Baidu is further innovating, Reducing the length of material to a few seconds。
This technology can help create personalized digital assistants and provide a more natural voice translation service。
nevertheless, As with many technologies, Voice cloning technology is also at risk of abuse《 new scientist》 report (speaking), Using such technology can make voice recognition software, The accuracy in the test has exceeded95%, This makesAI Could be an accomplice to voice scams。
Technologies now exist that allow people to useAI Make replacements or change some video footage, Or even make some fake videos completely from scratch, Although at the moment it's mostly about doing funny videos on the internet, But people are starting to use this technique too, Let the stars“ act in” They've never even been in a film or even a porno.。 present ., If you add this voice cloning technology, Then we may soon be bombarded with fake news。
Using only text orPhotoshop It's easy enough to fool people., If such technology were to get into the hands of someone with ulterior motives, Then there could be a bigger problem。