Generative AI has a plethora of well-documented options abuseSince write academic articles has copy artists. And right now, it appears to look in state affect operations.
A current marketing campaign was “most certainly” aided by business AI voice era merchandise, together with applied sciences made public by the hot start ElevenLabs, in keeping with a recent report from Massachusetts-based menace intelligence agency Recorded Future.
The report describes a Russia-linked marketing campaign designed to undermine European help for Ukraine, dubbed “Operation Undercut,” that extensively used AI-generated voiceovers over pretend “information” movies or deceptive.
The movies, which focused European audiences, attacked Ukrainian politicians as corrupt or questioned the usefulness of navy help to Ukraine, amongst different themes. For instance, one video touted that “even jammers cannot save US Abrams tanks”, referring to units utilized by US tanks to deflect incoming missiles – reinforcing the concept that sending high-strength armor know-how in Ukraine is ineffective.
The report states that video creators “most certainly” used voice-generated AI, together with ElevenLabs know-how, to make their content material extra authentic. To confirm this, Recorded Future researchers submitted the clips to ElevenLabs. AI speech classifier, which presents anybody the power to “detect if an audio clip was created utilizing ElevenLabs” and get a match.
ElevenLabs didn’t reply to requests for remark. Though Recorded Future famous the seemingly use of a number of business AI voice era instruments, it named none apart from ElevenLabs.
The usefulness of AI voice era was inadvertently highlighted by the influencer marketing campaign’s personal orchestrators, who – slightly carelessly – launched movies with actual human voiceovers that had “ a noticeable Russian accent.” In distinction, AI-generated voiceovers spoke in a number of European languages like English, French, German and Polish, with out foreign-sounding accents.
In keeping with Recorded Future, AI additionally made it potential to shortly distribute deceptive clips in a number of languages spoken in Europe reminiscent of English, German, French, Polish and Turkish (in reality, all languages supported by ElevenLabs.)
Recorded Future attributed this exercise to the Social Design Company, a Russia-based group that the U.S. authorities sanctioned final March for working “a community of greater than 60 web sites that impersonated actual information organizations in Europe after which used pretend social media accounts to amplify the misleading content material on the impersonated web sites.” All this was finished “on behalf of the Authorities of the Russian Federation,” the US State Division mentioned on the time.
The marketing campaign’s total impression on European public opinion was minimal, Recorded Future concluded.
This isn’t the primary time that ElevenLabs merchandise have been singled out for alleged misuse. The corporate’s know-how was behind a robocall impersonating President Joe Biden that urged voters to not exit and vote in a January 2024 main election, a detection firm has concluded voice fraud: according to Bloomberg. In response, ElevenLabs mentioned it had launched new security measures, reminiscent of routinely blocking politicians’ voices.
ElevenLabs prohibitions “unauthorized, dangerous or misleading impersonation” and said it makes use of varied instruments to implement this, reminiscent of automated and human moderation.
ElevenLabs has seen explosive development since its inception in 2022. It not too long ago increased ARR from $25 million to $80 million lower than a 12 months earlier, and will quickly be valued at $3 billion, TechCrunch beforehand reported. Its traders embrace Andreessen Horowitz and former Github CEO Nat Friedman.
#ElevenLabs #voice #era #Russian #affect #operation, #gossip247.on-line , #Gossip247
AI,ElevenLabs,Unique,Generative AI,safety,Startups ,