Generative AI has a plethora of well-documented misuses, from making up educational papers to copying artists. And now, it seems to be cropping up in state affect operations.
One latest marketing campaign was “very probably” helped by business AI voice technology merchandise, together with tech publicly launched by the scorching startup ElevenLabs, based on a latest report from Massachusetts-based menace intelligence firm Recorded Future.
The report describes a Russian-tied marketing campaign designed to undermine Europe’s assist for Ukraine, dubbed “Operation Undercut,” that prominently used AI-generated voiceovers on faux or deceptive “information” movies.
The movies, which focused European audiences, attacked Ukrainian politicians as corrupt or questioned the usefulness of army assist to Ukraine, amongst different themes. For instance, one video touted that “even jammers can’t save American Abrams tanks,” referring to gadgets utilized by US tanks to deflect incoming missiles – reinforcing the purpose that sending high-tech armor to Ukraine is pointless.
The report states that the video creators “very probably” used voice-generated AI, together with ElevenLabs tech, to make their content material seem extra reliable. To confirm this, Recorded Future’s researchers submitted the clips to ElevenLabs’ personal AI Speech Classifier, which supplies the flexibility for anybody to “detect whether or not an audio clip was created utilizing ElevenLabs,” and bought a match.
ElevenLabs didn’t reply to requests for remark. Though Recorded Future famous the probably use of a number of business AI voice technology instruments, it didn’t title any others apart from ElevenLabs.
The usefulness of AI voice technology was inadvertently showcased by the affect marketing campaign’s personal orchestrators, who – moderately sloppily – launched some movies with actual human voiceovers that had “a discernible Russian accent.” In distinction, the AI-generated voiceovers spoke in a number of European languages like English, French, German, and Polish, with no foreign-soundings accents.
In response to Recorded Future, AI additionally allowed for the deceptive clips to be shortly launched in a number of languages spoken in Europe like English, German, French, Polish, and Turkish (by the way, all languages supported by ElevenLabs.)
Recorded Future attributed the exercise to the Social Design Company, a Russia-based group that the U.S. authorities sanctioned this March for operating “ a community of over 60 web sites that impersonated real information organizations in Europe, then used bogus social media accounts to amplify the deceptive content material of the spoofed web sites.” All this was completed “on behalf of the Authorities of the Russian Federation,” the U.S. State Division stated on the time.
The general influence of the marketing campaign on public opinion in Europe was minimal, Recorded Future concluded.
This isn’t the primary time ElevenLabs’ merchandise have been singled out for alleged misuse. The corporate’s tech was behind a robocall impersonating President Joe Biden that urged voters to not exit and vote throughout a major election in January 2024, a voice fraud detection firm concluded, based on Bloomberg. In response, ElevenLabs stated it launched new security options like routinely blocking voices of politicians.
ElevenLabs bans “unauthorized, dangerous, or misleading impersonation” and says it makes use of varied instruments to implement this, akin to each automated and human moderation.
ElevenLabs has skilled explosive development since its founding in 2022. It lately grew ARR to $80 million from $25 million lower than a yr earlier, and should quickly be valued at $3 billion, TechCrunch beforehand reported. Its buyers embrace Andreessen Horowitz and former Github CEO Nat Friedman.