Berliner Boersenzeitung - Biden robocall: Audio deepfake fuels election disinformation fears

EUR -
AED 4.234305
AFN 73.206022
ALL 95.812234
AMD 436.184273
ANG 2.063925
AOA 1057.280409
ARS 1587.291241
AUD 1.667055
AWG 2.077953
AZN 1.961064
BAM 1.949927
BBD 2.330401
BDT 141.992303
BGN 1.970794
BHD 0.435312
BIF 3436.663292
BMD 1.152977
BND 1.479051
BOB 7.994884
BRL 6.053341
BSD 1.157025
BTN 108.831715
BWP 15.767643
BYN 3.429201
BYR 22598.351259
BZD 2.327111
CAD 1.595536
CDF 2628.787676
CHF 0.914658
CLF 0.026844
CLP 1059.885276
CNY 7.957269
CNH 7.976186
COP 4267.571808
CRC 537.981872
CUC 1.152977
CUP 30.553893
CVE 109.933392
CZK 24.476208
DJF 206.042059
DKK 7.472157
DOP 69.760177
DZD 153.327594
EGP 60.872574
ERN 17.294657
ETB 180.6651
FJD 2.59218
FKP 0.862237
GBP 0.864946
GEL 3.10733
GGP 0.862237
GHS 12.649842
GIP 0.862237
GMD 84.749724
GNF 10141.496666
GTQ 8.855288
GYD 242.069809
HKD 9.020571
HNL 30.638845
HRK 7.536091
HTG 151.723649
HUF 388.485269
IDR 19502.607732
ILS 3.606368
IMP 0.862237
INR 108.477969
IQD 1515.840693
IRR 1514031.885631
ISK 142.66913
JEP 0.862237
JMD 182.251828
JOD 0.81743
JPY 184.046854
KES 149.766145
KGS 100.827377
KHR 4640.043795
KMF 492.321403
KPW 1037.746034
KRW 1737.415627
KWD 0.354517
KYD 0.9642
KZT 558.260877
LAK 24946.076013
LBP 103458.959416
LKR 363.897058
LRD 212.319549
LSL 19.490063
LTL 3.404441
LVL 0.697425
LYD 7.377873
MAD 10.783173
MDL 20.231237
MGA 4822.515874
MKD 61.638053
MMK 2421.233218
MNT 4132.071286
MOP 9.317276
MRU 46.101338
MUR 53.763579
MVR 17.813319
MWK 2006.373981
MXN 20.570881
MYR 4.605059
MZN 73.671727
NAD 19.489979
NGN 1597.611466
NIO 42.581923
NOK 11.111258
NPR 174.132249
NZD 1.995233
OMR 0.443302
PAB 1.157015
PEN 4.001066
PGK 4.998964
PHP 69.383888
PKR 322.936082
PLN 4.273193
PYG 7528.388952
QAR 4.219572
RON 5.097888
RSD 117.448046
RUB 95.007374
RWF 1689.51831
SAR 4.325551
SBD 9.272285
SCR 16.055447
SDG 692.939845
SEK 10.837521
SGD 1.481118
SHP 0.865031
SLE 28.305819
SLL 24177.365885
SOS 661.211226
SRD 43.052736
STD 23864.298223
STN 24.426531
SVC 10.124548
SYP 128.491078
SZL 19.500432
THB 37.926607
TJS 11.078682
TMT 4.03542
TND 3.395258
TOP 2.776092
TRY 51.153211
TTD 7.867337
TWD 36.827174
TZS 2963.219161
UAH 50.801122
UGX 4281.086328
USD 1.152977
UYU 46.838713
UZS 14111.555625
VES 532.779606
VND 30382.099695
VUV 137.231179
WST 3.170146
XAF 653.989946
XAG 0.017078
XAU 0.00026
XCD 3.115978
XCG 2.085328
XDR 0.813357
XOF 653.995601
XPF 119.331742
YER 275.157775
ZAR 19.696538
ZMK 10378.184071
ZMW 21.665928
ZWL 371.258157
  • RYCEF

    -0.5400

    15.36

    -3.52%

  • BCC

    0.3800

    75.03

    +0.51%

  • CMSC

    -0.0300

    22.88

    -0.13%

  • NGG

    -1.1700

    83.12

    -1.41%

  • VOD

    0.0400

    14.76

    +0.27%

  • RIO

    -1.3600

    86.18

    -1.58%

  • GSK

    0.0000

    54.7

    0%

  • RELX

    0.0600

    32.53

    +0.18%

  • JRI

    0.0300

    12.13

    +0.25%

  • BCE

    0.1000

    25.59

    +0.39%

  • AZN

    -1.5100

    185.63

    -0.81%

  • CMSD

    -0.0900

    22.59

    -0.4%

  • BP

    0.4250

    45.835

    +0.93%

  • BTI

    0.0750

    58.525

    +0.13%

  • RBGPF

    -13.5000

    69

    -19.57%

Biden robocall: Audio deepfake fuels election disinformation fears
Biden robocall: Audio deepfake fuels election disinformation fears / Photo: Roberto SCHMIDT - AFP

Biden robocall: Audio deepfake fuels election disinformation fears

The 2024 White House race faces the prospect of a firehose of AI-enabled disinformation, with a robocall impersonating US President Joe Biden already stoking particular alarm about audio deepfakes.

Text size:

"What a bunch of malarkey," said the phone message, digitally spoofing Biden's voice and echoing one of his signature phrases.

The robocall urged New Hampshire residents not to cast ballots in the Democratic primary last month, prompting state authorities to launch a probe into possible voter suppression.

It also triggered demands from campaigners for stricter guardrails around generative artificial intelligence tools or an outright ban on robocalls.

Disinformation researchers fear rampant misuse of AI-powered applications in a pivotal election year thanks to proliferating voice cloning tools, which are cheap and easy to use and hard to trace.

"This is certainly the tip of the iceberg," Vijay Balasubramaniyan, chief executive and co-founder of cybersecurity firm Pindrop, told AFP.

"We can expect to see many more deepfakes throughout this election cycle."

A detailed analysis published by Pindrop said a text-to-speech system developed by the AI voice cloning startup ElevenLabs was used to create the Biden robocall.

The scandal comes as campaigners on both sides of the US political aisle harness advanced AI tools for effective campaign messaging and as tech investors pump millions of dollars into voice cloning startups.

Balasubramaniyan refused to say whether Pindrop had shared its findings with ElevenLabs, which last month announced a financing round from investors that, according to Bloomberg News, gave the firm a valuation of $1.1 billion.

ElevenLabs did not respond to repeated AFP requests for comment. Its website leads users to a free text-to-speech generator to "create natural AI voices instantly in any language."

Under its safety guidelines, the firm said users were allowed to generate voice clones of political figures such as Donald Trump without their permission if they "express humor or mockery" in a way that makes it "clear to the listener that what they are hearing is a parody, and not authentic content."

- 'Electoral chaos' -

US regulators have been considering making AI-generated robocalls illegal, with the fake Biden call giving the effort new impetus.

"The political deepfake moment is here," said Robert Weissman, president of the advocacy group Public Citizen.

"Policymakers must rush to put in place protections or we're facing electoral chaos. The New Hampshire deepfake is a reminder of the many ways that deepfakes can sow confusion."

Researchers fret the impact of AI tools that create videos and text so seemingly real that voters could struggle to decipher truth from fiction, undermining trust in the electoral process.

But audio deepfakes used to impersonate or smear celebrities and politicians around the world have sparked the most concern.

"Of all the surfaces -- video, image, audio -- that AI can be used for voter suppression, audio is the biggest vulnerability," Tim Harper, a senior policy analyst at the Center for Democracy & Technology, told AFP.

"It is easy to clone a voice using AI, and it is difficult to identify."

- 'Election integrity' -

The ease of creating and disseminating fake audio content complicates an already hyperpolarized political landscape, undermining confidence in the media and enabling anyone to claim that fact-based "evidence has been fabricated," Wasim Khaled, chief executive of Blackbird.AI, told AFP.

Such concerns are rife as the proliferation of AI audio tools outpaces detection software.

China's ByteDance, owner of the wildly popular platform TikTok, recently unveiled StreamVoice, an AI tool for real-time conversion of a user's voice to any desired alternative.

"Even though the attackers used ElevenLabs this time, it is likely to be a different generative AI system in future attacks," Balasubramaniyan said.

"It is imperative that there are enough safeguards available in these tools."

Balasubramaniyan and other researchers recommended building audio watermarks or digital signatures into tools as possible protections as well as regulation that makes them available only for verified users.

"Even with those actions, detecting when these tools are used to generate harmful content that violates your terms of service is really hard and really expensive," Harper said.

"(It) requires investment in trust and safety and a commitment to building with election integrity centred as a risk."

(U.Gruber--BBZ)