Berliner Boersenzeitung - Biden robocall: Audio deepfake fuels election disinformation fears

EUR -
AED 4.312165
AFN 76.95154
ALL 96.753705
AMD 448.031316
ANG 2.102251
AOA 1076.720928
ARS 1703.460147
AUD 1.779327
AWG 2.116455
AZN 1.995163
BAM 1.960036
BBD 2.363397
BDT 143.39197
BGN 1.956168
BHD 0.442705
BIF 3482.611091
BMD 1.174178
BND 1.516406
BOB 8.108213
BRL 6.480992
BSD 1.173386
BTN 106.122841
BWP 15.497835
BYN 3.464941
BYR 23013.883134
BZD 2.360071
CAD 1.61868
CDF 2659.512187
CHF 0.933592
CLF 0.027474
CLP 1077.800801
CNY 8.270027
CNH 8.265119
COP 4538.783942
CRC 584.638664
CUC 1.174178
CUP 31.115709
CVE 110.478074
CZK 24.391217
DJF 208.675178
DKK 7.471348
DOP 73.6792
DZD 152.004409
EGP 55.887573
ERN 17.612666
ETB 182.236126
FJD 2.682115
FKP 0.874651
GBP 0.878003
GEL 3.164377
GGP 0.874651
GHS 13.532349
GIP 0.874651
GMD 86.298212
GNF 10200.667993
GTQ 8.987156
GYD 245.500137
HKD 9.135026
HNL 30.774994
HRK 7.534576
HTG 153.698912
HUF 388.990947
IDR 19581.057178
ILS 3.792471
IMP 0.874651
INR 106.165215
IQD 1538.172801
IRR 49444.623799
ISK 147.993796
JEP 0.874651
JMD 187.765812
JOD 0.832515
JPY 182.561068
KES 151.353157
KGS 102.682053
KHR 4702.581843
KMF 491.980851
KPW 1056.77334
KRW 1735.046597
KWD 0.360215
KYD 0.977872
KZT 603.548729
LAK 25426.817853
LBP 105147.61388
LKR 363.417705
LRD 208.269765
LSL 19.644041
LTL 3.467041
LVL 0.710248
LYD 6.364121
MAD 10.748129
MDL 19.800952
MGA 5313.154049
MKD 61.552783
MMK 2466.030822
MNT 4166.481166
MOP 9.40212
MRU 46.697494
MUR 54.070734
MVR 18.141501
MWK 2039.54696
MXN 21.150931
MYR 4.798867
MZN 75.060144
NAD 19.644118
NGN 1706.279887
NIO 43.127586
NOK 11.980734
NPR 169.792398
NZD 2.035971
OMR 0.451465
PAB 1.173421
PEN 3.950522
PGK 4.987887
PHP 68.965348
PKR 329.120527
PLN 4.21373
PYG 7881.732459
QAR 4.275192
RON 5.092055
RSD 117.388771
RUB 94.520111
RWF 1702.557681
SAR 4.404148
SBD 9.546318
SCR 16.990238
SDG 706.269551
SEK 10.921825
SGD 1.516122
SHP 0.880937
SLE 28.293287
SLL 24621.923812
SOS 671.045152
SRD 45.414844
STD 24303.107961
STN 24.863213
SVC 10.267623
SYP 12983.066516
SZL 19.643882
THB 36.974672
TJS 10.830593
TMT 4.109622
TND 3.409519
TOP 2.827139
TRY 50.179072
TTD 7.959864
TWD 37.153097
TZS 2898.98726
UAH 49.805522
UGX 4182.844311
USD 1.174178
UYU 45.716469
UZS 14178.196202
VES 324.344521
VND 30921.970017
VUV 142.46031
WST 3.277164
XAF 657.349716
XAG 0.017731
XAU 0.000271
XCD 3.173274
XCG 2.114826
XDR 0.815437
XOF 656.961327
XPF 119.331742
YER 279.865043
ZAR 19.69423
ZMK 10569.016091
ZMW 26.900107
ZWL 378.084744
  • SCS

    0.0200

    16.14

    +0.12%

  • RBGPF

    0.4100

    82.01

    +0.5%

  • CMSD

    -0.1000

    23.28

    -0.43%

  • NGG

    1.3900

    77.16

    +1.8%

  • BTI

    -0.1200

    57.17

    -0.21%

  • RELX

    -0.2600

    40.56

    -0.64%

  • RIO

    1.2000

    77.19

    +1.55%

  • CMSC

    -0.0800

    23.26

    -0.34%

  • GSK

    -0.0700

    48.71

    -0.14%

  • AZN

    -1.4900

    89.86

    -1.66%

  • BCE

    -0.1800

    23.15

    -0.78%

  • RYCEF

    -0.0300

    14.77

    -0.2%

  • JRI

    -0.0800

    13.43

    -0.6%

  • VOD

    0.1100

    12.81

    +0.86%

  • BCC

    0.4500

    76.29

    +0.59%

  • BP

    0.7100

    34.47

    +2.06%

Biden robocall: Audio deepfake fuels election disinformation fears
Biden robocall: Audio deepfake fuels election disinformation fears / Photo: Roberto SCHMIDT - AFP

Biden robocall: Audio deepfake fuels election disinformation fears

The 2024 White House race faces the prospect of a firehose of AI-enabled disinformation, with a robocall impersonating US President Joe Biden already stoking particular alarm about audio deepfakes.

Text size:

"What a bunch of malarkey," said the phone message, digitally spoofing Biden's voice and echoing one of his signature phrases.

The robocall urged New Hampshire residents not to cast ballots in the Democratic primary last month, prompting state authorities to launch a probe into possible voter suppression.

It also triggered demands from campaigners for stricter guardrails around generative artificial intelligence tools or an outright ban on robocalls.

Disinformation researchers fear rampant misuse of AI-powered applications in a pivotal election year thanks to proliferating voice cloning tools, which are cheap and easy to use and hard to trace.

"This is certainly the tip of the iceberg," Vijay Balasubramaniyan, chief executive and co-founder of cybersecurity firm Pindrop, told AFP.

"We can expect to see many more deepfakes throughout this election cycle."

A detailed analysis published by Pindrop said a text-to-speech system developed by the AI voice cloning startup ElevenLabs was used to create the Biden robocall.

The scandal comes as campaigners on both sides of the US political aisle harness advanced AI tools for effective campaign messaging and as tech investors pump millions of dollars into voice cloning startups.

Balasubramaniyan refused to say whether Pindrop had shared its findings with ElevenLabs, which last month announced a financing round from investors that, according to Bloomberg News, gave the firm a valuation of $1.1 billion.

ElevenLabs did not respond to repeated AFP requests for comment. Its website leads users to a free text-to-speech generator to "create natural AI voices instantly in any language."

Under its safety guidelines, the firm said users were allowed to generate voice clones of political figures such as Donald Trump without their permission if they "express humor or mockery" in a way that makes it "clear to the listener that what they are hearing is a parody, and not authentic content."

- 'Electoral chaos' -

US regulators have been considering making AI-generated robocalls illegal, with the fake Biden call giving the effort new impetus.

"The political deepfake moment is here," said Robert Weissman, president of the advocacy group Public Citizen.

"Policymakers must rush to put in place protections or we're facing electoral chaos. The New Hampshire deepfake is a reminder of the many ways that deepfakes can sow confusion."

Researchers fret the impact of AI tools that create videos and text so seemingly real that voters could struggle to decipher truth from fiction, undermining trust in the electoral process.

But audio deepfakes used to impersonate or smear celebrities and politicians around the world have sparked the most concern.

"Of all the surfaces -- video, image, audio -- that AI can be used for voter suppression, audio is the biggest vulnerability," Tim Harper, a senior policy analyst at the Center for Democracy & Technology, told AFP.

"It is easy to clone a voice using AI, and it is difficult to identify."

- 'Election integrity' -

The ease of creating and disseminating fake audio content complicates an already hyperpolarized political landscape, undermining confidence in the media and enabling anyone to claim that fact-based "evidence has been fabricated," Wasim Khaled, chief executive of Blackbird.AI, told AFP.

Such concerns are rife as the proliferation of AI audio tools outpaces detection software.

China's ByteDance, owner of the wildly popular platform TikTok, recently unveiled StreamVoice, an AI tool for real-time conversion of a user's voice to any desired alternative.

"Even though the attackers used ElevenLabs this time, it is likely to be a different generative AI system in future attacks," Balasubramaniyan said.

"It is imperative that there are enough safeguards available in these tools."

Balasubramaniyan and other researchers recommended building audio watermarks or digital signatures into tools as possible protections as well as regulation that makes them available only for verified users.

"Even with those actions, detecting when these tools are used to generate harmful content that violates your terms of service is really hard and really expensive," Harper said.

"(It) requires investment in trust and safety and a commitment to building with election integrity centred as a risk."

(U.Gruber--BBZ)