Berliner Boersenzeitung - 'Tool for grifters': AI deepfakes push bogus sexual cures

EUR -
AED 4.343054
AFN 77.464136
ALL 96.578481
AMD 443.001294
ANG 2.116924
AOA 1084.432259
ARS 1696.425045
AUD 1.722632
AWG 2.13043
AZN 2.015092
BAM 1.955364
BBD 2.363473
BDT 143.548016
BGN 1.986001
BHD 0.445401
BIF 3475.425631
BMD 1.182587
BND 1.500966
BOB 8.109193
BRL 6.256361
BSD 1.173439
BTN 107.717999
BWP 16.277373
BYN 3.32206
BYR 23178.695489
BZD 2.360074
CAD 1.622687
CDF 2578.039008
CHF 0.922409
CLF 0.026073
CLP 1029.489324
CNY 8.24689
CNH 8.21806
COP 4228.657801
CRC 580.770597
CUC 1.182587
CUP 31.338542
CVE 110.240437
CZK 24.267271
DJF 208.973438
DKK 7.466899
DOP 73.933527
DZD 153.154875
EGP 55.759418
ERN 17.738798
ETB 182.791072
FJD 2.661179
FKP 0.870315
GBP 0.866681
GEL 3.18162
GGP 0.870315
GHS 12.79115
GIP 0.870315
GMD 86.329235
GNF 10278.709772
GTQ 9.006993
GYD 245.515296
HKD 9.251143
HNL 30.954103
HRK 7.533317
HTG 153.905708
HUF 382.153287
IDR 19840.785951
ILS 3.707232
IMP 0.870315
INR 108.414214
IQD 1537.357457
IRR 49816.456691
ISK 145.777895
JEP 0.870315
JMD 184.718842
JOD 0.838501
JPY 184.134678
KES 151.256298
KGS 103.416722
KHR 4722.947667
KMF 496.686746
KPW 1064.353704
KRW 1710.44627
KWD 0.362349
KYD 0.977982
KZT 590.738376
LAK 25359.349612
LBP 105085.885516
LKR 363.548997
LRD 217.091629
LSL 18.94048
LTL 3.491871
LVL 0.715335
LYD 7.466336
MAD 10.748905
MDL 19.97255
MGA 5308.817127
MKD 61.616271
MMK 2483.187819
MNT 4218.830116
MOP 9.4253
MRU 46.916546
MUR 54.292994
MVR 18.271409
MWK 2034.84661
MXN 20.533372
MYR 4.736855
MZN 75.57955
NAD 18.94048
NGN 1680.526824
NIO 43.180379
NOK 11.555294
NPR 172.348599
NZD 1.987207
OMR 0.454249
PAB 1.173539
PEN 3.936823
PGK 5.018882
PHP 69.733624
PKR 328.342141
PLN 4.208885
PYG 7847.251532
QAR 4.278347
RON 5.101724
RSD 117.373848
RUB 89.207823
RWF 1711.518652
SAR 4.430113
SBD 9.606873
SCR 16.856244
SDG 711.330129
SEK 10.584272
SGD 1.505082
SHP 0.887246
SLE 28.859447
SLL 24798.24684
SOS 669.450838
SRD 45.081425
STD 24477.153012
STN 24.494542
SVC 10.267712
SYP 13078.904017
SZL 18.935781
THB 36.920787
TJS 10.972155
TMT 4.139053
TND 3.416239
TOP 2.847384
TRY 51.246799
TTD 7.971224
TWD 37.116428
TZS 3004.130641
UAH 50.599026
UGX 4148.075755
USD 1.182587
UYU 44.440098
UZS 14242.826515
VES 416.584326
VND 31036.982812
VUV 141.661813
WST 3.258757
XAF 655.810877
XAG 0.011483
XAU 0.000237
XCD 3.196
XCG 2.114929
XDR 0.815618
XOF 655.810877
XPF 119.331742
YER 281.814608
ZAR 19.0597
ZMK 10644.701884
ZMW 23.02187
ZWL 380.792372
  • SCS

    0.0200

    16.14

    +0.12%

  • RBGPF

    -0.8100

    83.23

    -0.97%

  • JRI

    0.0100

    13.68

    +0.07%

  • NGG

    1.3200

    81.5

    +1.62%

  • CMSC

    0.1000

    23.75

    +0.42%

  • BCC

    -1.1800

    84.33

    -1.4%

  • VOD

    0.2300

    14.17

    +1.62%

  • RYCEF

    0.3000

    17.12

    +1.75%

  • CMSD

    0.0900

    24.13

    +0.37%

  • BCE

    0.4900

    25.2

    +1.94%

  • RELX

    0.0600

    39.9

    +0.15%

  • RIO

    3.1300

    90.43

    +3.46%

  • GSK

    0.5000

    49.15

    +1.02%

  • BP

    1.1000

    36.53

    +3.01%

  • BTI

    0.9400

    59.16

    +1.59%

  • AZN

    1.2600

    92.95

    +1.36%

'Tool for grifters': AI deepfakes push bogus sexual cures
'Tool for grifters': AI deepfakes push bogus sexual cures / Photo: Chris DELMAS - AFP

'Tool for grifters': AI deepfakes push bogus sexual cures

Holding an oversized carrot, a brawny, shirtless man promotes a supplement he claims can enlarge male genitalia -- one of countless AI-generated videos on TikTok peddling unproven sexual treatments.

Text size:

The rise of generative AI has made it easy -- and financially lucrative -- to mass-produce such videos with minimal human oversight, often featuring fake celebrity endorsements of bogus and potentially harmful products.

In some TikTok videos, carrots are used as a euphemism for male genitalia, apparently to evade content moderation policing sexually explicit language.

"You would notice that your carrot has grown up," the muscled man says in a robotic voice in one video, directing users to an online purchase link.

"This product will change your life," the man adds, claiming without evidence that the herbs used as ingredients boost testosterone and send energy levels "through the roof."

The video appears to be AI-generated, according to a deepfake detection service recently launched by the Bay Area-headquartered firm Resemble AI, which shared its results with AFP.

"As seen in this example, misleading AI-generated content is being used to market supplements with exaggerated or unverified claims, potentially putting consumers' health at risk," Zohaib Ahmed, Resemble AI's chief executive and co-founder, told AFP.

"We're seeing AI-generated content weaponized to spread false information."

- 'Cheap way' -

The trend underscores how rapid advances in artificial intelligence have fueled what researchers call an AI dystopia, a deception-filled online universe designed to manipulate unsuspecting users into buying dubious products.

They include everything from unverified -- and in some cases, potentially harmful -- dietary supplements to weight loss products and sexual remedies.

"AI is a useful tool for grifters looking to create large volumes of content slop for a low cost," misinformation researcher Abbie Richards told AFP.

"It's a cheap way to produce advertisements," she added.

Alexios Mantzarlis, director of the Security, Trust, and Safety Initiative at Cornell Tech, has observed a surge of "AI doctor" avatars and audio tracks on TikTok that promote questionable sexual remedies.

Some of these videos, many with millions of views, peddle testosterone-boosting concoctions made from ingredients such as lemon, ginger and garlic.

More troublingly, rapidly evolving AI tools have enabled the creation of deepfakes impersonating celebrities such as actress Amanda Seyfried and actor Robert De Niro.

"Your husband can't get it up?" Anthony Fauci, former director of the National Institute of Allergy and Infectious Diseases, appears to ask in a TikTok video promoting a prostate supplement.

But the clip is a deepfake, using Fauci's likeness.

- 'Pernicious' -

Many manipulated videos are created from existing ones, modified with AI-generated voices and lip-synced to match what the altered voice says.

"The impersonation videos are particularly pernicious as they further degrade our ability to discern authentic accounts online," Mantzarlis said.

Last year, Mantzarlis discovered hundreds of ads on YouTube featuring deepfakes of celebrities -- including Arnold Schwarzenegger, Sylvester Stallone, and Mike Tyson -- promoting supplements branded as erectile dysfunction cures.

The rapid pace of generating short-form AI videos means that even when tech platforms remove questionable content, near-identical versions quickly reappear -- turning moderation into a game of whack-a-mole.

Researchers say this creates unique challenges for policing AI-generated content, requiring novel solutions and more sophisticated detection tools.

AFP's fact checkers have repeatedly debunked scam ads on Facebook promoting treatments -- including erectile dysfunction cures -- that use fake endorsements by Ben Carson, a neurosurgeon and former US cabinet member.

Yet many users still consider the endorsements legitimate, illustrating the appeal of deepfakes.

"Scammy affiliate marketing schemes and questionable sex supplements have existed for as long as the internet and before," Mantzarlis said.

"As with every other bad thing online, generative AI has made this abuse vector cheaper and quicker to deploy at scale."

(A.Lehmann--BBZ)