Berliner Boersenzeitung - 'Tool for grifters': AI deepfakes push bogus sexual cures

EUR -
AED 4.278799
AFN 77.332466
ALL 96.575617
AMD 445.1876
ANG 2.085576
AOA 1068.388216
ARS 1684.735918
AUD 1.75613
AWG 2.09862
AZN 1.984015
BAM 1.955298
BBD 2.351906
BDT 142.873314
BGN 1.955951
BHD 0.439244
BIF 3450.13256
BMD 1.165091
BND 1.512264
BOB 8.068928
BRL 6.18139
BSD 1.167705
BTN 104.895516
BWP 15.51395
BYN 3.380546
BYR 22835.780461
BZD 2.348507
CAD 1.624445
CDF 2598.152383
CHF 0.935795
CLF 0.027249
CLP 1068.972737
CNY 8.239114
CNH 8.235468
COP 4423.838268
CRC 572.550529
CUC 1.165091
CUP 30.874907
CVE 110.236695
CZK 24.215228
DJF 207.947498
DKK 7.468599
DOP 74.200629
DZD 151.573688
EGP 55.422094
ERN 17.476363
ETB 182.080866
FJD 2.631882
FKP 0.872491
GBP 0.87341
GEL 3.139877
GGP 0.872491
GHS 13.301585
GIP 0.872491
GMD 85.051785
GNF 10146.786517
GTQ 8.944742
GYD 244.307269
HKD 9.07004
HNL 30.745973
HRK 7.537941
HTG 152.955977
HUF 381.927241
IDR 19422.821609
ILS 3.76036
IMP 0.872491
INR 104.791181
IQD 1529.71378
IRR 49079.451231
ISK 149.003201
JEP 0.872491
JMD 187.141145
JOD 0.82607
JPY 180.711448
KES 150.704566
KGS 101.886647
KHR 4676.939601
KMF 491.66861
KPW 1048.573823
KRW 1715.887947
KWD 0.35759
KYD 0.973154
KZT 590.220982
LAK 25331.604319
LBP 104570.198293
LKR 360.448994
LRD 206.107962
LSL 19.822595
LTL 3.44021
LVL 0.704752
LYD 6.347397
MAD 10.774234
MDL 19.862985
MGA 5193.64414
MKD 61.624177
MMK 2446.620372
MNT 4131.997126
MOP 9.362236
MRU 46.266921
MUR 53.675364
MVR 17.954132
MWK 2024.871384
MXN 21.185039
MYR 4.789718
MZN 74.447687
NAD 19.822595
NGN 1690.547045
NIO 42.970442
NOK 11.774198
NPR 167.831186
NZD 2.017279
OMR 0.448002
PAB 1.1678
PEN 3.926892
PGK 4.952877
PHP 68.813177
PKR 329.883811
PLN 4.230421
PYG 8097.955442
QAR 4.268104
RON 5.093784
RSD 117.405001
RUB 89.428762
RWF 1699.056442
SAR 4.372624
SBD 9.581501
SCR 15.83572
SDG 700.739077
SEK 10.962357
SGD 1.508886
SHP 0.87412
SLE 26.796781
SLL 24431.370198
SOS 666.226074
SRD 45.023191
STD 24115.028075
STN 24.494657
SVC 10.21742
SYP 12883.858981
SZL 19.816827
THB 37.09708
TJS 10.731491
TMT 4.077818
TND 3.427635
TOP 2.805259
TRY 49.532165
TTD 7.917001
TWD 36.455959
TZS 2842.8212
UAH 49.235746
UGX 4139.936989
USD 1.165091
UYU 45.74845
UZS 13910.428222
VES 289.625154
VND 30711.794538
VUV 142.222766
WST 3.250779
XAF 655.7858
XAG 0.020016
XAU 0.000276
XCD 3.148716
XCG 2.104569
XDR 0.815587
XOF 655.791427
XPF 119.331742
YER 277.75676
ZAR 19.715959
ZMK 10487.212054
ZMW 26.828226
ZWL 375.158775
  • RBGPF

    0.0000

    78.35

    0%

  • CMSC

    0.0400

    23.48

    +0.17%

  • VOD

    0.0500

    12.64

    +0.4%

  • RIO

    -0.5500

    73.73

    -0.75%

  • NGG

    -0.5800

    75.91

    -0.76%

  • GSK

    -0.4000

    48.57

    -0.82%

  • RYCEF

    0.4600

    14.67

    +3.14%

  • AZN

    -0.8200

    90.03

    -0.91%

  • RELX

    0.3500

    40.54

    +0.86%

  • BTI

    0.5300

    58.04

    +0.91%

  • BP

    -0.0100

    37.23

    -0.03%

  • CMSD

    -0.0300

    23.32

    -0.13%

  • JRI

    0.0500

    13.75

    +0.36%

  • SCS

    -0.1200

    16.23

    -0.74%

  • BCC

    -2.3000

    74.26

    -3.1%

  • BCE

    0.0400

    23.22

    +0.17%

'Tool for grifters': AI deepfakes push bogus sexual cures
'Tool for grifters': AI deepfakes push bogus sexual cures / Photo: Chris DELMAS - AFP

'Tool for grifters': AI deepfakes push bogus sexual cures

Holding an oversized carrot, a brawny, shirtless man promotes a supplement he claims can enlarge male genitalia -- one of countless AI-generated videos on TikTok peddling unproven sexual treatments.

Text size:

The rise of generative AI has made it easy -- and financially lucrative -- to mass-produce such videos with minimal human oversight, often featuring fake celebrity endorsements of bogus and potentially harmful products.

In some TikTok videos, carrots are used as a euphemism for male genitalia, apparently to evade content moderation policing sexually explicit language.

"You would notice that your carrot has grown up," the muscled man says in a robotic voice in one video, directing users to an online purchase link.

"This product will change your life," the man adds, claiming without evidence that the herbs used as ingredients boost testosterone and send energy levels "through the roof."

The video appears to be AI-generated, according to a deepfake detection service recently launched by the Bay Area-headquartered firm Resemble AI, which shared its results with AFP.

"As seen in this example, misleading AI-generated content is being used to market supplements with exaggerated or unverified claims, potentially putting consumers' health at risk," Zohaib Ahmed, Resemble AI's chief executive and co-founder, told AFP.

"We're seeing AI-generated content weaponized to spread false information."

- 'Cheap way' -

The trend underscores how rapid advances in artificial intelligence have fueled what researchers call an AI dystopia, a deception-filled online universe designed to manipulate unsuspecting users into buying dubious products.

They include everything from unverified -- and in some cases, potentially harmful -- dietary supplements to weight loss products and sexual remedies.

"AI is a useful tool for grifters looking to create large volumes of content slop for a low cost," misinformation researcher Abbie Richards told AFP.

"It's a cheap way to produce advertisements," she added.

Alexios Mantzarlis, director of the Security, Trust, and Safety Initiative at Cornell Tech, has observed a surge of "AI doctor" avatars and audio tracks on TikTok that promote questionable sexual remedies.

Some of these videos, many with millions of views, peddle testosterone-boosting concoctions made from ingredients such as lemon, ginger and garlic.

More troublingly, rapidly evolving AI tools have enabled the creation of deepfakes impersonating celebrities such as actress Amanda Seyfried and actor Robert De Niro.

"Your husband can't get it up?" Anthony Fauci, former director of the National Institute of Allergy and Infectious Diseases, appears to ask in a TikTok video promoting a prostate supplement.

But the clip is a deepfake, using Fauci's likeness.

- 'Pernicious' -

Many manipulated videos are created from existing ones, modified with AI-generated voices and lip-synced to match what the altered voice says.

"The impersonation videos are particularly pernicious as they further degrade our ability to discern authentic accounts online," Mantzarlis said.

Last year, Mantzarlis discovered hundreds of ads on YouTube featuring deepfakes of celebrities -- including Arnold Schwarzenegger, Sylvester Stallone, and Mike Tyson -- promoting supplements branded as erectile dysfunction cures.

The rapid pace of generating short-form AI videos means that even when tech platforms remove questionable content, near-identical versions quickly reappear -- turning moderation into a game of whack-a-mole.

Researchers say this creates unique challenges for policing AI-generated content, requiring novel solutions and more sophisticated detection tools.

AFP's fact checkers have repeatedly debunked scam ads on Facebook promoting treatments -- including erectile dysfunction cures -- that use fake endorsements by Ben Carson, a neurosurgeon and former US cabinet member.

Yet many users still consider the endorsements legitimate, illustrating the appeal of deepfakes.

"Scammy affiliate marketing schemes and questionable sex supplements have existed for as long as the internet and before," Mantzarlis said.

"As with every other bad thing online, generative AI has made this abuse vector cheaper and quicker to deploy at scale."

(A.Lehmann--BBZ)