Berliner Boersenzeitung - UK woman felt 'violated, assaulted' by deepfake Grok images

EUR -
AED 4.323663
AFN 75.347698
ALL 95.528884
AMD 433.357851
ANG 2.107244
AOA 1080.76821
ARS 1633.856661
AUD 1.622053
AWG 2.120625
AZN 1.998435
BAM 1.95745
BBD 2.371979
BDT 144.501779
BGN 1.963868
BHD 0.444762
BIF 3505.049681
BMD 1.177307
BND 1.490912
BOB 8.13772
BRL 5.783991
BSD 1.177682
BTN 111.001246
BWP 15.768021
BYN 3.328106
BYR 23075.220654
BZD 2.368556
CAD 1.60434
CDF 2726.643841
CHF 0.915594
CLF 0.026771
CLP 1053.619683
CNY 8.018934
CNH 8.004864
COP 4375.579851
CRC 540.246115
CUC 1.177307
CUP 31.19864
CVE 110.358004
CZK 24.307746
DJF 209.713173
DKK 7.473711
DOP 70.036942
DZD 155.656005
EGP 62.059278
ERN 17.659608
ETB 183.885946
FJD 2.567817
FKP 0.865876
GBP 0.864232
GEL 3.154767
GGP 0.865876
GHS 13.24894
GIP 0.865876
GMD 86.554381
GNF 10335.710425
GTQ 8.992349
GYD 246.393463
HKD 9.220446
HNL 31.307986
HRK 7.535707
HTG 154.245405
HUF 355.876999
IDR 20367.943937
ILS 3.423391
IMP 0.865876
INR 110.813802
IQD 1542.754293
IRR 1545804.322744
ISK 143.820085
JEP 0.865876
JMD 185.496327
JOD 0.834676
JPY 184.107546
KES 152.049068
KGS 102.920785
KHR 4723.900821
KMF 493.292187
KPW 1059.5893
KRW 1707.760614
KWD 0.362316
KYD 0.98141
KZT 545.383409
LAK 25844.34129
LBP 105461.686315
LKR 379.218313
LRD 216.108454
LSL 19.214893
LTL 3.476282
LVL 0.712141
LYD 7.449278
MAD 10.794097
MDL 20.261731
MGA 4890.03801
MKD 61.637784
MMK 2472.158404
MNT 4215.283897
MOP 9.499044
MRU 47.11971
MUR 55.003406
MVR 18.195334
MWK 2042.086278
MXN 20.25245
MYR 4.602768
MZN 75.241442
NAD 19.21473
NGN 1599.277482
NIO 43.336522
NOK 10.868907
NPR 177.604659
NZD 1.968697
OMR 0.452674
PAB 1.177672
PEN 4.079238
PGK 5.125319
PHP 71.048724
PKR 328.138038
PLN 4.227757
PYG 7208.074609
QAR 4.292718
RON 5.266061
RSD 117.394022
RUB 87.91019
RWF 1726.5257
SAR 4.424583
SBD 9.441335
SCR 16.221677
SDG 707.017566
SEK 10.825925
SGD 1.490041
SHP 0.878979
SLE 29.020987
SLL 24687.538318
SOS 673.055784
SRD 44.044242
STD 24367.881574
STN 24.520456
SVC 10.304684
SYP 130.149312
SZL 19.208617
THB 37.833955
TJS 11.005488
TMT 4.126462
TND 3.416079
TOP 2.834673
TRY 53.266239
TTD 7.966579
TWD 36.95391
TZS 3054.738898
UAH 51.56956
UGX 4404.674629
USD 1.177307
UYU 47.089685
UZS 14271.026915
VES 580.996894
VND 30974.951806
VUV 139.032561
WST 3.192283
XAF 656.499112
XAG 0.01452
XAU 0.000248
XCD 3.181731
XCG 2.122426
XDR 0.817538
XOF 656.510274
XPF 119.331742
YER 280.934968
ZAR 19.142485
ZMK 10597.173903
ZMW 22.434526
ZWL 379.09243
  • RBGPF

    0.0000

    63.18

    0%

  • RYCEF

    0.8000

    17.3

    +4.62%

  • GSK

    0.1500

    50.53

    +0.3%

  • BP

    -1.8700

    44.63

    -4.19%

  • RELX

    -0.4100

    35.75

    -1.15%

  • RIO

    5.0100

    105.51

    +4.75%

  • BCE

    0.1300

    24.23

    +0.54%

  • CMSC

    0.1300

    23.01

    +0.56%

  • NGG

    0.2100

    87.85

    +0.24%

  • VOD

    0.3900

    16.13

    +2.42%

  • BTI

    0.1600

    59.56

    +0.27%

  • BCC

    2.1100

    74.24

    +2.84%

  • AZN

    3.6800

    184.92

    +1.99%

  • CMSD

    0.1300

    23.42

    +0.56%

  • JRI

    0.1300

    13.17

    +0.99%

UK woman felt 'violated, assaulted' by deepfake Grok images
UK woman felt 'violated, assaulted' by deepfake Grok images / Photo: Lionel BONAVENTURE - AFP

UK woman felt 'violated, assaulted' by deepfake Grok images

British academic Daisy Dixon felt "violated" after the Grok chatbot on Elon Musk's X social media platform allowed users to generate sexualised images of her in a bikini or lingerie.

Text size:

She was doubly shocked to see Grok even complied with one user's request to depict her "swollen pregnant" wearing a bikini and a wedding ring.

"Someone has hijacked your digital body," the philosophy lecturer at Cardiff University told AFP, adding it was an "assault" and "extreme misogyny".

As the images proliferated "I had ... this sort of desire to hide myself," the 36-year-old academic said, adding now "that fear has been more replaced with rage".

The revelation that X's Grok AI tool allowed users to generate images of people in underwear via simple prompts triggered a wave of outrage and revulsion.

Several countries responded by blocking the chatbot after a flood of lewd deepfakes exploded online.

According to research published Thursday by the Center for Countering Digital Hate (CCDH), a nonprofit watchdog, Grok generated an estimated three million sexualised images of women and children in a matter of days.

CCDH's report estimated that Grok generated this volume of photorealistic images over an 11-day period -- an average rate of 190 per minute.

After days of furore, Musk backed down and agreed to geoblock the function in countries where creating such images is illegal, although it was not immediately clear where the tool would be restricted.

"I'm happy with the overall progress that has been made," said Dixon, who has more than 34,000 followers on X and is active on social media.

But she added: "This should never have happened at all."

She first noticed artificially generated images of herself on X in December. Users took a few photos she had posted in gym gear and a bikini and used Grok to manipulate them.

Under the UK's new Data Act, which came into force this month, creating or sharing non-consensual deepfakes is a criminal offence.

- 'Minimal attire' -

The first images were quite tame -- changing hair or makeup -- but they "really escalated" to become sexualised, said Dixon.

Users instructed Grok to put her in a thong, enlarge her hips and make her pose "sluttier".

"And then Grok would generate the image," said Dixon, author of an upcoming book "Depraved", about dangerous art.

In the worst case, a user asked to depict her in a "rape factory" -- although Grok did not comply.

Grok on X automatically posts generated images, so she saw many in the comments on her page.

This public posting carries "higher risk of direct harassment than private 'nudification apps'", said Paul Bouchaud, lead researcher for Paris non-profit AI Forensics.

In a report released this month, he looked at 20,000 images generated by Grok, finding over half showed people in "minimal attire", almost all women.

Grok has "contributed significantly to the surge in non-consensual intimate imagery because of its popularity", said Hany Farid, co-founder of GetReal Security and a professor at the University of California, Berkeley.

He slammed X's "half measures" in response, telling AFP they are "being easily circumvented".

(S.G.Stein--BBZ)