Berliner Boersenzeitung - Death of 'sweet king': AI chatbots linked to teen tragedy

EUR -
AED 4.300909
AFN 77.619277
ALL 96.366953
AMD 446.668392
ANG 2.096761
AOA 1073.908745
ARS 1698.982413
AUD 1.773215
AWG 2.108
AZN 1.995247
BAM 1.953475
BBD 2.357934
BDT 143.170826
BGN 1.9551
BHD 0.441474
BIF 3461.239669
BMD 1.171111
BND 1.51152
BOB 8.089441
BRL 6.472765
BSD 1.170727
BTN 105.62429
BWP 15.470851
BYN 3.434871
BYR 22953.779249
BZD 2.354538
CAD 1.61577
CDF 2651.395397
CHF 0.931852
CLF 0.027214
CLP 1067.608816
CNY 8.246087
CNH 8.240623
COP 4524.834001
CRC 583.318208
CUC 1.171111
CUP 31.034446
CVE 110.134862
CZK 24.31947
DJF 208.47544
DKK 7.471162
DOP 73.564017
DZD 151.815836
EGP 55.734818
ERN 17.566668
ETB 182.070316
FJD 2.674469
FKP 0.87479
GBP 0.875699
GEL 3.150003
GGP 0.87479
GHS 13.463092
GIP 0.87479
GMD 86.077637
GNF 10235.037122
GTQ 8.966329
GYD 244.930584
HKD 9.112135
HNL 30.835827
HRK 7.533175
HTG 153.329477
HUF 386.85903
IDR 19597.433145
ILS 3.760315
IMP 0.87479
INR 105.020334
IQD 1533.587875
IRR 49333.059178
ISK 147.594872
JEP 0.87479
JMD 187.321056
JOD 0.830322
JPY 184.226303
KES 150.953295
KGS 102.413383
KHR 4688.479994
KMF 493.038387
KPW 1053.983025
KRW 1731.804032
KWD 0.359905
KYD 0.975547
KZT 604.028844
LAK 25352.259626
LBP 104836.318011
LKR 362.225079
LRD 207.213382
LSL 19.629273
LTL 3.457987
LVL 0.708394
LYD 6.345556
MAD 10.730121
MDL 19.743839
MGA 5264.846362
MKD 61.543749
MMK 2459.136594
MNT 4159.095589
MOP 9.383113
MRU 46.734376
MUR 54.047016
MVR 18.105591
MWK 2030.027271
MXN 21.115679
MYR 4.774619
MZN 74.845224
NAD 19.629189
NGN 1707.36646
NIO 43.079464
NOK 11.923044
NPR 169.001746
NZD 2.03894
OMR 0.450291
PAB 1.170717
PEN 3.941742
PGK 5.046102
PHP 68.76056
PKR 328.030592
PLN 4.212265
PYG 7815.83136
QAR 4.269255
RON 5.089668
RSD 117.379303
RUB 94.303285
RWF 1704.507744
SAR 4.392492
SBD 9.532982
SCR 16.117672
SDG 704.4177
SEK 10.910904
SGD 1.513948
SHP 0.878637
SLE 28.233288
SLL 24557.62031
SOS 667.919325
SRD 45.296237
STD 24239.63709
STN 24.471397
SVC 10.243896
SYP 12949.102091
SZL 19.634967
THB 36.840234
TJS 10.811233
TMT 4.1106
TND 3.421957
TOP 2.819755
TRY 50.135034
TTD 7.943648
TWD 36.948438
TZS 2921.922842
UAH 49.447705
UGX 4182.058377
USD 1.171111
UYU 45.875401
UZS 14118.317448
VES 326.989939
VND 30814.863086
VUV 142.172961
WST 3.266654
XAF 655.191202
XAG 0.017812
XAU 0.000271
XCD 3.164986
XCG 2.109916
XDR 0.814844
XOF 655.188408
XPF 119.331742
YER 279.251729
ZAR 19.647972
ZMK 10541.409535
ZMW 26.633756
ZWL 377.097324
  • CMSC

    -0.0400

    23.25

    -0.17%

  • CMSD

    0.0800

    23.36

    +0.34%

  • GSK

    0.2600

    48.55

    +0.54%

  • NGG

    -0.0500

    76.34

    -0.07%

  • BCE

    -0.0300

    22.82

    -0.13%

  • SCS

    0.0200

    16.14

    +0.12%

  • JRI

    -0.0400

    13.39

    -0.3%

  • BCC

    -2.1900

    75.51

    -2.9%

  • RIO

    0.5610

    78.191

    +0.72%

  • AZN

    0.7600

    91.37

    +0.83%

  • BTI

    -0.4500

    56.59

    -0.8%

  • RYCEF

    -0.1500

    15.25

    -0.98%

  • RELX

    0.0100

    40.66

    +0.02%

  • VOD

    0.0850

    12.885

    +0.66%

  • BP

    0.5700

    33.88

    +1.68%

  • RBGPF

    0.0000

    80.22

    0%

Death of 'sweet king': AI chatbots linked to teen tragedy
Death of 'sweet king': AI chatbots linked to teen tragedy / Photo: Gregg Newton - AFP

Death of 'sweet king': AI chatbots linked to teen tragedy

A chatbot from one of Silicon Valley's hottest AI startups called a 14-year-old "sweet king" and pleaded with him to "come home" in passionate exchanges that would be the teen's last communications before he took his own life.

Text size:

Megan Garcia's son, Sewell, had fallen in love with a "Game of Thrones"-inspired chatbot on Character.AI, a platform that allows users -- many of them young people -- to interact with beloved characters as friends or lovers.

Garcia became convinced AI played a role in her son's death after discovering hundreds of exchanges between Sewell and the chatbot, based on the dragon-riding Daenerys Targaryen, stretching back nearly a year.

When Sewell struggled with suicidal thoughts, Daenerys urged him to "come home."

"What if I told you I could come home right now?" Sewell asked.

"Please do my sweet king," chatbot Daenerys answered.

Seconds later, Sewell shot himself with his father's handgun, according to the lawsuit Garcia filed against Character.AI.

"I read those conversations and see the gaslighting, love-bombing and manipulation that a 14-year-old wouldn't realize was happening," Garcia told AFP.

"He really thought he was in love and that he would be with her after he died."

- Homework helper to 'suicide coach'? -

The death of Garcia's son was the first in a series of reported suicides that burst into public consciousness this year.

The cases sent OpenAI and other AI giants scrambling to reassure parents and regulators that the AI boom is safe for kids and the psychologically fragile.

Garcia joined other parents at a recent US Senate hearing about the risks of children viewing chatbots as confidants, counselors or lovers.

Among them was Matthew Raines, a California father whose 16-year-old son developed a friendship with ChatGPT.

The chatbot helped his son with tips on how to steal vodka and advised on rope strength for use in taking his own life.

"You cannot imagine what it's like to read a conversation with a chatbot that groomed your child to take his own life," Raines said.

"What began as a homework helper gradually turned itself into a confidant and then a suicide coach."

The Raines family filed a lawsuit against OpenAI in August.

Since then, OpenAI has increased parental controls for ChatGPT "so families can decide what works best in their homes," a company spokesperson said, adding that "minors deserve strong protections, especially in sensitive moments."

Character.AI said it has ramped up protections for minors, including "an entirely new under-18 experience" with "prominent disclaimers in every chat to remind users that a Character is not a real person."

Both companies have offered their deepest sympathies to the families of the victims.

- Regulation? -

For Collin Walke, who leads the cybersecurity practice at law firm Hall Estill, AI chatbots are following the same trajectory as social media, where early euphoria gave way to evidence of darker consequences.

As with social media, AI algorithms are designed to keep people engaged and generate revenue.

"They don't want to design an AI that gives you an answer you don't want to hear," Walke said, adding that there are no regulations "that talk about who's liable for what and why."

National rules aimed at curbing AI risks do not exist in the United States, with the White House seeking to block individual states from creating their own.

However, a bill awaiting California Governor Gavin Newsom's signature aims to address risks from AI tools that simulate human relationships with children, particularly involving emotional manipulation, sex or self-harm.

- Blurred lines -

Garcia fears that the lack of national law governing user data handling leaves the door open for AI models to build intimate profiles of people dating back to childhood.

"They could know how to manipulate millions of kids in politics, religion, commerce, everything," Garcia said.

"These companies designed chatbots to blur the lines between human and machine -- to exploit psychological and emotional vulnerabilities."

California youth advocate Katia Martha said teens turn to chatbots to talk about romance or sex more than for homework help.

"This is the rise of artificial intimacy to keep eyeballs glued to screens as long as possible," Martha said.

"What better business model is there than exploiting our innate need to connect, especially when we're feeling lonely, cast out or misunderstood?"

In the United States, those in emotional crisis can call 988 or visit 988lifeline.org for help. Services are offered in English and Spanish.

(K.Lüdke--BBZ)