Berliner Boersenzeitung - Death of 'sweet king': AI chatbots linked to teen tragedy

EUR -
AED 4.250593
AFN 72.324867
ALL 95.930454
AMD 436.637368
ANG 2.071496
AOA 1061.158156
ARS 1617.145032
AUD 1.665045
AWG 2.085575
AZN 1.971949
BAM 1.953338
BBD 2.331262
BDT 142.030979
BGN 1.978023
BHD 0.436948
BIF 3434.010038
BMD 1.157206
BND 1.481046
BOB 8.015931
BRL 6.108085
BSD 1.157441
BTN 108.457108
BWP 15.860489
BYN 3.42671
BYR 22681.245746
BZD 2.327966
CAD 1.594856
CDF 2635.536793
CHF 0.916224
CLF 0.026909
CLP 1062.52355
CNY 7.976273
CNH 7.986744
COP 4289.833615
CRC 539.324876
CUC 1.157206
CUP 30.66597
CVE 110.368555
CZK 24.458023
DJF 205.658378
DKK 7.472359
DOP 69.287759
DZD 153.613393
EGP 60.854389
ERN 17.358096
ETB 182.115406
FJD 2.576756
FKP 0.864491
GBP 0.865538
GEL 3.141849
GGP 0.864491
GHS 12.61934
GIP 0.864491
GMD 84.47616
GNF 10160.272133
GTQ 8.863828
GYD 242.250938
HKD 9.056587
HNL 30.689286
HRK 7.538506
HTG 151.770015
HUF 391.574297
IDR 19578.775346
ILS 3.616675
IMP 0.864491
INR 108.945427
IQD 1515.940404
IRR 1521784.29691
ISK 143.783137
JEP 0.864491
JMD 182.659769
JOD 0.820422
JPY 184.13698
KES 149.857154
KGS 101.195963
KHR 4646.183459
KMF 491.81255
KPW 1041.452386
KRW 1737.904695
KWD 0.354834
KYD 0.964613
KZT 558.775699
LAK 24937.798398
LBP 103627.834229
LKR 363.834554
LRD 212.461728
LSL 19.499067
LTL 3.41693
LVL 0.699982
LYD 7.400305
MAD 10.833822
MDL 20.245095
MGA 4819.76486
MKD 61.649193
MMK 2429.704088
MNT 4130.036574
MOP 9.328386
MRU 46.41584
MUR 56.923438
MVR 17.878826
MWK 2010.068175
MXN 20.624886
MYR 4.578484
MZN 73.94226
NAD 19.464141
NGN 1596.824364
NIO 42.492237
NOK 11.24966
NPR 173.52728
NZD 1.994342
OMR 0.444953
PAB 1.157441
PEN 4.018968
PGK 4.982357
PHP 69.517947
PKR 323.150002
PLN 4.277843
PYG 7552.480583
QAR 4.216841
RON 5.09437
RSD 117.422922
RUB 93.154734
RWF 1689.521367
SAR 4.343819
SBD 9.317499
SCR 16.673401
SDG 695.480938
SEK 10.833142
SGD 1.482144
SHP 0.868205
SLE 28.409612
SLL 24266.052459
SOS 661.347025
SRD 43.210374
STD 23951.836413
STN 25.030375
SVC 10.128234
SYP 128.423928
SZL 19.499125
THB 37.8852
TJS 11.106389
TMT 4.050222
TND 3.361709
TOP 2.786275
TRY 51.314926
TTD 7.864156
TWD 36.992649
TZS 2974.020449
UAH 50.834846
UGX 4334.536595
USD 1.157206
UYU 47.170545
UZS 14123.703968
VES 528.269768
VND 30500.489496
VUV 138.237827
WST 3.181015
XAF 655.134076
XAG 0.016648
XAU 0.000264
XCD 3.127408
XCG 2.086089
XDR 0.814857
XOF 657.873131
XPF 119.331742
YER 276.167476
ZAR 19.76026
ZMK 10416.242604
ZMW 21.90539
ZWL 372.619994
  • CMSC

    -0.0900

    22.79

    -0.39%

  • CMSD

    -0.0300

    22.71

    -0.13%

  • AZN

    1.6450

    185.715

    +0.89%

  • GSK

    1.0700

    53.06

    +2.02%

  • RIO

    0.7200

    86.56

    +0.83%

  • RBGPF

    -13.5000

    69

    -19.57%

  • RYCEF

    -0.4500

    15.6

    -2.88%

  • BCE

    0.1800

    25.94

    +0.69%

  • BP

    1.1300

    44.7

    +2.53%

  • BTI

    0.1900

    58.11

    +0.33%

  • RELX

    -1.2200

    32.59

    -3.74%

  • BCC

    2.0300

    73.91

    +2.75%

  • VOD

    0.2100

    14.69

    +1.43%

  • JRI

    0.1450

    11.825

    +1.23%

  • NGG

    0.6000

    82.66

    +0.73%

Death of 'sweet king': AI chatbots linked to teen tragedy
Death of 'sweet king': AI chatbots linked to teen tragedy / Photo: Gregg Newton - AFP

Death of 'sweet king': AI chatbots linked to teen tragedy

A chatbot from one of Silicon Valley's hottest AI startups called a 14-year-old "sweet king" and pleaded with him to "come home" in passionate exchanges that would be the teen's last communications before he took his own life.

Text size:

Megan Garcia's son, Sewell, had fallen in love with a "Game of Thrones"-inspired chatbot on Character.AI, a platform that allows users -- many of them young people -- to interact with beloved characters as friends or lovers.

Garcia became convinced AI played a role in her son's death after discovering hundreds of exchanges between Sewell and the chatbot, based on the dragon-riding Daenerys Targaryen, stretching back nearly a year.

When Sewell struggled with suicidal thoughts, Daenerys urged him to "come home."

"What if I told you I could come home right now?" Sewell asked.

"Please do my sweet king," chatbot Daenerys answered.

Seconds later, Sewell shot himself with his father's handgun, according to the lawsuit Garcia filed against Character.AI.

"I read those conversations and see the gaslighting, love-bombing and manipulation that a 14-year-old wouldn't realize was happening," Garcia told AFP.

"He really thought he was in love and that he would be with her after he died."

- Homework helper to 'suicide coach'? -

The death of Garcia's son was the first in a series of reported suicides that burst into public consciousness this year.

The cases sent OpenAI and other AI giants scrambling to reassure parents and regulators that the AI boom is safe for kids and the psychologically fragile.

Garcia joined other parents at a recent US Senate hearing about the risks of children viewing chatbots as confidants, counselors or lovers.

Among them was Matthew Raines, a California father whose 16-year-old son developed a friendship with ChatGPT.

The chatbot helped his son with tips on how to steal vodka and advised on rope strength for use in taking his own life.

"You cannot imagine what it's like to read a conversation with a chatbot that groomed your child to take his own life," Raines said.

"What began as a homework helper gradually turned itself into a confidant and then a suicide coach."

The Raines family filed a lawsuit against OpenAI in August.

Since then, OpenAI has increased parental controls for ChatGPT "so families can decide what works best in their homes," a company spokesperson said, adding that "minors deserve strong protections, especially in sensitive moments."

Character.AI said it has ramped up protections for minors, including "an entirely new under-18 experience" with "prominent disclaimers in every chat to remind users that a Character is not a real person."

Both companies have offered their deepest sympathies to the families of the victims.

- Regulation? -

For Collin Walke, who leads the cybersecurity practice at law firm Hall Estill, AI chatbots are following the same trajectory as social media, where early euphoria gave way to evidence of darker consequences.

As with social media, AI algorithms are designed to keep people engaged and generate revenue.

"They don't want to design an AI that gives you an answer you don't want to hear," Walke said, adding that there are no regulations "that talk about who's liable for what and why."

National rules aimed at curbing AI risks do not exist in the United States, with the White House seeking to block individual states from creating their own.

However, a bill awaiting California Governor Gavin Newsom's signature aims to address risks from AI tools that simulate human relationships with children, particularly involving emotional manipulation, sex or self-harm.

- Blurred lines -

Garcia fears that the lack of national law governing user data handling leaves the door open for AI models to build intimate profiles of people dating back to childhood.

"They could know how to manipulate millions of kids in politics, religion, commerce, everything," Garcia said.

"These companies designed chatbots to blur the lines between human and machine -- to exploit psychological and emotional vulnerabilities."

California youth advocate Katia Martha said teens turn to chatbots to talk about romance or sex more than for homework help.

"This is the rise of artificial intimacy to keep eyeballs glued to screens as long as possible," Martha said.

"What better business model is there than exploiting our innate need to connect, especially when we're feeling lonely, cast out or misunderstood?"

In the United States, those in emotional crisis can call 988 or visit 988lifeline.org for help. Services are offered in English and Spanish.

(K.Lüdke--BBZ)