Berliner Boersenzeitung - The fight over a 'dangerous' ideology shaping AI debate

EUR -
AED 4.302842
AFN 79.988996
ALL 97.295357
AMD 449.496115
ANG 2.096669
AOA 1074.270892
ARS 1542.554451
AUD 1.787174
AWG 2.108711
AZN 1.994775
BAM 1.956754
BBD 2.366783
BDT 142.429437
BGN 1.955484
BHD 0.441666
BIF 3495.634019
BMD 1.171506
BND 1.500387
BOB 8.100018
BRL 6.312102
BSD 1.172236
BTN 102.507849
BWP 15.644293
BYN 3.875974
BYR 22961.520127
BZD 2.354668
CAD 1.611565
CDF 3385.653172
CHF 0.941185
CLF 0.0285
CLP 1117.956903
CNY 8.410473
CNH 8.409516
COP 4709.185192
CRC 592.871322
CUC 1.171506
CUP 31.044912
CVE 110.318782
CZK 24.471603
DJF 208.737308
DKK 7.462652
DOP 71.624918
DZD 152.035978
EGP 56.618843
ERN 17.572592
ETB 164.179842
FJD 2.632081
FKP 0.867307
GBP 0.863031
GEL 3.157207
GGP 0.867307
GHS 12.349384
GIP 0.867307
GMD 84.934193
GNF 10164.890962
GTQ 8.991115
GYD 245.241139
HKD 9.196329
HNL 30.729982
HRK 7.533136
HTG 153.440218
HUF 395.465457
IDR 18898.206549
ILS 3.971312
IMP 0.867307
INR 102.419433
IQD 1535.648952
IRR 49349.695449
ISK 142.958836
JEP 0.867307
JMD 187.861586
JOD 0.830587
JPY 172.542059
KES 151.452048
KGS 102.331051
KHR 4694.359167
KMF 493.789581
KPW 1054.282337
KRW 1614.603432
KWD 0.357849
KYD 0.976856
KZT 630.82289
LAK 25378.277118
LBP 104933.967605
LKR 352.691963
LRD 235.02254
LSL 20.737801
LTL 3.459153
LVL 0.708632
LYD 6.369105
MAD 10.559348
MDL 19.570124
MGA 5169.590424
MKD 61.749588
MMK 2459.266979
MNT 4213.193023
MOP 9.477735
MRU 46.783808
MUR 53.221526
MVR 18.037925
MWK 2032.686411
MXN 21.718247
MYR 4.92911
MZN 74.929531
NAD 20.737801
NGN 1798.226279
NIO 43.141033
NOK 11.944706
NPR 164.01236
NZD 1.957077
OMR 0.450427
PAB 1.171506
PEN 4.131513
PGK 4.948413
PHP 66.326585
PKR 332.664687
PLN 4.257455
PYG 8780.244627
QAR 4.274785
RON 5.060089
RSD 117.120174
RUB 93.365816
RWF 1696.194288
SAR 4.395499
SBD 9.642189
SCR 17.272426
SDG 703.489128
SEK 11.157735
SGD 1.499264
SHP 0.92062
SLE 27.177033
SLL 24565.896027
SOS 669.937247
SRD 43.836005
STD 24247.811607
STN 24.511218
SVC 10.257
SYP 15231.864138
SZL 20.733078
THB 37.824714
TJS 10.930509
TMT 4.111987
TND 3.446781
TOP 2.820706
TRY 47.73078
TTD 7.960643
TWD 35.089538
TZS 3045.915955
UAH 48.670728
UGX 4170.895348
USD 1.171506
UYU 46.942886
UZS 14664.110781
VES 155.520411
VND 30795.967364
VUV 140.053656
WST 3.11401
XAF 655.699054
XAG 0.030383
XAU 0.000349
XCD 3.166054
XCG 2.112648
XDR 0.822792
XOF 655.699054
XPF 119.331742
YER 281.483577
ZAR 20.514185
ZMK 10544.963998
ZMW 26.990389
ZWL 377.224496
  • SCU

    0.0000

    12.72

    0%

  • BCC

    0.8700

    85.13

    +1.02%

  • CMSC

    0.0000

    23.08

    -0%

  • GSK

    0.7350

    38.955

    +1.89%

  • NGG

    0.4200

    70.7

    +0.59%

  • RIO

    0.5650

    63.665

    +0.89%

  • RBGPF

    0.0000

    73.08

    0%

  • JRI

    0.0400

    13.43

    +0.3%

  • CMSD

    0.1565

    23.7199

    +0.66%

  • SCS

    0.2200

    16.41

    +1.34%

  • RYCEF

    0.1500

    14.95

    +1%

  • BP

    0.0100

    34.08

    +0.03%

  • VOD

    0.1100

    11.65

    +0.94%

  • BCE

    0.2200

    24.72

    +0.89%

  • AZN

    1.7800

    77.12

    +2.31%

  • RELX

    -0.2500

    47.57

    -0.53%

  • BTI

    -0.6600

    57.26

    -1.15%

The fight over a 'dangerous' ideology shaping AI debate
The fight over a 'dangerous' ideology shaping AI debate / Photo: WANG Zhao - AFP/File

The fight over a 'dangerous' ideology shaping AI debate

Silicon Valley's favourite philosophy, longtermism, has helped to frame the debate on artificial intelligence around the idea of human extinction.

Text size:

But increasingly vocal critics are warning that the philosophy is dangerous, and the obsession with extinction distracts from real problems associated with AI like data theft and biased algorithms.

Author Emile Torres, a former longtermist turned critic of the movement, told AFP that the philosophy rested on the kind of principles used in the past to justify mass murder and genocide.

Yet the movement and linked ideologies like transhumanism and effective altruism hold huge sway in universities from Oxford to Stanford and throughout the tech sector.

Venture capitalists like Peter Thiel and Marc Andreessen have invested in life-extension companies and other pet projects linked to the movement.

Elon Musk and OpenAI's Sam Altman have signed open letters warning that AI could make humanity extinct -- though they stand to benefit by arguing only their products can save us.

Ultimately critics say this fringe movement is holding far too much influence over public debates over the future of humanity.

- 'Really dangerous' -

Longtermists believe we are dutybound to try to produce the best outcomes for the greatest number of humans.

This is no different to 19th century liberals, but longtermists have a much longer timeline in mind.

They look to the far future and see trillions upon trillions of humans floating through space, colonising new worlds.

They argue that we owe the same duty to each of these future humans as we do to anyone alive today.

And because there are so many of them, they carry much more weight than today's specimens.

This kind of thinking makes the ideology "really dangerous", said Torres, author of "Human Extinction: A History of the Science and Ethics of Annihilation".

"Any time you have a utopian vision of the future marked by near infinite amounts of value, and you combine that with a sort of utilitarian mode of moral thinking where the ends can justify the means, it's going to be dangerous," said Torres.

If a superintelligent machine could be about to spring to life with the potential to destroy humanity, longtermists are bound to oppose it no matter the consequences.

When asked in March by a user of Twitter, the platform now known as X, how many people could die to stop this happening, longtermist idealogue Eliezer Yudkowsky replied that there only needed to be enough people "to form a viable reproductive population".

"So long as that's true, there's still a chance of reaching the stars someday," he wrote, though he later deleted the message.

- Eugenics claims -

Longtermism grew out of work done by Swedish philosopher Nick Bostrom in the 1990s and 2000s around existential risk and transhumanism -- the idea that humans can be augmented by technology.

Academic Timnit Gebru has pointed out that transhumanism was linked to eugenics from the start.

British biologist Julian Huxley, who coined the term transhumanism, was also president of the British Eugenics Society in the 1950s and 1960s.

"Longtermism is eugenics under a different name," Gebru wrote on X last year.

Bostrom has long faced accusations of supporting eugenics after he listed as an existential risk "dysgenic pressures", essentially less-intelligent people procreating faster than their smarter peers.

The philosopher, who runs the Future of Life Institute at the University of Oxford, apologised in January after admitting he had written racist posts on an internet forum in the 1990s.

"Do I support eugenics? No, not as the term is commonly understood," he wrote in his apology, pointing out it had been used to justify "some of the most horrific atrocities of the last century".

- 'More sensational' -

Despite these troubles, longtermists like Yudkowsky, a high school dropout known for writing Harry Potter fan-fiction and promoting polyamory, continue to be feted.

Altman has credited him with getting OpenAI funded and suggested in February he deserved a Nobel peace prize.

But Gebru, Torres and many others are trying to refocus on harms like theft of artists' work, bias and concentration of wealth in the hands of a few corporations.

Torres, who uses the pronoun they, said while there were true believers like Yudkowsky, much of the debate around extinction was motivated by profit.

"Talking about human extinction, about a genuine apocalyptic event in which everybody dies, is just so much more sensational and captivating than Kenyan workers getting paid $1.32 an hour, or artists and writers being exploited," they said.

(F.Schuster--BBZ)