Berliner Boersenzeitung - Facebook's algorithm doesn't alter people's beliefs: research

EUR -
AED 4.26841
AFN 80.362394
ALL 97.542216
AMD 446.735356
ANG 2.080099
AOA 1065.794205
ARS 1494.414015
AUD 1.776887
AWG 2.092071
AZN 1.980459
BAM 1.954642
BBD 2.348809
BDT 141.226338
BGN 1.956132
BHD 0.43854
BIF 3466.946195
BMD 1.162261
BND 1.493215
BOB 8.038238
BRL 6.486005
BSD 1.163311
BTN 100.147673
BWP 15.618748
BYN 3.807045
BYR 22780.325028
BZD 2.336716
CAD 1.596076
CDF 3354.287055
CHF 0.932807
CLF 0.029182
CLP 1120.296341
CNY 8.342655
CNH 8.346165
COP 4674.330945
CRC 587.052233
CUC 1.162261
CUP 30.799929
CVE 110.199718
CZK 24.634179
DJF 206.947405
DKK 7.463699
DOP 70.258379
DZD 151.514244
EGP 57.439973
ERN 17.433922
ETB 161.636047
FJD 2.620788
FKP 0.864949
GBP 0.866519
GEL 3.150183
GGP 0.864949
GHS 12.127816
GIP 0.864949
GMD 83.106172
GNF 10094.020343
GTQ 8.931709
GYD 243.385819
HKD 9.121487
HNL 30.445964
HRK 7.532663
HTG 152.739518
HUF 398.923459
IDR 18977.696027
ILS 3.908598
IMP 0.864949
INR 100.127437
IQD 1523.897249
IRR 48945.741055
ISK 142.354235
JEP 0.864949
JMD 186.029797
JOD 0.824089
JPY 172.932309
KES 150.300962
KGS 101.640213
KHR 4662.238109
KMF 491.989694
KPW 1046.046309
KRW 1616.942576
KWD 0.355234
KYD 0.969426
KZT 620.152624
LAK 25087.138481
LBP 104232.653
LKR 350.972086
LRD 233.241828
LSL 20.596898
LTL 3.431856
LVL 0.703041
LYD 6.327252
MAD 10.519168
MDL 19.788278
MGA 5176.933206
MKD 61.523554
MMK 2439.678938
MNT 4168.013035
MOP 9.404829
MRU 46.275587
MUR 53.119698
MVR 17.903172
MWK 2017.205016
MXN 21.777182
MYR 4.935007
MZN 74.338683
NAD 20.596898
NGN 1779.387897
NIO 42.814637
NOK 11.838157
NPR 160.236077
NZD 1.94976
OMR 0.446894
PAB 1.163311
PEN 4.140847
PGK 4.817146
PHP 66.377189
PKR 331.310933
PLN 4.244785
PYG 9003.666265
QAR 4.229694
RON 5.072695
RSD 117.080642
RUB 91.265035
RWF 1681.00418
SAR 4.36165
SBD 9.64543
SCR 17.082281
SDG 697.942292
SEK 11.245095
SGD 1.492813
SHP 0.913355
SLE 26.62005
SLL 24372.046713
SOS 664.806172
SRD 43.245469
STD 24056.466061
STN 24.485495
SVC 10.17897
SYP 15112.803405
SZL 20.592801
THB 37.628259
TJS 11.196867
TMT 4.079538
TND 3.419874
TOP 2.722137
TRY 46.947496
TTD 7.897322
TWD 34.181766
TZS 3030.404801
UAH 48.58252
UGX 4168.530579
USD 1.162261
UYU 46.882227
UZS 14725.276806
VES 135.943958
VND 30404.760344
VUV 138.92149
WST 3.080055
XAF 655.568644
XAG 0.030448
XAU 0.000347
XCD 3.14107
XCG 2.096558
XDR 0.815317
XOF 655.568644
XPF 119.331742
YER 280.163552
ZAR 20.586499
ZMK 10461.752209
ZMW 26.785133
ZWL 374.247723
  • CMSC

    0.0900

    22.314

    +0.4%

  • CMSD

    0.0250

    22.285

    +0.11%

  • RBGPF

    0.0000

    69.04

    0%

  • SCS

    0.0400

    10.74

    +0.37%

  • RELX

    0.0300

    53

    +0.06%

  • RIO

    -0.1400

    59.33

    -0.24%

  • GSK

    0.1300

    41.45

    +0.31%

  • NGG

    0.2700

    71.48

    +0.38%

  • BP

    0.1750

    30.4

    +0.58%

  • BTI

    0.7150

    48.215

    +1.48%

  • BCC

    0.7900

    91.02

    +0.87%

  • JRI

    0.0200

    13.13

    +0.15%

  • VOD

    0.0100

    9.85

    +0.1%

  • BCE

    -0.0600

    22.445

    -0.27%

  • RYCEF

    0.1000

    12

    +0.83%

  • AZN

    -0.1200

    73.71

    -0.16%

Advertisement Image
Facebook's algorithm doesn't alter people's beliefs: research
Facebook's algorithm doesn't alter people's beliefs: research / Photo: JOSH EDELSON - AFP/File

Facebook's algorithm doesn't alter people's beliefs: research

Do social media echo chambers deepen political polarization, or simply reflect existing social divisions?

Advertisement Image

Text size:

A landmark research project that investigated Facebook around the 2020 US presidential election published its first results Thursday, finding that, contrary to assumption, the platform's often criticized content-ranking algorithm doesn't shape users' beliefs.

The work is the product of a collaboration between Meta -- the parent company of Facebook and Instagram -- and a group of academics from US universities who were given broad access to internal company data, and signed up tens of thousands of users for experiments.

The academic team wrote four papers examining the role of the social media giant in American democracy, which were published in the scientific journals Science and Nature.

Overall, the algorithm was found to be "extremely influential in people's on-platform experiences," said project leaders Talia Stroud of the University of Texas at Austin and Joshua Tucker, of New York University.

In other words, it heavily impacted what the users saw, and how much they used the platforms.

"But we also know that changing the algorithm for even a few months isn't likely to change people's political attitudes," they said, as measured by users' answers on surveys after they took part in three-month-long experiments that altered how they received content.

The authors acknowledged this conclusion might be because the changes weren't in place for long enough to make an impact, given that the United States has been growing more polarized for decades.

Nevertheless, "these findings challenge popular narratives blaming social media echo chambers for the problems of contemporary American democracy," wrote the authors of one of the papers, published in Nature.

- 'No silver bullet' -

Facebook's algorithm, which uses machine-learning to decide which posts rise to the top of users' feeds based on their interests, has been accused of giving rise to "filter bubbles" and enabling the spread of misinformation.

Researchers recruited around 40,000 volunteers via invitations placed on their Facebook and Instagram feeds, and designed an experiment where one group was exposed to the normal algorithm, while the other saw posts listed from newest to oldest.

Facebook originally used a reverse chronological system and some observers have suggested that switching back to it will reduce social media's harmful effects.

The team found that users in the chronological feed group spent around half the amount of time on Facebook and Instagram compared to the algorithm group.

On Facebook, those in the chronological group saw more content from moderate friends, as well as more sources with ideologically mixed audiences.

But the chronological feed also increased the amount of political and untrustworthy content seen by users.

Despite the differences, the changes did not cause detectable changes in measured political attitudes.

"The findings suggest that chronological feed is no silver bullet for issues such as political polarization," said coauthor Jennifer Pan of Stanford.

- Meta welcomes findings -

In a second paper in Science, the same team researched the impact of reshared content, which constitutes more than a quarter of content that Facebook users see.

Suppressing reshares has been suggested as a means to control harmful viral content.

The team ran a controlled experiment in which a group of Facebook users saw no changes to their feeds, while another group had reshared content removed.

Removing reshares reduced the proportion of political content seen, resulting in reduced political knowledge -- but again did not impact downstream political attitudes or behaviors.

A third paper, in Nature, probed the impact of content from "like-minded" users, pages, and groups in their feeds, which the researchers found constituted a majority of what the entire population of active adult Facebook users see in the US.

But in an experiment involving over 23,000 Facebook users, suppressing like-minded content once more had no impact on ideological extremity or belief in false claims.

A fourth paper, in Science, did however confirm extreme "ideological segregation" on Facebook, with politically conservative users more siloed in their news sources than liberals.

What's more, 97 percent of political news URLs on Facebook rated as false by Meta's third-party fact checking program -- which AFP is part of -- were seen by more conservatives than liberals.

Meta welcomed the overall findings.

They "add to a growing body of research showing there is little evidence that social media causes harmful... polarization or has any meaningful impact on key political attitudes, beliefs or behaviors," said Nick Clegg, the company's president of global affairs.

(A.Berg--BBZ)

Advertisement Image