Facebook internal research shows its algorithms enhance political polarization.
From the
commentary by Lazer:
Bakshy et al. examine... whether Facebook's curation of news feeds prevents the intersection of conflicting points of view. That is, does a “filter bubble” emerge from this algorithmic curation process, so that individuals only see posts that they agree with? Such an algorithmic sorting has the potential to be unhealthy for our democracy, fostering polarization and undermining the construction of a vision of the common good...Their answer, after parsing the Facebook pages of ∼10 million U.S. individuals with self-declared ideologies, is that the curation does ideologically filter what we see.
It is laudable that Facebook supported this research and has invested in the public good of general scientific knowledge. Indeed, the information age hegemons should proactively support research on the ethical implications of the systems that they build. Facebook deserves great credit for building a talented research group and for conducting this research in a public way.
Here is the
Bakshy et al. abstract:
Exposure to news, opinion, and civic information increasingly occurs through social media. How do these online networks influence exposure to perspectives that cut across ideological lines? Using deidentified data, we examined how 10.1 million U.S. Facebook users interact with socially shared news. We directly measured ideological homophily in friend networks and examined the extent to which heterogeneous friends could potentially expose individuals to cross-cutting content. We then quantified the extent to which individuals encounter comparatively more or less diverse content while interacting via Facebook’s algorithmically ranked News Feed and further studied users’ choices to click through to ideologically discordant content. Compared with algorithmic ranking, individuals’ choices played a stronger role in limiting exposure to cross-cutting content.
No comments:
Post a Comment