Selective exposure, equilibrium effects, feedback loops

Wow! Andrew Guess and coauthors have published in Science a paper very similar to the one I described six months ago, collaborating with Facebook researchers to turn off ranked news feed in favor of chronological for radom users. As we might have anticipated, results are basically null, except that people clearly like chronological feed less and use Facebook/IG less often when it is turned on.

The idea that algorithm driven echo chambers causes polarization is just so painfully slippery. It sounds so plausible, but the issue is that basically no study has found evidence of this. Guess’s paper is basically state of the art when it comes to evaluating this question: turn off the algorithm, observe that like-minded exposure goes down and out-partisan content goes up, observe that polarization is unchanged. The conclusion that algorithm driven selective exposure doesn’t have an effect on polarization is hard to escape.

So the matter is finally settled? It’s hard to say. The idea seems like it will never die.

In my own research agenda, I’m working on understanding the same-but-opposite treatment; drastically increasing cross-cutting information (rather than scaling back like-minded, and hoping that some cross-cutting information takes its place). Much of the existing work on cross-cutting, or out-partisan news in social media is very good, but a lot of it tends to be rather conservative in treatment strength.

Looking forward, ranking algorithms may hve (tiny) direct effects, but what I’m really interested in are the equilibrium and feedback effects. How does the ranking algorithm (or more broadly, the way in which information is curated, distributed and consumed) change the way media companies, politicians, and influencers create content, and what are the political implications?


Sign up for the mailing list