Facebook’s Political Algorithm and Extremism Silos

Kevin Newman
3 min readMay 26, 2020

--

The social media echo chamber problem is strengthened by a simple algorithm mistake which Facebook and seemingly all of the social media platforms have baked into their cores. The Wall Street Journal recently shed light on part of the problem —Facebook chases “engagement,” by feeding folks more and more “sticky” extreme and enraging content, to keep them glued to the Facebook platform — and Facebook’s ads longer. Facebook categorically can’t fix this problem, because it’s at the core of their entire business model — which is likely why they chose to do nothing.

But there is a second side of the problem, and I’d argue a more important part— the “silo” problem. Facebook (and other social media) ranks every post on a political spectrum from left to right on a 5 step scale. They rank each user that way too. You can see how they rank you in your profile’s “Ad Preferences,” if you’re curious. They use these values to feed us posts that fall within those parameters. Before social media, polls showed folks were often had strong opinions on single issues. Often they held points of view that didn’t align completely with a party. One might be anti-abortion, yet also anti-gun for example (even if only one or the other motivated them to the polls — so-called single issue voters).

That happens less and less now, and I believe social media plays a role in this siloing effect, because these platforms are essentially sorting issue propaganda in two consistent “silos” on a simple 5 step spectrum. This drives partisan polarization on groups of issues together, rather than on single issues. Where we used to have a set of different single issue groups, that parties sort of gather in to a single coalition, now there are two large groups of people, who all think and understand an entire set of issues the same way — often with their own whole vocabulary to describe their set.

Together with the propensity to promote partisan extremism on social media platforms, we have a dangerous mix. This is society breaking, and is rife for foreign and domestic actors to manipulate and exploit. Perhaps unlike the business model of promoting extremism, Facebook, Twitter, and the others CAN address THIS issue , they just have to be less lazy about how they categorize users and content. It’s simple, instead of following the…

--

--