Opinion Makers Discouraged By Facebook News Feed Study

Fact checked
Facebook CEO

Researchers, opinion makers, scientists and traditional media are discouraged by a recent Facebook study which showed people choosing their own news stories on Facebook and not being influenced too much by opposing views.

Traditionalist media believe opposing views should appear on the news-feeds of Facebook users, rather than what the user actually prefer to click on. Their argument is that Democracy, via opposing viewpoints, is then represented on users news-feeds, irrespective of what they might actually like, prefer or eventually click on.

Business Insider UK reports:

Facebook data scientists published a study in Science magazine this week saying that the social network doesn’t completely isolate its users from different political viewpoints — but that even if it did, there might not be anything wrong with that.

The issue at hand is whether or not Facebook’s filtering algorithm keeps people from reading news stories and opinions they disagree with, since it’s designed to mostly show you things you’d like. This matters because exposure to different political ideas is important to the concept of democracy. After all, you can’t make a smart decision without considering all sides.

But if the Facebook News Feed is automatically bumping stuff you might not agree with down the page, where this same study shows it’s far less likely to be clicked on, then there are entire worlds of perspective you could miss out on.

The study finds that on average, around 23% of a user’s friends will be of a different political affiliation, and an average of 29% of News Feed stories will present a user with an opposing viewpoint.

Meanwhile, Facebook says that the filtering algorithim itself is more affected by which stories a user clicks on than by anything Facebook itself does.

In other words, it’s bad, but not as bad as Facebook’s worst critics fear.

“This shows that the effects that I wrote about exist and are significant, but they’re smaller than I would have guessed,” Eytan Bakshy, the data scientist who led the study, told the New York Times.

In the study, Facebook refers to displaying a story that a user won’t necessarily agree with as “cross-cutting.”

The baffling part is that Facebook’s study says “we do not pass judgment on the normative value of cross-cutting exposure,” and that “exposure to cross-cutting viewpoints is associated with lower levels of political participation.”

Microsoft Research scientist Christian Sandvig has a problem with this approach.

“So the authors present reduced exposure to diverse news as a ‘could be good, could be bad’ but that’s just not fair. It’s just ‘bad.’ There is no gang of political scientists arguing against exposure to diverse news sources,” Sandvig writes in a blog entry.

There’s also a major flaw in the study. Of Facebook’s 200 million American users, the research team opted to study just 10.1 million of them. That’s only 20%. And of that 20%, only 9% of them listed their political affiliation

Which, after crunching the numbers, means that Facebook opted to only study 4% of American Facebook users.

The big problem with this is that it means the sampling wasn’t random. Facebook chose from a base of people who feel strongly enough about politics to list it on their profile page, which isn’t representative of Facebook’s population at large.

“People who self-identify their politics are almost certainly going to behave quite differently, on average, than people who do not, when it comes to the behavior in question which is sharing and clicking through ideologically challenging content,” wrote University of North Carolina Assistant Professor Zeynep Tufekci in a Medium post on the study.

Even with all of these issues and strange philosophical bents, it’s one of the first studies of its kind, and will prove invaluable to researchers going forward.

Be the first to comment

Leave a Reply

Your email address will not be published.




This site uses Akismet to reduce spam. Learn how your comment data is processed.