Is Facebook an echo chamber? Does the social network help us create filter bubbles, through which we’re only exposed to content and opinions that are like our own? According to the company, not really.
In new study published today in the journal Science, Facebook claims that it’s mostly humans, not its News Feed ranking algorithm, that are at fault for making their feeds ideologically consistent.
“While News Feed surfaces content that is slightly more aligned with an individual’s own ideology (based on that person’s actions on Facebook), who they friend and what content they click on are more consequential than the News Feed ranking in terms of how much diverse content they encounter,” according to Facebook’s Data Science page.