As we discussed in class this week, Facebook tracks which posts and articles its users interact with in order to determine their political leaning. This is mostly accurate as a result of confirmation bias; people often click on articles which talk about topics they believe to be important or ideologies they believe to be true. A study by Eytan Bakshy and Solomon Messing examined how Facebook uses algorithms to determine the sequential order of a user’s News Feed. They examined a subject group of those openly reporting their political affiliation. It is noted that being more active on Facebook is directly correlated to being more likely to report ideological affiliation. Also, those with less polarized opinions are more likely to read posts from a wider variety of sources. It was also proven that people were more likely to be friends with those who share their political views; conservatives have mainly conservative friends and the majority of liberals have liberal friends.
Some ways considered to measure ideology were surveying users or joining data to the voter file. Weakness in surveying results from non-response bias, as those who do not have strong political opinions would be less likely to respond.