You are here
Home > selective exposure > Students design plugin to fight ‘selective exposure’

Students design plugin to fight ‘selective exposure’

It’s human nature for people to favor information which reinforces their pre-existing views, while at the same time avoiding contradictory information. It’s a primary reason some Americans have developed ultra-strong affiliations with one political party or belief system, while completely discounting the other.

Psychologists call the theory “selective exposure,” which is explored in great detail in the new book, “Fixing Post-Truth Politics:”

Upon exposure to specific aspects of information, people tend to incorporate specific portions into their mindset. These choices are made based on their perspectives, beliefs, attitudes and decisions. People mentally dissect the information to which they are exposed, and tend to select favorable evidence while ignoring the unfavorable.

Related to selective exposure is selective perception, a form of bias that causes people to perceive messages and actions according to their frame of reference. Using selective perception, people tend to overlook or forget information that contradicts their beliefs or expectations.

Students at Carnegie Mellon Univerity in Pittsburgh have created a web browser plugin called “ChromeView,” which seeks to counteract the effects of selective exposure.

“We learned how [Facebook] for the sake of catering to their users, creates an ideological bubble that traps the user by feeding them only one point of view,” they wrote.

Facebook’s platform functions by connecting people with the information that resonates with them, the group continued. In the case of politics however, these “filter bubbles” reinforced users’ biases by delivering articles with similar political leaning into their feeds.

The social web team hypothesized that an involuntary lack of exposure to dissimilar viewpoints drove up political tension. They decided to explore how a technical solution might be able to more easily expose users to differing viewpoints while also helping people understand how frequently (or infrequently) they consume content with which they disagree.

The plugin, ChromeView, was designed to do this in two ways. First, it incorporated a news recommendation modal, which displays when a user hovers over an article on the Facebook News Feed. When a user with the plugin installed moves over an article, the recommendation view displays similar articles from other news sources.

The second tactic was a visualization of the user’s political leaning, which the team called a political leaning meter. The political meter was meant to help motivate users to research both parties’ viewpoints, rather than only consuming content with which they already agreed.

“Social media has thrust politics into our daily lives and has convinced us that there are two sides, which we need to choose from,” said Zeeshan Rizvi. “Our goal with this project was to help show our users that there is another side by exposing them to their views.”

Here’s a video that demonstrates how it works.

This is important, because, according to “Fixing Post-Truth Politics:”

The news consumers who don’t want to ponder which report is true and which one is false have instead opted for the selective exposure that is part of their human nature. In effect, if the report agrees with their personal line of thinking, it must be true. And if it doesn’t, it must be false …
… which has helped to lead us to the era of post-truth politics in America.

This plugin may be the first step toward enabling people who want to seek out different perspectives the opportunity to do so in a way convenient while using Facebook.

Top