Senate Lawmakers Want to Pop Social Media Filter Bubbles

Nina design/Shutterstock

Featured eBooks

Digital First
What’s Next for Federal Customer Experience
Cloud Smarter

A newly proposed bill would require large social media companies to give users the chance to opt out of algorithmically curated content.

A bipartisan group of lawmakers want social media platforms to be more transparent about how they sort the information users see on their sites and give users the chance to opt out of content curation.

The Filter Bubble Transparency Act, introduced Tuesday in the Senate, would require large social media companies to shed light on the “opaque algorithms” used to curate posts on users’ feeds. Under the bill, platforms would be required to tell users when such algorithms are in use and give people the option to switch to an “input-transparent algorithm” instead.

The bill, sponsored by Sens. John Thune, R-S.D., Richard Blumenthal, D-Conn., Jerry Moran, R-Kan., Marsha Blackburn, R-Ky., and Mark Warner, D-Va., would make companies that violate those rules subject to civil penalties from the Federal Trade Commission.

The legislation would apply to any social media company that employs more than 500 people, pulls in more than $50 million in annual revenue and collects data on more than 1 million people. That means Facebook, Twitter, YouTube, Instagram and other high-profile platforms would fall under its purview.

Today, most social media platforms rely on algorithms to determine what content users are exposed to on their sites. In general, these algorithms are meant to optimize “relevance,” bringing the posts users are most likely to engage with to the top of their feeds and down-ranking content that they’re less likely to engage with. 

Those algorithms base those decisions on dozens of datapoint about individual users, from basic demographics like age, education and gender, to more subjective information like perceived political beliefs and relationships with other users. They also factor in users’ previous online behavior—someone who engaged with liberal-leaning news articles is more likely to see liberal-leaning content in the future.

These practices create so-called “filter bubbles,” a sort of algorithmic feedback loop that significantly modifies users’ online experience based on their past behavior. By continuously reinforcing past behavior, filter bubbles could keep users from seeing information that falls outside their perceived interests, social spheres or political opinions. Today, many fault automated content curation for deepening the partisan divide in U.S. politics.

“Many people are unaware that much of the content they see on internet platforms is determined by sophisticated algorithms and artificial intelligence that draw on data about each consumer’s online activity,” Sen. John Thune, R-S.D., said when introducing the legislation on the Senate floor. “Increasingly, every aspect of our online experience is personalized based on the vast amount of information companies collect about us—from our age and occupation to how many times we visit certain websites.”

By giving users the option to switch to an “input-transparent algorithm,” which indexes information by chronology or some other variable, the bill could help users expose themselves to a wider, more varied array of online content.

“Consumers deserve to control their own online experiences instead of being manipulated by Big Tech’s algorithms and analytics,” Blumenthal said in a statement. “Our bipartisan bill will allow consumers to regain some control over their online experience by letting them simply opt out of the filter bubble.”