Erin Jenkins was having trouble finding a home for Bruno, a friendly 2-year-old pit bull.

"We just haven't had many applications for him," she told Christopher Zara of the International Business Times in 2015.

The shelter she worked for, Karuna Bully Rescue in Boston, largely depended on its Facebook page to reach potential new owners, but despite having more than 4,700 likes on Facebook, the rescue was experiencing shrinking interest in adoption posts. The drop coincided with Karuna Bully Rescue's Facebook Page's diminishing reach.

"A typical post by Karuna Bully may reach only about 95 people unless the group pays money to boost it," Zara wrote.

Karuna Bully's experience is not unique. Organizations around the world are dealing with the diminishing impact of their Facebook pages, the result of a change Facebook made in 2012 to its news feed algorithm—the formula that determines which posts show up on a user's News Feed, and the order in which they are shown.

Facebook altered its algorithm to decrease the reach of pages in an attempt to make companies pay to boost posts or pay for advertisements. No longer were brands going to get what had formerly been free marketing.

In its efforts to harness and monetize the advertising reach of companies, however, Facebook cast its net across all Pages, which includes not only for-profit companies, but also nonprofit organizations as well as activists and advocates with little or no budgets for advertising.

Social media consulting company Edgerank Checker found that between February 2012 and March 2014, a Facebook page's organic reach (the reach of non-boosted posts) dropped from 16 percent of its fan base to 6.5 percent. This drop has had tangible effects for small nonprofits like Karuna Bully. With limited slots available on the news feeds of their fans, organizations that can't afford to pay for boosted posts can't compete on Facebook with wealthier groups.

In February 2016, Facebook launched nonprofits.fb.com, a site providing tips for how to best utilize their service. Even their new nonprofit-geared website, however, stresses the importance of paid content in affecting a Page's reach.

The Facebook Algorithm: Information Gatekeeper

The average Facebook user has too many friends, is part of too many groups, and likes too many pages to possibly be able to see every relevant post. So Facebook has an algorithm, a formula, to automatically figure out what posts would likely interest each of its users the most. The posts then show up on the news feed according to this order of preference.

In June of 2014, Facebook made headlines when it was exposed that, as an experiment, the social networking service manipulated user news feeds to provoke and assess emotional responses.

Many were outraged that Facebook had turned nearly 700,000 of its users into unwitting subjects of a psychological experiment. To others, it was unsurprising—par for the course in today's commercial landscape where media and marketers covertly test and manipulate people's emotions on a daily basis. Ethical debate aside, it was established that an algorithm has significant potential for social control.

Facebook's algorithm-based approach stands in contrast to the approach of networking services such as Twitter, which makes no attempt to sort the posts on a user's feed, displaying every tweet by every account a user follows in reverse chronological order. (Although, on February 10, Twitter did introduce an optional algorithm-based section to highlight some potentially relevant posts at the top of a user's feed.)

While Facebook publicly acknowledges that it uses an algorithm, the actual formula is unknown. This is likely because Facebook wants to protect the prized codes from corporate thievery, but it is also because the algorithm is not a simple static entity—it is an interconnected network of codes that constantly adapt and change.

"Facebook's News Feed algorithm, like Google's search algorithm or Netflix's recommendation algorithm, is really a sprawling complex of software made up of smaller algorithms," wrote Will Oremus in a January 2016 article for Slate.

#Ferguson vs. #IceBucketChallenge

Over the past five years, social media has proven an invaluable tool for raising awareness and mobilizing movements across the globe. In the Arab Spring, a wave of democratic uprisings that began in 2010 in Tunisia and spread to several other Arab nations, citizens circumvented state attempts at repression and censorship by communicating and organizing via social media. In the United States, 2011's Occupy Wall Street campaign was largely arranged and publicized via social media. More recently, social media has become a primary tool for the Black Lives Matter movement.

Some activists and intellectuals argue that the non-curated approach of Twitter is more conducive to social movements than Facebook. Their concern is that Facebook's algorithm, in attempting to show you what you want to see, favors agreeable content over posts that may create tension or be seen as unpleasant.

Zeynep Tufecki, assistant professor at the University of North Carolina School of Library and Information Science, pointed to her experience witnessing the concurrent events of the ALS Ice Bucket Challenge and the protests over the shooting of Michael Brown in Ferguson, Missouri in 2014. Her Twitter feed, she noticed, was filled with content about Ferguson, while Ice Bucket Challenge videos dominated her Facebook feed. Her observations, it turned out, were not unique. According to social media analytics company SimpleReach, Facebook posts about Ferguson reached an average of 257 News Feeds while Ice Bucket Challenge posts reached an average of 2,107.
"How the internet is run, governed, and filtered is a human rights issue," Tufecki wrote in a blog post for Medium.com. "What happens to #Ferguson affects what happens to Ferguson."

Why Should You Care?

While you may not immediately feel the impact of Facebook's algorithm, it does have a daily effect, however well-hidden, in the way we take in information about the world around us. In an age when Facebook is an information source for over 1.5 billion users, it should be a cause for concern that wealthier organizations are able to pay their way to your News Feed, often boxing out any smaller organizations that you follow. And when Facebook dampens the reach of stories deemed "unpleasant," a label that could easily be applied to pressing issues such as climate change, overseas conflicts, and income inequality, your world view is being manipulated.

What Can You Do?

There are steps you can take in your personal use of Facebook to make sure certain groups aren't kept out of your News Feed. On a Page's "Like" button—or "Following" button on mobile—select "See First" from the drop down menu to prioritize it in your News Feed. Also, under "News Feed Preferences" in "Settings", you can choose whose posts you see first.

Ultimately, knowledge is power, so as long as you know how different websites and social networks sort their information feeds, it is then up to you how to proceed. These are all tools, each better for some uses and worse for others. As long as you know their strengths and weaknesses, you have the control.