As with any toxic relationship, the possibility of a breakup sparks feelings of terror—and maybe a little bit of a relief. That’s the spot that Facebook has put the news business in.
In January, the social media behemoth announced it would once again alter its News Feed algorithm to show users even more posts from their friends and family, and a lot fewer from media outlets.
The move isn’t all that surprising. Ever since the 2016 election, the Menlo Park–based company has been under siege for creating a habitat where fake news stories flourished. Their executives were dragged before Congress last year to testify about how they sold ads to Russians who wanted to influence the U.S. election. In some ways, then, it’s simply easier to get out of the news business altogether.
But for the many news outlets that have come to rely on Facebook funneling readers to their sites, the impact of a separation sounds catastrophic.
In an open letter to Zuckerberg, San Francisco Chronicle editor-in-chief Audrey Cooper decried the social media company’s sudden change of course on
Jan. 12. “We struggled along,
trying to anticipate the seemingly capricious changes in your news-feed algorithm. We created new jobs in our newsrooms and tried to increase the number of people who signed up to follow our posts on Facebook. We were rewarded with increases in traffic to our websites, which we struggled to monetize.”
The strategy worked for a time, she says.
“We were successful in getting people to ‘like’ our news, and you started to notice,” wrote Cooper. “Studies show more than half of Americans use Facebook to get news. That traffic matters because we monetize it—it pays the reporters who hold the powerful accountable.”
But just as newspapers learned to master Facebook’s black box, so, too, did more nefarious operations, Cooper noted. Consumers, meanwhile, have grimaced as their favorite media outlets have stooped to sensational headlines to lure Facebook’s web traffic. They’ve become disillusioned by the flood of hoaxes and conspiracy theories that have run rampant on the site.
Now sites that relied on Facebook’s algorithm have watched the floor drop out from under them when the algorithm changed—all while Facebook has gobbled up chunks of the print advertising revenue that had always sustained news operations.
It’s all landed media outlets in a hell of a quandary—it sure seems like Facebook is killing journalism. But can journalism survive without it?
It’s perhaps the perfect summation of the internet age: a website that started because a college kid wanted to rank which co-eds were hotter became a global Goliath powerful enough to influence the fate of the news industry itself.
When Facebook launched its News Feed in 2006, it ironically didn’t have anything to do with news. This was the site that still posted a little broken-heart icon when you changed your status from “In a Relationship” to “Single.”
The News Feed was intended to be a list of personalized updates from your friends. But in 2009, Facebook introduced its iconic “like” button. Soon, instead of showing posts in chronological order, the News Feed began showing you the popular posts first.
And that made all the difference. Well-liked posts soared. Unpopular posts simply went unseen. Journalists were given a new directive: If you wanted readers to see your stories, you had to play by the algorithm’s rules. Faceless mystery formulas had replaced the stodgy newspaper editor as the gatekeeper of information.
With digital ad rates tied to web traffic, the incentives in the modern media landscape could be especially perverse: write short, write a lot; pluck heartstrings or stoke fury.
Mathew Ingram, who covers digital media for Columbia Journalism Review, says such tactics might increase traffic for a while. But readers hate it. Sleazy tabloid shortcuts give you a sleazy tabloid reputation.
“Short-term, you can make a certain amount of money,” Ingram says. “Long-term, you’re basically setting fire to your brand.”
The News Feed, Zuckerberg announced in January, had skewed too far in the direction of social video posts from national media pages and too far away from personal posts from friends and family. They were getting back to their roots.
Even before the announcement, news sites had seen their articles get fewer and fewer hits from Facebook. In subsequent announcements, Facebook gave nervous local news outlets some better news: they’d rank local community news outlets higher in the feed than national ones. They were also launching an experiment for a new section called “Today In,” focusing on local news and announcements, beta-testing the concept in certain cities. But in early tests, the site seemed to have trouble determining what’s local. The San Francisco Chronicle and other Bay Area news outlets say they’re taking a “wait-and-see” approach to the latest algorithm, analyzing how the impact shakes out before making changes. They’ve learned to not get excited.
There was a time Facebook was positively smug about its impact on the world. After all, it had seen its platform fan the flames of popular uprisings during the Arab Spring in countries like Tunisia, Iran and Egypt.
“By giving people the power to share, we are starting to see people make their voices heard on a different scale from what has historically been possible,” Zuckerberg bragged in a 2012 letter to investors under the header, “We hope to change how people relate to their governments and social institutions.”
And Facebook certainly has—though not the way it intended. A 2016 BuzzFeed investigation found that “fake news” stories on Facebook, hoaxes or hyper-partisan falsehoods, actually garnered more views than stories published in trusted outlets like The New York Times.
That, experts speculated, is another reason why Facebook, despite its massive profits, might be pulling back from news.
“As unprecedented numbers of people channel their political energy through this medium, it’s being used in unforeseen ways with societal repercussions that were never anticipated,” writes Samidh Chakrabarti, Facebook’s product manager for civic engagement, in a recent blog post.
By last May, a Harvard-Harris Poll found that almost two-thirds of voters believed that mainstream news outlets were full of fake news stories.
The danger of fake news, after all, isn’t just that we’re tricked by bogus claims. It’s that we’re pummeled by so many different contradictory stories, with so many different angles, that the task of trying to sort truth from fiction becomes exhausting.
Facebook has tried to address the fake news problem—hiring fact checkers to examine stories, slapping “disputed” tags on suspect claims, putting counterpoints in related article boxes—but with mixed results. The latest headache for the company arrived last week when it was revealed that the Trump campaign had used Cambridge Analytica to mine personal data of some 50 million Facebook users.
Facebook’s new algorithm threatens to make the fake news problem even worse. By focusing on friends and family, it could strengthen the filter bubble even further. To determine the quality of news sites, Facebook is rolling out a two-question survey about whether users recognized certain media outlets, and whether they found them trustworthy. The problem is that a lot of Facebook users, like Trump, consider the Washington Post and the New York Times to be “fake news.”
The other problem? There are a lot fewer trustworthy news sources out there. And Facebook bears some of the blame for that, too, the Chronicle‘s Cooper says.
“I’ve built my career on exposing hypocrisy and wrongdoing and expecting more of those with power, which is why I have repeatedly said Facebook has aggressively abdicated its responsibility to its users and our democracy,” she says. “I expect a lot more from them, as we all should.”
A version of this article first appeared in the Inlander. Jennifer Wadsworth contributed to this report.