Skip to content

Living in the Age of Misinformation

The Implications of Facebook Falsehoods

Images via Shutterstock, Reclaim America Facebook page

Throughout history and literature, there are certain storylines that are carried from century to century. One of those is the protagonist who, overcome with ambition, creates something that he becomes obsessed with bettering and evolving. Isolated from the world, alone in his lab with his creation, the protagonist sees nothing but progress. Failing to have any objective reality to compare to, he evolves and evolves his creation until one day he steps back and he’s totally lost control. “I’ve created a MONSTER!” He screams. But at least Macbeth gets what he deserves and Viktor Frankenstein chases the thing he created to the ends of the earth trying to understand it.

Mark Zuckerberg is one such creator who totally lost control. AND he refuses to admit there is anything wrong with his monster.

It’s 2016. 1.79 billion people use Facebook every month. That’s more people than there are in China, the biggest country on the planet. Facebook has become one of the top sources of news consumption — 44 percent of Americans get their news from the site. Major media outlets have been scrambling to keep up.

You could say Mark Zuckerberg and his News Feed now wield as much power as world leaders. And amidst all of that power, last week alone, Facebook got in trouble for allowing advertisers to exclude races from Facebook ads, allowed a fake news story that Pope Francis endorsed Donald Trump to be shared almost 1 million times, and started telling people they were dead. That is DARK. If that doesn’t trigger a, “Hey, I think this algorithm’s got problems, Mark” — then there’s something seriously wrong here.

14976330_10211375709727876_6186042954162993416_o-1

 

This is no dystopian future of technology running rampant, this is no Black Mirror episode, this is our lives. And Facebook had a huge role in shaping people’s realities — showing people more and more of what they want to see, what they agree with — sometimes, fake news — leading up to the election. It sure doesn’t help that Facebook amplifies sensationalist news at faster rates — catchy headlines that get attention get more likes, and therefore are served into more people’s News Feeds and shared at a way faster rate than long-form thought pieces like this.

Buzzfeed News did a recent long-form investigation, fact-checking various articles posted on Left and Right leaning pages. It found that the more untrue an article was, the higher engagement it received on Facebook.

sub-buzz-22816-1477686763-3
via BuzzFeed News

That’s the algorithm, kids. *movie trailer voice* From the same network that brought you the Ice Bucket Challenge, organized revolutions and all those amazing Joe Biden memes — we give you, this:

One example: I’m from a small town in south Louisiana. The day before the election, I looked at the Facebook page of the current mayor. Among the items he posted there in the final 48 hours of the campaign: Hillary Clinton Calling for Civil War If Trump Is Elected. Pope Francis Shocks World, Endorses Donald Trump for President. Barack Obama Admits He Was Born in Kenya. FBI Agent Who Was Suspected Of Leaking Hillary’s Corruption Is Dead.

These are not legit anti-Hillary stories. (There were plenty of those, to be sure, both on his page and in this election cycle.) These are imaginary, made up, frauds. And yet Facebook has built a platform for the active dispersal of these lies — in part because these lies travel really, really well. (The pope’s “endorsement” has over 868,000 Facebook shares. The Snopes piece noting the story is fake has but 33,000.) (NiemanLab)

That’s right: imagine the Catholic population hearing right before the election that Pope Francis endorsed Trump. The post above has 28,000 shares. 

Facebook pages similar to the above, such as AddictingInfo, Freedom Daily and Right Wing News have more than a million likes a piece — that’s about as many as CNN and Politico — but they’ve been posting hoax articles like these at frightening rates. Buzzfeed also found that left wing strongholds like Occupy Democrats and The Other 98% slipped up a few times.

Zuckerberg denies that this sort of fake news being shared hundreds of thousands of times influenced the outcome of the election, because “both sides are probably sharing fake news,” which I guess somehow evens things out and makes that okay?!? Zuckerberg is also quoted saying that only about 1% of articles shared on the site are fake. The problem is that those fake articles often tend to go viral, and get shared more frequently. I also wonder if they are counting viral, untrue memes as “news,” because many times they have the same effect. Here’s an official status update from Zucks:

“Of all the content on Facebook, more than 99% of what people see is authentic. Only a very small amount is fake news and hoaxes,” Mr. Zuckerberg wrote. “Overall, this makes it extremely unlikely hoaxes changed the outcome of this election in one direction or the other.” (The New York Times)

Buzzfeed news found that “top right-wing Facebook news outlets published false or misleading stories 38 percent of the time, compared to 20 percent for top left-wing outlets” at “an alarming rate they reported, in a headline that rings so true. This is actually why, when Facebook first met to address the fake news issue in May, they saw that the algorithm would disproportionately pull more hoax articles posted by conservatives.

“They absolutely have the tools to shut down fake news,” said the source from Facebook, who asked to remain anonymous citing fear of retribution from the company. The source added, “there was a lot of fear about upsetting conservatives after Trending Topics,” and that “a lot of product decisions got caught up in that.” (Gizmodo)

This seems like a missed opportunity for Facebook to have nipped this mess in the bud earlier this year.

“So why doesn’t anyone check their sources anymore?” You scream into the night. It’s the responsibility of both the left and the right to be more aware of biases when they’re reading left or right leaning publications. This is just another wake up call like this whole election. Do you think most of America even considers fact checking? Especially when they see 30 of their friends sharing a news story — they’re going to take that as fact and the source as legitimate. What’s really disturbing about NiemanLab’s examples is that a MAYOR in Louisiana shared four fake news stories. If these people see their mayor share it, chances are they’re going to think it MUST be true.

Does anyone have a shred of accountability anymore?

I’ve been thinking a lot about how Wikipedia moderates the left and the right as they edit pages to come closer to the truth.

It’s easy to bemoan the partisanship and downright nastiness of the Internet, but Wikipedia offers some much-needed optimism online. Despite the fact that the website is constructed by millions of anonymous contributors, new research shows that it reduces ideological segregation and is remarkably good at finding neutrality, even on the most contentious topics.

“It gives us hope!” said Greenstein, a Harvard Business School professor. “You do get [both sides] trying to destroy the other. But what seems to hold it together is to have one paragraph for both sides. Space is free for all intents and purposes, and we can … edit each other’s paragraphs for accurate points of view.”  (Washington Post)

Now, Wikipedia is an encyclopedia, so these people do have it in their best interest to not mess up history. This new study showed that the left side tended to edit the right side’s pages, and the right side tended to edit the left side’s pages, you know, to keep everyone in check. That sort of checks and balances system makes Wikipedia one of the most truthful sites on the Internet.

The problem with Facebook is that the left rarely sees the news feed of those on the right, and vice-versa. Many people unfollow or even defriend others from the opposite side, making it impossible for them to be served conflicting opinions from their own. I see sad, beautiful monologues after Trump wins, and the beginning chatter of revolution — meanwhile, people 10 minutes from where I grow up are seeing articles about crybaby Democrats. I see articles about racist acts happening in high schools: kids tormenting other black kids because they now think it’s okay. The fact is those people 10 minutes from where I grew up aren’t ignoring these articles; they’re simply not sharing them, so no one they know is seeing them.

Even if I wanted to report a fake news story disseminated by Republicans everywhere, that story would never appear in my feed.

What is the drawback of only seeing the same types of things over and over? Well, you get blindsided. You don’t see America for what it truly is because it’s hardly like that in the Facebook Feed of you and all you friends that live in cities. It may be the internet, but this is your reality. This is where you get your news. This is where you form your perceptions. And you have been seeing progress consistently for eight years. You’ve been seeing Black Lives Matter, gay marriage legalized, strong women succeeding. You’ve been seeing hope.

What’s the view from the other side? You’ve been seeing Democrats push forward their agendas on the environment, the affordable care act, what your friends sometimes call socialism. Meanwhile Muslims and illegal immigrants have infiltrated America, sucking you dry. You constantly see articles about Hillary’s emails: she’s being actively investigated by the FBI, she’s responsible for the Benghazi tragedy, she wants to shut down coal mines. America is a total nightmare according to your feed, and all you want is someone different — someone outside of a system that has failed you.

A divided America. Living in two worlds, side by side but so far apart. I’m not saying Facebook caused the election results, but it increases and accentuates the divisiveness that already exists in this country. We keep seeing what we want to see over and over and over. Our perceptions and opinions constantly get validated by like-minded people. And that’s not a good thing.  

“I think we would be surprised by how many things that don’t conform to our worldview, we just tune out,” Zuckerberg said. “I don’t know what to do about that.” We are defriending, we are hiding things that make us uncomfortable, Facebook is helping us along — and we are becoming more and more isolated.

People will say, “Well Facebook is not real life.” But I can’t tell you how many times I go out to bars and restaurants, and it is all anyone is talking about. The other day I went out to dinner with my parents, looked up, and they were both on their phones, on Facebook, talking about what the neighbors and their friends and their friends kids are up to. This social medium has become so intertwined with our lives that we don’t even realize how much it affects us anymore.

Another classic theme in literature is that of isolation breeding disillusion. People trap themselves in houses in the middle of nowhere, and with no contact with the outside world, get stuck in their own head — in their own bubbles — and usually end up going totally insane. It is imperative that we are able to open our eyes and look beyond this algorithm, and question what we see. And I know those who are open-minded will do that, but what about the rest of America?

Should we have a set of trusted editors watching Facebook to flag and remove fake news? I doubt Zuckerberg would spend that kind of money, especially since he’s denying Facebook has problems. Plus, just this year he fired Facebook’s “Trending Editors,” real life humans who used to write and manage the headlines in your Trending Section, and replaced those people with what? You guessed it. Another algorithm. A “newsworthy” algorithm.

Okay okay, what about another algorithm? I hesitate to suggest another one, but what about a “Fact-Checking Algorithm,” that crosschecks articles posted on Facebook with other sources on the Internet. If the algorithm doesn’t see any other matching titles elsewhere, the article gets flagged, then removed.

Or, what about a new algorithm that would expose people to more opposing opinions in their News Feeds? You “like” a liberal article, and the suggested articles below are all from the right side. Then at least we wouldn’t be totally closed off from what’s happening outside of our bubble.

screen-shot-2016-11-15-at-8-29-04-amThe first thing we the people can do is start to hold ourselves accountable. Like the Wikipedia editors, if we see fake news, we can report it on Facebook (see how right). We can start questioning what we’re seeing, check sources when we see a suspicious article, and stop accepting things at face value — especially on Facebook. The more we all come together, open our eyes, and act as a team to report blatant misinformation, the more we can at least keep both worlds spreading truth. We can also try to encourage our friends and family, especially if they have different beliefs, to do the same.

And if we the people start holding ourselves accountable, a huge player like Facebook needs to too. They need to admit that even though only 1% of articles are untrue, its algorithm tends to push out those sensationalist hoaxes the most.

It seems according to this recent New York Times article that Facebook’s staff has started to feel like they need to address some of the problems with the News Feed algorithm breeding fake news. But Zuckerberg won’t budge, and at a meeting last Thursday after the election, “many employees said they were dissatisfied with an address from Mr. Zuckerberg, who offered comments to staff that were similar to what he has said publicly.”

When anything becomes this big and powerful, its creator needs to suck it up and admit he needs to do something. Have some accountability, Zuckerberg. Get help… and fast. The fate of the free world depends on you.

Keep Your Feed Fed

10 TikTok Ideas Anyone Can Do

6 Social Media Trends to Try in 2023

How to Use Social Media for Growth and Discovery

5 Ways to Reach New Audiences with Instagram Reels

The Benefits of Partnering With A Full-Service Social Media Agency For Multifamily Residential

What Makes People Hit the Follow Button