Facebook

Should we stop blaming Facebook?

f8-facebook-mark-zuckerberg-0069
mm
Written by Kamil Arli

The author of Lance Ulanoffl wrote an article about the ‘Fake News’ claims about Facebook. He questions whether shoud we blame Facebook or not.

These are things we know to be true:

  • Facebook is increasingly a leading news source for many Americans.
  • There is a crap-ton of fake and biased news on Facebook.

Here’s what people are getting wrong:

  • Facebook deserves all the blame.
  • It’s Facebook’s fault that Donald Trump is now President-Elect.

There is no defense for fake news. I hate it and lose my mind when someone shares it on social media (do you know how many times Facebookers buried Abe Vigoda or re-reported the death of a celebrity who died five years ago?). But I don’t entirely blame Facebook. Facebook is a terrible media source that should be questioned at every published syllable.

Your personal newspaper

Facebook has never been shy about having “news” aspirations. In 2006, it introduced the News Feed. Then, as now, Facebook’s definition of “news” was comically loose. From the original postintroducing the News Feed:

News Feed highlights what’s happening in your social circles on Facebook. It updates a personalized list of news stories throughout the day, so you’ll know when Mark adds Britney Spears to his Favorites or when your crush is single again. Now, whenever you log in, you’ll get the latest headlines generated by the activity of your friends and social groups.

You see, there was no “news” in the “News Feed.” It was a social calendar at best and an invasion of privacy at worst.

Actual news didn’t start showing up at scale on Facebook until 2009. That was when Facebook promoted the “What’s on your mind?” content entry box to the top of the feed. That small but important change transformed the complexion of Facebook and remade tens of millions of Facebook users into content publishers.

Then in 2011, Facebook started manipulating the feed, using your own activities on Facebook (liking posts, brand pages and photos, commenting, joining groups) to influence which stories you saw in your feed.

SEE ALSO:   Facebook ends 2016 more powerful than ever, but it's under more scrutiny, too

It was at this same time that Facebook started referring to the platform as “your own personal newspaper.”

Zuckerberg’s desire for Facebook to become a news source was in direct response to the then meteoric rise of Twitter. For me, it’s always been a better news source than Facebook.

There are hoaxes and fake news on Twitter, too, but, as a public social platform, there are, I would say, fewer nooks and crannies to hide than on Facebook, which starts off as a private social media platform. Ninety-nine percent of tweets are public, and while Twitter does have a content algorithm, tweets typically rise or get buried through other tweets and retweets.

But Twitter’s fortunes have waned while Facebook’s skyrocketed. During this election cycle, however, things changed.

First, Facebook faced charges of bias in its Trending Topics section, which promotes various news topics across categories like politics, sciences, technology and sports. The human editors (actual journalists!) were soon fired, and Facebook turned the reins back over to its trusty (and occasionally stupid) algorithms.

At the same time, Zuckerberg started backpedaling on the very idea of Facebook being a media company: “We are a tech company, not a media company. We build the tools; we do not produce any content

When it comes to news – fake, real or incredibly biased– Facebook is merely an aggregator 

He is right. As far as I know, Facebook has no newsroom, beat reporters, copy desks or fact checkers (especially not fact checkers). I’m sure there are still some editorial types on staff, but they are not managing the News Feed or Trending feature.

More importantly, Zuckerberg and his minions are not reporting out or writing these fake stories. When it comes to news – fake, real or incredibly biased– Facebook is merely an aggregator. A terrible, terrible aggregator. And, no, I do not buy that simply hosting news content turns you into a media site.

The blame for dissemination and elevation of these fake and biased stories must, at least in part, fall on Facebook users. It’s their activities, after all, that drive these stories into the mainstream feed. It’s their seemingly blind belief in the ridiculous and inflammatory that gives these tales credence. They are the ones transforming the apocryphal into the possible.

SEE ALSO:   WhatsApp vulnerability allows snooping on encrypted messages

But why blame yourself when it’s easy to point fingers?

Belief is a stubborn thing

As a journalist, I’ve been trained to look for truth, to not accept pat answers or anything at face value. I’m a skeptic. Seeing is only believing for me if I can see the man behind the curtain pulling the strings, as well.

I would not expect those outside the media to be as skeptical as me. On the other hand, a modicum of critical thinking should be a prerequisite for basic humanity.

I’m not arguing that it’s a lack of thinking that got us here. When someone sees a headline like “Denzel Washington backs Donald Trump in the most epic way possible,” it should give them pause. But as Professor Thomas E Kida notes in his 2006 book Don’t Believe Everything You Think: The 6 Basic Mistakes We Make in Thinking, “We have a number of cognitive tendencies that lead us to form incorrect beliefs and make erroneous decisions.”

Kide writes that “The beliefs we hold are closely tied to the decisions we make. In effect, what we believe affects what we decide.” (This does explain the election.) This seems kind of obvious, but Kide’s book is a dissertation on all the ways our Gordian knot of thoughts and belief systems can lead to bad or inaccurate conclusions. He credits people with often making “very good decisions” but then points out that superstition (belief in fortune tellers and astrologers) and wish-fulfillment (hope that alternative medicine can cure an illness) can lead us astray.

In all, Kide list six major mistakes we make in thinking, but two jumped out at me:

We prefer stories to statistics:

Anecdotal information will often win over facts.

We seek to confirm:

This confirmation bias is one of the chief ways these stories and Facebook’s algorithms abuse us. If you see a story – even a false one — that conforms to your world view, you’ll like it or share it. Enough people do that on Facebook to help drive some of the most incendiary and hyperbolic stories to the top of Trending and our “news” feeds.

SEE ALSO:   What is the Facebook’s secret for acquiring companies?

Facebook is, essentially, the middleman between our wishes and the sources of wish fulfillment. For every person who is appalled by a fake or biased story shared or trending on Facebook, there’s an equal number who smile and think to themselves, “I knew it.”

Facebook, though, shouldn’t be let off the hook here.

There is an obvious financial incentive for Facebook and the fake-news providers. Both make money off advertising connected to these stories. Google already announced that it won’t let its ad products appear on these sites. While it’s not clear how Google will identify “fake news,” it’s encouraging to know they’re trying.

Facebook has also promised to ban fake news sites and offered a similar lack of detail on how they’ll do it.

It is time for Facebook users to wake up

Since it may be damn near impossible to scrub out all the fake news and the change will do nothing to address overly biased content, Facebook should go further and  reprogram its algorithms to blacklist news sources that break the rules and ethics of journalism by posting blatantly false stories (one proven false headline would get your site banned from Facebook for six months).

All that might help, but it won’t stop Facebook users from posting and believing lies.

It is time for Facebook users to wake up, to pause a beat or two before sharing any post. Just because it makes you feel good to read that “Hillary Clinton to Be Indicted…Your Prayers Have Been Answered,” doesn’t make it so.

We are being manipulated — not by Facebook, but by external forces that know we won’t bother to check, that understand our need for confirmation and self-satisfaction.

It’s time to stop blaming Facebook and to start taking some responsibility for what we share on social media and the election results. Facebook didn’t vote. Its members did.

Source: mashable

About the author

mm

Kamil Arli

Editor of DigitalReview.co. Digital Media Consultant

Leave a Comment