Over the last year, Facebook’s design team has been traveling around the world from Germany to Indonesia “talking with people about their experiences with misinformation.”
After a year of testing and learning and traveling the world, Facebook tweaked how they alert people when they see false news in their newsfeed.
In short, they took away the red flag next to probable fake news postings.
Now, they’re more subtle about how they call bullshit.
Just to be clear, this process took a year.
Thus was fulfilled the saying “a lie can travel halfway around the world while the truth is still putting on its shoes.” (Which wasn’t actually said by Mark Twain, of course.)
Facebook took a long journey to arrive at a place of delicately calling bullshit. But ultimately I think it’s for the best.
To understand, that’s rewind back in time to one year ago, shortly after the election of Donald Trump. It was this time last year that we first saw the rapid and dramatic rise of the term fake news, according to Google Trends.
This was not the first incidents of online fake news — Snopes could tell you that. But the role of deliberately fake news planted in social media during the election shone a new spotlight on its prominence and impact.
The current trend was covered by BuzzFeed’s first media editor, Craig Silverman, who covered hoaxes like the fake-news cottage industry based in Macedonia. (He also has an excellent name for an email newsletter, The Fake Newsletter.)
“Fake news has been a really extraordinary story, and it turns out it’s the kind of story Craig has been kind of preparing for for some time — maybe his whole life,” Buzzfeed Editor in Chief Ben Smith told Mathew Ingram, noting Silverman’s credentials writing a book and a column for the Poynter Institute about fact-checking called Regret the Error.
“[M]y beat is going to be more about networked media or democratized media — platforms and networks, misinformation and the economic incentives for creating different types of content,” Silverman explained.
At first, the nascent term “fake news” exclusively described Facebook hoaxes and deliberately false stories. This was, of course, short lived. The term was quickly appropriated and weaponized by president-elect Trump. The term peaked in early January 2017 after Trump called CNN “fake news” in a news conference.
Since the term fake news morphed into a Trump rallying cry and Donald Trump Jr’s t-shirt directed at “the enemy of the people,” there have been other terms used to describe the situation we find ourselves in: Alternative facts. Misinformation. Lying. Post-truth. And after post-truth became the 2016 word of the year, Stephen Colbert asserted that it was simply a rip-off of truthiness.
I think another classic term applies best: Bullshit.
Princeton philosophy professor Harry Frankfurt published his classic booklet On Bullshit in 2005, long before the term fake news and as Facebook’s main use was tagging drunken photos of your friends in college. But his warnings resonate today, maybe more than ever before.
“One of the most salient features of our culture is that there is so much bullshit,” Frankfurt wrote in the opening line. “Everyone knows this.”
Bullshit isn’t necessarily lying. To lie, you have to know the facts and willfully misrepresent them. To bullshit, you sort of just share whatever comes to mind — might be true, might be false — to give off the impression you want. As Frankfurt writes:
It is impossible for someone to lie unless he thinks he knows the truth…. For the bullshitter, however, all these bets are off: he is neither on the side of the true nor on the side of the false. His eye is not on the facts at all, as the eyes of the honest man and of the liar are, except insofar as they may be pertinent to his interest in getting away with what he says. He does not care whether the things he says describe reality correctly. He just picks them out, or makes them up, to suit his purpose.
The question then becomes, why would someone bullshit? Frankfurt has some theories for the growth of bullshit.
Bullshit, he says, necessarily arises when we feel we need to have an opinion on something, whether we’re adequately informed or not. It’s increasingly hard to be indifferent in our polarized world. We’re repeatedly told to pick a side. Those who actually are indifferent in this climate are easily mocked:
And so we’re told every day on social media that we need to have an opinion. This doesn’t always create informed opinions — just loud ones. “Bullshit is unavoidable whenever circumstances require someone to talk without knowing what he is talking about,” as Frankfurt puts it:
This discrepancy is common in public life, where people are frequently impelled — whether by their own propensities or by the demands of others — to speak extensively about matters of which they are to some degree ignorant. Closely related instances arise from the widespread conviction that it is the responsibility of a citizen in a democracy to have opinions about everything, or at least everything that pertains to the conduct of his country’s affairs. The lack of any significant connection between a person’s opinions and his apprehension of reality will be even more severe, needless to say, for someone who believes it his responsibility, as a conscientious moral agent, to evaluate events and conditions in all parts of the world. (Emphasis mine.)
In other words, the more someone thinks they need to be a vocal and democratic citizen, the more susceptible they are to bullshit. It’s like the Dunning-Kruger effect for opinions.
The more someone thinks that sharing their opinion on Facebook is a public good, the more they are likely to share bullshit.
“Stupidity’s never blind or mute,” wrote French philosopher Gilles Deleuze. “We’re riddled with pointless talk, insane quantities of words and images… Repressive forces don’t stop people expressing themselves but rather force them to express themselves.”
Therein lies Facebook’s bullshit problem, and how we arrived at their carefully contemplated efforts to combat falsehoods a year after the rise of the term “fake news.”
So how do you fix that problem? As Facebook’s design team wrote earnestly, “We learned that dispelling misinformation is challenging.”
“As the designer, researcher, and content strategist driving this work, we wanted to share the process of how we got here and the challenges that come with designing against misinformation,” explained Jeff Smith, Product Designer, Grace Jackson, User Experience Researcher, and Seetha Raj, Content Strategist.
Smith, Jackson and Raj wrote that simply telling something is “false” or “disputed” doesn’t always do the trick. Strong language or a red flag can have an unintended effect.
“Just because something is marked as ‘false’ or ‘disputed’ doesn’t necessarily mean we will be able to change someone’s opinion about its accuracy,” they write. “In fact, some research suggests that strong language or visualizations (like a bright red flag) can backfire and further entrench someone’s beliefs.”
That makes sense if you consider it in Frankfurt’s context. Bullshit isn’t about verifying truth and falsehoods. It’s more about projecting an identity and worldview. If you call bullshit on someone, that isn’t likely to get them to admit, Oh right, that’s total bullshit and I really don’t know what I’m talking about.
So how do you tell people that what they’re seeing on your platform next to your aunt’s Christmas photos and your friend’s puppy videos is… bullshit? How do you help people recognize and avoid being taken in by bullshit?
You do it delicately.
Facebook’s current best practices, based on A/B testing, includes showing related articles that dispute an article in question. This doesn’t slow down clicks on hoax articles, but it does reduce the number of people sharing them.
“Using language that is unbiased and non-judgmental helps us to build products that speak to people with diverse perspectives,” the Facebook design team writes.
This sounds like the design equivalent of gently asking, “Are you sure?” rather than shouting “Bullshit!”
It might seem crazy that it took more than year of traveling, planning and testing for Facebook to arrive at the fact that they should be a little more subtle and delicate when calling bullshit. But it’s the right approach.
It may feel good to yell Bullshit! when you see bullshit. That’s essentially what liberals did with the term fake news. Then it was immediately turned against them. A lot of good that did.
The delicate — and effective — art of calling bullshit requires a lot less yelling, and a little more talking.