Fake News Its Complicated - Claire Wardle

Download as pdf or txt
Download as pdf or txt
You are on page 1of 6

First Draft Follow

Non-pro t supporting truth & trust in news. Project of the @ShorensteinCtr. @CrossCheck is our
award-winning veri cation collaboration.  rstdraftnews.org
Feb 17, 2017 · 6 min read

Fake news. It's complicated.

By Claire Wardle, First Draft News Research Director

By now we’ve all agreed the term “fake news” is unhelpful, but without
an alternative, we’re left awkwardly using air quotes whenever we utter
the phrase. The reason we’re struggling with a replacement is because
this is about more than news, it’s about the entire information
ecosystem. And the term fake doesn’t begin to describe the complexity
of the di erent types of misinformation (the inadvertent sharing of
false information) and disinformation (the deliberate creation and
sharing of information known to be false).

To understand the current information ecosystem, we need to break


down three elements:

1. The di erent types of content that are being created and shared

2. The motivations of those who create this content


3. The ways this content is being disseminated

This matters. As Danah Boyd outlined in a recent piece, we are at war.


An information war. We certainly should worry about people
(including journalists) unwittingly sharing misinformation, but far
more concerning are the systematic disinformation campaigns.
Previous attempts to in uence public opinion relied on ‘one-to-many’
broadcast technologies but, social networks allow ‘atoms’ of
propaganda to be directly targeted at users who are more likely to
accept and share a particular message. Once they inadvertently share a
misleading or fabricated article, image, video or meme, the next person
who sees it in their social feed probably trusts the original poster, and
goes on to share it themselves. These ‘atoms’ then rocket through the
information ecosystem at high speed powered by trusted peer-to-peer
networks.

This is far more worrying than fake news sites created by pro t driven
Macedonian teenagers.

. . .

The Di erent Types of Mis- and Disinformation
Back in November, I wrote about the di erent types of problematic
information I saw circulate during the US election. Since then, I’ve
been trying to re ne a typology (and thank you to Global Voices for
helping me to develop my de nitions even further). I would argue
there are seven distinct types of problematic content that sit within our
information ecosystem. They sit on a scale, one that loosely measures
the intent to deceive.

. . .
Why is this type of content being created?

If we’re serious about developing solutions to these problems, we also


need to think about who is creating these di erent types of content and
why it is being created.

I saw Eliot Higgins present in Paris in early January, and he listed four
‘Ps’ which helped explain the di erent motivations. I’ve been thinking
about these a great deal and using Eliot’s original list have identi ed
four additional motivations for the creation of this type of content: Poor
Journalism, Parody, to Provoke or ‘Punk’, Passion, Partisanship, Pro t,
Political In uence or Power, and Propaganda.

This is a work in progress but once you start breaking these categories
down and mapping them against one another you begin to see distinct
patterns in terms of the types of content created for speci c purposes.
. . .

Dissemination Mechanisms
Finally, we need to think about how this content is being disseminated.
Some of it is being shared unwittingly by people on social media,
clicking retweet without checking. Some of it is being ampli ed by
journalists who are now under more pressure than ever to try and make
sense and accurately report information emerging on the social web in
real time. Some of it is being pushed out by loosely connected groups
who are deliberately attempting to in uence public opinion, and some
of it is being disseminated as part of sophisticated disinformation
campaigns, through bot networks and troll factories. (As you can see I
need to work up a 3D matrix to map my graph against the di erent
dissemination mechanisms).

As this Buzzfeed article highlights, a group of US Trump supporting


teenagers have connected online to in uence the French election in
April. They have shared folders of sharable ‘meme-shells’ so even those
who can’t speak French can drop visuals into hashtag streams. It’s now
incredibly easy for loosely connected groups to mobilize, using free
tools to co-ordinate private messaging.

When messaging is coordinated and consistent, it easily fools our


brains, already exhausted and increasingly reliant on heuristics (simple
psychological shortcuts) due to the overwhelming amount of
information ashing before our eyes every day. When we see multiple
messages about the same topic, our brains use that as a short-cut to
credibility. It must be true we say — I’ve seen that same claim several
times today.

On the night of the Inauguration attendees at the Deploraball boasted


to This American Life they “memed” Trump into the White House.
Listen to an excerpt.

They understand that we’re much less likely to be critical of visuals.


We’re much less likely to be critical of information that supports our
existing beliefs. And, as information overload exhausts our brains,
we’re much easier to in uence.

. . .

What can we do?
We all play a crucial part in this ecosystem. Every time we passively
accept information without double-checking, or share a post, image or
video before we’ve veri ed it, we’re adding to the noise and confusion.
The ecosystem is now so polluted, we have to take responsibility for
independently checking what we see online.

In the weeks after the US election, we saw journalists track down fake
news creators. One consistent element was that creators talked about
trying to create news that would fool people on the Left and how they
failed. As fake news creator Jestin Coler told NPR, “We’ve tried to do
similar things to liberals. It just has never worked, it never takes o .
You’ll get debunked within the rst two comments and then the whole
thing just kind of zzles out.”

But liberal debunking primacy was short lived. Since Trump’s


inauguration, we’re seeing both sides falling for and sharing false
information.Whether it’s the ‘rogue’ Twitter accounts that no one has
been able to independently verify, the Trump executive order meme
generator, users re-tweeting a post by Jill Stein’s parody account
desperately wanting it to be real, or claiming Vice-President Pence has
deleted a tweet condemning the Muslim ban when it was still sitting on
his timeline from December, the Left is showing that it is just as human
as the Right. When humans are angry and fearful, their critical thinking
skills diminish.
Craig Silverman was a guest on the “On The Media” radio show and
talked about the need for emotional skepticism. I couldn’t agree more.
This isn’t just about funding more news literacy projects, this is about
teaching people to second guess their instinctual reactions. If you nd
yourself incredibly angry at a piece of content or feeling smug (because
your viewpoint has been rea rmed), take another look.

In the same way that you’re told to wait 20 minutes before you reach
for a second helping of food, because you need to wait for your brain to
catch up with your stomach, the same is true with information. Maybe
you don’t need to wait 20 minutes before clicking the share button, but
two minutes is probably sensible.

This is a crucial time. If we’re going to truly understand the situation


we nd ourselves in, we need to understand the severity and we need
to understand what we’re ghting. Throwing the term fake news
around, even with air quotes, is getting us nowhere.

If you are interested in this topic. I have created an ongoing reading list.

You might also like