CLA Season

CLA Speaks
Cal Poly Theatre & Dance Department
Cal Poly Music Department
Cal Poly Arts Season 2016-17

Calendar

Cal Poly CLA News

The latest online edition of CLA's Impact Magazine

Ask an Expert: What Makes Online Misinformation So Dangerous — And Shareable?

WRITTEN BY GABBY FERREIRA

A pandemic, natural disasters and widespread protests are all happening across the country right now, and a significant epidemic of misinformation is exacerbating the chaos.

Kim Bisheff
Kim Bisheff is a lecturer in Cal Poly's Journalism Department
and an expert in how misinformation spreads online.

Unequivocally false claims have been swirling across the internet — that antifa protesters are being bussed into towns ahead of protests to loot, that hydroxychloroquine is a cure for the coronavirus — and in many cases, the sheer number of people who believe them and spread the misinformation only serve to more deeply fracture our shared perceptions of truth and reality.

Cal Poly News sat down with Kim Bisheff, a lecturer in Cal Poly’s Journalism Department and an expert in how misinformation spreads online, for a chat about what we face, why it’s harmful — and what we can do about it.

This conversation has been edited and condensed for clarity.

What is misinformation? What counts as misinformation on a social network like Facebook or Instagram?

It’s a really tricky term to define these days because it’s being thrown around so much, but in a general sense, misinformation is false information that is pushed out to the public with the intent to deceive.

Generally, misinformation is designed to tap into our emotional reactions: fear and anger, most frequently. We humans have this design flaw where, when we experience strong emotions —especially fear and anger — or in some cases, that feeling of validation that comes with having your opinions restated in an inflammatory way, we tend to react to that information without giving it much critical thought.

On social media, that means liking things, retweeting, “hearting,” commenting, interacting in any of those ways. When we do that, we amplify that message in a way that in the olden, pre-social media days we didn’t have the power to do.

A lot of that content is political. A lot of it is, these days, COVID-19 related.

Here’s an example: a lot of the misinformation around the pandemic is about miracle cures, like gargling with bleach or the whole hydroxychloroquine thing.

When that came out, it tapped into this really strong fear that people had, and in a vacuum of concrete information, they just ran with it. People, even very bright people, were getting hydroxychloroquine and taking it prophylactically even though there has never been any definitive evidence that it’s helpful. Of course, eventually it was shown that not only does it not help, but it can be very harmful.

But in that interim, and even still now, there are memes out there, there are narratives that are being passed on by people who are looking for a solution to this big scary thing that’s out there. They’re going to grab on to any possible solution they can find.

Another example of misinformation is texts and Facebook posts about antifa being bussed into SLO in advance of protests. That exact rumor is playing out in cities all over the country with almost the exact wording every time.

Those messages were designed to cause a fear reaction and be spread around. Locally, that led to shop owners being fearful and boarding up their stores, which had a negative emotional impact on a lot of community members. That’s an example of misinformation that spreads with absolute intent to deceive and to cause uncertainty.

Those repeated messages are super powerful because the more we’re exposed to false information, the more likely we are to accept it as true information.

How does misinformation spread? How does misinformation find a footing in an audience?

It taps into emotional triggers.

We’re already on heightened alert, at really high levels right now, and then we see something that seems to legitimize that fear. We share it, we spread it, we send it to friends because we’re worried about them.

We have this feeling of having insider information because it came to us from a friend of a friend. Once someone we know is attached to that piece of misinformation, then it lends it this legitimacy that shuts down our critical thinking even more. So, it continues to spread and spread and spread.

The interesting thing about something like that is once a piece of misinformation has been debunked, the truth really doesn’t spread.

False information spreads much faster than true information, and that includes misinformation that’s been debunked or corrected. Once this thing has gone out there, it’s really hard to contain again.

That has to do with human nature. The emotions that are involved in debunking are less satisfying than the emotions that are involved in legitimizing our fear and anger.

There can also be a little bit of embarrassment, especially if we realize that we ourselves were suckered into believing something that was false. We consider ourselves intelligent beings and we shouldn’t fall for that stuff. It’s embarrassing! We maybe quietly think, “Oh that wasn’t right,” and we move on. Whereas when we experience the more negative emotions that are triggered by scary content, just as humans, we feel a greater urge to put it out there.

Because of the nature of viral information, it spreads quickly and in a way that causes us humans to adopt beliefs. Developing a belief is much easier than tearing down that belief because of evidence to the contrary.

When you’re dealing with people you respect and love and they’ve consumed really well-produced convincing misinformation, like that Plandemic hoax video, and they absolutely believe it, it’s just heartbreaking. It’s so hard to have those conversations. It’s so hard to convince them that it’s not something they should put any faith in. It’s challenging.

How do you confront a loved one about misinformation that they truly believe, or have shared on social media?

What I’ve learned is that first of all, it’s a very, very, very difficult task.

Secondly, once someone has adopted a belief, if you simply present them with facts contrary to that belief, you’re not going to change their mind. They’re more likely to just dig in. The best way to start to change someone’s mind is to find common ground.

In the case of the Plandemic video, for example, you could say something like this:

“Wow, I saw that video. It’s really convincing. When I first saw it, I thought, ‘I can see how this makes so much sense.’ I was curious about the woman who was interviewed, so I Googled, and look what I found. What do you think about this?”

When we bring them on that journey of discovery together, it takes the shame and accusation away and makes it more likely that they’ll be receptive to the possibility that their first interpretation was wrong.

It’s really key to make it clear this is a person I love and respect and I’m not telling you you’re stupid or wrong. I want to go on this intellectual journey with you. Let’s do this together.

What are the consequences of misinformation? How do we see misinformation impact people both on social media and then in the real world?

I am very concerned that if Americans continue to look at their social media feeds to find out about current events and how the world works, and if they continue to turn away from legitimate sources of news and science information, then when election time comes, we’re going to make some really bad decisions.

It sounds like conspiracy, but there are plenty of people who would like to see the downfall of our democracy.

Part of that game plan is taking advantage of our tendency to react to inflammatory content by spreading it through social networks. We all need to inoculate ourselves against that by being aware that this is a problem that exists.

We need to think critically whenever we are confronted by those scary emotions because of something we saw or read. Of course, in the pandemic era it’s really hard to do that because everything we see is scary.

When I give talks, one thing I’ve started to tell people is they need to do some social media distancing.

I would never tell anyone to stop consuming social media because that’s a fool’s errand, but it shouldn’t be the place where we go to find out about what’s happening in the world. It should be the place where we go to find joy and share pictures of our families, beautiful hikes we’re taking, and anything that makes us feel positive emotions.

If there’s anything that makes us feel a negative emotion that we consume, especially on social media, we just need to stop and take a breath and put on our critical thinking brain and at the very least, don’t react to it on social media.

If we just stop sharing it, this problem goes away.

How should people judge for themselves whether they’re looking at a credible news source?

Mark Zuckerberg is not going to save us, we gotta do this on our own. [Laughs].

When we’re deciding what’s a credible news source, one thing we should look for is whether the stories have bylines. We should then look up the people who have published that information, and look them up across their social channels and see: do they have a journalism background, do they have an advocacy background, who are they?

We need to improve our ability to tell the difference between fact-based writing and opinion writing. That’s a big problem.

Opinion pieces tend to generate a bigger emotional response, so in the digital world those individual stories are more likely to get shared and spread. As readers, we tend to be pretty bad at telling the difference between opinion and news when it’s not clearly labeled in a separate section in a newspaper.

Another thing to look for is attribution. Responsible, professional journalism attributes its statements to credible sources. Every statement should have clear attribution that helps us understand exactly where the reporter got that information. The word “said” is all over every professional news story. When we’re reading casually, we don’t even notice that word, but if you start looking for it intentionally, you’ll see after that “said” is a source that we can look up independently and verify.

We want our information attributed to primary sources. That means, for example, if it’s something that has to do with a real estate development, then we want to make sure the reporter is talking to the development director and not a random neighbor who’s angry about the impact that development may or may not have on their property value. Reporters do include opinion statements from the general public, but those statements are lower in the story and serve a different purpose. We want our facts to be confirmed by primary sources.

The News Literacy Project has fantastic resources for educators and for anyone who wants to improve their news literacy skills. They just launched a podcast called, “Is that a fact?”

Another good resource is MediaWise, a program through the Poynter Institute that’s geared toward empowering people to become better consumers of information. Their free, online fact-checking course for first-time voters goes live in October.

Could you give any examples of misinformation that spread on social media that bled into the real world?

The “Pizzagate” rumor is the most notorious example.

In 2016, a conspiracy theory started circulating online that some prominent Democrats were operating a pedophile ring in the basement of a Washington, D.C. pizzeria. It gained so much traction and caused so much outrage among its believers that one of them armed himself with a rifle and a handgun and traveled from North Carolina to that pizzeria. He barged in, shot open a lock, and found … cooking supplies.

Not only were there no captive children, but the restaurant didn’t even have a basement. Amazingly, versions of this conspiracy theory are still circulating online.

What are some ways that misinformation is being combated online? Are there any good solutions here, or is it just a necessary evil we have to accept with the Internet?

There are some really great journalists doing fact checking work. These are non-partisan groups that are staffed by professional, experienced journalists who are looking at pieces of information that appear in the press, information that comes out of our prominent figures’ mouths, and they’re fact checking in real time. That’s become a really important function of the press.

Because fact checking itself has become so politicized and there’s so much distrust around that, I would say don’t take my word for it. Go out and look at Snopes or Politifact or Factcheck.org and then retrace their steps. One thing they’re really good at is showing their work. The best transparency in the journalism industry right now is being done by these fact checkers.

If you follow a Snopes link, you will be able to go through the exact steps their fact checkers took when they were debunking the myth. You can then follow those steps and look it up yourself. Use that as a starting point and do your own work from there.

Be critical of the information we consume, especially if it triggers a strong emotion when we’re first exposed to it, and just stop sharing anything we aren’t absolutely certain about.

If we just stop reacting to these stories on social media, this problem will go away very quickly and that’s really the only way I see it going away. I don’t know that misinformation will ever go away completely, but we can at least push it down to a realm where it will stop having such a disproportionate impact on our real world.

There has always been conspiracy out there, there’s always been propaganda. The difference now is that we have these platforms where we can make it travel fast and far. If we control ourselves when we use those platforms, if we learn to be cautious about spreading rumors, then we will be able to get a handle on it before it destroys us.

It sounds so absurd, but I really think that if we don’t learn to deal with this, if we don’t learn to be better consumers of information, smarter consumers of information, then this misinformation pandemic is going to do in our democracy.

Story originally appeared in Cal Poly News

Read the most recent stories in The Link

Related Content