The University of New Hampshire (UNH) hosted a MediaWise event for all students to learn about fake news, misinformation and the importance of fact-checking as the 2020 presidential election draws closer. Senior multimedia reporter Alex Mahadevan and reporter Heaven Taylor-Wynn from MediaWise gave students, both in and outside of the journalism program, examples of misinformation, lingo to know the difference between different types of misinformation, and tools to help them be vigilant online.  

MediaWise is a nonprofit organization that teaches students from middle school to college how to separate facts from fiction in the news. The organization is run by the nonprofit Poynter Institute located in Florida. MediaWise Voter Project is funded by Facebook. Program partners include Students Learn Students Vote and the Campus Vote Project.   

The two presenters started off by giving facts so students could see how big of an issue misinformation is. An NYU study found that from 2015 to 2018 there were 187 Instagram engagements with information on the platform that originated at Russian troll farms with Russian disinformation. Mahadevan explained that there are at least seven million first time voters getting for information from social media platforms, and this misinformation can spread quickly. Mahadevan also cited a study from Northeastern that found that more than half of college students don’t fact check most things that they see online before they share it. Not only that, but the PEW Research Center reported that most Americans have lost trust in the news and reporters’ ethical standards.   

Mahadevan and Taylor-Wynn told students what their programs goal is to reach 2 million voting-age college students by 2020 by releasing a voted guide, with the help of John Green’s production company Complexly, hiring Campus Correspondents who go through boot camps that cover misinformation on a deeper level than the one-hour lecture at Hamilton Smith, and bus tours during college football season. The first “fake news” that the group tackled was a photo of singer Jason Derulo falling down the Met Gala stairs. While this photo seemed harmless and funny, Derulo himself even laughed about it in an interview, Mahadevan and Taylor-Wynn explained the dangers of altered images and sharing posts and memes without knowing who or what is behind them.  

Although the presenters added in funny examples throughout the lecture, they also showed many political memes and videos that were altered, cut short, had the wrong caption, or were taken out of context. These examples were both left- and right-leaning, showing the students that there isn’t one group of people behind this and anyone could fall victim to misinformation no matter what their views are. Mahadevan and Taylor-Wynn showed how easy it is to alter someone’s face (attaching professor Tom Haines’ face to Tom Holland’s Spiderman), tweets (it took less than a minute for Mahadevan to fake a tweet, making it look like it came from President Trump), and alter audio (all that’s needed is a video with a range of facial expressions).  

With each new type of misinformation, Mahadevan and Taylor-Wynn gave students tools to use to combat this. First, there are three questions everyone should ask themselves before sharing a tweet or meme, “Who is behind the information?” “What’s the evidence?” and “What do other sources say?” Additionally, people who use social media should be wary of how a video, tweet or meme makes them feel.   

“Does it make you angry, emotional or anxious?” Mahadevan asked the students. “Because if it makes you feel that way then you definitely need to go ahead and fact check it.”  

Along with the questions that students could ask themselves when reading posts and memes, Mahadevan and Taylor-Wynn shared a plethora of fact-checking sites, reverse engineering programs, and ways to spot fakes with your own eyes. The two highly recommended Google Fact Check Explorer for articles and explained that the more sources you look at, the better your understanding and view of the topic will become.  

“What you’ll want to do is exercise click restraint,” Taylor-Wynn said. “You want to scroll through different websites or different results and see what you can find. See what multiple sources are saying, chances are everyone is going to report the same thing is a different way but you want to get a holistic view of what’s going on.”   

On top of that, people can almost always simply look up who is behind the information, or misinformation, on Google by asking who owns a certain news website or who sends out tweets on a sketchy account.   

If there’s a tweet that seems fake, after you scroll through the (alleged) original posters feed and can’t find the tweet, ProPublica created Politwoops that collects all politicians deleted tweets.   

Mahadevan and Taylor-Wynn showed how to “spot a bot,” referring to fake social media accounts that post as a person would. A telltale sign is the account posting at strict intervals, like every hour or half-hour. These accounts will either have no followers or thousands of fake followers, and the account will post the same things over and over again, usually with lots of hashtags.  

Some of these bots can be helpful, like the Big Cases Bot, that gives updates on big court filings, or the Los Angeles Earthquake Bot, which tweets every time there is an earthquake around LA. Some bots, however, are no so helpful and there was an influx of them during the 2016 election trying to influence what people are talking about and spreading misinformation.  

If you can’t figure out if an account is a bot just by looking at it, there are tools for that too. Account Analysis allows people to plug in a twitter handle that shows how much that account posts, or Hoaxy, which gives people a visual of how specific tweets or articles are being spread by humans and bots.   

For fact-checking photos in tweets or articles, the presenters recommended Google Reverse Image Search and TinEye, which gives viewers an idea of where a photo was posted, how it was posted, and if it has been altered.   

The last point that Mahadevan and Taylor-Wynn talked about were “deepfakes,” video or audio files that have been created using artificial intelligence. There are four types of DeepFakes, starting with videos. A video deepfake was shown to the lecture hall with Professor Haines’ face over Spiderman’s in Avengers Endgame. While this was another silly example of how it can be used, Mahadevan showed YouTube videos of how people can use this technology to make it look like politicians are saying something they’ve never said. To spot these, Mahadevan and Taylor-Wynn said, look for any skin discoloration, especially around the mouth, a shimmer around the head and a white blob in the place of teeth. These deepfakes are easy to pick out.  

The other forms of deepfakes are artificial intelligence generated text, “cheapfake” or “dumbfake,” and audio manipulation. All of these deepfakes are harder to pick out; Taylor-Wynn and Mahadevan explained that there was no real way to figure it out besides asking yourself the three questions from the beginning and checking in with other sources.   

“If you take this approach when you’re faced with something online or in real life, and you’re wondering ‘is this legitimate,’ ask yourself these three questions, analyze them, and chances are you’re going to come out with the right answer,” Taylor-Wynn said.   

“In today’s smartphone world there much information available with the swipe of a screen, and much of that information is true,” Professor Haines said, “But much is false, whether something innocently shared devoid of context, or intentionally created to mislead or misinform. So it is more important than ever for journalists, but also any consumer of information, to know how to think critically about, and fact check, suspicious posts encountered online. The MediaWise team did a great job offering a sense of the challenges and some very useful tools for people to get closer to the truth.”  

To learn more about fact-checking and misinformation, or to learn how to get involved with the Campus Correspondents program, visit poynter.org/mediawise or follow @MediaWise on all social platforms.