A Deep Dive into Deep Fakes: Media Literacy in a World Where Seeing Is no Longer Believing

Strategies to help students determine determine whether a video has been altered include analyzing what motivates people to create fakes in the first place. 

Last year, a viral video featured the opening credits of Full House with all of the sitcom actors’ faces replaced with actor Nick Offerman’s face. The video, which utilizes deep fake technology to create these face swaps, is funny, at least in part because we are all in on the joke. As viewers, we know that Offerman didn’t play all of the roles on this popular TV show from more three decades ago.

But what happens when we’re not all in on the joke? How do we know when videos that look plausible have been manipulated to deceive or manipulate the viewer? More importantly, how do we teach today’s learners fact from fiction in a world where seeing is no longer believing?

 

What is a deep fake?

The term deep fake refers to a video that has been edited using software to replace the person in the original video with someone else in a way that makes the video look authentic. Initially, these videos primarily targeted celebrities. But as the technology has grown more advanced, deep fakes have become a focus of U.S. intelligence efforts to curb misinformation and disinformation. Even former President Obama loaned his likeness to support efforts to inform Americans of the growing threat.

Currently, the software and skill set required to create really convincing deep fakes remains limited to a relatively small group of people. But as with all information, social media has sped up its democratization. Indeed, TikTok and Snapchat, two of the most popular social media among young people, recently integrated deep-fake technology that allows users to swap faces within each app in order to easily create their own deep fakes. Additionally, YouTube has usurped all other social media in popularity, making the medium of video, a large and ubiquitous part of our information diets.

According to KQED’s Above The Noise, there remain some digital “tells” that offer clues to a video’s authenticity:

  • Look for the mugshot More primitive deep fake software results in video in which the person is either facing forward or directly to one side—like a mug shot photo.
  • The eyes (and teeth) have it There are some areas of the face that some deep fake software continues to struggle with, including the teeth and eyes. Eyes often naturally change shape as the face moves, and teeth may not appear to line up properly all the time. If eyes and teeth appear static, it can signal a crack in a video’s credibility.
  • A bad spray tan Blotchy patches or uneven skin tone can be a sign that something is amiss.
  • Bad lip syncing If the mouth and voice don’t quite match up, there could be a problem.
  • Suspicious shadows This is harder to detect, but in manipulated videos shadows in the background may not match the size/shape of the people supposedly creating the shadows.

It’s important to remember that these clues alone do not mean that a video has been altered. What’s more, as deep fake technology becomes more advanced, and access to those advances become more readily available, savvy digital detectives will have to look for other clues to determine whether a video is credible.

 

The WHOA! test

Whether working with a group of fifth graders or a room full of their teachers and librarians, we often nudge learners to ask themselves whether or not a video passes the "WHOA!" test. It’s simple: If a post, article, or video makes you say "WHOA!" because it's upsetting, outrageous, or too good to be true, that feeling indicates a need to investigate further.

Extreme emotional reactions to the news, and other information, should be a signal to RESIST the urge to share—although our instinct is often to do the exact opposite. When we let emotion take the wheel, we make poor choices about passing on information.

Read: It's Time To Go Mobile While Teaching Media Literacy

Videos are especially good at triggering emotional responses, in part because we often view video as irrefutable evidence of truth. Also, video has elements that aren’t present in text or still images—movement, vocal inflection, and music—all evoke meaning, emotion, and bias in our brains. Even those who lean towards healthy skepticism when consuming information online can be tempted to let down their guard when viewing a video. We have been trained to believe that seeing is believing.

Deep fake technology, coupled with a new information landscape where trained and citizen journalists alike compete to be first to produce the next viral account, has made the use of emotional triggers to gain clicks in the commonplace. These realities should prompt changes to media literacy instruction to include support for students learning to recognize the emotional triggers that cause us to click.

 

The usual suspects

One effective strategy to helping learners recognize credibility issues is to have them consider the motives of the video creators. There are a couple of profiles of potential suspects for a virtual “lineup” that can help fledgling digital detectives connect a video’s clues with the possible motivation.

Suspect 1: the troll

  • Motives: A desire to divide, diminish, and destroy. For trolls, it can be personal. They take pleasure in hurting others, and these attacks somehow bolster their own self-esteem. Online bullies often fall into this category. Videos posted by trolls are often intended to embarrass and humiliate others.
  • Tools : Personal attacks, sensational language and rage bait, playing on existing biases.
  • Wins: Trolls win when they get attention or make others feel bad about themselves. They win when they are able to trigger “cancel culture” and when their handiwork goes viral.
  • Losses: Trolls lose when their work results in crickets—silence—and/or is met with kindness.

Suspect 2: the click-chaser

  • Motives: Fame! The click chaser seeks your online validation. Money and influence are secondary to likes, though likes often translate into money and influence.
  • Tools : Potpourri! The click-chaser will use anything at their disposal, but some of their favorite tools are clickbait or rage bait, videos accompanied by titles in ALL CAPS!, and video content that plays on emotions.
  • Wins: The click-chaser wins when we share their content. They want to be famous—one click, like, and share at a time.
  • Losses: click-chasers lose when we don’t click.

Jennifer LaGarde (librarygirl.net, @jenniferlagarde) and Darren Hudgins (about.me/darren_hudgins) are co-authors of Fact VS Fiction: Teaching Critical Thinking In the Age of Fake News (ISTE 2018; #FactVSFiction). LaGarde's passions include leveraging technology to help students develop authentic reading lives and meeting the needs of students living in poverty. Hudgins is the CEO of Think | Do | Thrive. He works with educators, school leaders, districts, and school organizations to build experiences promoting thought, play, and innovation.

Comment Policy:
  • Be respectful, and do not attack the author, people mentioned in the article, or other commenters. Take on the idea, not the messenger.
  • Don't use obscene, profane, or vulgar language.
  • Stay on point. Comments that stray from the topic at hand may be deleted.
  • Comments may be republished in print, online, or other forms of media.
  • If you see something objectionable, please let us know. Once a comment has been flagged, a staff member will investigate.


Tiffany Clark

Thanks for this important article on media literacy - lessons like this should be incorporated into curriculum at every level.

Posted : Mar 01, 2020 02:24


RELATED 

ALREADY A SUBSCRIBER?

We are currently offering this content for free. Sign up now to activate your personal profile, where you can save articles for future viewing

ALREADY A SUBSCRIBER?