It’s nearly 2020 and some schools are still focusing their #digcit energies toward “online safety” and “cyber-bullying”. Granted, these are important issues but I propose we need to go WAAAAY deeper into our critical thinking of the digital world that surrounds us.
We need to teach students how to spot shallow fakes and deep fakes.
A shallow fake is an online version of an existing clip or image that has been “tweaked”. Nancy Pelosi was shallow-faked in May 2019 when the audio of a clip was slowed down to make her look intoxicated.
Donald Trump was also shallow-faked by a Seattle TV station when images from the president’s remarks were manipulated to make him look…..orange. And with a bigger head.
These are shallow fakes because the source material is original and actually exists. In fact, you could make an argument that a Snapchat filter is a shallow fake because it can erase all sorts of wrinkles. But I digress.
Deep fakes are a bigger problem.
Deep fakes involve software that monitors thousands of facial photos and then “maps” them to imagine how those faces would move, given certain audio cues.
Deep fakes could be a serious threat to our democracy, unless we start teaching media literacy skills into every classroom, every day. How easy would it be to create a video of a world leader “saying something” when that never actually…..happened?
In a world where we tend to believe the very worst of those who vote differently from us, would it be possible to judge clips rationally? I’m worried that the answer is “no”.
Here are some tips on how to spot deep fakes:
Ask yourself: What’s the source of this material? Who stands to gain from this video, who stands to lose? etc.
I would also encourage us to take a more introspective step. If a clip we see gives us a strong emotional response, that’s our first clue to check it for authenticity. Acknowledge the baggage that *we* bring to the communication experience. What are our biases?
Let’s get students into the habit of asking these questions. Our democracy might depend on it.