The Night I Fell for a Deepfake – When Seeing Isn’t Believing

I still remember scrolling one night and stumbling across a video of Donald Trump being shoved into a police car. It looked real, the flashing lights, the camera shaking, the crowd shouting his name. The video had millions of views, thousands of comments, and people arguing about what it meant.

For a few seconds, I was genuinely confused. When did this happen? Why wasn’t it all over the news?

Then I checked the comments more closely. Someone wrote, “Guys, this isn’t real, it’s a deepfake.” I paused the video, replayed it, and suddenly saw it, the odd blink, the strange mouth movement. The entire thing was AI-generated. None of it had ever happened.

That moment stuck with me because it showed just how easy it is to mistake fiction for fact online. If I could believe it for even a moment, how many others shared it thinking it was real? How many formed opinions or arguments around something that never actually occurred?

Deepfakes use artificial intelligence to create fake videos and audio that look and sound shockingly real. They’ve been used for entertainment and satire, but in politics, the consequences are far more serious.

Comments

Leave a comment