Deepfake videos are becoming increasingly common and could rival sextortion schemes in their effectiveness across a broad spectrum of potential victims.
In case you are not familiar with either of these hacker trends, let me start with sextortion schemes. These are email phishing messages that come in to your mailbox and inform you that your device had been taken over by a hacker and you were observed on adult websites or doing lewd acts via the webcam on your device. The message goes on to threaten the release of this information to your friends, family members and professional colleagues. Unless of course, you pay the hacker a ransom to not do so. While statistics vary, this scheme seems to be quite successful, earning hackers a good sum of money.
Deepfake videos are fake videos that are made by taking a persons image and inserting it into a video making it look like the person is actually part of the video. Often hackers will use images and audio samples taken from publicly available sources and stitch them into a video that does not put you in a flattering light. Just imaging what combining a deepfake video with a sextortion scheme could look like?
You may recall that deepfake videos really came on the scene during the 2016 election cycle in the United States. Some candidates were reportedly scene in videos that they were not actually present for. This is how that happened.
It's more important than ever, especially given the current state of things in the US and globally to be wary of not just everything you read, but even what you see in video. If it doesn't feel right, it may very well not be. Be suspect and challenge the natural tendency to accept what you see as fact. Be sure you're not being tricked by a deepfake video.