![]() |
This image is a totally legit photo of me on the moon trying to sell you a bridge.* Photo by Artflow, Copilot, and TET |
Scam's are nothing new but, as technology makes everything easier, it's also making it easier for you to be scammed by something that seems very legit. Things that used to indicate something was on the level can very easily be faked by generative AI in a practice known as 'deep faking'. Where a person's likeness (image and voice) is used to generate video of them doing or saying something they didn't.
As one of the best examples of what's currently possible, YouTube Channel floydbishop, has made a PSA video titled, Watch This Fake AI Video Scam — Then Show It to Your Parents, that is 100% AI generated... i.e. it's ALL fake. Not a single human appears on camera, and not a single human voice is heard. Watch the video below.
If you watch closely, some scenes have small 'tells' that are common AI errors, such as items magically appearing, or nonsensical text, but, for the most part, you'd be hard pressed to tell any of this was AI generated if the avatars in the video weren't telling you they're fake.
Deep faking is not that hard to do. Within about five minutes of researching on YouTube, you could probably know enough to deep fake yourself (the only person you can confidently deep fake without breaking any AI site's Terms of Service).
That said, many generative AI sites prevent you from generating video of celebrities, public figures, or even people in general, but not all of them do.
Even so, a scammer with relatively basic computer skills can install their own unfiltered AI locally on their computer and generate anything they like, not just photos, and video, but even mimicking people's voices with little more than a 30 second voice sample to train with.
It's not just celebrities and public figures at risk of being deep faked. Anyone who has uploaded an image and audio of themselves into a public space online are fair game.
Keeping your uploads private, for only a select few, is not completely secure either. It's not completely unheard of for friends to go rogue on you... like your recent ex, for example.
While all this is concerning, the point is, to be diligent, when it comes to anyone requesting money, images, or secure details from you that you didn't initiate, seems suspect, or sounds too good to be true.
Take a moment to investigate. Make sure what you're seeing or hearing is true and legit. If you're at all unsure, err on the side of caution.
* You can tell this photo is not legit. I'm wearing a wrist watch. I haven't worn a wrist watch since the late 90's!
Comments
Post a Comment
Comments are moderated by an actual human (me, TET) and may not publish right away. I do read all comments and only reject those not directly related to the post or are spam/scams (I'm looking at you Illuminati recruiters... I mean scammers. Stop commenting on my Illuminati post!).