LifestyleSecurity

Face Off — How Bad People Can Hijack Your Identity (Without Your Knowing)

Face Off — How Bad People Can Hijack Your Identity (Without Your Knowing)

Machine learning and AI are generally good. But some people use them for shady misdeeds. Face stealing is one of those.

Face’s not Safe

Fantômas, a super-villain originating from the French version of penny dreadfuls, had a cool talent. He could create a realistic mask of any person on earth and wear it to conceal his eerie, faceless identity. Thanks to shapeshifting he enjoyed his usual escapades: terrorism, extortion, stuffing pockets with family jewelry, and so on.

It used to be a naive horror plot decades ago. It has become a gruesome reality today, even spawning a facial recognition attack term among security experts. And as usual, the dangerous issue became real due to the human thirst for comfort.

We loathe the idea of remembering a PIN or a password. We want a certain action — like buying a coconut shampoo online — to be quick and frictionless. For that purpose, our phones were taught to identify us via a bunch of modalities: fingerprint, voice, and face.

And while it’s easy to get authorized by letting your gizmo peep at your face, it’s also simple to fool that gizmo. For example, in 2017 a Vietnamese security company showed that iPhone X wasn’t really good at distinguishing an owner’s face from a creepy mask with a silicone nose. 

While this mask took time and $150 to make, some facial attacks are virtually free and disgustingly easy to orchestrate. Photographic paper, a printer and a photo nicked from Instagram — that’s all that perps need to make a high-def cutout with your face to try to make a facial recognition system think that it’s actually you.

Picture: Fantômas could wear anyone’s face while staying 100% anonymous

Deep Learning for Deep Trouble

Things get truly frightful when the computery stuff enters the stage. AI and deep learning are available basically to anyone today. Even a six-grader — if he or she is patient enough — can whip up a deepfake clowning their classmate, teacher or the US president. 

While crude comedy is vexatious, it’s not that harmful. But when the same tools as MorphThing, Deepface Lab or CyberVoice end up in more skilled, but still wrong hands, consequences can be disastrous.

In broad terms, a criminal can steal your entire digital identity. It’s simple to do through the digital onboarding services. One of the techniques — dubbed face morphing — enables a perp to blend your face with theirs. The end result is a peculiar chimaera, which retains facial features from two different people. But what’s the gain?

Picture: After some coding perps can teach a neural network to impersonate you better than Phil Hartman

It’s elementary: by injecting your facial features into their photo, a culprit can get verified as you by an automatic face recognition system. 

This is especially favorable for those baddies who are on the run from the law and can’t fly on a plane or cross borders without risking getting caught. Today, face morphing is actively used for making fake IDs. A lawbreaker doesn’t even need plastic surgery to successfully run away and hide. 

Morphs aren’t the only doohickey in a digital thug’s arsenal. Another terrifying technique employs the inglorious deepfakes. You’ve heard of them a gazillion times, mostly in the context of YouTube funnies and fake porn. But in reality, deep fakes are a far more serious crime tool.  

Steal a Face, Steal a Voice

Deep learning operates byzantine algorithms that mimic our neural activity. After they feed on a large quantity of samples — photos, voice recordings, videos —  they can learn to recreate what they’ve seen and heard. Namely, a tool like Generative Adversarial Network is capable of such a stunt.

Thanks to learning, a neural network can clone your face or voice and let someone use them in real time. While a face can simply be worn like a mask, let’s say, during a live stream, your voice can be imitated through the voice conversion. This know-how allows a perp to utter whatever nonsense they want in a microphone in your actual voice. 

Sounds fantastic, right? Well an unnamed U.A.E. company lost $35 mil just because someone learned how to apply voice conversion. Now imagine if presidents, generals, CEOs, and other big cats will be receiving spoofed calls (including Zoom calls) from someone who they think they know. Consequences may be calamitous, to say the least.
But there are ways to stop this madness. Visit the antispoofing techniques Wiki https://antispoofing.org/ and see how your precious biometrics can be protected.

This website uses cookies. By continuing to use this site, you accept our use of cookies.