Not a month seems to pass without a viral sensation around the development of deepfake technology. The latest bit of news was of course the creators of South Park launching a new web series based entirely on the technology.
At a first glance, deepfakes are obviously getting better and more convincing, but as they mentioned it was probably the most expensive short they ever created, so it seems that it’s application is still prohibitively expensive for most users – and fraudsters.
I mention this because during the last SEON webinar on online lending, our guest Kaspars Magaznieks mentioned that they have seen some attempts at fraud by people who were using simple face swap technologies in an attempt to beat their risk checks.
So while individual, one-off cyber criminals would only have access to toys such as Face Swap, who’s to say that a more professional outfit won’t try using deepfakes on a lower and wider level, be it chargeback fraud or abusing promotional programs?
With the penetration of smartphones everywhere, KYC procedures by large moved onto biometric authentication – that is, asking for not just government IDs, but selfies from the customer holding them. This is only logical, as it’s easy – if annoying – for legitimate customers to do, and so far has proven reliable in raising the barrier for carders who would maybe have access to the documents, but not to the associated face.
However, we have seen in the past few months that this shift in the industry is forcing innovation on the criminal side as well. An entire cottage industry has popped up designed to deliver such selfies to criminals, and we have seen how enterprising fraudsters were able to dupe such KYC processes in the year’s biggest poker botting scandal.
Economically speaking this means that there is already a demand for convincing identity spoofing solutions, and deepfakes will be able to satisfy that demand.
The question is not “if” but “when”: from the fraudsters point of view, if the potential reward is big enough and the risk low enough, sooner or later some business will get hit by an attack designed to beat biometric KYC via deepfakes.
The industry response so far has been moving forward towards something called live facial feature detection – sort of like a CAPTCHA that you have to solve with your face, smiling or grimacing into your camera. As far as easy onboarding processes go, this doesn’t seem like an optimal solution at all, not to mention the plain weirdness of it, which is sure to rub legitimate customers the wrong way.
We believe we can at least take some of that pain away by using digital footprints to flag suspicious users instead of treating every potential customer as a hacker straight out of a sci-fi movie.
But by the looks of it, fraud attempts powered by deepfake technology are coming, and they will only grow bigger and bolder as the technology advances and becomes more accessible to a wider, less tech-savvy audience.
See a live demo of our product
Use casesOnline Lending
Get our latest newsletter
Join over 6000 companies in getting the latest fraud-fighting tips