AI Fakes IDs: A Challenge for Crypto KYC

The field of artificial intelligence (AI) has experienced substantial growth in recent years. It is projected that by 2030, the AI industry could contribute as much as $15.7 trillion to the global economy. One particular development gaining attention is the creation of “deepfakes” – realistic video or audio recordings generated by AI that can imitate real human appearances or voices to a point where they are almost indistinguishable. Recently, there was a viral post demonstrating how people are using open-source and commercial software to alter selfies and create counterfeit IDs that can potentially deceive security checks. This poses a significant challenge to the existing Know Your Customer (KYC) process.

Toufi Saliba, CEO of HyperCycle, believes that the current security processes, including KYC, are vulnerable to deepfakes. He suggests that using proper cryptography and embracing AI can provide a resilient solution to protect against deepfake attacks. Saliba also highlights the impact of deepfakes on the cryptocurrency sector, emphasizing the need for regulators and centralized entities to adopt cryptography as a defense mechanism.

Dimitry Mihaylov, an AI research expert for the United Nations, asserts that the rise of AI-generated deepfakes creates unforeseen challenges in detecting fake identification documents. He believes that industries across the board need to evolve rapidly and points to projects like FakeCatcher, which utilizes AI to detect deepfakes with a 96% accuracy rate. Mihaylov anticipates a shift in regulatory approaches to KYC, suggesting the adoption of more dynamic and interactive verification processes, such as video KYC.

The impact of deepfakes extends to the cryptocurrency industry as well. OnlyFake, a platform that produces counterfeit driver licenses and passports, has allegedly bypassed the KYC protocols of several well-known crypto exchanges. Discussions have leaked, revealing their ability to deceive verification processes at exchanges and financial institutions, including OKX, Kraken, Bybit, Bitget, Huobi, and PayPal. OnlyFake offers a swift document generation process and even allows users to incorporate their own photos or select from pre-curated images. This presents a significant challenge for online verifications in the industry.

In 2022, CertiK, a blockchain security company, uncovered an underground marketplace where individuals sold their identities for as little as $8. These identities were used to establish banking and exchange accounts for fraudulent cryptocurrency initiatives. The availability of AI deepfake technology has raised concerns among cryptocurrency leaders, especially in terms of video verification processes for identity validation. Binance’s chief security officer, Jimmy Su, warned about the increase in fraudsters using deepfakes to bypass KYC procedures. These video forgeries are becoming so realistic that they can deceive human evaluators.

A study by Sensity AI revealed that liveness tests used for identity verification are vulnerable to deepfake attacks. Scammers can substitute their faces with those of other individuals, allowing them to carry out fraudulent activities. This was demonstrated when a man in India fell victim to a scam where the scammer used a deepfake to impersonate his friend and steal his money. A deepfake video of Elon Musk sharing misleading crypto investment advice circulated on Twitter.

As AI continues to advance, the threat of deepfake attacks, particularly “face swaps,” is expected to increase. Attacks against remote identity verification systems have already risen by 704% between 2022 and 2023. Hackers and scammers are becoming more sophisticated, utilizing digital injection attack vectors and emulators to create and use deepfakes in real-time, posing a serious challenge to authentication systems.

The evolving security paradigm in the face of deepfakes will be crucial as humans increasingly rely on advanced technologies. It remains to be seen how this nascent security landscape will develop to combat the growing threat of deepfakes.

Ismail Bohon

Ismail Bohon

One thought on “AI Fakes IDs: A Challenge for Crypto KYC

  1. Deepfakes have the potential to erode trust and cause significant harm. It’s crucial for us to work together to develop robust security measures and safeguard against these attacks.

Leave a Reply