Cybersecurity firm Cato Networks has warned of a new threat to cryptocurrency exchanges. The company stated that criminals managed to bypass identity verification systems using artificial intelligence (AI)-based deep imitation technology. The aim is not to take over existing accounts, but to facilitate money laundering by creating fake accounts.
Spread of Fake Identities and Accounts
According to Cato Networks’ report, criminals are actively selling these deepfake tools in underground markets. These tools are used to bypass exchanges’ authentication processes. The process specifically aims to create new, verified accounts with fake credentials. The report emphasized that such accounts lay the groundwork for illegal activities such as money laundering, fraud and guarantor accounts.
The company stated that fake account fraud caused a loss of more than $ 5.3 billion in 2023, compared to 2022.
How Is Deep Imitation Technology Used?
In the first step, malicious actors create fake identity documents and images using AI-powered tools. Then, lifelike copies of documents such as passports are created and special videos are prepared to bypass the facial recognition systems of cryptocurrency exchanges. Once these transactions are completed, a fake identity document is uploaded to the system and a verified account is opened on the exchange.
“New account fraud allows fraudsters to find security vulnerabilities,” said a Cato Networks official, adding that this method makes it possible to create a fake account within minutes. The company also drew attention to the seriousness of the threat by publishing a video showing how the process works.
Recommendations for Cryptocurrency Exchanges
Cato Networks emphasized that cryptocurrency exchanges should update their security systems against such threats. The company recommends that only technical measures will not be sufficient, and that threat monitoring studies should also be carried out using methods such as human intelligence (HUMINT) and open source intelligence (OSINT).
“Many claims are made in the media about artificial intelligence technology. However, threat actors have already started using these technologies in the field. It is certain that they will further develop their deep imitation methods over time,” added the company representative.
Cyber security experts point out that exchanges should make it more difficult to create such fake accounts by taking additional measures in their identity verification processes. It is stated that especially two-step verification and advanced facial recognition technologies can be effective against these threats.
AI-powered deepfake technologies have the potential to increase fraudulent activities by targeting vulnerabilities in the cryptocurrency industry. Cato Networks states that exchanges should not be satisfied with technological solutions alone and that it is vital to keep threats up to date.
The rapidly developing structure of the cryptocurrency industry creates new opportunities for cybercriminals. Therefore, it has become inevitable for both exchanges and users to tighten security measures.
Disclaimer: The information contained in this article does not constitute investment advice. Investors should be aware that crypto currencies carry high volatility and therefore risk, and should carry out their transactions in line with their own research.