Deepfake as a weapon: how North Korean hackers use AI videos against crypto professionals

Cyber threats are taking new forms. North Korean hacker group Lazarus Group, also known by the alternative name BlueNoroff, has added advanced deepfake technology to their arsenal. This marks a significant step in the evolution of cyberattacks targeting the crypto industry, where the stakes for financial and digital assets are particularly high. According to research firm Odaily, malicious actors are successfully using synthesized video content to gain access to crypto professionals’ systems.

How Deepfake Video Attacks Work

The attack mechanism is clever and insidious. Hackers operating on behalf of Lazarus Group initiate video calls through compromised Telegram accounts, using fake video content of known contacts of the victims. Martin Kuharz, co-founder of BTC Prague conference, reported a similar incident and described the attackers’ tactics: they manipulate trust and persuade targeted users to install seemingly harmless software to fix sound issues in Zoom.

The main danger lies here — malicious software disguised as a plugin. Once installed, it grants attackers full control over the device. Researchers from security firm Huntress found that these methods resemble previous operations aimed at developers in the cryptocurrency sphere.

Lazarus Group Expands Attack Methods Arsenal

Huntress specialists and SlowMist analysts classify these operations as part of a hacking group supported by the North Korean state. The attackers demonstrate clear signs of a systematic approach: each operation is carefully prepared and targeted at specific wallets and individual crypto professionals.

The embedded malware is capable of performing multi-layered infections on macOS devices. Its functionality includes backdoor insertion, keystroke interception, clipboard content theft, and access to encrypted assets in crypto wallets. This is a comprehensive attack designed for maximum damage.

Why Deepfake Technology Is Becoming a Critical Threat

With the proliferation of deepfake creation and voice cloning technologies, visual verification is losing reliability. Traditional identity verification methods — video and audio recordings — are no longer a secure way to authenticate. This creates a new vulnerability vector specifically for the crypto sector, where transaction amounts can be substantial, and the recovery of lost assets is minimal.

Analysts warn that as deepfake technologies improve and related tools become more widespread, the number of such attacks will only increase. Malicious actors are gaining more effective ways to bypass traditional trust mechanisms.

How to Protect Yourself from Deepfake Attacks

The crypto industry must not only respond to threats but actively strengthen its defenses. The primary measure is the implementation and use of multi-factor authentication at all levels. Never open links or download software based on video calls, even if the interlocutor appears familiar.

Additionally, crypto professionals should rely on alternative channels for identity verification: voice calls through other platforms, direct messages via secure channels, and pre-agreed passphrases. Companies should train employees to recognize signs of social engineering and trust exploitation through deepfake videos. Awareness and caution are key protective barriers in the age of advanced video synthesis technologies.

BTC-7.64%
View Original
This page may contain third-party content, which is provided for information purposes only (not representations/warranties) and should not be considered as an endorsement of its views by Gate, nor as financial or professional advice. See Disclaimer for details.
  • Reward
  • Comment
  • Repost
  • Share
Comment
0/400
No comments
Trade Crypto Anywhere Anytime
qrCode
Scan to download Gate App
Community
English
  • 简体中文
  • English
  • Tiếng Việt
  • 繁體中文
  • Español
  • Русский
  • Français (Afrique)
  • Português (Portugal)
  • Bahasa Indonesia
  • 日本語
  • بالعربية
  • Українська
  • Português (Brasil)