Through deepfake video calls: how North Korean hackers are attacking crypto professionals

Hackers operating under North Korean guidance are expanding their arsenal of attack methods targeting crypto industry professionals. A new approach—using video calls with AI-generated deepfakes—allows malicious actors to impersonate acquaintances or authoritative contacts, then persuade victims to install malicious software. This method demonstrates how sophisticated the application of synthesis technologies has become for cyberattacks.

Video Calls as a Social Engineering Tool

According to research firm Huntress, the attack scenario unfolds as follows: attackers hijack real people’s Telegram accounts known to the target. They then initiate video calls during which the attacker’s face is replaced with an AI-generated deepfake. This bypasses basic visual verification checks that typically help identify fraud.

Martin Kuhar, co-organizer of the BTC Prague conference, shared details of a specific method: a deceptive video call is accompanied by a proposal to install a supposedly fix plugin for Zoom, claimed to resolve sound issues. After installing the malware, attackers gain full access to the infected device and can steal crypto assets, communications, and other critical data.

Technical Analysis of the Malware: Multi-layered Infection

The deployed malware demonstrates complexity and multifunctionality. On macOS systems, malicious code is capable of:

  • deploying backdoors for remote device control
  • recording user keystrokes
  • copying clipboard contents
  • accessing cryptographically protected wallets and their private keys

This functionality allows hackers not only to compromise a specific device but also to use it as a foothold for further operations.

Lazarus Group and State Support

Researchers from SlowMist and Huntress confidently identified the attackers as North Korea’s Lazarus Group, also known as BlueNoroff. The group receives funding and political backing from the state, enabling it to continually improve its hacking techniques.

A characteristic feature is the reuse of code components and attack techniques across multiple operations. This indicates centralized management and a long-term strategy targeting crypto professionals and traders.

How to Protect Yourself from Deepfake Video Call Attacks

The proliferation of face and voice synthesis technologies makes video and audio unreliable methods of authentication. Industries need to urgently rethink their identity verification approaches. Recommendations include:

  • Enabling multi-factor authentication (MFA) on all critical services
  • Using hardware security keys instead of software-based verification methods
  • Being skeptical of unexpected video calls—even if the caller appears familiar
  • Regularly updating operating systems and software
  • Training teams to recognize social engineering tactics

The crypto industry must recognize the scale of the threat and take active measures to strengthen its defenses.

BTC3,77%
View Original
This page may contain third-party content, which is provided for information purposes only (not representations/warranties) and should not be considered as an endorsement of its views by Gate, nor as financial or professional advice. See Disclaimer for details.
  • Reward
  • Comment
  • Repost
  • Share
Comment
0/400
No comments
  • Pin

Trade Crypto Anywhere Anytime
qrCode
Scan to download Gate App
Community
  • 简体中文
  • English
  • Tiếng Việt
  • 繁體中文
  • Español
  • Русский
  • Français (Afrique)
  • Português (Portugal)
  • Bahasa Indonesia
  • 日本語
  • بالعربية
  • Українська
  • Português (Brasil)