
A North Korean state-sponsored hacking group, Lazarus, is advancing its tactics with a more polished and deceptive approach.
A report by cybersecurity firm Silent Push revealed that the group has set up fake US-based crypto companies to distribute malware disguised as job opportunities.
According to the report, a Lazarus subgroup called “Contagious Interview” is behind the registration of three fraudulent crypto consulting firms: BlockNovas LLC, Angeloper Agency, and SoftGlide LLC.
The security firm stated that the three companies were created to look like legitimate players in the blockchain industry. However, these shell firms were used to lure developers into fake job interviews.
Zach Edwards, a senior threat analyst at Silent Push, pointed out that this isn’t the first time Lazarus has used job interview lures, but it’s the most advanced version seen so far.
He said:
“They have now crossed the rubicon – they are willing to register a fake business and go through all the supposed KYC checks involved with that process, and were successful in the effort.”
Malware disguised as interview tools
The fake interview process typically involves a request for an introductory video. When applicants try to upload the video, they encounter an error. They’re then given a quick-fix solution of a copy-and-paste command that secretly delivers malware.
Edwards said:
“During the job application process an error message is displayed as someone tries to record an introduction video and the ‘solution’ is an easy ‘click fix’ copy and paste trick, which leads to malware if the unsuspecting developer completes the process.”
Silent Push identified three distinct malware strains used in this campaign: BeaverTail, InvisibleFerret, and OtterCookie. These tools give hackers remote access to victims’ devices and allow them to extract sensitive information.
The attackers use services like Astrill VPN and residential proxies to cover their tracks, making their infrastructure difficult to trace.
AI-generated identities
Beyond malware, the North Korean attackers rely heavily on fake AI personas to perform their nefarious activities.
Silent Push found that the threat actors use AI tools like Remaker AI to generate fake employee photos. Sometimes, they even alter real images to create deceptive profiles that look nearly authentic.
Edwards said:
“There are numerous fake employees and stolen images from real people being used across this network…In one of the [cases], the threat actors took a real photo from a real person, and then appeared to have run it through an ‘AI image modifier tool’ to create a subtly different version of that same image.”
This development marks a dangerous evolution in cybercrime targeting the crypto space. The combination of malware, social engineering, and AI-generated identities signals a growing threat.
Edwards concluded:
“This investigation is a perfect example of what happens when threat actors continue to uplevel their efforts one campaign after the next, without facing justice.”