Fake job seekers are flooding U.S. companies that are hiring for remote positions, tech CEOs say

When the Pindrop Security voice authentication startup published a recent start of work, a candidate stood out hundreds of others.

The applicant, a Russian encoder called Ivan, seemed to have all the correct qualifications for the Senior of Engineering. However, when he was interviewed during the video last month, Pindrop recruiter noticed that Ivan’s facial expressions were slightly out of synchrony with his words.

This is because the candidate, to whom the company has called “Ivan X”, was a scammer who used Deepfake software and other generative the AI ​​tools in an attempt to be hired by the technology company, said the CEO and co -founder of Pindrop Vijay Balasubramaniyan.

“General AI has erased the line between what is human being and what it means to be a machine,” said Balasubramaniyan. “What we are seeing is that people are using these false identities and false faces and false voices to ensure employment, even sometimes they will make an exchange of face with another person who appears for work.”

Companies have long fought in the attacks of computer pirates that hope to exploit vulnerabilities in their software, employees or suppliers. Now, another threat has emerged: work candidates who are not those who say they are, handling AI tools to manufacture photographic identifications, generate employment stories and provide answers during interviews.

The emergence of the profiles generated by AI means that by 2028 worldwide 1 out of every 4 work candidates will be false, according to the Gartner research and advice firm.

The risk for a company to bring a false work search engine can vary, depending on the intentions of the person. Once hired, the imposter can install malware to demand the rescue of a company, or steal data from its customers, commercial secrets or funds, according to Balasubramaniyan. In many cases, deceptive employees are simply collecting a salary that otherwise could not do it, he said.

Massive increase

Cybersecurity and cryptocurrency companies have seen a recent increase in false employment applicants, industry experts said to CNBC. As companies often hire remote roles, they present valuable objectives for bad actors, these people said.

Ben Sevser, Bright’s CEO, said he heard for the first time on the subject a year ago and that the number of fraudulent work candidates has “greatly increased” this year. Your company helps more than 300 corporate customers in finance, technology and medical care to evaluate possible employees in video interviews.

“Humans are generally the weak link in cybersecurity, and the hiring process is an inherently human process with many transfers and many different people involved,” said Sevser. “It has become a weak point that people are trying to expose.”

But the problem is not limited to the technological industry. More than 300 American companies inadvertently hired imposters with ties with North Korea for IT work, including an important national television network, a defense manufacturer, a car manufacturer and other Fortune 500 companies, the Department of Justice alleged in May.

The workers used stolen American identities to request remote work and remote networks implemented and other techniques to mask their true locations, the Department of Justice said. Finally they sent millions of dollars in salaries to North Korea to help finance the Nation’s weapons program, the Department of Justice alleged.

That case, which involves a ring of alleged facilitators, including an American citizen, presented a small part of what US authorities have said is an extensive network abroad of thousands of IT workers with North Korean ties. Since then, the Department of Justice has presented more cases that involve North Korea workers.

A growing industry

False employment applicants are not stopping seeing it, if the experience of Lili Infante, founder and executive director of Cat Labs, is an indication. Its Florida headquarters is at the intersection of cybersecurity and cryptocurrency, so it is especially attractive to bad actors.

“Every time we list a work publication, we get 100 spies from North Korea that request it,” said Infante. “When you look at their curriculums, they look incredible; they use all the keywords for what we are looking for.”

Infante said his company relies on an identity verification company to eliminate false candidates, part of an emerging sector that includes companies such as Idenfy, Jumio and Socure.

The false employee industry has expanded beyond North Koreans in recent years to include criminal groups located in Russia, China, Malaysia and South Korea, according to Roger Grimes, a veteran computer security consultant.

Ironically, some of these fraudulent workers would be considered high performance in most companies, he said.

“Sometimes the role is bad, and then sometimes they do it so well that some people have really told me that they regretted that they had to let them go,” Grimes said.

His employer, the Knowbe4 cybersecurity firm, said in October that he inadvertently hired a North Korean software engineer.

The worker used AI to alter a file photo, combined with an identity of the US. It was only discovered after the company found suspicious activities from its account.

Fighting the deep

Despite the case of the Department of Justice and some other publicized incidents, hiring managers in most companies generally do not know the risks of false work candidates, according to Bright’s seamer.

“They are responsible for the talent strategy and other important things, but being historically has not been one of them,” he said. “People think they are not experiencing it, but I think it is probably more likely that they are simply not realizing that it is happening.”

As the quality of Deepfake technology improves, the problem will be more difficult to avoid, said Sevser.

As for “Ivan X”, Balasubramaniyan de Pindrops said the startup used a new video authentication program that he created to confirm that it was a Deepfake fraud.

Although Ivan said he was located in western Ukraine, his IP address indicated that he was actually thousands of miles east, in a possible Russian military installation near the North Korean border, the company said.

Pindrop, backed by Andreessen Horowitz and Citi Ventures, was founded more than a decade ago to detect fraud on voice interactions, but can soon pivot the video authentication. Customers include some of the Banks, insurers and Health companies of the United States.

“We can no longer trust our eyes and ears,” said Balasubramaniyan. “Without technology, you are worse than a monkey with a random currencies.”



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *