AI makes it easier for fraudsters to create fake companies, hamper due diligence: report

AI may make it easier for fraudsters to fake businesses but companies also plan to use it to detect fraud, according to the report

A new report finds that a majority of businesses say they’re facing challenges in bringing on new vendors or business customers due to fraud concerns exacerbated by the rise of generative artificial intelligence (AI) technologies.

The report by the Association of Certified Fraud Examiners (ACFE) and Thomson Reuters found that synthetic businesses created by fraudsters and potential fines or regulatory actions are the top two concerns of organizations when they onboard a new business customer or vendor. 

The survey found that 61% of businesses surveyed said it’s moderately or extremely challenging to onboard a new vendor, while 52% said the same about onboarding a new customer – in part because of the difficulty of doing due diligence to ensure a criminal hasn’t used generative AI to create a fake business with official-looking websites and more.

"Organized crime rings are sophisticated and typically they’re ahead of the curve on the adoption of any new technology," Dori Buckethal, vice president of Thomson Reuters Risk & Fraud Solutions, told FOX Business. "What we’re seeing with synthetic identity at fictitious businesses is that generative AI and machine learning can replace what was once done with a very manual process within criminal organizations."

GENERATIVE AI TOOLS LEAD TO RISING DEEPFAKE FRAUD

AI Fraud

A report by the Association of Certified Fraud Examiners and Thomson Reuters found that businesses are concerned about the potential for AI's use by fraudsters, and intend to leverage it to counteract fraud. (iStock / iStock)

"So in the past and even today, there are a large number of people sitting in call centers conducting this type of fraud. And just like we’re hearing in corporations, those people can be replaced with technology to do it at a faster rate and accelerate the pace at which they’re attacking businesses," she added.

Buckethal explained that companies are concerned about the reputational damage of doing business with a fraudulent entity, in addition to regulatory fees and fines if they’re found to be working with sanctioned individuals or business entities. Furthermore, the dollars lost by companies to fraud hurt shareholders and their stakeholders.

"Business-to-business fraud is big business," Buckethal explained. "The estimates are in the billions of billions of lost revenue each year and the ACFE experts have told us that they estimate 5% of business revenues are lost to fraud each year." 

CHATGPT COULD INCREASE ‘THREAT VECTOR’ FOR CYBERATTACKS AND MISINFORMATION, EXPERTS WARN: ‘DEEPFAKE GENERATOR’

Deepfake Creation

Generative AI can be used to create human-like deepfakes and also fictitious companies and websites for fraudsters. (Reuters TV / Reuters / Reuters Photos)

One of the most common types of fraud is known as business email compromise, which involves a fraudster spoofing the email of a legitimate business. For example, in the real estate industry, if a criminal knows a financial transaction will occur at a certain time, they’ll attack the company in advance to change a routing number to send a payment to their fraudulent account.

Even as businesses grapple with the use of generative AI by criminals, those businesses are finding ways to leverage AI to improve their internal know-your-customer and know-your-vendor due diligence processes. 

Over half of the businesses surveyed by ACFE and Thomson Reuters for the report said they plan to use AI and machine learning for those purposes in the coming years.

WHAT ARE THE DANGERS OF AI? FIND OUT WHY PEOPLE ARE AFRAID OF ARTIFICIAL INTELLIGENCE

Artificial Intelligence

Generative artificial intelligence creates new challenges for businesses to avoid being duped by fraudsters. (iStock / iStock)

"What we’re hearing from customers today is that they’re using AI and machine learning. They’re testing it out to automate their processes. So it’s specific pieces of their workflow that they go through for their due diligence process or their onboarding, know your customer process, that they’re trying to automate through AI," Buckethal said.

"One of the concerns for some of our customers is that they’re regulated industries and so they have to make sure they’re doing everything in lockstep with the regulators," she added. "They have regular meetings to make sure that everybody is aware of how they’re deploying AI and where and that everybody is comfortable with it and the results. We’re already seeing it in use, but people are taking it slowly, which I think is the right answer."

Buckethal noted that as companies implement AI into due diligence processes to relieve some of the burdens on their often under-resourced compliance staff, they should reinvest savings to continue to further improve their corporate safeguards because the threat from fraudsters using AI and other tools won’t diminish over time.

Ticker Security Last Change Change %
TRI THOMSON REUTERS CORP. 161.50 +1.30 +0.81%

GET FOX BUSINESS ON THE GO BY CLICKING HERE

"They really need to take the savings from that shift and reinvest it in technologies to continue to look at this problem because the criminals aren’t going away. The technology isn’t slowing down, so they shouldn’t do this as a cost-cutting method and should be investing more in their risk and compliance teams," she said. "Criminals are thinking like CFOs, and business leaders need to think like criminals to understand where to invest in technology and resources."