Deepfakes pose growing fraud risk to contact centers

Deepfake attacks, including sophisticated synthetic voice clones, are rising, posing an estimated $5 billion fraud risk to US contact centers, according to the latest Pindrop Voice Intelligence and Security Report.

Contact center fraud has surged by 60 percent in the last two years, reaching the highest levels since 2019. By the end of this year, one in every 730 calls to a contact center is expected to be fraudulent.

US consumers are particularly concerned about the risk of deepfakes and voice clones, with 67.5 percent of those polled expressing anxiety about these threats in banking and the financial sector. Banks, credit unions, and high net-worth individuals are increasingly targeted via sophisticated fraud tactics, highlighting the need for innovative tools and strategies to establish an effective fraud prevention framework.

Retail fraud, including refund abuse, has quadrupled among Pindrop's customer call centers through 2023 too. Traditional manual fraud detection and authentication systems are proving ineffective, necessitating advancements in fraud prevention and authentication methods.

"The rapid advancements in AI have transformed deepfakes from a novelty to a serious threat for institutions and consumers alike," says Pindrop co-founder and CEO Vijay Balasubramaniyan. "Generative AI fundamentally breaks trust in commerce, media, and communication. Attackers are using sophisticated AI tools at an alarming rate. We need good AI to beat bad AI."

The rise of gen AI has led to deepfakes becoming a more potent force. Using tools like ChatGPT, fraudsters can now create more targeted and individual-specific attacks, creating an elevated risk of fraud.

The full report is available from the Pindrop site.

The company is also is launching a new Pulse Deepfake Warranty, as part of a three-year subscription to Pindrop, which will reimburse customers for synthetic voice fraud events that go undetected by the Pindrop Product Suite.

Image Credit: SergeyNivens/

© 1998-2024 BetaNews, Inc. All Rights Reserved. Privacy Policy - Cookie Policy.