49 percent of global businesses targeted by deepfakes

A new report from Regula reveals that 49 percent of businesses globally have experienced deepfake scams involving either audio or video -- almost doubling the number of incidents since 2022.

The survey, of 575 business decision makers, shows a significant rise in the prevalence of video deepfakes, with a 20 percent increase in companies reporting incidents compared to 2022.

While 29 percent of fraud decision-makers across Australia, France, Germany, Mexico, Turkey, UAE, UK, and the USA reported encountering video deepfake fraud in 2022, this year's data -- covering the USA, UAE, Mexico, Singapore, and Germany -- shows this figure has surged to 49 percent.

Audio deepfakes are also on the rise, with the latest report showing a 12 percent increase compared to 2022 survey data. However, there are differences between sectors, audio deepfakes prevail over video ones among the three of the surveyed sectors, including financial services (51 percent), aviation (52 percent), and crypto (55 percent). While law enforcement (56 percent), technology (57 percent) and FinTech (57 percent) are reporting more face video scams.

"Our latest survey demonstrates that AI-generated identity fraud has become an everyday reality. The surge in deepfake incidents over the two-year period of our survey leaves businesses no choice but to adapt and rethink their current verification practices," says Ihar Kliashchou, chief technology officer at Regula. "Deepfakes are becoming increasingly sophisticated, and traditional methods are no longer enough. What we think may work well is the liveness-centric approach, a robust procedure that involves checking the physical characteristics of both individuals and their documents; in other words, verifying biometrics and ID hardcopies in real-time interactions. This is what we adhere to in our R&D and what we recommend that our customers do to protect themselves,"

You can read more about the study on the Regula blog.

Image credit: Wrightstudio/Dreamstime.com

Comments are closed.

© 1998-2024 BetaNews, Inc. All Rights Reserved. Privacy Policy - Cookie Policy.