ChatGPT: Navigating the rising financial crime landscape in the digital age

In-depth discussions with financial crime compliance decision makers from 10 leading U.S. financial institutions reveal that real-time digital payments, digital fraud, and cybercrime are the primary concerns for compliance teams in 2023.That said, there is a new player that has entered the scene and demands our attention: ChatGPT. It has the dual ability to help or hurt compliance and security teams.

Because while this cutting-edge technology presents an opportunity for financial institutions to detect and mitigate fraud and financial crime, it also provides criminals with an avenue to commit these acts more easily.

The Expanding Influence of ChatGPT

Since its introduction in November 2022, ChatGPT has experienced exponential growth, boasting over 1 billion users by March 2023. Unsurprisingly, criminals have swiftly embraced it for their illicit activities, leveraging its capabilities to create convincing fake profiles, documents, and transactions that can easily bypass even the most well-trained compliance personnel. Additionally, ChatGPT serves as a breeding ground for the development of bots and malware to execute cybercrime schemes and perpetrate scams aimed at obtaining sensitive financial information. The use of AI-generated messages further compounds the threat, as it enhances the realism of impersonations, making scams more difficult to detect. Instances of ChatGPT being employed to create legitimate-looking social media personas for data theft and even monitoring cryptocurrency prices and payments have already come to light. The potential for fraud and scams to be "turbocharged" by ChatGPT, which FTC chair Lina Kahn has highlighted, poses a significant challenge for compliance teams striving to differentiate between criminal and legitimate transactions.

Fraud and Financial Crime are Interrelated

Traditionally, financial institutions have treated fraud and financial crime separately, with fraud primarily associated with payments and financial crime linked to money laundering. But often, they are interrelated, with money laundering serving to hide the money stolen via fraud. As a result, many financial institutions have merged their fraud and financial crime teams into a single integrated unit to gain a more holistic view of these threats.

ChatGPT's Impact on Financial Crime

The rise of real-time payments and cybercrime has resulted in a surge over the past three years in financial crime cases for large U.S. banks involving digital payments, account takeover, and payments related to ransomware and cryptocurrencies. The proliferation of bots and synthetic identities has powered this surge, and ChatGPT provides fraudsters with a new tool to further increase their efforts. Of particular concern is the connection between ransomware, cryptocurrencies, and sanctions evasion, most notably by Russian-related entities. Gangs perpetrating ransomware attacks often demand payment in crypto assets, which are subsequently converted into fiat currency and laundered. Recent collaborations between the US Treasury's Office of Foreign Assets Control (OFAC) and the UK's Office of Financial Sanctions Implementation (OFSI) indicate an escalation in criminal activity in this realm. Criminal organizations can employ ChatGPT to develop both malware and counterfeit credentials, which enables them counter sanctions and intensifies the challenges compliance teams face.

Harnessing AI Solutions in the Fight Against Financial Crime

The complexity of emerging criminal typologies, the speed and volume of real-time payments, and the growing intricacy of sanctions demand a proactive response from financial institutions. Compliance teams simply cannot manually monitor and catch each real-time transaction. To combat this, financial institutions must employ digital identity solutions and leverage AI/machine learning technologies. Ironically, while ChatGPT serves as a valuable tool for fraudsters, it can also play a crucial role in detecting anomalies, cross-referencing against sanctions lists, and reducing false positives by analyzing vast amounts of data on individuals and transaction histories.

Reflecting on the potential of ChatGPT, a Vice President, AML, from a Tier-1 bank said, “I have just been playing around with ChatGPT in the last few weeks, and there's the opportunity for some of these things to completely rewrite our entire workflow. I think that we are going to need to be able to monitor AML threats in real-time, and we're going to need to turn to machine learning to start generating that.” ChatGPT is a transformative technology, both for cybercriminals and the financial institutions working to thwart them. When it comes to the threat of generative AI like ChatGPT, financial institutions must learn to fight fire with fire.

Image credit: BiancoBlue/

Christopher Reimann is MBA, Vice President & Principal, KS&R.

Comments are closed.

© 1998-2024 BetaNews, Inc. All Rights Reserved. Privacy Policy - Cookie Policy.