Preparing for potential regulations around AI in electronic bill payment and presentment

The speed with which enterprise-level artificial intelligence has moved from the realm of theoretical -- if not outright science fiction -- to a widely adopted business tool has been nothing short of astonishing. The mad dash to find and implement applications for new, AI-based solutions is reminiscent of the rapid ascension of cloud technology in both fervor and consistency: the race to the cutting edge is taking place across industries.

Where sectors differ is in the level of caution they’re bound by their respective norms and requirements to apply in the process. While those in all industries ought to approach new technologies and tools with healthy prudence, in my personal opinion (and I am not an attorney), those of us in the electronic bill payment and presentment (EBPP) space must additionally consider the regulations that govern individuals’ privacy and the security of their data when it comes to something as private as their finances.

This calculatedly measured stance has already shaped my industry’s approach to artificial intelligence. At the same time, the rapid implementation of AI solutions in other technology sectors has highlighted areas in need of further regulation, as was most recently apparent in the Executive Order On Safe, Secure, and Trustworthy Artificial Intelligence that was issued in late October. Drawing on my over 20 years in the field, including three at EngageSmart solution InvoiceCloud, I have some thoughts around how regulations around AI in EBPP will continue to evolve beyond these very early stages.

Data security

Naturally, the methods used to store data have a huge effect on its vulnerability to hackers and other bad actors. Data mining and phishing scams have increased in recent years, and AI is poised to further boost their sophistication. Fortunately, AI can also be used to help protect sensitive information, as is already the case with many email providers on the market.

The financial information that the EBPP sector handles every day is the same information that hackers are seeking to capture and exploit. For this reason, any AI-powered solutions used to organize or store consumer financial data will have to be airtight. We can expect to see increasing privacy-protecting regulations around data security as EBPP organizations begin to experiment with implementing AI in the collecting and storing of personal data. Laws and regulations already in place to protect data privacy in addition to the most recent Executive Order include two previous Executive Orders issued in 2019 and 2020, California’s Consumer Privacy Act (CCPA), Virginia’s Consumer Data Protection Act (CDPA), and the General Data Protection Regulation (GDPR) in the EU, among others.

Ethical use

Of course, AI is merely a tool and as such isn’t innately ethical or unethical, regardless of what the frantic tenor of AI conversations may lead you to believe. But as with any tool, it can be used or implemented unethically, whether accidentally or otherwise. I think we can expect to see regulations demanding transparency around the algorithms AI uses and, more to the point, accountability when it comes to the human decision-making behind AI behaviors. Ethical regulations could be designed to address biases or other unfairnesses that may be coded into AI by humans - again, whether accidentally or otherwise. Such rules may be put in place to safeguard against discrimination and the misleading of customers, and are likely to have an impact on things like the use of AI in customer interactions.

Transparency in general is a key tenet of the ethical use of any technology, and I expect that regulatory bodies will want to ensure EBPP businesses are transparent about all aspects of AI use, by requiring them to disclose when it has been or will be part of creating any product or experience. This not only helps customers feel more informed and knowledgeable when it comes to how they pay their bills, it provides the accountability and understanding of circumstances needed to identify and solve any mistakes that do occur.

Market manipulation

We know that artificial intelligence can be incredibly helpful in analyzing data, including the data used in assessing risk, so I expect we’ll see an increase in AI use cases like algorithmic trading -- and I expect these use cases to be highly regulated. After all, it’s not hard to imagine AI-powered risk assessment and algorithmic trading being used as tools to manipulate markets and possibly lead to financial instability. While this isn’t directly relevant to us in the EBPP space, the impacts of regulations that financial institutions face in one area frequently trickle down to affect other, related sectors. As EBPP providers offer an increasingly diverse range of payment options to consumers, including the ability to pay bills using cryptocurrency, it will be crucial that all markets touched by tech are closely regulated.

Reliability standardizations

In sectors like healthcare and fintech, in which the stakes are especially high for consumers and the consequences of mismanaged data are dire, we’re likely to see broad safety and reliability regulations and standardizations. EBPP organizations that work in these sectors or make tools used in these sectors -- for example, any product used in hospital billing -- would feel the impact of such regulations.

Healthcare bills in particular are a source of stress for many people, and technology use in healthcare is intensely regulated, often leading to additional stress. The continued use of antiquated technology in healthcare settings is a significant pain point for both providers and patients, but the sector’s regulations can make modernization a challenge. EBPP companies operating in this space should consider the typically gradual pace of innovation and provide solutions that will introduce the system-shock of AI accordingly.

It’s important for those of us in the EBPP space to keep abreast of how our use of AI is likely to be regulated, and not just because of the compliance costs that those demands are likely to incur. Many people in tech hear the word “regulation” and immediately think of a phenomenon that slows the pace of innovation and product development. While it’s true that regulations force organizations to take their time when adopting a new technology, I think it behooves organizations to consider these regulations not only important for using AI as safely and ethically as possible, but also as an opportunity to keep ahead of the competition. Companies that are agile enough to adapt to regulatory demands quickly and faithfully will be in a position to shape the future landscape. Moreover, these players are going to inspire the most trust in customers who may currently view AI with trepidation.

Regulations can be a challenge for EBPP, but I think the industry could benefit from leaders considering these limitations and restrictions as opportunities for further success. After all, anything that makes our tools safer and more ethical is a win for all of us.

Image credit: BiancoBlue/depositphotos.com

Ramesh Kandukuri is Chief Technology Officer, Enterprise Solutions at InvoiceCloud.

Comments are closed.

© 1998-2024 BetaNews, Inc. All Rights Reserved. Privacy Policy - Cookie Policy.