New investment, state legislation and more penalties -- privacy predictions for 2025

As both businesses and individuals become more reliant on connectivity and data so concerns around privacy are increasingly to the fore.

Here are the views of some industry experts on what the privacy landscape may look like in 2025.

Ravi Srivatsav, CEO of DataKrypto says there's a need for new investment. "Companies will increasingly address data privacy strategically and operationally, investing in new infrastructure and technology to develop stringent data protection to avoid the costly consequences of cybersecurity attacks. Adversely, such investments can create new attack surfaces, which will be addressed with innovative, privacy-enhancing technologies (PETs) like secure multi-party computing (SMPC), trusted execution environments (TEEs), confidential computing, and fully homomorphic encryption (FHE)."

Lorri Janssen-Anessi, director of external cyber assessments at BlueVoyant, thinks we'll see more enforcement of privacy rules. "Building on frameworks like the GDPR, more regions could enforce privacy rights, obligating organizations to limit data collection, improve transparency, and seek explicit consent for data use. Companies may face stringent requirements to secure consumer data, notify users of breaches quickly, and demonstrate the minimum collection of personal information."

Gabrielle Hempel, customer solutions engineer at Exabeam, thinks the absence of federal AI and privacy law in the US will lead individual states to act. "The absence of a comprehensive federal AI and data privacy law will lead states to take matters into their own hands. California, Colorado and other states will continue introducing AI regulations, forcing companies to navigate a complex patchwork of legal standards. As AI becomes more ingrained in business operations, the lack of a national framework will create compliance challenges across industries. Without swift federal action, expect more states to legislate AI usage, and companies to be caught in an increasingly fragmented regulatory landscape."

This echoed by BreachRx CEO, Andy Lunsford:

Under the new administration, US federal oversight and regulatory enforcement will likely decrease. Historically, states have stepped in to fill the void -- leading to heightened scrutiny from jurisdictions like California and New York, especially within frameworks such as the NYDFS Cybersecurity Regulation. This pattern of state-level intervention is not new; we’ve seen it in other sectors, such as automotive, and it underscores the need for companies to remain proactive.

The idea that organizations can simply 'sit back and wait' for conditions to improve is a dangerous misconception. With over 50 state-level laws applicable to data privacy and security, businesses face a fragmented compliance landscape that could become more intricate and costly as states enact their own measures. Companies must prepare for this evolving complexity by strengthening their incident response capabilities and ensuring they are equipped to navigate a web of diverse requirements.

Maurice Uenuma, VP and GM, Americas at Blancco, shares this view too. "The growing patchwork of data privacy regulations across the US, many of which are similar and overlap, will continue to increase compliance burdens on organizations that create, process, store, and transmit sensitive data in 2025. Since California's passage of California Consumer Protection Act, later superseded by the California Privacy Rights Act, over 20 states have passed comprehensive privacy laws. Many of these have already been passed into law but will be taking effect on a rolling basis through 2026 and beyond. To overcome compliance paralysis, organizations will need to be highly organized and efficient. Mature governance (from the board on down), repeatable processes, and tools -- including Governance, Risk and Compliance platforms -- will be critical to minimize compliance-related risks."

Nico Chiaraviglio, chief scientist at Zimperium, sees a strong role for mobile security in addressing privacy concerns. "Mobile security plays a crucial role in addressing the needs of data privacy. However, we often see mobile security with the lens of threat defense and application security. But regulatory compliance is a key piece of the mobile security function. I predict that in 2025, we will see mobile security prioritizing data privacy needs by implementing robust privacy-preserving technologies. According to Zimperium's 2024 Global Mobile Threat Report, 82 percent of organizations allow bringing your own device (BYOD) to work. And a recent survey from Tableau found that 63 percent of Internet users believe most companies aren’t transparent about how their data is used, and 48 percent have stopped shopping with a company because of privacy concerns. We will likely see more regulatory compliance baked into mobile security solutions, particularly around data handling and encryption standards. We are already seeing regulatory shifts in the financial sector, holding app developers accountable for any harm towards their end users due to external attacks. Businesses are recognizing that regulatory compliance features are a necessary piece of the mobile security stack, and they are seeking mobile security platforms that address both privacy and security needs."

Bryan Kirschner, vice president strategy at DataStax, thinks there will need to be a balance between fostering innovation in AI and protecting privacy and security:

You can balance innovation and privacy by providing consumers with authentic choices on how their data gets used. Transparency allows consumers to make informed choices about contributing their data for AI training, especially if they understand the broader benefits, such as improved tools accessible to everyone, like free versions of AI products (e.g., ChatGPT). However, there is a critical distinction between data used for general societal benefit and data used for limited, commercial purposes. When data is used for private, profit-driven applications (e.g., targeted marketing), there should be a fair exchange of value between the company and the consumer that’s focused on 'earning' rather than 'assuming' unrestricted data access.

Speaking as a consumer myself, do I have any qualms about Netflix using data to suggest new shows to me or developer new concepts for shows? Quite the opposite, I'm rooting for them!

Ultimately, this balance can be maintained by ensuring consumers have a say in whether their data contributes to AI development for public good versus private gain. This way, innovation is encouraged, but not at the expense of individual privacy and security, and companies are incented to 'compete on trust.'

Dan Hauck, chief product officer at NetDocuments, says. "Forward-thinking organizations will stand out by staying ahead of evolving regulations. More businesses will focus on privacy and data security, adopt explainable AI models, and integrate human oversight into AI-driven processes. Ethical AI committees will become increasingly common, helping companies protect confidentiality, drive innovation, and keep ethics at the forefront."

Geoff Hixon, VP solutions engineering at Lakeside Software, thinks integrated AI will help protect privacy. "In 2025, AI-integrated PCs will feature advanced security frameworks that continuously monitor for and adapt to emerging threats in real-time. Distributed computing will further enhance privacy by allowing sensitive data processing to occur locally, reducing the need to transfer data and thus minimizing exposure. This dual focus on security and privacy will offer users a safer and more controlled computing environment."

Chris Gaebler, chief marketing officer at Protegrity, thinks we'll see more organizations being penalized for non-compliance.

…as governments introduce new regulations to improve data privacy and keep customer information secure, we anticipate seeing several more organizations be penalized with fines due to noncompliance and, at the same time, lose customer trust.

In the search for improved predictability and profitability in 2025, boards may put their organizations at risk of breaches and regulation noncompliance if they don't enforce measures to safeguard their data. This means implementing robust security frameworks that allow them to leverage AI and maintain stringent security, which is also key to building customer trust and protecting their reputations.

Image credit: md3d/depositphotos.com

© 1998-2024 BetaNews, Inc. All Rights Reserved. Privacy Policy - Cookie Policy.