Growing data volumes drive enterprise IT priorities


A new survey of 500 data and IT leaders who manage data workloads of 150 terabytes or more shows that growing volumes of data are increasingly driving business priorities.
The study, from hyperscale data analytics platform Ocient, finds 90 percent of IT and data leaders are planning to remove or replace existing big data and analytics technologies in the next six to 12 months.
New AI platform aims to open up access to data


Data is increasingly the lifeblood of businesses, but according to Gartner, poor data management costs organizations an average of $12.9 million a year.
With the launch of a new AI-powered SaaS platform that simplifies organizing and leveraging data without an engineer, Pliable aims to open up data access to businesses.
Data governance is top enterprise priority when introducing AI


IT and business leaders are largely allowing employee use of generative AI but the majority (66 percent) are concerned about the data governance risks from AI, including privacy, security and the lack of data source transparency in vendor solutions.
The latest 2023 State of Unstructured Data Management survey from Komprise is based on responses from 300 global enterprise storage IT and business decision makers at companies with more than 1,000 employees in the US and UK, and finds 90 percent of organizations allow employee use of generative AI.
Building next-gen operational intelligence at scale


In today’s digital era, operational visibility is a prerequisite for businesses across sectors such as manufacturing, transportation and retail. However, managing this massive influx of rapid, real-time data can be challenging -- especially for organizations that don’t have the infrastructure in place.
This data generally takes the form of events such as clicks, telemetry, logs and metrics, often collected as time series or machine data. In contrast to transactional data collected via batch ingestion, this data is collected via real-time streaming.
Sensitive data is exposed in over 30 percent of cloud assets


New analysis of more than 13 billion files stored in public cloud environments reveals that more than 30 percent of cloud data assets contain sensitive information.
The study by Dig Security shows personal identifiable information (PII) is the most common sensitive data type that organizations save. In a sample data set of a billion records, more than 10 million social security numbers were found -- the sixth most common type of sensitive information -- followed by almost three million credit card numbers, the seventh most common type.
The importance of data privacy in healthcare [Q&A]


Data is one of the biggest drivers of innovation in healthcare today. Almost everything in healthcare relies on having access to the right data from developing new drugs and medical equipment to allocating resources.
Making use of this data often requires sharing with other organizations and that presents challenges when it comes to keeping it secure. We spoke to Riddhiman Das, co-founder and CEO at TripleBlind, to learn how healthcare organizations are securing their data while still making it accessible.
Storage challenges in a world of high-volume, unstructured data [Q&A]


The amount of data held by enterprises is growing at an alarming rate, yet it's often still being stored on 20-year-old technology.
Add to this a proliferation of different types of systems -- or even different storage platforms for specific use cases -- and you have greater complexity at a time when it’s hard to find new IT personnel.
The return of data modeling -- this time it's strategic [Q&A]


Over the past decade data modeling -- setting up data structures aligned to business requirements -- has tended to take something of a back seat as businesses have rushed to bring products to market.
But we're producing more data than ever and need ways to process it effectively. That's why Satish Jayanthi, CTO and co-founder at Coalesce, believes it's time for data modeling to make a comeback in enterprise strategy. We spoke to him to find out more.
Shifting left to improve data reliability [Q&A]


The concept of 'shifting left' is often used in the cybersecurity industry to refer to addressing security earlier in the development process.
But it's something that can be applied to data management too. Shifting left in this sense means performing data reliability checks sooner. The ability to execute data reliability tests earlier in the data pipelines helps keep bad data out of production systems.
Analyzing Identity and Access Management (IAM) through the lens of data management


An organization's identity data today is its linchpin. This invaluable asset binds an enterprise together, orchestrating access rights, establishing a unified and accurate view of users across various channels, and empowering informed security decisions.
However, the identity infrastructure is continuously becoming more complex as businesses expand their digital landscape, bringing more users, devices, and applications within their IT environment. With this increasing complexity, poor data management can engender substantial financial losses and jeopardize sensitive user and customer data.
Defending your organization from illegal data's wrath


In today's interconnected world, businesses not only grapple with the management of vast amounts of data but also face the looming threat of illegal data concealed within their digital repositories. This proliferation of illegal data presents a range of risks and challenges that organizations must confront.
Illegal data encompasses a broad spectrum of content or files that contravene laws, regulations, and/or company policy. It includes materials such as pirated software, confidential information obtained through unlawful means, and content that promotes or facilitates illegal activities; as well as content that is simply not acceptable or useful on the corporate network such as holiday videos and cat pics.
Businesses struggle to make decisions due to 'analysis paralysis'


In difficult economic times businesses need to make decisions quickly and data is a key part of enabling those choices.
But research from analytics cloud platform Alteryx shows enterprises are struggling to make timely, insight-driven decisions because of 'analysis paralysis' caused by issues around ownership of and access to data.
More than half of enterprises overwhelmed by data


Today's typical large organization is holding 35 petabytes of data across its systems and this is expected to double by 2025. But 75 percent of IT leaders are concerned that their current infrastructure won't be able to scale to meet this demand.
A new report, from infrastructure specialist Hitachi Vantara, shows that while technologies like generative AI are spurring a goldrush to greater insights, automation, and predictability, they are simultaneously putting pressure on the already-strained infrastructure and hybrid cloud environments on which they run.
Dealing with the data authorization blindspot [Q&A]


User authorization for access to data is complicated. Knowing who has access to what information is often difficult because of complex role hierarchies, different authorization models used for different technologies, and the variety of data that may be accessible across technologies and clouds.
Ben Herzberg, chief scientist at data security platform Satori, believes there's often a blindspot around authorization, but that the issue doesn't have to be as complex as it can seem. We talked to him to learn more.
Data clean rooms: The power of second-party data


A staggering 81 percent of advertisers depend on third-party data to reach customers and understand prospects’ buying habits. Their reliance on this data, however, comes with a problem. Exponential cookie decay, government legislation, and increasing consumer demand for data privacy make accessing this data more difficult.
Many brands are turning to data clean rooms (DCR) as a solution. DCRs help companies leverage second-party data to hone their marketing and advertising. In fact, 80 percent of advertisers with media buying budgets over $1 billion will use DCRs by the end of 2023. So, what makes DCRs so popular? This article will show how DCRs can be an incredibly powerful MarTech tool that fosters collaboration among brands, enabling them to gain insights, form ‘lookalike’ audiences, and advertise directly to their user base.
Recent Headlines
Most Commented Stories
BetaNews, your source for breaking tech news, reviews, and in-depth reporting since 1998.
Regional iGaming Content
© 1998-2025 BetaNews, Inc. All Rights Reserved. About Us - Privacy Policy - Cookie Policy - Sitemap.