Data

Storage

Storage challenges in a world of high-volume, unstructured data [Q&A]

The amount of data held by enterprises is growing at an alarming rate, yet it's often still being stored on 20-year-old technology.

Add to this a proliferation of different types of systems -- or even different storage platforms for specific use cases -- and you have greater complexity at a time when it’s hard to find new IT personnel.

By Ian Barker -
financial data exchange

The return of data modeling -- this time it's strategic [Q&A]

Over the past decade data modeling -- setting up data structures aligned to business requirements -- has tended to take something of a back seat as businesses have rushed to bring products to market.

But we're producing more data than ever and need ways to process it effectively. That's why Satish Jayanthi, CTO and co-founder at Coalesce, believes it's time for data modeling to make a comeback in enterprise strategy. We spoke to him to find out more.

By Ian Barker -
Left turn shift left

Shifting left to improve data reliability [Q&A]

The concept of 'shifting left' is often used in the cybersecurity industry to refer to addressing security earlier in the development process.

But it's something that can be applied to data management too. Shifting left in this sense means performing data reliability checks sooner. The ability to execute data reliability tests earlier in the data pipelines helps keep bad data out of production systems.

By Ian Barker -
Data

Analyzing Identity and Access Management (IAM) through the lens of data management

An organization's identity data today is its linchpin. This invaluable asset binds an enterprise together, orchestrating access rights, establishing a unified and accurate view of users across various channels, and empowering informed security decisions.

However, the identity infrastructure is continuously becoming more complex as businesses expand their digital landscape, bringing more users, devices, and applications within their IT environment. With this increasing complexity, poor data management can engender substantial financial losses and jeopardize sensitive user and customer data.

By Wade Ellery -
Data Stream

Defending your organization from illegal data's wrath

In today's interconnected world, businesses not only grapple with the management of vast amounts of data but also face the looming threat of illegal data concealed within their digital repositories. This proliferation of illegal data presents a range of risks and challenges that organizations must confront.

Illegal data encompasses a broad spectrum of content or files that contravene laws, regulations, and/or company policy. It includes materials such as pirated software, confidential information obtained through unlawful means, and content that promotes or facilitates illegal activities; as well as content that is simply not acceptable or useful on the corporate network such as holiday videos and cat pics.

By Michael Jack -
Analyzing data on computer

Businesses struggle to make decisions due to 'analysis paralysis'

In difficult economic times businesses need to make decisions quickly and data is a key part of enabling those choices.

But research from analytics cloud platform Alteryx shows enterprises are struggling to make timely, insight-driven decisions because of 'analysis paralysis' caused by issues around ownership of and access to data.

By Ian Barker -
folder stack

More than half of enterprises overwhelmed by data

Today's typical large organization is holding 35 petabytes of data across its systems and this is expected to double by 2025. But 75 percent of IT leaders are concerned that their current infrastructure won't be able to scale to meet this demand.

A new report, from infrastructure specialist Hitachi Vantara, shows that while technologies like generative AI are spurring a goldrush to greater insights, automation, and predictability, they are simultaneously putting pressure on the already-strained infrastructure and hybrid cloud environments on which they run.

By Ian Barker -
Restricted Access sign

Dealing with the data authorization blindspot [Q&A]

User authorization for access to data is complicated. Knowing who has access to what information is often difficult because of complex role hierarchies, different authorization models used for different technologies, and the variety of data that may be accessible across technologies and clouds.

Ben Herzberg, chief scientist at data security platform Satori, believes there's often a blindspot around authorization, but that the issue doesn't have to be as complex as it can seem. We talked to him to learn more.

By Ian Barker -
Data-Clean-Room

Data clean rooms: The power of second-party data

A staggering 81 percent of advertisers depend on third-party data to reach customers and understand prospects’ buying habits. Their reliance on this data, however, comes with a problem. Exponential cookie decay, government legislation, and increasing consumer demand for data privacy make accessing this data more difficult.

Many brands are turning to data clean rooms (DCR) as a solution. DCRs help companies leverage second-party data to hone their marketing and advertising. In fact, 80 percent of advertisers with media buying budgets over $1 billion will use DCRs by the end of 2023. So, what makes DCRs so popular? This article will show how DCRs can be an incredibly powerful MarTech tool that fosters collaboration among brands, enabling them to gain insights, form ‘lookalike’ audiences, and advertise directly to their user base.

By Derek Slager -
Sustainable Concept - Sustainability Business

End of life data risks sustainability targets

Environmental sustainability has a high to moderate influence on their approach to processing end of life (EOL) data for 88 percent of respondents according to a new survey.

But more than a third (39 percent) of enterprises are yet to implement a plan to reduce their data footprint, leaving them at risk of compliance failures in light of upcoming sustainability regulations.

By Ian Barker -
Cloud maze complexity

Complex environments mean enterprises can't use a third of their data effectively

New research from hybrid data company Cloudera reveals that organizations currently estimate they are not using 33 percent of their data effectively.

The survey 850 IT decision makers (ITDMs) across the EMEA region shows 72 percent of respondents agree that having data sitting across different cloud and on-premises environments makes it complex to extract value from all the data in their organization.

By Ian Barker -
Startup people talking

Trust in data: How start-ups can thrive in the data economy

Data is crucial in today's tech-driven world, with enterprises prioritizing its use in all aspects of their operations. A recent survey shows that 83 percent of CEOs want their organizations to be data-driven.  However, the same survey found that only 25 percent of organizations are data-leading companies. This presents a significant opportunity for start-ups to establish themselves at the forefront of the data economy. 

The early days of the data economy relied on users handing over their data to access digital services, and companies then monetize that data through advertising. There is now a transition underway where businesses are seeking to improve and broaden how they create, manage, analyze, and extract value from their data. This expansion will expand the data economy's definition and market potential, creating an opportunity for start-ups to create hardware and software that will enable this new era.

By Tom Henriksson -
Deep Data Search digital forensics

Data quality incidents take two days or more to resolve

The latest State of Data Quality survey from Bigeye finds that more than half of respondents have experienced five or more data issues over the last three months.

Another 40 percent say they have experienced moderate to severe data incidents within the last six months, and that it took a major effort to avoid damage to the business. These incidents range from severe enough to impact the company's bottom line, to reducing engineer productivity.

By Ian Barker -
Chip processor

How data and analytics build a stable future for manufacturers

The CHIPS and Science Act promises a bright future for the U.S. semiconductor industry. The legislation aims to increase domestic production capacity, build a stronger workforce and encourage American innovation. But high-tech manufacturers can't sit around waiting to reap the benefits -- they must focus on revenue optimization now to set themselves up for success.

Experts forecast semiconductor demand to surge 6-8 percent per year, requiring manufacturers to double current production. Despite the CHIPS Act inspiring $200 billion in new commitments to U.S. manufacturing, the industry is unlikely to experience significant production capacity growth for several years. What should companies do in the interim? Improve data and analytics processes to build better business practices.

By Chris Shrope -
data storage

Dealing with data: What to do and where to store it

Today’s digitally-enabled organizations generate huge volumes of data that needs to be stored, maintained, and accessed. Indeed, it’s been estimated that globally, around 2.5 quintillion bytes of data is created every day. A figure that is escalating at a rapid rate as enterprises pursue big data analytics and IoT projects.

Added to which, the rising use of video, rich imaging, and AI applications means that much of this data is 'unstructured'. So much so that according to IDC as much as 80 percent of the world’s data will be unstructured by 2025. All of which adds further complexities into the equation when it comes to storing and preserving data so that it is accessible and available for analysis.

By Eric Bassier -
betanews logo

We don't just report the news: We live it. Our team of tech-savvy writers is dedicated to bringing you breaking news, in-depth analysis, and trustworthy reviews across the digital landscape.

x logo facebook logo linkedin logo rss feed logo

© 1998-2025 BetaNews, Inc. All Rights Reserved.