The rise of first-party data: Why quality matters over quantity


For years, digital marketers have paid hand over fist in the digital gold rush for data. Instead of a tangible product, tech companies earn millions in revenue from the data they collect on previous, current and future digital consumers. But digital marketers seeking to gobble up as much data as they for their campaigns, while not stopping to consider the source of or methods used to collect it, are taking the wrong approach. The age-old mantra of "quality over quantity" has never been more relevant in online advertising, and marketers must quickly and fully embrace first-party data or risk their digital campaigns (and bottom lines) falling flat.
The primary reason to use first-party data over third party data from data marketplace platforms is simple: it’s better. Publishers, apps and ad platforms alike can gather first-party data directly from their audiences and customers, whether that data be purchases, app downloads, in-app actions, social media interactions, or subscriptions. This data comes directly from the source, making it as precise and accurate as possible. This is in stark contrast to third party data, which is aggregated from multiple platforms and combined into a larger data set where buyers generally do not know the exact sources of their data.
Is dark data valuable?


A tsunami of dark data is coming -- data that has never been analyzed, tagged, classified, organized, evaluated, used to predict future states, control other processes or has been put up for sale for others to use. So, what do we do with this data? First, we have to understand that exponentially more is coming. We see this in autonomous technology as vehicles generate four thousand gigabytes per day.
Also data is becoming more complex, as most of it is already in in video or other complicated forms. Seemingly free storage is encouraging people to store more and defer deletion.
Delta Lake to be hosted by the Linux Foundation


All organizations want to get more value from their data, but can be hampered by the lack of reliable information within data lakes.
The Delta Lake project addresses data reliability challenges by making transactions ACID (Atomicity, Consistency, Isolation, Durability) compliant enabling concurrent reads and writes. It also helps to ensure that the data lake is free of corrupt and not-conformant data.
Almost half of employees have access to more data than they need


A new study of over 700 full-time US employees reveals that that 48 percent of employees have access to more company data than they need to perform their jobs, while 12 percent of employees say they have access to all company data.
The survey by business app marketplace GetApp also asked employees what classifications of data protection are in place at their company. No more than a third of businesses were found to use any one individual data classification.
Enterprises are modernizing data architectures but still have major concerns


A new study of over 300 IT executives in large enterprises by database company DataStax reveals all are modernizing their data architecture, but most are still struggling with major challenges.
The results show 99 percent of IT execs report challenges with architecture modernization and 98 percent with their corporate data architectures (data silos). Vendor lock-in (95 percent) is also a key concern among respondents.
Data fragmentation is the main reason public cloud doesn't deliver


When IT managers adopted the cloud they believed it would simplify operations, increase agility, reduce costs, and provide greater insight into their data. Yet 91 percent say it hasn't delivered all the expected benefits and 88 percent that it isn't meeting management expectations.
A new study of 900 senior decision makers, for data management company Cohesity carried out by Vanson Bourne, finds that of those who feel the promise of public cloud hasn't been realized, 91 percent believe it's because their data is fragmented in and across public clouds and could become nearly impossible to manage long term.
Quality issues with training data are holding back AI projects


For many organizations, AI and machine learning are seen as a route to greater efficiency and competitive advantage.
But according to a new study conducted by Dimensional Research for Alegion almost eight out of 10 enterprise organizations currently engaged in AI and ML report that projects have stalled, and 96 percent of these companies have run into problems with data quality, data labeling required to train AI, and building model confidence.
UK consumers want businesses to do more to protect their data


Protecting your digital footprint is growing more important and the results from a survey of 2,000 UK adults by Kaspersky Lab finds that people believe there is not enough business or state protection currently in place to defend it.
The study finds 41 percent of UK respondents think that businesses should do more to protect their personal data, including passwords, addresses and bank account details, from hacking.
15 percent of IT professionals have more data sources than they can count


Data is essential for modern business, but managing it effectively can be difficult. Big data can be just too big to handle.
A new survey from unified operations specialist Ivanti looks at the challenges IT professionals face when it comes to silos, data and implementation.
More than half of companies have sensitive files open to all employees


The latest data risk report from security company Varonis reveals that 53 percent of companies have at least 1,000 sensitive files open to all employees, putting them at risk of data breaches.
Keeping old sensitive data that risks fines under HIPAA, GDPR and the upcoming CCPA is a problem too. The report finds over half of data is stale and 87 percent of companies have over 1,000 stale sensitive files, with 71 percent having over 5,000 stale sensitive files.
Automated governance platform helps businesses use data safely


Data privacy is a major concern for businesses, made more acute by the raft of new compliance and data protection rules appearing around the world.
Immuta is launching a platform with no-code, automated governance features that enable business analysts and data scientists to securely share and collaborate with data, dashboards, and scripts without fear of violating data policy and industry regulations.
Over half of data-driven initiatives are failing


More than half of data-driven initiatives are failing in business, with 27 percent of failures due to a skills shortage according to new research from analytic database company Exasol.
In the public sector, financial services and energy and utilities companies the failure rate rises to more than 60 percent. And in retail and financial services 40 percent blame skills shortages for failures.
Data center and server room considerations: What you need to know


In the rapidly-evolving data-hungry IT environment, data center management is becoming increasingly intensive and complex. Team that with the constant pressure to control costs while increasing efficiency and capacity, data center traffic is projected to more than triple by 2020, driven primarily by our dependence to do business, communicate, and entertain over the Internet.
The immense amount of data needed to support these activities requires not only a growing number of data centers, but new kinds of data center builds, which also necessitates new ways to manage them. In addition, green initiatives driven by power concerns and the implications of size and scale, coupled with the adoption of new technologies, are creating a confluence of often conflicting forces that require new and innovative data center management solutions.
The evolution of data and disparate systems


The weird thing about evolution is that it affects us even though we are deeply aware of its mechanisms and processes. There's something unavoidable and inexorable about it. While that's true of physical processes governed by natural selection, perhaps it's less true of human culture and technology. Or is it?
Over the long history of IT and its use in and by big business, we've seen constant innovation, sometimes incremental in progress but sometimes radically discontinuous. Consider the steady march of microprocessor performance in the former case and the sudden AI deep learning revolution for the latter. But in both cases what's happened before and what's happening now affect and influence what directions tech goes into tomorrow.
European financial services industry struggling with lack of data skills


Nearly 40 percent of financial services businesses are failing to implement data initiatives due to a lack of skills, with almost a third saying that their GDPR initiatives are failing, according to a new study.
The report produced by Vanson Bourne for analytics database company Exasol is based on responses from 500 IT and business decision makers, from enterprises in Germany and the UK.
Recent Headlines
Most Commented Stories
BetaNews, your source for breaking tech news, reviews, and in-depth reporting since 1998.
© 1998-2025 BetaNews, Inc. All Rights Reserved. About Us - Privacy Policy - Cookie Policy - Sitemap.