Cloud

Why data quality is essential to cloud migration [Q&A]

Migrating to the cloud is an increasingly popular option for businesses, but to be successful the data involved needs to be in good shape.

We spoke to Kevin Kline, principal program manager at SentryOne, to find out why the quality of data is so essential to successful migration and what businesses need to do to ensure their migration succeeds.

By Ian Barker -
disk cleaning

Enterprises struggle to implement data sanitization policies

Despite recent legislation placing greater emphasis on privacy and data protection, a new study of data sanitization policies reveals that in many cases there’s a gap between policy and execution.

The study of more than 1,800 senior business leaders by Blancco Technology Group reveals that although 96 percent have a sanitization policy in place, 31 percent have yet to communicate it across the business and 20 percent don't believe their organization's policies are finished being defined.

By Ian Barker -
2020 keyboard

Compliance struggles and more legislation -- privacy and data predictions for 2020

With the California Consumer Privacy Act (CCPA) set to come into force in January, privacy and how companies use data is set to be one of the big themes of 2020. What do some of the industry’s leading figures think this will mean?

Peter Reinhardt, CEO and co-founder of Segment believes, "Though the GDPR roll-out should have given American companies a good taste of what was to come, it's still likely that most will do the bare minimum to comply with the CCPA until the US government starts enforcing it in 2020.

By Ian Barker -
Cloud data security

Secure cloud helps deliver data-driven innovation

Data usage and analysis are now key drivers of innovation and competitive advantage, but increased data use raises issues surrounding security, privacy and compliance.

Israeli company Satori Cyber is launching a new Secure Data Access Cloud to offer continuous visibility and control of data flows across all cloud and hybrid data stores.

By Ian Barker -
Digital data

The rise of first-party data: Why quality matters over quantity

For years, digital marketers have paid hand over fist in the digital gold rush for data. Instead of a tangible product, tech companies earn millions in revenue from the data they collect on previous, current and future digital consumers. But digital marketers seeking to gobble up as much data as they for their campaigns, while not stopping to consider the source of or methods used to collect it, are taking the wrong approach. The age-old mantra of "quality over quantity" has never been more relevant in online advertising, and marketers must quickly and fully embrace first-party data or risk their digital campaigns (and bottom lines) falling flat.

The primary reason to use first-party data over third party data from data marketplace platforms is simple: it’s better. Publishers, apps and ad platforms alike can gather first-party data directly from their audiences and customers, whether that data be purchases, app downloads, in-app actions, social media interactions, or subscriptions. This data comes directly from the source, making it as precise and accurate as possible. This is in stark contrast to third party data, which is aggregated from multiple platforms and combined into a larger data set where buyers generally do not know the exact sources of their data.

By J. Joynson-Hewlett -
Dark data

Is dark data valuable?

A tsunami of dark data is coming -- data that has never been analyzed, tagged, classified, organized, evaluated, used to predict future states, control other processes or has been put up for sale for others to use. So, what do we do with this data? First, we have to understand that exponentially more is coming. We see this in autonomous technology as vehicles generate four thousand gigabytes per day.

Also data is becoming more complex, as most of it is already in in video or other complicated forms. Seemingly free storage is encouraging people to store more and defer deletion.

By Tom Austin -
data lake

Delta Lake to be hosted by the Linux Foundation

All organizations want to get more value from their data, but can be hampered by the lack of reliable information within data lakes.

The Delta Lake project addresses data reliability challenges by making transactions ACID (Atomicity, Consistency, Isolation, Durability) compliant enabling concurrent reads and writes. It also helps to ensure that the data lake is free of corrupt and not-conformant data.

By Ian Barker -
Access management

Almost half of employees have access to more data than they need

A new study of over 700 full-time US employees reveals that that 48 percent of employees have access to more company data than they need to perform their jobs, while 12 percent of employees say they have access to all company data.

The survey by business app marketplace GetApp also asked employees what classifications of data protection are in place at their company. No more than a third of businesses were found to use any one individual data classification.

By Ian Barker -
Business database

Enterprises are modernizing data architectures but still have major concerns

A new study of over 300 IT executives in large enterprises by database company DataStax reveals all are modernizing their data architecture, but most are still struggling with major challenges.

The results show 99 percent of IT execs report challenges with architecture modernization and 98 percent with their corporate data architectures (data silos). Vendor lock-in (95 percent) is also a key concern among respondents.

By Ian Barker -
multiple clouds

Data fragmentation is the main reason public cloud doesn't deliver

When IT managers adopted the cloud they believed it would simplify operations, increase agility, reduce costs, and provide greater insight into their data. Yet 91 percent say it hasn't delivered all the expected benefits and 88 percent that it isn't meeting management expectations.

A new study of 900 senior decision makers, for data management company Cohesity carried out by Vanson Bourne, finds that of those who feel the promise of public cloud hasn't been realized, 91 percent believe it's because their data is fragmented in and across public clouds and could become nearly impossible to manage long term.

By Ian Barker -
machine learning AI

Quality issues with training data are holding back AI projects

For many organizations, AI and machine learning are seen as a route to greater efficiency and competitive advantage.

But according to a new study conducted by Dimensional Research for Alegion  almost eight out of 10 enterprise organizations currently engaged in AI and ML report that projects have stalled, and 96 percent of these companies have run into problems with data quality, data labeling required to train AI, and building model confidence.

By Ian Barker -
business security

UK consumers want businesses to do more to protect their data

Protecting your digital footprint is growing more important and the results from a survey of 2,000 UK adults by Kaspersky Lab finds that people believe there is not enough business or state protection currently in place to defend it.

The study finds 41 percent of UK respondents think that businesses should do more to protect their personal data, including passwords, addresses and bank account details, from hacking.

By Ian Barker -
data overload

15 percent of IT professionals have more data sources than they can count

Data is essential for modern business, but managing it effectively can be difficult. Big data can be just too big to handle.

A new survey from unified operations specialist Ivanti looks at the challenges IT professionals face when it comes to silos, data and implementation.

By Ian Barker -
Folders magnified

More than half of companies have sensitive files open to all employees

The latest data risk report from security company Varonis reveals that 53 percent of companies have at least 1,000 sensitive files open to all employees, putting them at risk of data breaches.

Keeping old sensitive data that risks fines under HIPAA, GDPR and the upcoming CCPA is a problem too. The report finds over half of data is stale and 87 percent of companies have over 1,000 stale sensitive files, with 71 percent having over 5,000 stale sensitive files.

By Ian Barker -
Data privacy

Automated governance platform helps businesses use data safely

Data privacy is a major concern for businesses, made more acute by the raft of new compliance and data protection rules appearing around the world.

Immuta is launching a platform with no-code, automated governance features that enable business analysts and data scientists to securely share and collaborate with data, dashboards, and scripts without fear of violating data policy and industry regulations.

By Ian Barker -
Load More Articles