New self-service tool helps unlock customer data across ad platforms


Advertising via platforms like Google and Facebook is popular, but any insights gained from using these services are difficult to apply elsewhere.
Identity management company Drawbridge is launching a new Self Service Graph dashboard that gives marketers transparency into their cross-device data by letting them tap into the service, without requiring any engineering on the client side.
Misuse of spreadsheets costs European businesses €55 billion a year


Spreadsheets like Excel were never designed to handle complex analytics and big data tasks, but a growing demand for data insights is leading many businesses to waste effort manually handling data in spreadsheets.
A study commissioned by self-service analytics company Alteryx and carried out by IDC reveals that advanced spreadsheet users spend on average nine hours a week on repeat effort manually manipulating data, wasting €10,000 per year. Across Europe this represents, on average, two billion hours of duplicate work, costing an eye-watering €55bn per year (around $64bn).
New platform cuts the cost of using Google BigQuery


Companies often have multiple business intelligence tools deployed across different departments. This means IT teams can end up having to build data pipelines dedicated to each tool at the cost of agility and resources.
BI and big data specialist AtScale is launching its latest platform, AtScale 6.0, which aims to help users deploy analytical workloads on Google BigQuery, cut costs and speed up delivery of results.
Splunk will use machine learning to improve its enterprise solutions


Splunk has revealed plans to boost the power of its enterprise software offerings thanks to the power of machine learning.
Speaking at the opening keynote of the company’s conf2017 event in Washington, Splunk chief product officer Richard Campione highlighted how machine learning could help the company’s customers get even more insight out of their data.
How .NET Framework integrates big data


Companies of all sizes have started to recognize the value of big data collections and the need to take advantage of them. Development of software systems play a big role in big data analytics.
As companies proceed on their big data journey, they usually start by batch processing their big data assets. This can mean gathering and aggregating web log data, telemetry from IoT devices, user clicks from an app and more.
Intelligent data platform drives digital transformation


Businesses are more keen than ever to unlock the power of their data, but often struggle to come up with a strategic approach.
Cloud data management company Informatica is launching its latest Intelligent Data Platform driven by the CLAIRE engine which uses metadata-driven AI to deliver faster insights.
New data platform accelerates application development and deployment


One of the things that can hold back development of business applications the reliance on extract, transform, and load (ETL) infrastructures to move data between applications and platforms.
Platform specialist InterSystems is launching a new platform built around a high-performance database that eliminates the need for ETL.
Equifax data breach could be worst ever -- find out here if you are affected by the hack


Data breaches are fairly common nowadays. This is unfortunate, as it exposes sensitive information to evil hackers and other nefarious criminals. Look, people are doing their best to make it through the day -- working long hours and struggling to make ends meet. Then, some computer nerd comes along and adds to life's difficulties by stealing identities. Sigh.
Today, another data breach comes to light, but this time it is particularly bad. In fact, it could quite possibly be the worst such hack in history. You see, credit agency Equifax -- a company you'd expect to be very secure -- had consumer information stolen. Now, it isn't just a handful of people that are affected. No, it is a staggering 143 million consumers in the USA! To make matters worse, it includes the holy grail of personally identifiable information -- social security numbers. Besides SSN, the hackers got birth dates and addresses too. For some of these unfortunate folks, even credit card numbers and driver's license numbers were pilfered.
Energy efficiency and what it means for data center operators, designers and manufacturers


As society continues to grow and develop more eco-friendly and sustainable values, the pressure on data centers to become more energy efficient has greatly increased. Recently, the data center industry has been making significant progress in developing new, energy efficient approaches in the design and manufacturing of data centers.
Basically, what we are seeing is the overall solution of deploying racks and aisle containment structures at the same time, versus the traditional model of filling a data center full of hardware and then bringing in someone to help with efficiency and containment. Data center operators are having these conversations earlier in the design and manufacturing process, eliminating the need to make energy efficient changes in bits and pieces.
How enterprises are using data to get work done


A new study by content management and collaboration specialist Egnyte has analyzed 25 Petabytes of customer data and four billion of the activities they performed.
This has been used to uncover unique insights about the way businesses are managing their data and how their employees are collaborating on it.
Fastly adds the power of edge computing to Google BigQuery


Companies continue to be keen to exploit the power of big data analytics, and one of the most popular platforms for doing this is Google's BigQuery.
Edge cloud platform Fastly is announcing a new integration that allows the real-time streaming of logs to BigQuery. This, the first of a number of planned integrations with Google's Cloud Platform, aims to deliver better performance and faster real-time insights.
The value of analytics and big data in digital transformation


Big data and analytics are topics firmly embedded in our business dialogue. The amount of data we’re now generating is astonishing. Cisco predicts that annual global IP traffic will reach 3.3 ZB per year by 2021 and that the number of devices connected to IP networks will be more than three times the global population by 2021, while Gartner predicts $2.5M per minute in IoT spending and 1M new IoT devices will be sold every hour by 2021. It’s testament to the speed with which digital connectivity is changing the lives of people all over the world.
Data has also evolved dramatically in recent years, in type, volume, and velocity -- with its rapid evolution attributed to the widespread digitization of business processes globally. Data has become the new business currency and its further rapid increase will be key to the transformation and growth of enterprises globally, and the advancement of employees, "the digital natives."
New storage platform offers data protection and seamless scaling


In the past companies have relied on different secondary storage solutions for backup and recovery, and for archiving large amounts of structured and unstructured data.
But as businesses need to store larger amounts of more diverse data this creates headaches for administrators as the same information collects on both solutions and may need to be handled via different interfaces.
Data capture techniques fail to keep pace with demands for real-time analysis


Organizations using extract, transfer and load (ETL) and Changed Data Capture (CDC) technologies are struggling to keep up with today's demand for real-time data analysis, with negative effects on their business opportunities and efficiency.
This is one of the findings of a new study by IDC, sponsored by software company InterSystems which highlights a growing need for reliable real-time data analytics in today's enterprise.
Three ways to generate profit with the data you already have


Build it and they will come. That is the view many organizations maintain about their data lakes and data warehouses. Companies are rapidly investing in systems and processes to retain business data that they know is valuable but have no clue what to do with it. Even the government collects mass amounts of data without specific plans for using the information at the time of collection. This trend only accelerates as the amount of data being produced continues to escalate. Today, it is estimated that human knowledge is doubling every 12 to 13 months and IBM is estimating that with the build out of the "internet of things," knowledge will double every 12 hours.
Most organizations search for value in their data by throwing teams of data scientists at the various stores of data collected hoping to find insights that are commercially viable. This approach typically results in endless hours of digging for insights and if any are found, they rarely see the light of day. In order to monetize your data, you need a different approach, one that starts by turning the process on its head. We recommend three approaches to help you monetize your data:
Recent Headlines
Most Commented Stories
Betanews Is Growing Alongside You
Only a fool still uses Windows 7
© 1998-2025 BetaNews, Inc. All Rights Reserved. Privacy Policy - Cookie Policy.