New platform cuts the cost of using Google BigQuery
Companies often have multiple business intelligence tools deployed across different departments. This means IT teams can end up having to build data pipelines dedicated to each tool at the cost of agility and resources.
BI and big data specialist AtScale is launching its latest platform, AtScale 6.0, which aims to help users deploy analytical workloads on Google BigQuery, cut costs and speed up delivery of results.
"We've tested Google BigQuery against some of the most demanding queries," says Josh Klahr, VP of product management for AtScale, "and the technology has shown to be easy to use and very performant. With the new functionality added to our Adaptive Cache, it's now also very cost-efficient to run hundreds of queries for thousands of users without breaking the bank."
AtScale continually analyzes query patterns and automatically creates and manages aggregates, this means users get faster query results and put significantly less load on BigQuery, resulting in big infrastructure savings.
Its Adaptive Cache graph engine optimizes the processing of aggregates, and early deployments have shown processing time to improve by up to 10X. By re-routing big data queries to its Adaptive Cache, AtScale 6.0 reduces the number of unique requests reaching the underlying infrastructure. This helps to reduce infrastructure stress and it also lowers the cost incurred for each query. In initial testing on Google BigQuery, query costs have been reduced by up to 1,000x per query.
In addition there's a patented Hybrid Query Service technology which lets any BI tool run on big data directly, using SQL or MDX, without any data extract required. This means that any BI user can create reports and dashboards to run live on the data and this is now also possible on Google BigQuery.
You can find out more and request a demo on the AtScale website.