New analytics platform helps deliver AI for business
The biggest challenges that businesses face when implementing AI projects relate to the handling of data as it often sits in a separate silo.
Analytics specialist Databricks is addressing this problem with a Unified Analytics Platform with new capabilities to unify data and AI teams and technologies.
It offers MLflow for developing an end-to-end machine learning workflow, Databricks Runtime for ML to simplify distributed machine learning, plus Databricks Delta for data reliability and performance at scale.
"To derive value from AI, enterprises are dependent on their existing data and ability to iteratively do machine learning on massive data sets. Today’s data engineers and data scientists use numerous, disconnected tools to accomplish this, including a zoo of machine learning frameworks," says Ali Ghodsi, co-founder and CEO at Databricks. "Both organizational and technology silos create friction and slow down projects, becoming an impediment to the highly iterative nature of AI projects. Unified Analytics is the way to increase collaboration between data engineers and data scientists and unify data processing and AI technologies."
According to research commissioned by Databricks, it takes organizations more than seven months to bring AI projects to a close, with 50 percent of the time spent on data preparation. Currently, organizations build their big data architectures using a variety of systems, which increases cost and operational complexity. Data engineers are struggling to simplify data management and provide clean, usable data to data scientists, ultimately, hindering the success of AI initiatives.
A key component of the Unified Analytics Platform is Databricks Delta which extends Apache Spark to simplify data engineering by providing high performance at scale, data reliability through transactional integrity, and the low latency of streaming systems. With Delta, organizations don't have to make a trade-off between storage system properties, or spend their resources moving data across systems. Hundreds of applications can reliably upload, query, and update data at massive scale and low cost, making data sets ready for machine learning.
You can read more on the Databricks blog.