The impact of AI in the data center [Q&A]

data center

The rise of artificial intelligence (AI), automation and machine learning is changing the way IT professionals approach data center operations and management. And the possibilities are endless -- from lowering operational costs and reducing time to deploy new applications -- there’s no doubt that AI is transforming the data center.

We spoke with Hal Woods, chief technology officer of storage and data management specialist Datera, to learn more.


BN: How can AI be used to improve data center operations?

HW: AI promises to automate operational aspects such as resource allocation and optimization, event handling and contribute critical information to capacity planning. There have been some early successes in using AI for planning and provisioning power to data centers, with a notable example being Google reducing data center cooling costs by 40 percent using its DeepMind AI, resulting in a 15 percent overall reduction in power consumption.

BN: Are certain data centers better positioned to integrate AI?

HW: The potential is greatest in data centers with greater variability. Without predictability, it is impossible to implement fixed processes to deploy, scale and maintain the data center infrastructure. Most often we think of these data centers as being associated with cloud deployments and as-a-service deployments. This variability could be in the growth of the data center infrastructure, the range of applications, the service level agreement (SLA)/service level objective (SLO) being offered by the data center, or the response to failures and outages.

BN: What benefits do you anticipate will occur with AI data center automation?

HW: There are two dimensions to the benefits of AI data center automation: the reduction of data center infrastructure and operational costs; and the ability to reduce the time to deploy new infrastructure and applications and respond to issues.

At Datera we have developed an AI approach to placing customers' data based on service level objectives. Our system constantly monitors access patterns and migrates data within the system to best meet the SLO, and more importantly, allows the customer to change the SLO on the fly with the system automatically moving data in response to the changed SLO. New algorithms can be deployed based on analytics that observe the operational characteristics of the installed base of systems, enabling continuous operational improvements as well as the exploitation of new technology.

BN: Are there any pitfalls that companies should be aware of when looking to add AI or automation technology to their data centers?

HW: The biggest pitfall is the 'silver bullet' syndrome. AI is one piece in a comprehensive strategy that also includes conventional data center infrastructure management (DCIM) capabilities. The next pitfall is ensuring that the organization and key individuals are vested in the use of AI. All deployments will experience growing pains, so companies need to focus on solving the problems that arise rather than using setbacks as a reason to discount the technology.

BN: How can companies who have never used AI in their data centers begin using the technology?

HW: This really depends on the pain points driving them to deploy AI, but one suggestion would be to use it to make recommendations for specific actions that could eventually be automated. This should naturally progress to deploying AI incrementally across the enterprise as confidence improves, eventually leading to wide scale deployment. As a CTO, I am excited about the development of revolutionary technologies but recognize that adoption of new technology in many enterprises must be incremental.

BN: How do you see AI/automation in data centers progressing in the next three to five years?

HW: AI, along with the increased application of analytics and machine learning in data center automation, has a very promising outlook particularly as data centers focus on as-a-service approaches. In these approaches, the instantaneous response to provision, move and optimize applications demands an adaptive system.

Analytics can be used to make sense of the information generated by thousands of sensors in a large data center, while machine learning can be used to create AI algorithms that are then deployed into the data center to constantly respond to changes.

While this outlook is promising, it is not without its limits. Just like any other system, metrics should be developed that can be utilized to assess the impact that AI-based automation is having.

Photo credit: wavebreakmedia / Shutterstock

Comments are closed.

© 1998-2021 BetaNews, Inc. All Rights Reserved. Privacy Policy - Cookie Policy.