How IT can cut its power bills and help save the planet [Q&A]

It's estimated that enterprise IT accounts for around 1.5 percent of the world's energy usage, making it a major contributor to greenhouse gas emissions.

But a simple switch to using more efficient solutions rather than simply throwing everything into the cloud could make a significant difference. We spoke to Dr. Jim Webber, chief scientist at native graph database leader Neo4j and visiting professor at Newcastle University, to find out more.

BN: What's contributing to IT's high energy usage?

JW: There are some understandable reasons for this; the amount of infrastructure we need to run the modern world is high, and its power consumption is growing at two percent annually. It's worth noting that this growth is occurring despite the gradual transition to a greener energy grid.

There are fewer good reasons behind this trend too. We have become excessively reliant on scale-out cloud technologies, neglecting certain fundamental principles of computer science.

Thankfully, there is room for improvement if developers reconsider their options such as the adoption of more efficient and mechanically sympathetic system designs in the cloud. By reducing the need for so many servers -- and so decreasing electricity consumption -- we can make a big difference to our alarming power consumption statistics. It's important to highlight that enterprise IT alone is responsible for 1.5 percent of the planet's total energy consumption (even discounting flagrantly wasteful things like Bitcoin mining). Implementing energy-saving measures are very noticeable for both your company's bottom line and the planet.

BN: Has the way the cloud has been marketed contributed to the problem?

JW: With the widespread adoption of cloud-based IT provision, companies have shifted their hardware and computer rooms to vast, server farms, opting to rent computing resources as needed. Undoubtedly, this approach has radically changed the economics of supplying access to business software applications and storing corporate data, providing immense benefits in terms of cost savings and convenience. However, this transition has led many developers to perceive these services as essentially free, assuming that any workload can be processed at scale in a way that just wasn't physically possible for on-premise.

However, I believe we are witnessing a shift in the perception of cloud overuse. CIOs are receiving ever larger bills from the hyperscale cloud providers, prompting them to realize cloud is not truly a 'free' solution. The cloud providers address this by using their economies of scale to offer discounts to help rationalize corporate IT spend. Unfortunately, this discounting often perpetuates an unhealthy cycle where deploying a massive cloud infrastructure becomes the default response to every problem, regardless of the actual requirements.

BN: Older systems seemed to use their hardware much more efficiently. Have we lost our way a bit in terms of efficient coding?

JW: During my early days as a programmer, the situation was very different. We were well aware of the limitations in data storage and available RAM, so working within the constraints of these systems became a mark of professional pride.

Times have changed! Now, spinning up 1,000 servers and powering up Apache Spark to address your problem isn't uncommon. To me, that has always felt very wasteful in terms of resources -- and as they say in economics, there's no such thing as a free lunch. In some cases, that's of course the right thing to do, but in others, we're using brute force to solve problems that might be better expressed as a carefully thought out single-machine operation.

BN: How can organizations achieve their aims in a less expensive way?

JW: It's worth acknowledging that there are IT problems -- even at the highest enterprise level -- that can be solved more efficiently and quickly on a single computer with the right software as opposed to always defaulting to big cloud infrastructure. That also means you can bypass the complexities associated with managing and maintaining a 1,000-server environment, including all the power costs that it takes and the subsequent GHG (greenhouse gas emissions) it entails.

For some tasks there are lightweight and far less computationally expensive alternatives available. This isn't about relying on some groundbreaking new super chip but rather exploring data engines that tackle problem-solving in smarter, less brute-force ways.

An excellent example of this shift to lighter-weight software is Adobe's experience developing Behance. A specialized social media application for the creative industry, the company tried two different architectural approaches and neither of them proved successful in delivering the desired functionality for the application. The breakthrough came when the team decided to discard both previous versions and try a fresh approach using a native graph database solution. This proved to be highly efficient, enabling the team to finally deliver what the 10 million Behance users wanted, and the new approach significantly reduced infrastructure requirements. Because it now only needed three servers and operated on a dataset of only 30 to 50 gigabytes of memory, the system was a thousand times smaller than the previous versions they had been struggling with.

Am I saying that graph software, or any single technology solution for that matter, solves all your problems? No. But what I am saying is that the only companies in the world that can afford to solve all their IT problems with gargantuan data processing and computing resources are the Web hyperscalers. For the rest of us, it would pay to be more circumspect in our cloud spending. That would help reduce our energy consumption and cloud bills while reducing the hit on the planet from all that compute power.

BN: Is there a role for AI in finding more efficient solutions?

JW: AI will help, as used in conjunction with graph databases it can track our impact on the environment through elegant, real-time operational control of even the most complex IT configurations and setups. The only conceivable way of capturing all that density and interconnectedness is via a graph database.

Image credit: Olivier26/depositphotos.com

Comments are closed.

© 1998-2024 BetaNews, Inc. All Rights Reserved. Privacy Policy - Cookie Policy.