Cloud computing may finally end the productivity paradox

One of the darkest secrets of Information Technology (IT) is called the Productivity Paradox. Google it and you’ll learn that for at least 40 years and study after study it has been the case that spending money on IT -- any money -- doesn’t increase organizational productivity. We don’t talk about this much as an industry because it’s the negative side of IT. Instead we speak in terms of Return on Investment (ROI), or Total Cost of Ownership (TCO). But there is finally some good news: Cloud computing actually increases productivity and we can prove it.

The Productivity Paradox doesn’t claim that IT is useless, by the way, just that we tend to spend more money on it than we get back in benefits from those expenditures. IT still enabled everything from precision engineering to desktop publishing to doctoring movie star photos, but did so at a considerable cost. Follow the history of any organization more than 50-60 years old and you’ll see that they acquired along the way whole divisions devoted not to manufacturing or sales but just to schlepping bits and keeping them safe. Yes, IT reduced the need for secretaries, telephone operators, and travel agents, but it more than replaced those with geeks generally making higher wages.

At the heart of the Productivity Paradox is the simple fact that power in organizations is generally measured in terms of head count. The more people a manager manages the more power he or she is assumed to have in their organization. So there has always been a trend toward over-hiring simply as a symptom of executive ego. Every manager wants a bigger budget this year than last year. Add to this the frequent failure of large IT projects and productivity (units of output or profit per person-hour) suffers.

It doesn’t hurt, either, that in most non-tech companies the CEO is generally clueless about what’s actually needed in terms of IT and what it ought to cost, with the CTO or the CIO gleefully working to keep the boss in the dark.

This trend is not true everywhere, of course. Organizations in crisis, especially those that appear to be dying, can slash and burn their way to IT-based productivity increases. But the industry as a whole is at best stagnant in terms of productivity growth.

This is not to say that companies can’t build whole new divisions based on digital tech -- divisions that help the company grow as a whole. But purely in terms of sales or profits per employee, IT rarely helps the company improve those particular numbers -- the ones upon which we base productivity.

But cloud computing is different because it is the ultimate shared resource. Public or private clouds are virtual production pipelines that take almost no management personnel (breaking the headcount conundrum) and can be managed to run at close to 100 percent utilization leading to true productivity improvements.

When we put Windows and Linux applications in the cloud and manage them there, for example, we win in terms of TCO, ROI and productivity because our ratio of cloud employees to users is about 10,000-to-one. Imagine a 10,000-employee company with one IT person.

And though these clouds are multi-billion-dollar technology projects bigger than anything ever built before, they aren’t especially risky at all because these clouds and their core services are for the most part finished and already work.

Though cloud services are charged by the by the hour, minute, or second, capital costs are minimal and sometimes zero, since new customers can generally start with their old hardware. And even when they turn to dedicated new hardware with its greater reliability and energy efficiency, purchase prices are dramatically lower than PCs and required hardware upgrades can be many years apart. Think of cloud hardware in terms of epochs. In terms of end-user cloud computing, we’re right now in the 1920-by-1080 LCD display epoch, which should be fine until everyone is ready for 4K a few years from now, at which point the upgrade will cost no more and probably substantially less.

The labor component of a cloud installation is more like maintaining a phone system than a computer network. It’s mainly plug-and-play.

There are almost constant hardware upgrades in the cloud, of course (Amazon’s AWS, Microsoft’s Azure and the Google Cloud are in a continuous arms race, constantly adding both capacity and capability. But these upgrades are mainly transparent to users and are included in a service price that is, paradoxically, always going down.

Traditional IT departments worry a lot about data security, but in a well-designed client installation that’s handled in the cloud, too. Every workstation boot starts with a clean OS image from read-only storage, giving viruses and malware no place to hide. Scanning for bad code happens in the cloud, too, and is both automatic and totally up-to-date.

The cloud’s innate security, along with its inherent reliability, explain why we have become so important to government customers who really can’t afford to be hacked.

While lower hardware, labor, and software costs finally drive productivity for cloud customers, Moore’s Law also ensures that these productivity gains will continue into the future. The cloud is, after all, a market that has dropped in cost by 30-50 percent per year for the last decade with no change to that trend in sight.

3 Responses to Cloud computing may finally end the productivity paradox

© 1998-2024 BetaNews, Inc. All Rights Reserved. Privacy Policy - Cookie Policy.