The virtualization challenge and whether IT is ready

Dr. Ajei Gopal is in the business of serving up metrics for administrators to assess and measure, in his role as executive vice president of CA's Enterprise IT Management Group. As Dr. Gopal related, "I was with a customer a couple of weeks ago, and with one of their applications, there was a problem and at the end of the day, the guy responsible for that application comes in and says, 'The application was working. All the lights on the data center were green. The server was up, the network was up, the database was up, everything seemed to be up but the application wasn't working.' It turned out that somebody had made a change to the application, and instead of entering one database query, it was issuing a couple of dozen. And each individual query was completing at the right time. So there was nothing wrong with anything wrong at the silo level, but in the aggregate, the performance was not acceptable to customers who were simply abandoning their transactions in the middle.
"There were two problems with that scenario," Dr. Gopal continued. "One is that the IT guys were looking at green lights, even though the business wasn't working...and secondly, the business guys were looking at the [status] for that application in the context of the number of business transactions, in terms of the revenue that they were getting. And the IT guys were looking at it in a different way: Is the server up, is the application up? And that mismatch between the business and the data center is a real problem that needs to be addressed."
Dell's Rick Becker took the baton from there, pointing out -- in perhaps the most ironic note of the afternoon -- that most vendors in the business are free to think of data centers in a fairly homogenous fashion...except Microsoft.
![]() |
"The fact is, all of us -- maybe with the exception of Microsoft -- have been living in a very proprietary management stack, that is [released] to a very proprietary focus of solutions." Rick Becker, Vice President, Enterprise Software and Solutions, Dell |
"The fact is, all of us -- maybe with the exception of Microsoft -- have been living in a very proprietary management stack, that is [released] to a very proprietary focus of solutions," Becker remarked. "But when you add virtualization onto that challenge for our customers, suddenly you're not going to be able to recognize the workload and the way it's running. Any server can actually have five or twenty workloads, and there could be redundant 'brother' servers, and by the way, maybe your data center is across the world. So we have to have a way to manage heterogeneous environments, and do it in a standards-based way."
Microsoft senior vice president for Server and Tools Bob Muglia then acknowledged a critical fact, in a way that made it seem easier than one would expect for Microsoft.
"The reality is that, the situation in enterprises is, it is heterogeneous and it's going to stay heterogenous in the future," said Muglia. "So we need to think about that, and we've been making very strong investments in making sure that we run Linux really well on our Hyper-V, and Windows can run on top of other environments like Xen. We're making investments to make sure that all of the protocols and things we do, wherever possible, are standards-based. When standards-based protocols don't exist, we publish the protocols that we have."
Muglia estimated how much by percentage businesses spend on maintenance versus innovation -- on keeping things running the way they are, versus investing in concepts such as virtualization to move the business forward.
![]() |
"The IT guys were looking at green lights, even though the business wasn't working...and the business guys were looking at the [status] for that application. That mismatch between the business and the data center is a real problem that needs to be addressed." Dr. Ajei Gopal, Executive Vice President, Enterprise IT Management Group, CA |
"It's an 80/20 rule, where 80% of the IT budgets are spent on maintaining existing systems, and only 20% are spent on new," Muglia said. "We want to turn that right on its head, flip that around, and have it possible so that IT can address 80% of its resources on the new things that drive the business forward."
One of the most intriguing comments of the day came from Unisys vice president and general manager for systems and technology, Mark Feverston. His implication was that virtualization and other great improvements could indeed have a positive impact on the enterprise...if only the enterprise was fully aware of just what it is they actually, already own.
"Clients are a bit risk averse in this space," Feverston said. "They're saying, how do you take the risk out of deploying virtualization? A lot of times...clients want to understand, 'Can you help me discover what I have?' And when you talk about a large company, that's pretty hard to do -- talking about finding the server, finding what's on the server, the applications, all the artifacts that would be, not in the data center but in the total organization. That's the thing that they're asking."