Are we actually ready for the multicore and virtualization era?

The problem before hardware and software vendors alike, a panel of leading industry executives discovered yesterday, is getting their own adopters to understand and embrace the technology they're already buying today.

LOS ANGELES (BetaNews) - Today, very few PC processors are sold with single cores, and that number is dwindling down to zero. And one of the largest commercial server operating systems, Windows Server, just last week added a virtualization platform as a principle option for all customers.

But the problem now is adoption. For enterprises, there's a certain fear that the separation of contexts between the application that runs and the processor that runs it, will generate unmanageable levels of complexity among administrators. And while it might seem that the whole multicore adoption problem has already been solved, and quite handily -- Intel has already marched on to the 45 nm generation, and AMD is running along to catch up -- the often unspoken truth there is that developers have yet to adopt the mindset for parallelism. So there's a limit to how well processors can subdivide tasks into threads, and unless developers start helping out, each new "power of two" heaped onto the number of cores in a CPU, may matter less and less.

Last week in Los Angeles, Microsoft convened a panel of seven leading IT industry executives including its own server and tools division senior vice president, Bob Muglia, and moderated by IDC research vice president Al Gillen. The broad topic at hand was "the data center of the future," though as the conversation was allowed to meander, it eventually centered on two factors, both of which revealed the same hard truth: Adoption is running further and further behind production.

Microsoft senior vice president Bob Muglia

"This is a hard problem. This is not something that's going to get solved in just a two- or three-year timeframe. This is the biggest shift in the software industry that we've ever seen. It's not a single-release-cycle problem."

Bob Muglia, Senior Vice President, Server and Tools Business, Microsoft

Muglia was asked by Gillen what benefits Microsoft perceived that its customers were realizing from the dawn of the multicore, or "many-core," era. His attempt to divert the question slightly at first did not go unnoticed.

"One of the things we've done with Visual Studio and .NET is made the transition from 32-bit to 64-bit pretty much a seamless transition, getting the code to run in both environments," started Muglia, focusing right away on the 64-bit innovation instead. "So you can take the code that might have been written in the past and bring it forward into the 64-bit world. And we very much see that 64-bit computing is the future. [Windows Server] 2008 is the last 32-bit release of Windows Server that we're going to do; from now on, it's all 64-bit, and by and large, we've structured the server environment so it's straightforward to take advantage of the multiple cores and the multiple processors that exist in servers."

With the good news out of the way, Muglia then felt more comfortable embracing a cold fact: "Many-core is a very substantive challenge for single-threaded application development that exists on the client, and we're working real closely with all of our partners in the industry to make sure we can enable the development of applications on the client that maybe take advantage of multiple cores," he went on. "However, the server environment is much more straightforward in terms of being able to leverage the cores that we have, because of the fact that servers by their nature serve multiple people."

In other words, tasks on a server are already broken down into discrete services, which are more easily designed for multicore, and certainly multi-processor, environments. But applications on a client operating system like Windows Vista are rooted in a more single-threaded world of the old x86 processing model. Redesigning them for multithreading may either involve untangling decades of interlocking strands of spaghetti, or starting over from scratch.

Intel's server and platform group vice president Kirk Skaugen noted that his company's SSE4 data extensions for multiple data streams are now built directly into Visual Studio 2008, so developers can take advantage of those automatically.

AMD's corporate vice president for servers and workstations, Randy Allen, took credit on behalf of his company for having launched the 64-bit era...but that was six years ago already.

"Back in 2002, when we introduced AMD64 into the market, the 64-bit extensions to the x86 instruction set, Microsoft threw its support behind that immediately," Allen noted. "And I think everybody understood at that time that it dramatically changed the game, because at that time, the rationale was that, to make this transition from 32-bit to 64-bit computing, it was going to require a whole new instruction set, a whole new set of software. If you sit here and look back now, six years, it's hard to believe, you see that it dramatically changed."

From the perspective of IT centers, the problem is one of seeing all this technology happen to them, as opposed to with them. That was the message brought by Dr. Ajei Gopal, executive vice president of Enterprise IT Management for CA.

AMD corporate vice president for server and workstations Randy Allen

"Back in 2002...the rationale was that, to make this transition from 32-bit to 64-bit computing, it was going to require a whole new instruction set, a whole new set of software."

Randy Allen, Corporate Vice President, Server and Workstation Division, AMD

"If you work in an IT environment, probably the biggest issue that you're dealing with is change that happens that hasn't been appropriately managed," remarked Dr. Gopal, whose company is in the business, among other things, of producing change management tools for administrators. "If something bad happens, someone does something to respond to it and it results in a catastrophic failure and you don't know exactly what happened."

How exactly is Microsoft handling the problem of change management, particularly with regard to the question of parallelism in programming brought upon developers by the multicore era? BetaNews put the question directly to Bob Muglia.

"[Windows Server] has evolved and is structured to take advantage of many-core today," Muglia told us. "But clients have a lot of work to do, because most client applications are single-threaded. So we're working with Intel, AMD, and others to really think about how we would restructure software to be able to take advantage of all of these cores."

Will we see such a tool that will help developers adopt a parallelism mindset sometime within the lifecycle of Visual Studio 2008? "You will, and in fact, we have a Professional Developers' Conference coming this fall, where we'll talk about what it means to write many-core applications on the client, especially," Muglia responded. "But I want to emphasize, this is a hard problem. This is not something that's going to get solved in just a two- or three-year timeframe. This is the biggest shift in the software industry that we've ever seen. It's not a single-release-cycle problem. But we are moving on it."


For more: The virtualization challenge and whether IT is ready

One Response to Are we actually ready for the multicore and virtualization era?

© 1998-2024 BetaNews, Inc. All Rights Reserved. Privacy Policy - Cookie Policy.