PDC 2009: What have we learned this week?
It ended up being a somewhat different PDC conference than we had anticipated, and even to a certain extent, than we were led to believe. Maybe this was due in part to a little intentional misdirection to help generate surprise, but in the end, the big stories here in Los Angeles this week were more evolutionary than revolutionary. That was actually quite all right with attendees I spoke with this week, most of whom are just fine with one less thing to turn their worlds upside down. It's tough enough for many of these good people to hold onto their jobs every week.
We'll start our conference wrap-up with a look at the flashpoints (remind me to call Score Productions for a jingle to go with that) we talked about at the beginning of the week, and we'll follow up with the topic that crept in under the radar when we weren't expecting.
Making up for UAC, or, making Windows 7 seem less like Vista. This was absolutely the theme of "Day 0," which featured the day-long workshops. At this point, Windows engineers have absolutely no problem with the notion of disowning Vista, disavowing it, even though it was technically a stairstep toward making Windows 7 possible. But it is now perfectly permissible to acknowledge the performance hardships Vista faced, and let go of the past in order to move forward.
Mark Russinovich leads the way in this department, and the fact that he's appreciated leads others to follow suit. During his annual talk on "Kernel Improvements" -- which he expanded this year to a two-parter -- Russinovich spoke about the way that the timing of Windows' response to user interactions was adjusted to give the user more reassurance that something was happening, rather than the sinking suspicion that nothing was happening.
In an explanation of a user telemetry service he helped get off the ground called PerfTrack, he told attendees, "We went through and found roughly 300 places in the system where you interact with something, and there's a beginning and then an end where you go, 'Okay, that's done,' and optimized the performance of those user-visible interactions. We instrumented those begin-and-ends with data points, which collects timing information and sends that up to a Web service...and for each one of these interactions, we define what's considered 'great' performance, what's considered 'okay' performance, and what's considered Vista -- I mean, uh, 'bad,'" he explained, with a little grin afterward that appeared borrowed from Jay Leno. "And then if we end up in that 'okay' or 'bad,' what we do is, selectively turn on more instrumentation using ETW [Event Tracing for Windows] -- instrumentation of file accesses, Registry activity, context switches, page faults -- and then we collect that information from a sampling of customer machines that are showing that kind of behavior.
"We feed that back to the product teams, they go analyze those and figure out, 'Why is their component sluggish in those scenarios?' and optimize that."
One of the results he demonstrated, shown here in this pair of charts, shows the number of user-reported instances of Start menu lag time leaning more toward the quick side than the slow side of the chart, between two builds of the Windows 7 beta.
The fact that performance matters was one of the key themes of PDC 2009, and attendees greeted that message with enthusiasm -- or, maybe more accurately, with appreciation that the company had finally received the message. But there are still lessons to be learned here that can be applied to other product areas, if anybody out there is listening.
Why Windows Azure? The major theme of Day 1 was the ability to scale services up -- scaling local services up to the data center, and data center services up (or down, depending on your application) to Microsoft's cloud provider, Windows Azure.
Last year at this time, Microsoft went to bat with essentially nothing -- no real definition of an Azure application, no clear understanding of who the customers will be, and absolutely no clue as to the business model. But now we know that services will be rendered on a utility basis like Amazon EC2, and we have a much clearer concept of the customer groups Azure will address. One is the small business that has never before considered data center applications; another is the class of customer that needs to plan for exceptional capacity traffic during unusual situations, but can't afford to maintain that high capacity 24/7; and the third is the big customer building a new class of application that has never before been considered on any platform.
Channeling customers to Microsoft's cloud will be "Dallas," its code name for large-capacity data bank services typically open for mining by the general public, which should eventually be given a typically Microsoft-sounding name; and AppFabric, the company's new mix-and-match component applications system built on the IIS 7 platform. But in neither of these cases is Microsoft particularly inventing the wheel; and as I heard from a plurality of attendees this week, Microsoft's entering another crowded field of contenders (including SalesForce.com and IBM) where competition has already been saturated. Success in this venture is by no means assured.
Next: Office takes a backseat...