A word about our Windows Web browser test suite
Since March 2009, our Windows Web browser performance test suite has utilized four components which score different aspects of the browser engine: the HowToCreate.uk CSS rendering test, the Celtic Kane basic speed comparison, the Web Standards Project's Acid3 standards compliance test, and the SunSpider JavaScript benchmark. We've received approximately equivalent numbers of praises and complaints for our having chosen all four of these independent analyses for our suite, though we see no reason at present to discredit any of them.
We did have to modify the HowToCreate.uk test internally, because the way it typically accounts for its own speed does not take into account how the onLoad event fires differently with various browsers' JavaScript interpreters. Our modifications account for the discrepancy, and we applied those modifications to all the browsers we test, not just those (Google Chrome, Apple Safari) which fire differently.
Each component of our suite counts toward 25% of each browser build's final score. That score is a composite of scores relative to the performance of Microsoft Internet Explorer 7 in Windows Vista Service Pack 2. In a fresh installation of Vista SP2, we noted the times and performance levels of IE7, a relatively slow browser. We then gave IE7 an index score of 1.00, so that we can compare other browsers' performance to IE7's on a general basis. A browser that scores 5.42, for example, performed about 542% better than IE7 in Vista SP2.
The reason we included Acid3 -- a non-speed-oriented test -- in our composite score is because we believe it's important that a browser not only be fast, but do the job it's expected to do. Typically, a browser is expected to adhere by standards, at least from the perspective of the developers who build Web sites for browsers. Thus, a browser that scores 5.42 whose Acid3 score is only 75% conceivably has the opportunity, with a bit of tweaking, to score a 7.23 if it follows the rules. Consider the Acid3 our version of applying "style points."
Currently we test relative performance using an identical machine with a multiple-boot option, enabling us to boot different Windows versions from the same large hard drive. Our platforms are Windows XP Professional SP3, Windows Vista Ultimate SP2, and Windows 7 Release Candidate.
All platforms are always brought up to date using the latest Windows updates from Microsoft, prior to testing. We realize, as some have told us, that this could alter the speed of the underlying platform. However, we expect real-world users to be making the same changes, rather than continuing to use unpatched and outdated software. Certainly the whole point of testing Web browsers on a continual basis is because folks want to know how Web browsers are evolving, and to what degree, on as close to real-time a scale as possible.
Our physical test platform is an Intel Core 2 Quad Q6600-based system using a Gigabyte GA-965P-DS3 motherboard, an Nvidia 8600 GTS-series video card, 3 GB of DDR2 DRAM, and a 640 GB Seagate Barracuda 7200.11 hard drive (among others). The Windows XP SP3, Vista SP2, and Windows 7 RC partitions are all on this drive. Since May 2009, we've been using a physical platform for browser testing, replacing the virtual test platforms we had been using up to that time. Although there are a few more steps required to manage testing on a physical platform, our readers have indicated that the results of physical tests will be more reliable.