How we test

PCs and Components

2D Tests

We test desktop PCs, netbooks and laptops with our own, custom-built, 2011 Real World Benchmarks.

We split the results into three categories: Responsiveness, Media and Multitasking, with the Overall score an average of the three sub-scores.

For instance, responsiveness replicates light browser and productivity workloads. The Media test involves running iTunes for audio conversion, Photoshop CS5 to crunch large images and Sony Vegas 10 to edit home video. This then gets run simultaneously alongside Cinebench 11 in order to get a handle on the multitasking ability of the system.

3D Gaming tests

We use pre-recorded demos in Crysis and DIRT 3 to test gaming performance where relevant. We have three standard test settings, depending on the power of the graphics card: Low, Medium and High.

To test gaming performance, we use our own recorded Crysis benchmark. We use the Low, Medium and High quality settings in 1366 x 768, 1600 x 900 and 1920 x 1080 screen modes respectively.  Very high-end systems can also be tested using the ultra-intensive Very High settings, with all detail switched on, and varying levels of anti-aliasing enabled.

Laptop Battery Life tests

We subject laptops to two battery tests. In the light-use test, we optimise the system settings for the greatest power efficiency. We then disconnect the mains and run a script scrolling a selection of web pages until the system shuts down, giving you a realistic idea of the surfing time each laptop offers.

For the heavy-use test, we engage Windows’ High Performance power profile, set the display brightness to maximum, and allow the taxing Cinebench 3D renderer to push the processor load to the limit. This gives a worst-case figure, revealing how long you can expect the battery to last under the most demanding conditions.


Handheld Devices

Smartphones and tablets are real do-it-all devices, and so testing involves many approaches. Our first batch of tests focuses on performance. Since these phones run on a number of different platforms, it’s tricky for any single benchmark to provide a comparative view across them all. The only truly cross-platform application is the web browser, so we use this to gauge general performance. First, we visit the SunSpider JavaScript benchmark on the device's stock web browser, and then point the phone at our own custom page-rendering test. Hosted on a local Apache web server, this loads 28 web pages sequentially, and times how long it takes. The test is run six times to obtain an average score, given in seconds. Finally, for the Android handsets we also run the Quadrant benchmark, which tests graphics, I/O and CPU performance.

Battery life is tested over the course of 24 hours, at the end of which we take a reading from the phone’s battery gauge. The test involves downloading, then playing a podcast on loop for an hour. We visit the Google homepage and force the screen on for an hour, with the brightness turned down to medium (with dynamic and browser brightness settings off). Finally, we make a phone call to the handset for half an hour, leaving the phone next to a radio to simulate conversation. The phone is then left checking email (with push on) for the rest of the test.