Wondering what size power supply to buy, and how to make sense of power requirements? Darien Graham-Smith explains all.
Electronics manufacturers are continually seeking to reduce the power consumption of the components and products they manufacture. “Performance per watt” is a key measurement for new processors. You might wonder why this particular aspect of component design attracts so much attention – after all, laptops and tablets are already tremendously energy-efficient. Yet device and component manufacturers invest millions in shaving every watt they can from their power draw, and on these pages we’ll explain why. We’ll start by looking at what exactly power consumption represents.
Volts and amps
If you remember your physics from school, you’ll know that electricity is measured in volts and amps. It can be a little difficult to get your head around what these measurements really represent, but for everyday purposes you can think of voltage as representing the “pressure” at which the electricity flows out of the power supply, and of amperage as measuring how charged up with energy that flow is. Multiplying the voltage by the amperage gives us the total power received by the device, which is measured in watts.
It should be clear that voltage and amperage represent quite different properties of electricity. Voltage is “pushed out” by the power supply at a fixed level: if your power supply is rated at more volts than your device can handle, it will cause the device to overheat (and possibly even explode). Amperage, conversely, is “pulled in” by the device as needed. A power supply that’s rated at 2A can provide up to two amps of charge, but you can safely use it to power a less power-hungry device.
How do you find out the voltage and amperage ratings of a power supply? Simply turn it upside down. Almost all power supplies bear a sticker showing voltage and amperage ratings (although you may need to pore over some very small print to find the relevant figures). Many electronic devices also have stickers showing the input voltage and amperage they expect.
With this information, it’s easy to work out if a given power supply will work with a laptop or other device: you simply need to check that the voltage ratings of the two are equal, and the amperage rating of the power supply is equal to or greater than that of the device. When it comes to devices that charge via USB, the voltage is always 5V, so you can safely plug a phone or tablet into any USB port. However, the ports found on laptops and PCs provide comparatively low levels of current (0.5A for a USB 2 port and 0.9A for USB 3): if your tablet charges slowly, or not at all, you’ll need to use a more powerful USB charger that plugs directly into the mains. These are typically rated as 2A, and can go as high as 5A.
Desktop power supplies
What about desktop PC power supplies that are rated in watts, rather than volts and amps? This is a special case, as a single ATX power supply actually provides several different power outputs – known as “rails” – running at a variety of voltages (namely 3.3V, 5V and 12V) to suit the different components within the desktop system. The quoted wattage represents the maximum total power that can be provided across all of these rails simultaneously. If you check the technical documentation for a desktop power supply, you should find an amperage rating for each rail, showing how the available power is divided up.
In practice, the various motherboard components, drives and expansion cards that sit on the lower voltage rails generally have very modest power demands. The only connections whose ratings you might need to worry about are the 4- or 8-pin 12V CPU power connectors and – if you have a high-end graphics card – the 12V 6- or 8-pin PCI Express power connectors. Again, check the technical documentation to find out how much power your graphics card requires. This may be stated in watts rather than amps, but that’s no problem. Since wattage is voltage multiplied by amperage, we can calculate the required amperage via a simple division. For example: cards based on Nvidia’s GeForce GTX 680 design have a quoted maximum power draw of 195W, so the relevant 12V rail must be rated at 16.25A or higher.
What’s so great about watts?
Since wattage can be simply derived from voltage and amperage, you might wonder why we need to bother with it. The answer is that, precisely because it combines those two measures, it gives us a single, simple unit for talking about power consumption. It doesn’t tell us everything about an electrical current: a power draw of 50W might represent a flow of 10A at 5V, or it could be 2A at 25V. But for many practical purposes that doesn’t matter. Each configuration will drain a battery at precisely the same speed, and will add the same amount to your electricity bill. A device drawing 100W will consume twice as much energy in a given period of time, or will drain a given battery twice as quickly – again, regardless of the actual voltage and amperage.
There’s still one complicating factor to consider: wattage isn’t necessarily stable over time. It may be constant for very simple appliances such as electric heaters and vacuum cleaners, but computers and smartphones have far more erratic power demands. Their amperages – and hence their wattages – go up and down depending on what you’re doing with your device. Sitting at the Windows desktop doing nothing, a modern desktop PC might draw something of the order of 60W. Load up a demanding game that taxes a 3D graphics card, and simultaneously floods the CPU with logic and physics calculation, and your energy demands could easily double or triple while that particular piece of software is running.
This is why, when we review desktop PCs, we state both “idle” and “peak” power ratings. Similarly, it’s why our laptop reviews report battery life in both light and heavy use. A battery powerful enough for only two hours of heavy multitasking can often support six or seven hours of casual web browsing.
This brings us to the crux of the issue. Power consumption is a key issue in consumer electronics, not simply because saving power is good for the environment, nor because it reduces your electricity bills. Those are certainly valid considerations, especially for businesses deploying hundreds or thousands of computers, but for an individual, electronic gadgets are already very cheap to run. More than this, though, reducing power consumption enables laptops, tablets and phones to run for longer on a single charge – a very desirable thing indeed.
There are many ways to minimise the power consumption of a device. Some of these are visible to the user: most mobile devices, for example, switch their screens off when they haven’t been used for a while. Laptops have a “sleep” mode that draws only a trickle of power, so as to keep the hardware in a state from which it can quickly reawaken. If you’ve delved into Windows’ advanced power settings, you’ll be familiar with other power-management features, such as spinning down mechanical hard disks that haven’t been used for a certain period of time.
Many power-saving measures, however, take place behind the scenes. The continual shrinking of CPU dies is a case in point. Intel’s Ivy Bridge processors perform very similarly to older Sandy Bridge models, but because they’re manufactured on a 22nm process rather than a 32nm one – meaning that the smallest components inside the chip are around 30% smaller – they can run at lower voltages. In practice, tasked with the same workload, Ivy Bridge can shave more than 10% off Sandy Bridge’s power consumption. Smaller dies tend to run cooler, too, meaning less energy is needed for fans (although in the specific case of Ivy Bridge, the thermal advantage is offset by Intel’s use of cheaper heat-dissipation materials in its newer chips).
Processors have also become progressively smarter about managing their own power consumption. The Intel SpeedStep system, introduced with the Pentium III, automatically reduced the operating frequency of the entire CPU – and hence its energy requirements – whenever full power wasn’t required. In more recent architectures, the concept has grown into Turbo Boost, which dynamically clocks individual CPU cores up and down as needed.
Modern CPUs also save energy by selectively shutting down internal features that aren’t being used, and only waking them up as needed (a process called power gating). You can browse the web and use desktop applications without wasting power on idle processing cores and unneeded GPU components. Intel’s forthcoming Haswell architecture extends this concept by moving peripheral and disk controllers into the CPU package, so they too can be gated – a move that, it’s rumoured, may slash a PC’s overall power consumption by up to 50%.