Underneath all the speculation and marketing hype, the Cell processor might actually herald a revolution in computing. Tim Dean investigates.
Last month we took a peek at the next generation of consoles, now officially called the Playstation3 and Xbox 360. One of the interesting things about both these devices is their somewhat hidden agenda. Both Microsoft and Sony are pitching these machines as the ultimate in home gaming, although both manufacturers have a far bigger agenda in mind. The gaming market is by no means small, but it is dwarfed by the scale of the broader home entertainment market, including TV, music, video etc, and this is where the Playstation3 and Xbox 360 will come into their own.
We already know a little about Microsoft’s plans. As discussed last month, the Xbox 360 will likely support high definition video, have a hard disk, and be able to download online content using a micro payment system. Given all this, it’s likely to sport a TV tuner as well, either internally or externally.
Sony’s vision, on the other hand, is a little less clear. However, the big giveaway to its plans for the Playstation3 comes from its power plant, the enigmatically-named Cell processor. In a way, so far the Playstation3 appears to be a more conventional gaming console than the Xbox 360, but the fact it’ll be based on the Cell processor could tell of an even further reaching vision of home entertainment than that manifest in the Xbox 360. Before we outline this vision, let’s take a look inside the Cell to see what makes it tick – and in this Cell, it’s not nucleotides.
There are a few trends that will help to make sense of the Cell processor. The first is the move from complexity to simplicity in processor architecture, and the second is in the kinds of applications we’re using.
Like many aspects of the technological world, processor technology goes through cyclic phases of simplicity to complexity and back. One example of this trend is with interconnects, which got broader and more parallel through the 1990s, but are now returning to a narrow serial approach, such as with SerialATA and HyperTransport. A similar trend is occurring in processor technology, as embodied by the Cell processor.
During the 1990s, the trend was towards more complex CISC (Complex Instruction Set Computing) processors, such as the Pentium series, with bigger instructions, longer pipelines, more parallelism, and nifty tricks to make all this happen like out of order execution and bigger memory caches. However, also during the 1990s, a counter-trend began – one that represented a move back to simplicity through RISC (Reduced Instruction Set Computing) processors like the MIPS, ARM and PowerPC, with shorter instructions, less parallelism and a simpler architecture.
Another trend was in the kinds of software we’re running. Applications have been continually evolving away from traditional integer-based programs, like word processing, and towards media-rich applications like image editors, music encoding, and 3D gaming. In the former, the application itself took up the lion’s share of memory and the data being worked on was relatively small, such as when you have 40MB of Microsoft Word editing a 40KB document. With the latter, the data itself is bountiful, but the operations that need to be run on it are relatively simple, if repetitive, such as rotating or changing the colour saturation on a 300MB bitmap, or encoding a WAV into an MP3 file.
The processor manufacturers responded to this trend with the likes of Intel’s MMX instructions, followed by the SSE family as well as AMD’s 3DNow! and Apple’s AltiVec. These extensions are SIMD (Single Instruction Multiple Data), and are also called vector processing. As the name suggests, these extensions are able to use a single instruction, such as ‘move pixel X by Y amount’, and apply that to a big chunk of data without having to reload the instruction each time.
Finally, on the software front, we’re seeing this vectorisation of software coincide with an increase in multitasking, with the ultimate manifestation of this being virtualisation.
This all sets the scene for a new type of processor -- one that’s specifically designed with this application environment in mind, and that uses the best that RISC and vector processing have to offer