As reported by Atomic, last Sunday Nvidia CEO Jen-Hsun Huang took to the show floor of his company’s Game Festival in Shanghai, China and did a passable impression of ‘the only person in the tech world surprised at the GTX 690’s announcement’ which was all the more impressive given he was the one announcing it. This was the culmination of eleven days of teasing fans with barely visible glimpses of the card on Facebook, a website countdown and the mailing out of Nvidia branded crowbars to the press (presumably for opening a crate full of leviathan-proportioned GPUs). Nvidia's latest brainchild was certatinly the subject of a fair bit of hype - albeit with the announcment of the GTX 690's specs and its attendent bosting of 'the fastest graphics card in the world', the actual hardware came pretty close to justifying all the pre-release shenanigans.
We've certainly been quite impressed at the allegedly low power draw, quietness and overall performance of the dual-GPU GTX 690. As my colleague Vito pointed out Nvidia achieved in two steps: the most important being hunting out two of the most energy efficient examples of its GK-104 chips (as in the GTX 680) then underclocking & undervolting them. The second string to the GTX 690 bow is it’s cooling unit, comprised of two efficient vapour chambers (not used since the 500 series) leading to a pair of large heatsinks and all serviced by the large mid-mounted coaxial fan. TDP (max power draw / heat dissipation) is 300w, roughly 25% above a GTX 580. In Nvidia testing the latter weighed in acoustically at 51 dB in SLI (specific configuration not known) while the GTX 690 measured a noticeably lower 47 dB. This potentially negates one of the primary objections Atomic has had towards dual-GPU cards in the past and performance-wise the newcomer seems to be no slouch:
There are some issues however. For the first time users will need a PCIe 3.0 setup for a card to stretch its legs; splitting sixteen PCIe 2.0 lanes between what amounts to two GTX 680s will likely lead to a significant performance drop. First made available on Intel Sandy-Bridge E systems, PCIe 3.0 is not something Nvidia has confirmed its drivers will enable on X79 by default. With the GTX 680 Nvidia has performed ‘validation’ of specific X79 motherboards, leaving many 680 owners with reduced PCIe 2.0 connectivity. This is a problem AMD managed to avoid and implies GTX 690 users could ‘officially’ need an Ivy-Bridge based system to fully utilise their new card. Such limitations likely have a lot to do with the announcement occurring the same day as Ivy-Bridge availability. Similarly, those looking to SLI a pair of GTX 690’s need two PCIe 3.0 x16 slots. As the only setup currently capable of doing this is the Intel X79 / Sandy-Bridge E combo with its Nvidia driver-related issues, this may leave Quad-SLI in a bit of a limbo for now (unless of course you can Google 'GTX680 X79 Japanese registry hack’ and follow the rabbit hole). What is certain is that all AMD systems as well as Intel setups of Sandy-Bridge and older vintages are somewhat left out of this already quite exclusive party.
There is also the issue of availability, inspite of promises of ‘cards on shelves’ on the May 3rd. Given that the ‘690 requires ‘cream of the crop’ GK-104 chips (two for every card) and that the single ‘not the cream of the crop’ GK-104 wielding GTX 680 is already not exactly abundant, we expect supply to be scant for some time and retail prices to soar above the $US 999 MSRP (about twice a GTX 680) in the US. This translates directly to concerns for Australasia, where we already regularly get overcharged for parts; expect some truly exorbitant prices should you find one in stock. Don’t get us wrong though, the GTX 690 remains a significant achievement for Nvidia and a potentially good choice for monied individuals allowed past Nvidia’s PCIe 3.0 bouncer.
It remains to be seen how close AMD can come to the GTX 690 with its HD 7990, expected to be announced at Computex in June. AMDs dual-GPU will have a similar composition as the GTX 690; two top-end AMD ‘Tahiti’ cores as in the HD 7970 & HD 7950 and a whopping great big cooler. The Red Team have a slightly tougher task given Tahiti’s higher power draw; as a result the chips in the HD 7990 will likely being lower clocked or partially disabled as with the HD 7950. Slightly lower stock performance of its 28nm lineup is part of why AMD don’t seem in a massive rush to release the HD 7990 – as a card it is unlikely to be quite as fast as the GTX 690, and as such there is no ability to become a ‘halo’ piece of hardware, shining world-beating light down onto the rest of the line-up. Denied this, the HD 7990 loses a lot of its point and has become less of a focus. Nonetheless AMD have proven more than adept at dual-GPU cards over the last two generations and additional competition will certainly be welcome.
With all its plusses then, the GTX 690 is set to quash a lot of our concerns over dual-GPU cards on release (with driver stability being the main holdout) and provide a reasonable and damnned imposing alternative to multi-card SLI for enthusiasts. Initially though those enthusiasts will need an Ivy bridge system, a bit of extra cash and more than a bit of luck in finding stock. Atomic is set to thoroughly review the GTX 690 and most likely gather round its green-LED backlit ‘GEFORCE GTX’ logo and go 'ooooh, aaaaaaah!' in an upcoming issue.