AMD does battle on two fronts.
AMD has really done a number on Intel with its Ryzen architecture, but the company isn’t willing to stop there. It is now taking on the might of Nvidia with the new Vega RX series of Radeon graphics cards, so is now waging war on two fronts. With promises of 1080-beating performance at better pricing, the Vega RX series is arguably the newest AMD product most relevant to gamers, more so than Ryzen. Let’s see if it can live up to the hype.
There are several versions of Vega RX launching, starting with the Vega RX 56, then Vega RX 64 air-cooled, and finally the Vega 64 liquid-cooled. We were sent the Vega RX 64 air-cooled for testing.
Both Vega RX 64 chips have 4096 Next Generation Core Units (NCU), 256 Texture Units, 64 ROPS and 8GB of HBM 2.0 memory. The Vega RX 56 cuts this back to 3584 NCU, 224 Texture Units, 64 ROPs, but retains the same 8GB of HBM 2.0 memory. However, the Vega RX 64 has faster 1.89Gbps memory, while the Vega RX 56 drops this back to 1.6Gbps. Thanks to the use of HBM 2.0 memory, the bus width is absolutely massive, at 2056-bits. All three chips are built on the 14nm FinFET procedure, with a stupendous 12.5 billion transistors in total.
As for frequencies, the liquid Vega RX 64 has a base clock of 1406MHz which boosts to 1677MHz. The air-cooled Vega RX 64 has a base clock of 1247MHz which boosts to 1546MHz, while finally the Vega RX 56 has a base clock of 1156MHz with a Boost clock of 1471MHz. Both Vega 64 cards are 270mm in length.
When it comes to the power requirements of these cards, welcome back to the bad old days of Radeons operating brilliantly as room heaters. The Vega RX 64 liquid requires a whopping 345W of power, while the air cooled Vega RX 64 drops this to 295W. Finally, the Vega RX 56 requires 210W. AMD has stuck with a blower cooler on the non-liquid cards, and it has a lot of work to do. As a result, it is rather loud, hitting 56dB during our game testing. This means you’ll definitely hear it when it’s in your case. Obviously all cards have twin 8-pin power plugs to supply all that power. Above these is the GPU Tach, eight small LEDs that light up to show just how much work the card is doing. While Vega RX supports CrossFire, AMD isn’t really pushing the feature. Besides, just think of the PSU you’d need to run two of these cards; at least 1200W to leave comfortable headroom for the rest of your gear. If you do go with CrossFire though, like previous cards there’s no need for an inter-card bridge.
At the heart of every video card are the shader cores. Known as Next Generation Compute Units (NCU) in Vega, the latest version is the biggest change since the launch of its now aging GCN architecture. The NCU can now handle a pair of FP16 operations inside a single FP32 ALU, which will likely make it rather attractive to Bitminers. This allows for a huge increase in the number of FP16 operations if the instructions are identical.
Vega also has a new memory controller called the High Bandwidth Cache Controller (HBCC), which allows it to more rapidly access data that isn’t loaded into the chip’s HBM 2.0 memory. For example, AMD demoed The Witcher 3, which allocated three times more memory than the amount of VRAM that was actually being used on the GPU. The HBCC helps speed this up, allowing the GPU to work with data sets up to 512TB in size. A part of the chip’s memory is allocated to this, and the newer HBM 2.0 memory offers up to eight times faster performance than the original HBM 1.0. It can also come in much larger sizes, up to 32GB, whereas the original was limited to just 4GB.
Today’s game developers are now using compute shaders to do more geometry processing, which wasn’t their original function. Part of the benefit of this is that the renderer can figure out which polygons are actually visible to the gamer, and not part of the scene. For example, in some games, there can be a 100x difference between the number of polygons in a scene, versus the polygons rendered that are actually visible. This is because many polygons are hidden behind others (they’re referred to as primitives in this stage of the rendering pipeline). To get around this Vegas has a new programmable geometry pipeline. Apparently, primitive shaders will have the same access that a compute shader would have to coordinate how devs bring work into the shader. This should have the effect of cutting out those unnecessary primitives at a much faster rate.
Also helping in this regard is a new ‘draw-stream binning rasterizer’, or DSBR for short. It uses a tile-based approach to more efficiently shade pixels, with a focus on those with complex depth buffers. The rasterizer will fetch primitives that overlap only once, and shade them only once.
Vega is the first chip from AMD that supports ‘packed math operations’. These are useful in certain workloads such as deep learning, which don’t need the full 32-bits that GPUs use for single-precision data types. This is part of the NCU design, and allows Vega to achieve up to 512 eight-bit ops per clock, 256 12-bit ops per clock or 128 32-bits per clock. This will make Vega a faster performer in productivity apps that rely on packed maths operations, though we only tested game performance.
When it comes to outputs, Vega is pretty normal by today’s standards. Our AMD reference card had three DisplayPort 1.4 ports and a single HDMI 2.0b output, allowing for 4K at 60Hz. AMD’s FreeSync technology is also supported, which is great to see considering there are now plenty of FreeSync displays on the market. This has been updated to FreeSync 2, which brings High Dynamic Range (HDR) to the table. This also brings lower latency to the output process, as colours tone map directly to the native colour space of a FreeSync 2 monitor, rather than use the two step process currently used.
When we went to print, the Vega RX 64 we were supplied was currently retailing for a massive $899. We spoke to AMD about this, as the street price should be $699. At the time we felt that price was in competition with the GeForce GTX 1080 Ti, which can now be bought starting at $1,020, $100 more expensive than the Vega RX 64. However, we then spoke to AMD, and they promised that this was due to a limited delivery of first samples, and that Vega RX 64 prices would hit $699 by the time this magazine hit shelves. This puts it in competition with the GeForce GTX 1080, which is currently sitting around the $749 mark. The liquid-cooled Vega RX 64 is currently retailing for $1,099, so we look forward to testing this directly against the 1080 Ti. Thankfully it appears the Vega RX architecture isn’t so great at cryptocurrency mining, as most of the software built for this is focused on the Polaris architecture, so we shouldn’t see a shortage of products thanks to miners buying massive amounts of stock, unlike the current situation with the RX570.
AMD has had some major wins in the CPU world of late, but it doesn’t seem that has been replicated in the GPU market. For starters we have massively inflated prices at launch, but this will hopefully be rectified as more stock lands in the country. But there’s no denying what a power-hungry chip Vega is, and in this day and age of energy efficiency, that’s not a good thing. It’ll be extremely interesting to see how the Ryzen Mobile APU performs energy-wise, as it uses Vega for its integrated graphics processor. Most disappointing though is the performance. Right now, Vega simply doesn’t have quite the speed to beat Nvidia. It’s a much closer race than it has been in the past, but Nvidia’s 1080 still offers superior performance at the same price, without requiring more power than a small suburb to run.
We’re very keen to get our mitts on the water-cooled version, as the increase in frequency isn’t insubstantial, though we doubt it’ll have the oomph to take on the GeForce GTX 1080 Ti given the air-cooled results Vega 64 results. We’re also interested to see what third parties can do with the chip, as the basic blower cooler feels like it could do with some severe improvements, leading to lower fan noise. There is hope for this new design though; a manufacturing process shrink could lead to huge power drops which would in turn lead to the possibility of higher frequencies. And that’s exactly what AMD has in mind with its next architecture, the Navi. Expect it to have a similar design to Vega RX, but built on a 7nm node. Sadly this isn’t due until the second half of 2018, so until then we’ve got Vega RX. At this price, we can’t recommend the AMD Radeon Vega RX 64 over the Nvidia GeForce GTX 1080, even when it (if) it hits $699, thanks to its lower performance and higher energy needs.
How we tested
We used our usual test bed for testing, based on the Asus Maximus VIII Hero, Core i7-7700K, 16GB of Ballistix DDR4-2666MHz and three SATA 3 SSDs for the OS and benchmarks. To compare performance against other cards, we used the Galax GeForce GTX 1070 Exoc-Snpr white, Aorus Xtreme Edition GTX 1080 and Aorus GeForce GTX 1080 Ti. As all of these cards come factory overclocked, we used MSI’s Afterburner to lower their speeds to stock standard.
We used the latest official AMD driver at the time of writing, the Radeon Software Crimson ReLive Edition 17.8.1. This gives access to a huge range of settings, including overclocking. We didn’t have time to overclock our card, but other results are showing that 1.7GHz is reasonable for this card. However, it causes the card to exceed 400W in power draw, and the resulting heat causes throttling, resulting in basically zero performance increase.
Our new suite of benchmarks were used to test performance, and the results weren’t quite as impressive as we’d liked. In most tests the Vega RX 64 came third, behind the GTX 1080 Ti and GTX 1080. However, it did have a big lead in Rise of the Tomb Raider, even beating out the GTX 1080 Ti, and also beat the 1070 in the 3DMark Fire Strike Extreme test, which runs at 2560 x 1440. Unfortunately we didn’t have time to test DX12 performance, where AMD tends to lead over Nvidia, but we’ll be doing a story on DX12 and Vulkan performance in an upcoming article.