The launch of the AMD Fury X cards has been rather tumultuous. There were rumours of selective sample send outs, issues with pre-release cooler pump whine and the fact that while powerful, the cards didn’t look likely to take Nvidia’s performance crown. However, while all that may be true at traditionally HD resolutions, if you scale it all up to three 4K displays and CrossFire two of the Fury X cards together, there’s a whole different story.
Although each card is ‘limited’ in total RAM to just 4GB (compared with Titan X 12GB GDDR5), in many of the games tested by Tweaktown, the Fury X cards came out on top. While Tomb Raider and Metro Last Light proved to be more compatible with Nvidia hardware, AMD’s new water cooled GPUs were able to take home the gold in Battlefield 4, Shadow of Mordor and Bioshock Infinite.
In each instance, the games were running at mostly-high settings, with the resolution set to a ridiculous 11,520 x 2,160. The fact that any of these cards was delivering playable frame rates at such a resolution is impressive, but it’s even more so when you factor in the architecture differences between AMD and Nvidia hardware.
It is worth pointing out that while the average frame rates of the Fury X CrossFire cards were often higher than that of the SLI Titan X and 980 Ti cards, more often than not the minimum frame rate for both those latter cards was higher than AMD’s solution. Of course a more general playable frame rate is preferred, but it’s worth noting that neither company seems to have a catch-all solution for playing at resolutions that are still many years away for mainstream gamers.
What this does at least do, is give us hope that with future generations, driver updates and cooperation with developers, AMD can remain competitive in the desktop GPU game. Because if it can’t, Nvidia has much less incentive to develop industry leading products.