Nvidia smashed it out of the park with its GeForce GTX 680, but that clearly isn’t enough graphical grunt for the green team. In a surprise unveiling at the Nvidia Game Festival 2012 in Shanghai, CEO Jen-Hsun Huang gave us its dual-GPU beast, the GeForce GTX 690.
The formula for building one of these super-powerful cards is surprisingly simple. Two of Nvidia's GTX 680 cores are soldered onto a 280mm-long piece of PCB, with a small chip between the two to allow them to work together.
As is usual in dual-GPU cards, the cores have been clocked down a little, with the 1,006MHz stock speed of the GTX 680 now standing at 915MHz. Nvidias Turbo Boost technology remains intact, though, so that core clock will adjust up or down depending on how much work the GPUs are doing. Adjustments are made every millisecond, and the GTX 690's 915MHz core will hit a peak of 967MHz when it's at maximum load.
Aside from the clock drop, little of the GTX 680 has been changed. Each core is still accompanied by 2GB of 6,008MHz GDDR5 RAM, for a total of 4GB across the board. There have been no architectural changes, either, with the eight huge clusters serving each core still packing 192 stream processors each. Across the entire GTX 690, that means there are a mighty 3,072 stream processors and just over seven billion transistors.
Thats a formidable amount of pixel-pushing power and, as expected, it translated to ridiculous benchmark results. At Full HD, the card clearly isn’t being pushed: its 73fps in our 1,920 x 1,080 Very High quality benchmark, for instance, isn't that far ahead of the HD 7970's 60fps or the GTX 680's 57fps.
Crank up the resolution and detail, though, and the two cores get to work. Running Crysis at 2,560 x 1,600 saw its score barely drop to 70fps; the GTX 680 and HD 7970, by way of contrast, ran through the same test at 42fps.
Performance in other games was just as impressive. Crysis 2 at 1,920 x 1,080 and its Ultra settings - the trickiest the game offers - saw the GTX 690 average 57fps, the same score as a single GTX 680, but the HD 7970 fell behind with a 36fps average. Again, though, upping the resolution put clear air between the cards: the GTX 690 averaged 55fps, and its single-core stablemate scored 33fps at 2,560 x 1,600.
DiRT 3 and Just Cause 2 didn't provide much of a challenge to the GTX 690. In the former the card averaged 122fps at 2,560 x 1,600 and Ultra quality settings, and in the latter it brushed aside high resolutions and quality settings with disdain. With everything turned up and at 2,560 x 1,600, the card averaged 128fps.
The only way to see what the GTX 690 can really do, though, is to run benchmarks across three screens, and at the huge resolution of 5,760 x 1,080. After a lot of fiddling we managed to get it up and running, and the results were impressive: 108fps in Dirt 3, 83fps in Skyrim, and 56fps Battlefield 3, all with 4x anti-aliasing enabled, blowing every other card out of the water.
Nvidia has made plenty of play of the GTX 690's frugality when it comes to power, noise and heat and, while it's certainly at the high end of every scale - it requires two eight-pin power sockets, after all – it performed relatively well in our tests. At idle and full load, the power consumption of our test rig came in at 96W and 373W - not that much more than that of the GTX 680. And its thermal properties were even more eye-opening: under full load the temperature topped out at only 65°C.
Those decent results can be attributed to the card's pair of vapour chamber coolers, both of which are visible through one of the GTX 690's unique features - two polycarbonate windows. The rest of the card has an air of luxury about it, too: it's made from cast aluminium and protected with a hi-tech chromium coating, the fan housings are constructed from injection-moulded magnesium alloy, and the GTX 690 logo on the side of the board is laser-etched plastic with LED backlights.
Of course, given the luxury feel and stupendous levels of speed, you'd expect the GTX 690 to come with a ludicrous price - and at £840 you'd be right. That's a ridiculous amount to spend on a single component, and we cant possibly recommend you do unless you're planning to game on three screens at the highest quality levels. That's the only place this card makes sense, but as an exercise designed to further cement Nvidia's status as top dog in graphics, it certainly does the trick. Your move, AMD.
Author: Mike Jennings
PC Pro