Conclusions
At this point in the review, Nvidia's marketing department would no doubt like for me to say a few words about some of its key points of emphasis of late, such as PhysX, CUDA, and GeForce 3D Vision. I will say a few words, but perhaps not the words that they might wish.
CUDA is Nvidia's umbrella term for accelerating non-graphics applications on the GPU, about which we've heard much lately. ATI Stream is AMD's term for the same thing, and although we've heard less about it, it is very similar in nature and capability, as are the underlying graphics chips. In both cases, the first consumer-level applications for are only beginning to arrive, and they're mostly video encoders that face some daunting file format limitations. Both efforts show some promise, but I expect that if they are to succeed, they must succeed together by running the same programs via a common programming interface. In other words, I wouldn't buy one brand of GPU over the other expecting big advantages in the realm of GPU-compute capability—especially with a GPU as old as the G92 in the mix.
One exception to this rule may be PhysX, which is wholly owned by Nvidia and supported in games like Mirror's Edge and... well, let me get back to you on that. I suspect PhysX might offer Nvidia something of an incremental visual or performance advantage in certain upcoming games, just as DirectX 10.1 might for AMD in certain others.
As for GeForce 3D Vision, the GeForce GTS 250 is purportedly compatible with it, but based on my experience, I would strongly recommend getting a much more powerful graphics card (or two) for use with this stereoscopic display scheme. The performance hit would easily swallow up all the GTS 250 has to give—and then some.
The cold reality here is that, for most intents and purposes, current GeForces and Radeons are more or less functionally equivalent, with very similar image quality and capability, in spite of their sheer complexity and rich feature sets. I would gladly trade any of Nvidia's so-called "graphics plus" features for a substantial edge in graphics image quality or performance. The GTS 250 comes perilously close to losing out on this front due to the Radeon HD 4850's superior performance with 8X multisampled antialiasing. The saving grace here is my affection for Nvidia's coverage sampled AA modes, which offer similar image quality and performance.
All of which leads us to the inevitable price and performance comparison, and here, the GeForce GTS 250 offers a reasonably compelling proposition. This card replicates the functionality of the GeForce 9800 GTX+ in a smaller physical size, with substantially less power draw, at a lower cost. I like the move to 1GB of RAM, if only for the sake of future-proofing and keeping the door open to an SLI upgrade that scales well. In the majority of our tests, the GeForce GTS 250 proved faster than the Radeon HD 4850 1GB, if only slightly so. If the Radeon HD 4850 1GB were to stay at current prices, the GTS 250 would be the clear value leader in this segment.
That's apparently not going to happen, though. At the eleventh hour before publication of this review, AMD informed us of its intention to drop prices on several Radeon HD 4800 series graphics cards that compete in this general price range, mostly through a series of mail-in rebates. Some examples: this MSI 4850 512MB card starts at $159.99 and drops to $124.99 net via a mail-in rebate, and more intriguingly, this PowerColor 4870 512MB lists for $169.99 and has a $15 rebate attached, taking it to net price parity with the EVGA GTS 250 Superclocked card we tested. We hate mail-in rebates with a passion that burns eternal, but if these rebates were to last in perpetuity, the GeForce GTS 250 1GB at $149 nevertheless would be doomed.
For its part, Nvidia has indicated to us its resolve to be the price-performance leader in this portion of the market, so it may make an additional price move if necessary to defend its turf. Nvidia has limitations here that AMD doesn't face, though, mainly because it doesn't have the option of simply switching to GDDR5 memory to get twice the bandwidth. That is, after all, the only major difference between the Radeon HD 4850 and 4870. On the merits of its current GPU technology, AMD would seem to have the stronger hand.
What happens next is anybody's guess. Just so long as Nvidia doesn't rename this thing, I'll be happy. There are GPU bargains ahead. TR