actually it's already in the drivers aparently. some site shows you how to unlock the 'temporal aliasing' ... however no one is willing to show the results of it.
I know ATi aren't going to shy away from being the fastest, this said the cards this time around are very evenly matched;
Especially when you consider the fact that both ATi's are running between 75-120MHz faster than the 6800 Ultra.
With the Pro being able to be overclocked to around the same as the XT; this would suggest these cards have a very visible limit.
Geforce reference boards are known to be tame compared to market models, so another reason to keep it calm atm.
We're also looking at the future with NVIDIA. Shader 4.0 is atleast a year off as DirectX 10 is being released with this new model, and is going to be released initially as part of Windows 6.0 (the official new name is released tommorrow apparently). Which means until then NVIDIA lead the market with technology...
right now that doesn't matter greatly as everything is 1.1 and 2.0 based, this said developers are no doubt going to patch because 3.0 does offer greater performance as well as a more flexible pipeline.
Far Cry already has the patch ready for the Dx9.0c release; although it is unsure what else will feature these extensions, E3 is just around the corner.
We're going to know very very soon, and my money is on HL2s reworked shader engine suddenly sprouting 3.0; Same goes for Halo2.
Really the only thing truely dragging the Geforce's performance right now is thier Anti-Aliasing as only 2x/2xQ/4x are native, the rest are pure software and you can see it in the huge performance drop. There is a new technology i've seen them impliment into the Goforce range which seems to mimic analog scanline signals; really how they're doing it right now is just far too costly.
I don't think people quite understand that for the Radeons it is a post process thing; whereas for the Geforce they tripple the screen size and adverage down.
Which means when your playing at 640x480 w/4x AA your really playing at 1920x1440 and it's being scaled down. As the Geforce can't maintain this every frame, they only sample every Nth frame; and a loss in quality occurs.
What is sad is that, the AA factor is what is considered Image Quality; very little is ever taken into account in terms of REAL image quality - as in colour bleeding, colour sat/hue/light, etc...
Personally another thing i find interesting is how you can now use 64bit image processing for the Geforce, again not used yet; but it will be used as this is OpenEXR standard (industrial, light and magic) ... this is going to become as standard as EAX is for sound cards. And the difference in quality is just phenominal over the FX-Series who only can process in 8/16bit and the Radeons which process in a std 24bit.
I find it quite amusing that not many radeon users sit down and question; how come Radeon buffers use 24bit colour yet uses 32bit depth?
Geforce don't, they use 24X8 Depth ... but they run on a 32bit pipeline; it's weird but true.
AthlonXP 2500+ | 256MB DDR PC2700 | GeForce FX 5200 44.04 | DirectX 9.0 | Audigy2 | Crystal Clean OS