What o_0 says is true...
NVIDIA are decent in DirectX and outstanding in OpenGL
ATI as outstanding in DirectX and decent in OpenGL
Doom3 is OpenGL, Half-Life 2 is DirectX
I can say from personal experience that Doom3 although it looks like a very demanding game ... it isn't.
On it's lowest settings 640x480x16bpp you can get almost identical graphics to the X-Box, on a:
Pentium 3 600MHz / Geforce3 Ti 200 / 128MB Ram
The only thing the more powerful cards give you is better colour precision and higher resolutions, as Carmack has now edited the engine so both the ATI and NVIDIA cards both use the Generic ARBVP Pipeline rather than thier optimised lines; It means there is now no quality difference between what the cards render.
Althought the R300 (Radeon 9700 Pro) can keep up with the NV30 (Geforce 5800 Ultra), you'll notice that in DirectX this usually is an entirely different story.
I would say get a card that is at the max of your budget:
Budget Card : Geforce 5600 Ultra (£30-40)
Mid-Range Card : Geforce 5700 Ultra (£80-90)
High-End Card : Radeon 9800 XT (£150-180)
Budget-wise the 5200 will outperform the 9200SE/9200 and the 5200 Ultra will outperform the 9200pro.
As most places will sell the 5600 for the same price, and this pushes the speed even further above then that is worth the investment; however ONLY the ultra verison, as the 5200Ultra actually outperforms the standard.
Mid-Range Geforce 5700 Ultra, to me this is just quite simply the best card on the market right now because price:power ratio is just incredible. The card can slimly outpeform the 5900 Standard, waltzes over the Radeon 9600/9600pro and more often than not outperforms the 9600XT.
High End however is Radoen country until the 6800 hits the market.
Is the X800pro / XT worth the extra £100-150 over the 9800pro... well they cost double and give you double performance, so really that's upto you. IMO there are no games that really need them, and to take any real advantage from the VPU you are going to need a pretty powerful processor.
However if you wait a few months, both cards will come in a few more (and more affordable) flavours; So you won't have to break the band just to get decent performance.
My advice? Buy yourself a Budget/Mid-Range card which will run almost every game right now at max graphics and upgrade your processor to ensure it.
If your processor is under 1.5GHz then what is more likely going to be slowing your game down between a Geforce FX and a Radeon 9-Series is the processor not the card.
DirectX heavily relies on a quick processor, as do the Radeons. (which is why I don't recommend them until you have a high-end computer). Further more Radeons prefer AMD and Geforce prefer Pentium, not sure why; but you'll see noticeable performence jumps between the two.
Finally on the point of the Half Life 2 performance argument.
Now it is believed that Geforces' performance in DirectX is much poorer than the Radeons'... to an extent this is true because of the geometry pipelines as Radeons are built to push pure polygons.
However, unlike what Valve are putting out about the HL2 engine. GeforceFX DO NOT perform as badly under DirectX as the HL2 benchmarks are suggesting...
Infact they run Halo and Final Fantasy XI without Driver AA; with a noticeable performance increase over the Radeons'.
You will also hear alot about drive tampering from both sides.
I've recently had a chance to talk to an actual NVIDIA driver engineer on this very matter and he said;
Quote: "Due to the way the application uses render targets, forcing AA in the control panel causes our driver to allocate 23MB of unused Z buffer at 1600x1200, 4xAA. This causes textures to fall out of video memory (into AGP), causing a significant performance hit.
Our driver does this because multisampled color buffers require multisampled Z buffers to be used with them, but DirectX allows any Z buffer to be used with any color buffer, so we have to be conservative and allocate space for a full multisample Z buffer, even if the application only uses it for aliased rendering.
When AA is enabled in the application, the application can be much more intelligent about allocating multisampled vs aliased render targets & Z buffers than our driver, and not suffer this drop.
Because we do not have a clean way of solving this buffer allocation problem (aside from performing app detection in the driver, and working around the problem there), we added a Far Cry game profile that forces FarCry.exe to ignore control panel AA (specifying AA in the application works just fine)."
and yeah this is actually the only current 'cheat' nvidia is being accussed of right now.
To me that seems like a pretty sound explainion... and it is funny how it differs from what these so-called reviewers as finding out.
AthlonXP 2500+ | 256MB DDR PC2700 | GeForce FX 5200 44.04 | DirectX 9.0 | Audigy2 | Crystal Clean OS