The cards inability to work is a just a problem with transistors often being dude; it is rarely anything to do with the VPU/GPU.
So there isn't much chance of the rendering pipeline being affected to the point where the Radeon pushes out dud pixels.
The graphical glitchs if that was the case would be system wide, not localised to games.
There is a HUGE double standard when it comes to cards online;
ATI = 'You have a problem then the card is a dud, drivers are too old or user error.'
Geforce = 'You have a problem then obviously Geforce are crap cards and you should buy Radeon.'
Look I know I'm a Geforce fan, but what I see in benchmarks constantly is very little difference between the cards; Sometimes Radeon is better sometimes Geforce is...
From an artists POV the Picture Quality on the Geforce (without FSAA) is far superior; You cannot pit the FSAA's against each other because they are better on different tasks.
Driver-wise however, Radeon's you must constantly keep updating just to make sure games work how they should; Geforce on the other hand will work with anything, all the drivers do is improve quality and speed.
That IS how it should be. a Driver isn't something for your manufacturer to hack and cheat thier way to the top, but something which provides a UNIVERSALLY stable platform which all games can use.
What really pisses me off about the whole double standard issue is the whole 3DMark '03 thing.
NVIDIA were 'caught' tweaking thier drivers, the changes were purely superficial graphical quality drops (which actually were only aimed at the 5800) and as such everyone has pinned them as cheaters.
Thing is though ATI have not only been caught doing the same bloody thing, but they've also been caught using thier own version of DirectX which actually optimises the pipeline to thier cards; specifically lowers quality to certain aspects of the cards in order to fit what would've taken 128bit operations to 96bit operations... with the extra 32bit they use this to catch stack overflow allowing them to squeeze an ounze more speed.
Sure you could deep this is ingenius, if it wasn't against the Agreement you sign in order to develop drivers. Developers are given fully open versions of these API and as such they have access to alter it but they're not allowed to.
Yet when this was discovered during the whole Half-Life2 *leak* everyone just simply dismissed it or claimed that ATI were doing exactly the right thing.
What peeves me off about the Geforce series is that the drivers for it are like 'lite' versions STILL. We have many features of the cards STILL to be exposed, also there is a weird banding thing with colour blending as although it is using 128bit Colour for some reason it is only being calculated as 16bit Colour Space using the Integer rather than Floating Point Unit... Technically the FX-Series on paper can outperform the Radeon; hell even in OpenGL they're capable of pushing the envelope alot further, as the NV30 (5800) is capable of keeping up with the R320 (9800); The same cards that have a 2/3 to 1/2 speed difference in DirectX.
And that's if the speed was that different to matter at the top end; but the 9800 vs 5950 is pretty even. And that gap closes each driver as the main speed difference between the cards is the Pixel Shader 2.0 speed; which the drivers are improving DRAMATICALLY at the moment.
AthlonXP 2500+ | 256MB DDR PC2700 | GeForce FX 5200 44.04 | DirectX 9.0 | Audigy2 | Crystal Clean OS