Sorry your browser is not supported!

You are using an outdated browser that does not support modern web technologies, in order to use this site please update to a new browser.

Browsers supported include Chrome, FireFox, Safari, Opera, Internet Explorer 10+ or Microsoft Edge.

Geek Culture / Graphics card upgrade - advice needed

Author
Message
adr
21
Years of Service
User Offline
Joined: 21st May 2003
Location: Job Centre
Posted: 24th Jul 2003 15:56 Edited at: 24th Jul 2003 16:04
It's been a while since I last bought a graphics card and I'm a little out of touch with the standards. I can't really afford a £300 card, so I've set my budget at £100 max. I've done some reading, and I've come up with two contenders:

1. Creative GeForce Ti-4200 64MB 4x AGP : £80 delivered
Pros : fast, beats all other cards of the same price
Cons : 4x AGP. DX8 only. 64MB - faster memory than the 128MB version, but what's the RAM/speed payoff there?

2. Inno3D GeForce FX5200 128MB 8X AGP : £70 delivered
Pros : DX9 Support, faster b/w, more memory
Cons : slower than the 4200 ... what else?

Dunno which one to get. I don't really need a new card - I've only just realised though that my GeForce 2 GTS can't do all the cool effects that DBP can provide!

I am tempted to go for the 5200FX. Even though it can be significantly slower than the 4200, the FX is a bit more futureproof and a bit cheaper. I don't really play games (last game I played was GTA:VC a coupla months ago) so I guess speed doesn't mean that much to me.

Anyway, your advice and experience is most welcome. I'm open to suggestions about Radeons, although the 9200 (the card in the same price bracket) seems to perform quite poorly against these two.

Thanks

Bender:Blackmail’s such an ugly word. I prefer extortion. The x makes it sound cool.
Karlos
22
Years of Service
User Offline
Joined: 18th Nov 2002
Location: United Kingdom
Posted: 24th Jul 2003 16:14
dabs.com do a 64MB gf4 Ti4800SE for £99 and £109 for the 128MB version.

If it ain't broke - try harder.
W2K Pro - Geforce2 MX400 64MB - Athlon 900
lcfcfan
21
Years of Service
User Offline
Joined: 19th Feb 2003
Location: North East, UK
Posted: 24th Jul 2003 16:26
or you could chuck in an extra £30 and get a Geforce FX 5600 256MBDDR 8X AGP card from ebuyer, cos if your gonna get a new card you don't wanna buy something that is outdated already like a geforce 4, i'm gonna get rid of my ti4200 and get a new fx 5900 i'll be able to use all the new shaders then.

Rob K
Retired Moderator
22
Years of Service
User Offline
Joined: 10th Sep 2002
Location: Surrey, United Kingdom
Posted: 24th Jul 2003 16:46
I suggest you get a GeForce FX 5600 or a slower 5200 if you can't stretch your budget. The reason is that both are fully DX9 compatible - U5 of DBP will be able to use (not require though) fully DX9 compatible cards. I strongly recommend it as it will be useful for games such as Half-Life 2 as well.

adr
21
Years of Service
User Offline
Joined: 21st May 2003
Location: Job Centre
Posted: 24th Jul 2003 16:48
Anyone know about the Radeon 9600?
http://www.ebuyer.com/customer/products/index.html?product_uid=49372

Can't seem to find an english review

Bender:Blackmail’s such an ugly word. I prefer extortion. The x makes it sound cool.
lcfcfan
21
Years of Service
User Offline
Joined: 19th Feb 2003
Location: North East, UK
Posted: 24th Jul 2003 21:15
check out the ebuyer reviews.

Ian T
22
Years of Service
User Offline
Joined: 12th Sep 2002
Location: Around
Posted: 24th Jul 2003 21:32
I have a Radeon 9500 and it's AMAZING. I highly recommend it

I'm not sure what 100 pounds translates into, but I think it's roughly $140, so you could get a Radeon 9500 for that price. Loads better than the GF4 cards.

However, if you can pay $150, you could afford a Radeon 9600 256mb (whereas R9500 max out at 128) card, and those should be REALLY nice.

--Mouse

Famous Fighting Furball
adr
21
Years of Service
User Offline
Joined: 21st May 2003
Location: Job Centre
Posted: 24th Jul 2003 21:51
I'm assuming the 9600 is a bit better than the 9500, and I know it supports DX9 natively. I can't find a decent review of it though (you know, graphs n stuff )

Bender:Blackmail’s such an ugly word. I prefer extortion. The x makes it sound cool.
Preston C
21
Years of Service
User Offline
Joined: 16th May 2003
Location: Penn State University Park
Posted: 24th Jul 2003 22:19
hmm, this post could help me too. If the Radeon 9600 is that cheap for that power, I could try and stretch how much I am willing to spend to $150.

At first glance, I'm a mediocre mech pilot. Look again and you will see my battlemech's computer code rushing through my eyes. My Mech And I Are One!
Ian T
22
Years of Service
User Offline
Joined: 12th Sep 2002
Location: Around
Posted: 24th Jul 2003 22:28
9600 isn't that much better than the 9500 performance-wise, unless you buy a 256mb of course. I'm really not sure where their differences lie-- the 9500 is also DX9 native.

http://altfarm.mediaplex.com/ad/fm/2765-11382-3987-0?mpt=138055&mpvc=

There's a review.

Interesting enough, it seems to be saying that the 9600 isn't as good as the 9500 unless you get one that simply had more memory (cuz the 9500 maxes out at 128). Maybe a 9500 pro would be a better purchase after all? I'm not sure...

--Mouse

Famous Fighting Furball
Shadow Robert
22
Years of Service
User Offline
Joined: 22nd Sep 2002
Location: Hertfordshire, England
Posted: 24th Jul 2003 23:12
Quote: "I am tempted to go for the 5200FX. Even though it can be significantly slower than the 4200"


how exactly do you know this?
the FX5200 is roughly the same speed as the 4200Ti, and when using shaders it is blindingly faster. with the DetonatorFX drivers you get a guarenteed 25% speed increase too.

you can pick a 5200 up for £53@Watford.co.uk
or if you can stretch to £90 (i think) they do the 5600

although yeah the GF4Ti & Radeons with Standard operation can out perform the new mx series, as soon as you start using Shaders they don't even come close speed wise.

TR: AoD @ 1024x768x32 FSAA4x on a Radeon 9800pro runs at 100fps and although can use some shaders can't use all of them.
GeForce4 4800Ti runs at 125fps against although can use some of the shaders not all of them, but can still use more than the Radeon.
GeForceFX 5200 run at 150fps and uses all shaders - its also less buggy. (same in general goes for other titles, AAO/Unreal2/Veitcog/UT2K3/etc...)

really you've gotta ask what are you going to be doing with it though... i mean if you want a card that can use alot of things for a good few years, then get the FX - if not then on pure stability and speed for current titles i'd recommend the 4Ti.

and don't worry about the 100% DirectX9 compliancy, as this is done on a driver level anyway - even my old GeForce2mx's are now fully DirectX9 compliant cause i use the DetonatorFX drivers, and anyone whos anythings about how this all works would tell you the same thing. Compliancy is just in the drivers not the card itself, unless when people say by Compliancy they mean fully supported by DirectX - which Microsoft have done with the widest variety of cards anyways since Dx6 (sorry but i think the term DirectX9 compliment is just stupid, if it isn't already someone can make it ... and Hardware wise its not these cards complimenting DirectX but DirectX complimenting thier technology)

Ian T
22
Years of Service
User Offline
Joined: 12th Sep 2002
Location: Around
Posted: 24th Jul 2003 23:22
Raven, that's a very odd thing to say... full DX9 compliance certainly is a thing to worry about. Cards which don't have it will not be able to utilize DX9 shaders. I remember the huge fuss when Morrowind was released and a lot of users couldn't have the pixel shaded water because their cards weren't DX8 compliant...

--Mouse

Famous Fighting Furball
Shadow Robert
22
Years of Service
User Offline
Joined: 22nd Sep 2002
Location: Hertfordshire, England
Posted: 24th Jul 2003 23:41
yeah but as ONLY the GeForceFX line actually has 100% Shader access in any APi... it seems a kinda moot point, especially as nVidia don't want to share (and i don't blame them) Shader3.x with ATi.

so really try to explain your point there?
there is going to be a gap between those who can use Shaders and those who can't. there will also be a gap between GeForceFX users and everyone else for the same reasons.

WIWMTBP titles are getting more and more common, and althought the Radeon and GeForce4 series are capable of alot of things - they're just not upto the task of complete shader access.
nv30 Shaders are likely to remain nVidia's proprioty and everyone is starting to use them - even HL2 and Doom3 are WIWMTBP titles.
They not only favour GeForce Shader Cards, but they also hardcode alot of effects directly into the engines for these cards.

no matter how much ATi want to, they'll never have 100% DirectX9 access in that sense of the word.

Ian T
22
Years of Service
User Offline
Joined: 12th Sep 2002
Location: Around
Posted: 24th Jul 2003 23:51
A valid point, I suppose, but I'm sure ATI will develop their own tech to catch up in that aspect.

By the way, your spelling and grammer are slipping today.

--Mouse

Famous Fighting Furball
lcfcfan
21
Years of Service
User Offline
Joined: 19th Feb 2003
Location: North East, UK
Posted: 25th Jul 2003 01:29
lol the 5200 is no where near as fast as the ti 4200 i have a 4200 and my friend made the mistake of buying a 5200 and it has piss poor performance and that's coming from the ultra version, my card just walks all over his.

Shadow Robert
22
Years of Service
User Offline
Joined: 22nd Sep 2002
Location: Hertfordshire, England
Posted: 25th Jul 2003 01:41
Graham, does he actually have it setup right?
because i have a 5200 (standard not ti) and a 4800ti... and when you use Shaders the 5200 walks all over the 4800 and Radeons.
but for normal operations its around the same speed as a 4200ti

just turn on stuff like FSAA, your 4200ti will have a noticeable performance hit, his 5200 won't even flinch

the card is most definately outperforms everything else on games made for it, else its just a standard card.
And remember the original clock speed is 100Mhz, if you get ahold of Omega Drivers you can actually clock the processor to what it SHOULD be set to (yet only a few manufacturers like Creative actually do) of 350Mhz.

at that speed it runs rings around the competition, and actually be performing at the speed it was designed to - especially with AGP8x sockets.

all these performance hits and slowdowns of the FX's are just really rumours, that are pushed because alot of these budget manufactures also push Radeon cards - so they underclock the FX they retail so that they can shift thier slower and more expensive Radeon stock.

i mean when the FX was released if you see the nVidia site and thier capabilities they tested them with far far far better performance than what is being produced ... and nVidia recently created the FX5900 and FX6000 series because across the board it seemed like thier FX line was slower.
Just to find out in an independant report that companies were deliberatly underclocking them, i mean chist an FX5200 mx at 100Mhz - even the most basic cards on the market now run at 250Mhz processor speed.

the DetonatorFX 45.09 drivers should actually resync the clocks of the GPUs right again, but it was annoying to hear about when the report was released like 3weeks ago to the developers.
Especially to me, as i have the design release versions and was wondering why the hell mine were running so much faster than these OEM versions you got. Sickening if you ask me.

Ian T
22
Years of Service
User Offline
Joined: 12th Sep 2002
Location: Around
Posted: 25th Jul 2003 02:09
I personally don't think that is as much of an issue. On mid-end cards, very few games can afford to be run with FSAA anyways, even if it only takes a mild performance dip. The only game I have found that I can afford FSAA on (and the Radeon 9500 is really good with that, it was the reason it got such great reviews) is Neverwinter Nights, which I run with 'Nice 4x Antialiasing' and it looks great. Other games simply can't take the performance hit.

--Mouse

Famous Fighting Furball
Bloodshot
21
Years of Service
User Offline
Joined: 7th Apr 2003
Location: United Kingdom
Posted: 25th Jul 2003 03:27 Edited at: 25th Jul 2003 03:32
Read this interesting news about modern DX9 cards guys:

NVidia FSAA Hardware bug?

Personally, I never use FSAA and go for higher res screens instead, but it maybe something you want to take into account before making a purchase!
Ian T
22
Years of Service
User Offline
Joined: 12th Sep 2002
Location: Around
Posted: 25th Jul 2003 03:32
1024x768 is good for me... I mean, why go any higher, there's hardly any difference and often the GUI is a bit too small to manage.

Then again I'm picky... I find 800x600 way too big...

--Mouse

Famous Fighting Furball
Shadow Robert
22
Years of Service
User Offline
Joined: 22nd Sep 2002
Location: Hertfordshire, England
Posted: 25th Jul 2003 05:56
640x480x32 @ FSAA 4x looks better than 1024x768x32 no FSAA
i have some ingame shots of something around here to emphasis this but gimme a lil while cause i've got a new compilation going on.

that site is so... one sided i can't believe it. Just look at the title "NVIDIA-based Graphics Cards Will Have a Bug in Half-Life 2?" but if you read down they actually say its a bug in Dx9 and will affect ALL currently supported cards.
Not that it matters, because the whole thing is bull anyways as my GeForce & Radeons both do MultiSampling in HL2 without glitch.

and nv40 my arse, the currently designated chip design is the nv38 (FX6000) ... there are plans for an nv40, but they're still just pipedream drawing board ideas of the FX-2, which they plan to introduce the new v4 Shaders (Collision & Dynamics Shaders)

Ian T
22
Years of Service
User Offline
Joined: 12th Sep 2002
Location: Around
Posted: 25th Jul 2003 06:02
'640x480x32 @ FSAA 4x looks better than 1024x768x32 no FSAA'

I totally disagree. No matter how much FSAA is pounded onto 640x480-- I've tried it with 8x--it still looks crummier than 1024x768x32 with no FSAA.

My ideal setting is 1024x768x32 with medium antiroscopic filtering and 2x FSAA. Looks pretty good, most games can handle it... the exception being Morrowind, I have to have 0 FSAA and AS on it or it runs like a bucket of molasses. The NetImmerse engine (now GameBryo) is to blame for that...

--Mouse

Famous Fighting Furball
adr
21
Years of Service
User Offline
Joined: 21st May 2003
Location: Job Centre
Posted: 25th Jul 2003 12:14


Leave it one night and the whole world posts! Thank you very much you guys, this has been helpful. I think, instead of determining which is better, the 5200FX or the Ti4200, I think I may go for a 5600 FX if I can find it cheap enough. I really didn't want to spend more than £100 though (all in, delivery and VAT).

If I can't find it, I may go for the 5200FX, since I don't really play games - just make em

Bender:Blackmail’s such an ugly word. I prefer extortion. The x makes it sound cool.
Shadow Robert
22
Years of Service
User Offline
Joined: 22nd Sep 2002
Location: Hertfordshire, England
Posted: 25th Jul 2003 12:41
guess its more of an opinion Mouse... unfortunately (and i've been trying) there is no way to show this effect in a jpg on your browser.
because you really need to see them in action.

the reason i prefer FSAA 4x @ 640x480x32 rather than 1024x768x32 is simply for the fact that the edges blend and make it look like its part of the scene, whereas no matter how extreme the resolution the models/scene parts will stand out against each other.
FSAA 4x @ 1024x768x32 is just quality to look at though
heres some TR shots i took the otherday to show someone the difference between ATi SmoothVision & nVidia IntelliSample

they do make the ageing game look pretty sweet you've gotta admit



adr
21
Years of Service
User Offline
Joined: 21st May 2003
Location: Job Centre
Posted: 25th Jul 2003 12:51
I've settled on one of two cards:

Gainward 5600FX 128MB - £113
ABit Siluro 5600FX 128MB - £104
(both at www.komplett.co.uk)

At first I thought "Gainward" straight away (I do like that red PCB) but I've heard good things about Abit's efforts.

I just need to find some reviews first... oh, and do some work.

Bender:Blackmail’s such an ugly word. I prefer extortion. The x makes it sound cool.
adr
21
Years of Service
User Offline
Joined: 21st May 2003
Location: Job Centre
Posted: 25th Jul 2003 15:42
That's right, you can all breath a sigh of relief. I have made a purchase...

Creative's GeForce 5600 Ultra FX 128MB - a bargain at £130 (RRP £200).

Bender:Blackmail’s such an ugly word. I prefer extortion. The x makes it sound cool.
lcfcfan
21
Years of Service
User Offline
Joined: 19th Feb 2003
Location: North East, UK
Posted: 25th Jul 2003 18:50 Edited at: 25th Jul 2003 18:51
you could have got the 256mb version from ebuyer like i said in an earlier post for the same price.

adr
21
Years of Service
User Offline
Joined: 21st May 2003
Location: Job Centre
Posted: 25th Jul 2003 21:06
That isn't an ULTRA though... and I think the ultra is worth more than 128MB sat there doing nothing

Bender:Blackmail’s such an ugly word. I prefer extortion. The x makes it sound cool.
Shadow Robert
22
Years of Service
User Offline
Joined: 22nd Sep 2002
Location: Hertfordshire, England
Posted: 26th Jul 2003 01:38
Creative don't do a 256mb version of the FX5600ti
and they are the best on the market ...
FX5200 250/400/300Mhz (core/memory/bw) (64fp - 32sp - 128bw)
FX5200ti 350/400/400Mhz (core/memory/bw) (64fp - 64sp - 128bw)
FX5600ti 400/400/600Mhz (core/memory/bw) (128fp - 128sp - 128bw)
FX5900ti 600/400/1000Mhz (core/memory/bw) (256fp - 256sp - 256bw)

there are some things about the ti's you might wanted to erm know before you bought... for example they ONLY run on 8x(AGP3.0) boards, whereas the standards run on 4x/8x

they've got a new FX5900ti they've just finished which'll replace thier current 256mb one that has been adjusted to a 1.2ghz clock.
i've been buying some new hardware recently, primarily alot of the lowerend FX cards - because i need to know what each version is technically capable of.

bought an FX5200 64mb cause not alot of people seem to support them but there are a good few around (i thought they all started at 128mb, but i guess its another cost cut from developers)

adr
21
Years of Service
User Offline
Joined: 21st May 2003
Location: Job Centre
Posted: 26th Jul 2003 15:40
It's a good job my board supports 8x AGP ...



Bender:Blackmail’s such an ugly word. I prefer extortion. The x makes it sound cool.
lcfcfan
21
Years of Service
User Offline
Joined: 19th Feb 2003
Location: North East, UK
Posted: 26th Jul 2003 22:38
replace the 256mb version of the 5900 with what a 512?, that would be a beast if so!

Ian T
22
Years of Service
User Offline
Joined: 12th Sep 2002
Location: Around
Posted: 27th Jul 2003 00:33
Hmmm, I don't know why, but now I've switched everything to 6xAA, 16xAS, and highest detail mipmap/texture, and I'm seeing hardly any slowdown . Fate apparently wanted you to win this argument Raven, and I don't mind because now Morrowind looks REALLY sweet

--Mouse

Famous Fighting Furball
Shadow Robert
22
Years of Service
User Offline
Joined: 22nd Sep 2002
Location: Hertfordshire, England
Posted: 27th Jul 2003 00:55
ahh... an ATi eh Mouse, you know technically 6x FSAA is technically impossible.

ATi neglects to understand this though, what they actually call 6x SmoothVision FSAA is actually just really a scene blur of a 6x Pixel Tap rather than a 4x4 Pixel with a 16Texel Tap. (not to mention it doesn't only work on edges)

sad really but ATi's isn't real MultiSampling, just a fake like alot of thier stuff.
It's like the 16x Anistrophic Filtering - its not real either, all they do is the standard 4x and combine it without a linear pitch - which means it creates 4x4 so first the filter blur & then the mipmap blur (4x4=16 hense 16x)

ironic really because through all the boasting of features on the Radeon cards, someone running a 640x480x32 scene @ 6xFSAA 16xAF will have the exact same graphical representation as a GeForce user running ~ 4xFSAA (Performance Tap) 4xAF.
you also have Texture Sharpening on GeForce Cards, doesn't work with the mx lines (unfortunately) but on the ti's and Fx's that blur you from AF is displaced and sharpened so you get a really clean scene that doesn't look blurred.

there's also the added bonus that we can use upto 12xFSAA in any resolution, whereas FSAA6x & 8x on the ATi cards are only allow upto 1024x768 & 800x600 respectively.
but then resolutions over that require alot of RAM, to do the operations.

its cool how much realism you can add to something using the FSAA techniques

Preston C
21
Years of Service
User Offline
Joined: 16th May 2003
Location: Penn State University Park
Posted: 27th Jul 2003 01:34
So Raven, what one would you suggest? What video card do you think is the best in your opinion?

At first glance, I'm a mediocre mech pilot. Look again and you will see my battlemech's computer code rushing through my eyes. My Mech And I Are One!
Ian T
22
Years of Service
User Offline
Joined: 12th Sep 2002
Location: Around
Posted: 27th Jul 2003 02:37
'sad really but ATi's isn't real MultiSampling, just a fake like alot of thier stuff.'

But it dosen't matter, Raven. AA itself is just faking. As long as it looks better, it's good-- that's the point of the whole thing anyways, isn't it?

--Mouse

Famous Fighting Furball
Shadow Robert
22
Years of Service
User Offline
Joined: 22nd Sep 2002
Location: Hertfordshire, England
Posted: 27th Jul 2003 04:58
Quote: "and the software (processor)-antialiasing is mutch faster, than the video-card hardware anti aliasing."


mate your on a planet all of your own... Software AA takes around 35% processing power, hardware is done realtime with almost no processing hit because its done through a backbuffer FSAA layer in the FPU.

and no mouse doesn't particularly matter greatly, its just annoying when they have stuff like 6x FSAA on thier thing which is actually physically impossible to achieve without redefining what AA is.
That and you try to run MultiSampling_6_Samples in DirectX and it bitchs at you that it isn't supported (which is annoying lol)

personally i'd suggest if you have the cash Creative FX5900ti 256Mb - its the fastest card on the market (but is £400)

on the budget range, MSI's 5200 128Mb and Creatives 5200ti 128Mb are probably the best you'll get your hands on.
and getting a 5200ti is FAR better than a 5600

the only point at which you have to start umming and ahhing is at 5800 and 5900 level.
as the 5200/5400/5600 (apart from almost no one retailing the 5400) are almost identical cards just speed seperates them really.

the 5800 and 5900 have some pretty unique updates over the other cards enabling the higher level Shadow and Effect Shaders.
if you want to see what i mean goto http://www.nvidia.com and checkout the FX demo's - the older ones on page2 are standard FX and the newer ones are FX58/59 only - grab the vids, as they're pretty impressive.

Especially the Gas Station one, thats using Luma's GI system as it was 2months ago - and its progressed alot since then.
ever want to know why nVidia have the Way Its Ment To Be Played logo those demo's are what the card is capable of, and they're not even optimised

Preston C
21
Years of Service
User Offline
Joined: 16th May 2003
Location: Penn State University Park
Posted: 27th Jul 2003 05:54
So I should get Creatives 5200ti? Okeydokey

At first glance, I'm a mediocre mech pilot. Look again and you will see my battlemech's computer code rushing through my eyes. My Mech And I Are One!
Starlight
21
Years of Service
User Offline
Joined: 23rd Apr 2003
Location: Caithness, Scotland.
Posted: 16th Aug 2003 02:35
For those who say that they cannot get the performance speeds that they should be getting from their card. Such as the argument above about Ti cards running faster than the FX5200 (EH?????!!!)

Are you running in PCI Compatibility Mode? This is something that can be overlooked. There are a number of ways to check this. I built my new PC from scratch buying different componanets, etc. When I formated the HD and installed XP, then installed the graphics driver for G4 Ti4200. In the drivers window it showed PCI Compatibility Mode for the BUS. When I installed the AGP 3 drivers that came with my new Motherboard the BUS changed to AGP x 8. Meaning that it was now operating in AGP mode.

Normally you would not notice any difference when in PCI mode but when running something (usually games) that is demanding on the card you will notice the bottle-neck. I think is purely a H/W thing and you need to check your BIOS in case AGP has been disabled (WTF! ), etc. Or install the AGP drivers for you M/B.

I don't want footsteps following me...
Starlight
21
Years of Service
User Offline
Joined: 23rd Apr 2003
Location: Caithness, Scotland.
Posted: 18th Aug 2003 02:05
What about the PNY Technologies nVidia FX cards? In comparision to Creatives FX cards? Has anyone used the PNY cards? They seem to have a larger head sink. I have a G4 TI4200 card made by PNY Tech. and it is good but the fan does make a large noise when it warms up. But its seems to be bug free on my system. But I was thinking of upgrading to the FX5600 Ultra and was wondering if the PNY are better or worse than the Creative cards. Or are the Creative 3D cars the best on the market?

I don't want footsteps following me...
CattleRustler
Retired Moderator
21
Years of Service
User Offline
Joined: 8th Aug 2003
Location: case modding at overclock.net
Posted: 18th Aug 2003 04:09
I fried my Nvidia GEForce 4 Ti4600 a while back by punching my pc during a game of EA MVP Baseball 2003. The machine rebooted and then did the dreaded "Beep, beep-beep-beep" ARGH! after that the card wasn't working, I tried it on 3 different pc's and had the same result! So I went out and bought a FX 5200 Ultra 128 which specs about the same as the 4600TI - I loved my 4600ti, I miss that damn card like a brother (weep weep) Oh wait, there it is over there on the table collecting dust.

damn
Rust-

How do ya do there son

Login to post a reply

Server time is: 2024-11-23 17:12:34
Your offset time is: 2024-11-23 17:12:34