Sorry your browser is not supported!

You are using an outdated browser that does not support modern web technologies, in order to use this site please update to a new browser.

Browsers supported include Chrome, FireFox, Safari, Opera, Internet Explorer 10+ or Microsoft Edge.

Geek Culture / Need Graphics Card Help!

Author
Message
GOD
21
Years of Service
User Offline
Joined: 23rd Apr 2003
Location: right there
Posted: 9th Oct 2003 03:52
Well, ive decided i need a new grapphics card, especialy since mine cant run DX 9 and most things coming out requires it. I want to try spend between 100 and 200 $$$$$ Im not sure what to get though, and i know nothing about cards. Also, im worried that if i make up my own mind i will get a crappy 1 and then a new DX wil come out and i cant get it and I waistes my money . lol well any advice would be apprechiated. Thanks.

~haXor
Preston C
21
Years of Service
User Offline
Joined: 16th May 2003
Location: Penn State University Park
Posted: 9th Oct 2003 04:01
NVidia GeForce FX 5200 128MB

-or-

ATI Radeon 9600 128MB (not sure if its pro thats the better one or not)

Thats all I can suggest.


GOD
21
Years of Service
User Offline
Joined: 23rd Apr 2003
Location: right there
Posted: 9th Oct 2003 04:11
yah i had the Radeon 9600 in mind, so i think im going to go with that.
QuothTheRaven
22
Years of Service
User Offline
Joined: 2nd Oct 2002
Location: United States
Posted: 9th Oct 2003 04:12
for that price you wont have to worry too much about a better one coming out, because there will already be many better ones when you buy it

CattleRustler
Retired Moderator
21
Years of Service
User Offline
Joined: 8th Aug 2003
Location: case modding at overclock.net
Posted: 9th Oct 2003 04:32
if you go with nvidia that NWC mentioned make sure its the ULTRA version

-RUST-
Ian T
22
Years of Service
User Offline
Joined: 12th Sep 2002
Location: Around
Posted: 9th Oct 2003 05:06
nVidia = eVil




The 9600 is not a good buy for its price. You can buy a 9800 (top of the line) for only $170:

http://www.gameve.com/store/gameve_viewitem.asp?idproduct=1373&showit=1

I highly recommend it!

--Mouse: Famous (Avatarless) Fighting Furball

A very nice %it, indeed.
Shadow Robert
22
Years of Service
User Offline
Joined: 22nd Sep 2002
Location: Hertfordshire, England
Posted: 9th Oct 2003 05:17
Quote: "nVidia = eVil"


:: coughtatiaretheonesbuyingthierpopularitywithdeveloperscough ::


DarkSin
21
Years of Service
User Offline
Joined: 23rd Jul 2003
Location: Under your bed
Posted: 9th Oct 2003 05:23
Quote: ":: coughtatiaretheonesbuyingthierpopularitywithdeveloperscough ::"


well whatever works

When catapults are outlawed, only outlaws will have catapults. I have a catapult. Give me all the money, or I will fling an enormous rock at your head.
Shadow Robert
22
Years of Service
User Offline
Joined: 22nd Sep 2002
Location: Hertfordshire, England
Posted: 9th Oct 2003 05:26
problem is it doesn't help the end consumer,
these petty tactics are begining to screw alot of people out of money. Personally i wonder if anyone at nvidia has seen the HL2 source ... shameful

Ian T
22
Years of Service
User Offline
Joined: 12th Sep 2002
Location: Around
Posted: 9th Oct 2003 05:44
:coughnvidiaaretheoneswholeakedthehalflifetwobetaandpossiblysourcecodecough:

--Mouse: Famous (Avatarless) Fighting Furball

A very nice %it, indeed.
Digital Awakening
AGK Developer
22
Years of Service
User Offline
Joined: 27th Aug 2002
Location: Sweden
Posted: 9th Oct 2003 06:46
Don't by an nVidia card. All DX9 tests so far exept for Doom3 Beta have shown that nVidia's 5000 series are preforming badly as soon as 2.0 shaders are used. Since this is the future you really should stay away from them. The latest drivers from nVidia have shown increased speed puting them neck to neck with ATI however they are filled with cheats that displays an incorect image. Less lighting, no shadows or fog etc.

In fact many of the biggest of nVidia's partners have started or are planning to make ATI cards. nVidia have officially announced that graphic cards are no longer their primary market.

I have a 9800 and it's really a good buy. If you can find it for $170 then it's the best buy you can get.

Ian T
22
Years of Service
User Offline
Joined: 12th Sep 2002
Location: Around
Posted: 9th Oct 2003 06:47
It should be noted that that is the 128mb, non-Pro version, but it's still a darn good buy for that money.

--Mouse: Famous (Avatarless) Fighting Furball

A very nice %it, indeed.
Shadow Robert
22
Years of Service
User Offline
Joined: 22nd Sep 2002
Location: Hertfordshire, England
Posted: 9th Oct 2003 07:24
Quote: "nVidia have officially announced that graphic cards are no longer their primary market"


really? you care showing me this announcement as their site has nothing on this... and it would seem stupid to divert funding away from thier graphics card line with the nv4x on the horizon and the nv38 being released over the Christmas period as a rival to the RadeonXT cards.
you add to this the fact that over the past 8months nvidia have recieved nothing but high acclaim from everyone except valve & microsoft - Valve are infact quite vocal to the point of thier claims to the poor performance of the nvidia nv3x line, yet wouldn't allow the use of the new 51.75 drivers for testing ... and Microsoft it isn't surprising they're putting down nvidia seeing as they wanted a GeForce GPU for thier Xeon project, quite frankly it is obvious that they didn't want to take no as an answer.

nvidia have also as of yesterday become PERMINANT members on the SGI_ARB for OpenGL, this means unlike ATi they get a full vote over what new content goes into OpenGL 1.x and 2.x
You combine this with the fact that Futuremark's 330 build of 3DMark03 shows without a doubt the overwhelming performance of the nv3x line, and this is the version specifically designed to combat the nvidia driver cheats - you add to this nvidia no longer have the right to alter thier drivers as being part of the Beta Testers once again for Futuremark.

Sorry but from the professional tests i've seen, and my own home tests - quite frankly I don't see GeForce faltering even a step.

Quote: "The latest drivers from nVidia have shown increased speed puting them neck to neck with ATI however they are filled with cheats that displays an incorect image"


really? i take it you own one of these cards and can prove this?
Because the 45.23 drivers were buggy, they are also the SLOWEST drivers of the series.
44.04 are currently the fastest on standard operations and the new 51.75 are the faster with shader operations...

i've run multiple tests within 3DMark01 SE 440 and 03 330 and what i've found is there isn't ay of this so-called quality loss nor hacks. Apart from a buffering problem between the loading screen & rendering which causes the frontbuffer not to clean itself properly which leaves and intermitant blur across the screen when rendering frame-by-frame but not present when in realtime.

-- -- --

i'm getting sick of the anti-nvidia bull... all you people are going on are sites and statistics from them, alot of which are biased towards a particular manufacturer.
at the end of the day, explain to me why the gamers needed a the slightly speed bolstered RadeonXT if the GeForceFX is really as useless as everyone claims?

It would've been a waste of development research funds, testing and production casting ... a business WOULD NOT spend $2.1billion on developing a slightly faster with slightly enhanced shader pipelines IF thier current card was already performing exactly how they planned.

yeah i'm sure that'd happen - and i'm some monkey fairy from planet Zi
jesus wake up, these companies are in a presswar ... and quite frankly nvidia has taken home every single industry award from CG to Gameing industries for being the best graphics solution.
ATI picked up NONE!

Hell the Alienware PCs are reknown for being beasts of gaming machines, tell me what cards they have packaged AS STANDARD since August 2002!

Face it when it comes down to it, all ATI users and fans have are the online statistics.
nvidia fans, have the industry behind them.

So the GeForceFX doesn't perform well within Half-Life2 ... that is a SINGLE developer out of hundreds that have a problem with the FX line. They might seem like the biggest right now, but popularity wise due to the whole deal with Half-Life2 ... quite frankly thier credibility is flowing down the pan.

-- -- --

and you know what "Was it nvidia that leaked the source?"
to be perfectly frank, if it was then valve deserved it ... you don't go around trying to destroy a companies reputation the way Valve did. At the end of the day, you sleep with the devil and your gonna get burnt!

las6
22
Years of Service
User Offline
Joined: 2nd Sep 2002
Location: Finland
Posted: 9th Oct 2003 09:07
Quote: "It would've been a waste of development research funds, testing and production casting ... a business WOULD NOT spend $2.1billion on developing a slightly faster with slightly enhanced shader pipelines IF thier current card was already performing exactly how they planned"


oh man, another one.
First of all, XT is like a normal radeon on steroids, they've tweaked it to maximum, but it is still the same card. Just faster.
So there's no real development/research/testing/production costs. Actually I think it was pretty cheap for them.

AND of course they'd want to put new cards to the market. Don't you get it? If they'd just stop here, nobody would buy a new gfx card. But if they keep coming with new products, people start to think their cards aren't good enough, so they buy a new one. Dah. Simple and very common method in business everywhere. You see new models of cars being made every year, but that doesn't mean that they are in any way much better than the previous models.

No matter what you say, HL2 is a big game. People want to play it and if NVIDIA cards can't run it without specific optimizations to the shaders, well consumers have easy time picking out their gfx card. Would you want a GFX card that runs some games ok, others not because there's no shader optimization for it OR would you like a card that runs ok in any case? I know what my pick would be.

Oh, and Radeon 9500 pro might be a good choice too, if you could only find it. but 9800 for 170$, now that's cheap!

Digital Awakening
AGK Developer
22
Years of Service
User Offline
Joined: 27th Aug 2002
Location: Sweden
Posted: 9th Oct 2003 18:22
Raven:
Since I know ther's no point arguing with you nor do I wanna bother reading your long boring posts so I just give you this:

"There was little mention about any new FX cards or the current ones. The NVIDIA CEO Jen-Hsun made it quite clear that the graphics industry would no longer be the primary focus point of the company. They then continued to demonstrate a wide range of new multimedia products."

http://www.legionhardware.com/html/doc.php?id=261&p=3

CattleRustler
Retired Moderator
21
Years of Service
User Offline
Joined: 8th Aug 2003
Location: case modding at overclock.net
Posted: 9th Oct 2003 19:33
<puts on flame-proof coat>

I say someone should resurrect 3DFX!



-RUST-
Shadow Robert
22
Years of Service
User Offline
Joined: 22nd Sep 2002
Location: Hertfordshire, England
Posted: 9th Oct 2003 20:25
Quote: "First of all, XT is like a normal radeon on steroids, they've tweaked it to maximum, but it is still the same card. Just faster"


Checkout the technology page on ATi about them... they've updated their features to be onpar with the GeForceFX 52/5600 cards, thier speed has been increased from a top of 800mhz ro 1,000mhz coupled with an intelligent speed sensor which will prevent dangerous overclocking that ATI card owners are used to (something the FX and 4ti have had for over 3years now!)
Add this they're now using an .11 die rather than the .13 they were using, which means smaller but more heat expensive VPUs.

if you think its just a Radeon on steriod then be my guest and LOOK at the specifications of it.

The Radeons are currently by far top of the Market according to most online sources, as a company you don't spend stupid money ... the R480 is due out next year, give me a REASON why they would release the RadeonXT if they seriously had nothing to fear from nvidia.
I'm perfectly serious here, out of ALL of the companies around ATI are certainly as hell not known for throwing money around they don't need to, the only reason the 9500's were recalled and replaced with 9600s is because they were too expensive for budget cards but the XT range isn't ment for the budget market they're top of the line cards. And they're to replace the Pro series there, who are also not shifting to budget.

The industry runs on money, how much you can make to the loss of development ... ATi might be hailed as Creme of the crop right now, but people are not going to be willing to spend out another $380 just to get this newer installment of the Radeon.

Quote: "HL2 is a big game. People want to play it and if NVIDIA cards can't run it without specific optimizations to the shaders, well consumers have easy time picking out their gfx card. Would you want a GFX card that runs some games ok, others not because there's no shader optimization for it OR would you like a card that runs ok in any case? "


Half-Life2 has been optimised for ATI use, there are no 2 ways around it ... its not a case that the GeForceFX is performing badly because it in unoptimised, the Radeon's shaders are optimised whereas the GeForceFX, Quadrat & Perhelia are all in the same boat.

but Half-Life 2 is just ONE game out of over 100 games released for the PC format every year ... it might be the most highly publicised however just where the HELL is it??
I don't have it, you don't have it... NO ONE HAS IT!
Latest Doom3 reports are that the GeForceFX using the 51.83 WHQL Candidate is performing at breakneck speeds.
Jedi Academy also performs FAR FAR greater on GeForceFX and 4ti cards than it does on the Radeon equivilants, even STALKER is reporting its far greater speed on the FX range over the Radeons.

just because these games aren't AS high profile as Half-Life2, its hard to ignore that this socalled speed loss between the cards is the created facts of two very pathetic companies.

Benchmarks show the FX are faster than the Radeon equivilants, and with each DetonatorFX/Foreware driver they're getting better all the time with the Radeons speed now at a stand still as far as optimisation goes.
As an Artist i personally prefer the GeForceFX over the Radeon equivilants because it has FAR FAR greater image quality, it's interchangeable FSAA Quality levels and depths of FSAA along with AF are unparralelled - i keep trying to see this so called better quality from the Radeons, but all i constantly see are washed out colours and very poor FSAA sampling ... considering the Radeons are suppose to be working from a SampleX5 rate whereas the GeForce standard sampling is SampleX3/5/9 with the XS being 9x9 Sampling and much smoother graphically under DirectX.

We should probably believe everything that is being said by Valve and ATi though, i mean they didn't lie that Half-Life2 supported FSAA did they? Oh wait no, there have been benchmarks with it.
But they didn't lie that thier partnership wouldn't give favouritism, did they? Oh wait no, Gabe Newell at Alcatraz pretty much sunk that one.

I'm sorry but I'm not going to buy my brand new graphics card based on a SINGLE TITLE... i don't give a damn if it holds the answer to life, the universe and everything - over $100 for a new graphics card and I WANT some damn proof that it is the best.
And from the real-life tests this is just NOT what i'm seeing.

1 game out of over 200 tested over the past years, that particular game being 'sponsered' by ATI - and cooincidentally that is the ONE game that happens to be showing the FX in poor colours.

Quote: ""There was little mention about any new FX cards or the current ones. The NVIDIA CEO Jen-Hsun made it quite clear that the graphics industry would no longer be the primary focus point of the company. They then continued to demonstrate a wide range of new multimedia products.""


i followed the link, i'd strongly suggest you checkout the links from nvidia.com about the conference where the GeForceFX line won 8 of the industries top awards for excellence (wonder how many ATI went home with ::coughnonecough:

There is a message within the words said though, over the past 2years nVidia has brached out into motherboard and multimedia for the home. nVidia's primary consern is the Multimedia market... to say that they're no longer primarily just about the graphics is true, however it doesn't mean that this isn't thier primary consern STILL.

the GoForce for the Mobile Market and GeForceFX 6000 / GeForceFX 2 are prime examples of this, ATI only has the graphics market ... nVidia are starting to offer the full package,

from Graphics, Sound, Networking, Motherboard Solutions, Etc...
And it is no secret that thier graphics card and Creative's Sound Blaster ranges perform quite stunningly upon thier own chipsets.
Thier motherboards are now the industries fastest, combined with the unlimited power of the new Athlon64 FX and a GeForceFX - not to mention Crucial Ram who they bought out earlier this year who are one of the most well know Ram developers.
nVidia haven't stepped back from giving the end users anything, what they're doing is giving you the ENTIRE package...

Quote: "3DFX and their groovy voodoo cards where you could use 2 cards at the same time by connecting them via CLI, yeah man that's where the fun is at rofl! good in their day though. <- this is how an opinion *should* end.. IMO "


the term is SLi, and what the hell do you think the GeForceFX are?
the CLUE is in the name.

Shadow Robert
22
Years of Service
User Offline
Joined: 22nd Sep 2002
Location: Hertfordshire, England
Posted: 10th Oct 2003 01:47
Quote: "If companies didn't release new products we'd all still be on 286's running at 10mhz with beastly 256kb graphics cards."


the speed of computers have come out of need & competition ... the needs drives new technology but when the purchasers can afford it.
the competition is what truely drives the innovation.

Intel were the ones who set the computer standards up until the 486 back in 1993... when the 6x86 Pentiums were created the chip specification was suddenly made a trade secret. This ment AMD had to finally make thier OWN version of the pentium processor, and as such the Processor war which is still going on today was born.

but what exact has this done?
Well this has given us MMX which are 57 Op instructions that allow us to calculate 24bit colour more accurately and faster
It has given us 3,000ghz or raw processing power.
It has given us 3DNow & SSE special instruction sets designs to bridge the gap between the multimedia and game hardware.
It has given us 64Bit Processors capable of over 2x the processing power of thier earlier equivilants.
It has given us a huge range of new types of processors.
With 3DNow and MMX we gained the AGP socket to take advantage of this, we also gained faster Ram as the FSB has jumped from 16mhz to 800mhz.
The pure raw power of processors have gone from 4,096mips to 448 mega/mips...

competition within hardware is a good thing, it pushes the opponents further advancing tehcnologies and giving the end user the best product ... but if there is no immedate requirement to create a new card, like Intel of the 80s they'll just sit back and develop at thier own pace.
Slightly updating an old design isn't something you do when you don't have competition right on your tail - if you don't believe this look at TGC.

It's quite obvious the reason they released DarkMatter Enhancements for DB was because of Blitz3D, i bet if Blitz wasn't around they wouldn't have bothered. Just like DarkBASIC Pro ... if there wasn't competition they'd develop at thier own pace and they'd take the time to fully learn and understand the DirectX APi rather than trying to give you guys the best technology that is on offer right here and now.

... the point is you don't spend money on something that isn't needed. The R480 is due out within the next 6months, the market is already hyped about it, the market is also hyped about the 97/9800pro within Half-Life2 ... even if they just supercharged the Radeon they'd still have to change the chipdevelopment line which COSTS MONEY, even the most minor changes can cost hundreds of thousands - because to change the chip you have to shut down production to create the new ones, which is several lines of chips not being produced for about a day maybe more - which is ALOT of lost revenue.

Sorry but no company will release to users a more expensive but slightly better version of what is already available on the market over a year after the original was released... it just doesn't happen!

the_winch
21
Years of Service
User Offline
Joined: 1st Feb 2003
Location: Oxford, UK
Posted: 10th Oct 2003 02:03
Quote: "Sorry but no company will release to users a more expensive but slightly better version of what is already available on the market over a year after the original was released... it just doesn't happen!"


Ever heard the expression "the customer is allways right"? If the customer wants to buy a more expensive then slightly better card then you sell it to them before they buy from someone else or someone else sees potential and starts making to sell to your customers.

A lot of people buy new cards every year, don't supply them with a better card then they won't wait they will just buy one somewhere else.

A lot of the computer industry is based on selling what is a little better for a lot more money to people who know it will be obsolete in a short time.

Quote: "It's quite obvious the reason they released DarkMatter Enhancements for DB was because of Blitz3D, i bet if Blitz wasn't around they wouldn't have bothered. Just like DarkBASIC Pro ... if there wasn't competition they'd develop at thier own pace and they'd take the time to fully learn and understand the DirectX APi rather than trying to give you guys the best technology that is on offer right here and now."


There is the need to fight compertion and there is the need to make money. You can't run a company on now you need products to sell tommorrow/next week/5 years time. (Potential) Compertion isn't the only reason for developing new products/technology. Just look at microsoft.
Ian T
22
Years of Service
User Offline
Joined: 12th Sep 2002
Location: Around
Posted: 10th Oct 2003 02:24
'you add to this the fact that over the past 8months nvidia have recieved nothing but high acclaim from everyone except valve & microsoft'

Untrue. Just about all the review sites I've seen hail the newest 9800 as The King and are annoyed with nVidia over the 5800 debacle. nVidia is out as far as I'm concerned... they've been behind too long, made too many big mistakes, and now they're pulling back from the whole games front. Just go to their website-- it's changed considerably, now aimed at multimedia buyers.

--Mouse: Famous (Avatarless) Fighting Furball

A very nice %it, indeed.
Megaton Cat
21
Years of Service
User Offline
Joined: 24th Aug 2003
Location: Toronto, Canada
Posted: 10th Oct 2003 20:51
ok so since this is just another case of raven and his video cards ill just say one thing

Quote: "You can buy a 9800 (top of the line) for only $170:
"


how the heck can u get a 9800 for 170???? I bought my Radeon 9200 for 120 $ and am pretty happy with it i mean it can still run almsot all of the latest games but i just mostly use it for game development so its alright for me. I dunno i just dont trust those Gforce 4 MX cards for some reason.

My site is delyaed for the 250th time!
If life is just one big joke...
then I must have missed the punchline.
KARRIBU
21
Years of Service
User Offline
Joined: 7th Oct 2003
Location: England
Posted: 11th Oct 2003 01:47
I would let you know how the ATI Radeon 9800 128mb is because I've bought one!! You know the problem...The damn $%^&"'s at the shop haven't delivered it yet!!!!!!! I need it to play Homeworld 2 properly...i have the game just not the damn card.

Now I have 2 questions for YOU!!!

1) If a wood chuck could chuck wood then how much wood could a wood chuck chuck.

2) If I dont receive my card soon, what would be the best organicaly grown food to beat them sevearly with?

I wouldn't be so paranoid if people would just stop looking at me!
M00NSHiNE
21
Years of Service
User Offline
Joined: 4th Aug 2003
Location: England, UK
Posted: 11th Oct 2003 02:12
1) I think the proper expression is - How much wood could a wood chuck chuck if a wood chuck could chuck wood - and the answer is "Why would a wood chuck not be able to chuck wood in the first place?"

2) Organically grown food? Use something a bit tougher, slap them about with a fish.

"It's amazin' what you can do with a computer and access to t'internet"
Evil Noodle
21
Years of Service
User Offline
Joined: 28th Apr 2003
Location:
Posted: 11th Oct 2003 02:21
a wood chuck would chuck no amount of wood since a wood chuck cant chuck wood
Evil Noodle
21
Years of Service
User Offline
Joined: 28th Apr 2003
Location:
Posted: 11th Oct 2003 02:21
2/tofu
Shadow Robert
22
Years of Service
User Offline
Joined: 22nd Sep 2002
Location: Hertfordshire, England
Posted: 11th Oct 2003 03:14
mouse you can't count review sites, they're always either pro one card or another ... and oftenly they follow the trends which right now the Radeons are this trend.

at the end the day nvidia are the ones going home with the industry awards. And my point about a new card isn't ATi slowing stopping innovation or anything...

its the fact that come March at E3 they'll be unveiling the R480 - that card boasts better performance and better shader support than any other currently on the market.

So explain to me WHY RELEASE THE RADEON XT??
Fine i'd understand if there was nothing planned...
but this is like Intel releasing a Pentium2 800mhz when they're planning to release a Pentium3 1.0Ghz in a matter of 2months time.
This again would make sense IF, the P2 was going directly to the budget market ... but again this isn't the case with the RXT, instead it is actually now the MOST expensive card out.

yes people are willing to spend $380 every year or so to get something that will be soon obsolete, but technically the RadeonXT is ALREADY obsolete. It makes NO SODDING SENSE!
The only time you do something like this is when you NEED to beat the competition, i don't care what you try to say ... anyone who has an incling of business sense will know this all too well.
You do NOTHING to rock the financial boat, especially when you seem out on top.

Yian
21
Years of Service
User Offline
Joined: 16th Jun 2003
Location: Nicosia, Cyprus(the Greek half)
Posted: 12th Oct 2003 22:28
My god! Raven please try to compress your posts man...do you write books?

Jeriko The Slyz,Yian The Craft,The Mechanist,The Lost One,Master Of Dots,Bambos O Bellos,Zolos O Kolos
MiR
21
Years of Service
User Offline
Joined: 13th Jul 2003
Location: Spain
Posted: 13th Oct 2003 19:42
I need a new grafics card. And I also have a budget of 200€/$. I was thinking of getting a Geforce FX5600Ultra. Any good?
I don´t care about running HL2 or Doom 3.(I´ve got an Xbox that will run both perfectly well and that only cost 200€) The thing that needs to run fast is DBP.
Raven: Another reason why Microsoft doesn´t like Nvidia is the price of the grafics chips in the Xbox. They should have done what Nintendo did with ATI. Make the chips themselves and pay a fee for every chip used.
Shadow Robert
22
Years of Service
User Offline
Joined: 22nd Sep 2002
Location: Hertfordshire, England
Posted: 13th Oct 2003 20:59
Quote: "Another reason why Microsoft doesn´t like Nvidia is the price of the grafics chips in the Xbox. They should have done what Nintendo did with ATI. Make the chips themselves and pay a fee for every chip used."


first i heard that the GC runs on an ati based gfx chip, you might be thinking of thier new machine ... from what i understand the GC runs on an updated version of their N64 gfx chip (hense the game soft graphics style)
but that aside, thats exactly what Microsoft did do ... that is the whole reason for nvidia refusing outright to let them have the rights to the nv30 like Microsoft wanted.

however Microsoft unlike Nintendo aren't a hardware company, they still required a 3rd party to produce the chips ... infact produce ALL of the hardware. It would've been far cheaper for Microsoft if they had let nvidia produce the chips for them - but typical Microsoft they wanted everything to be THEIRS to alter however they like.

.. .. ..

as for Gfx cards for the PC, the FX5600 ultra is a great little card if you can afford it, get either PNY or Creative - they've the best support and speed.

MiR
21
Years of Service
User Offline
Joined: 13th Jul 2003
Location: Spain
Posted: 13th Oct 2003 21:24
What´s the diference between the GeforceFX5600 and the GeforceFX5600ultra? I thought it was the amount of ram but I´ve seen a GF FX5600ultra with 128megs.
Shadow Robert
22
Years of Service
User Offline
Joined: 22nd Sep 2002
Location: Hertfordshire, England
Posted: 13th Oct 2003 22:00
speed of the processors... 5600 @350mhz 5600ultra @500mhz

the new 51.75 drivers are going to see the welcome return of the overclocking abilities in the GeForce cards something to look forward to.

empty
22
Years of Service
User Offline
Joined: 26th Aug 2002
Location: 3 boats down from the candy
Posted: 13th Oct 2003 22:12
Quote: "first i heard that the GC runs on an ati based gfx chip, you might be thinking of thier new machine ... from what i understand the GC runs on an updated version of their N64 gfx chip (hense the game soft graphics style)"

GC uses "Flipper" by ATI.

I awoke in a fever. The bedclothes were all soaked in sweat.
She said "You've been having a nightmare and it's not over yet"
Preston C
21
Years of Service
User Offline
Joined: 16th May 2003
Location: Penn State University Park
Posted: 13th Oct 2003 22:14
Quote: "GC uses "Flipper" by ATI.
"


Now I know why it was called project Dolphin in production (I think that was its project name)


Shadow Robert
22
Years of Service
User Offline
Joined: 22nd Sep 2002
Location: Hertfordshire, England
Posted: 13th Oct 2003 22:26
care to share how you know this empty?
because i've not seen a single source state this

empty
22
Years of Service
User Offline
Joined: 26th Aug 2002
Location: 3 boats down from the candy
Posted: 13th Oct 2003 23:30
http://www.nintendo.com

I awoke in a fever. The bedclothes were all soaked in sweat.
She said "You've been having a nightmare and it's not over yet"
Shadow Robert
22
Years of Service
User Offline
Joined: 22nd Sep 2002
Location: Hertfordshire, England
Posted: 14th Oct 2003 00:15
Quote: "System LSI Custom ATI/Nintendo "Flipper" "

i wonder about these tech specs, considering it used to say Nintendo FlipChip R4360 and it used to also say Graphics Chip next to the name.

the_winch
21
Years of Service
User Offline
Joined: 1st Feb 2003
Location: Oxford, UK
Posted: 14th Oct 2003 01:25 Edited at: 14th Oct 2003 01:32
http://www.ati.com/companyinfo/press/2002/4559.html
http://cube.ign.com/articles/087/087830p1.html?fromint=1

Looks like ati aquired the company that orignally designed it and nec manufacture the chip.
empty
22
Years of Service
User Offline
Joined: 26th Aug 2002
Location: 3 boats down from the candy
Posted: 14th Oct 2003 02:28 Edited at: 14th Oct 2003 02:28
Yup, it was ArtX (says so on old specs).
But the chip has always been called "Flipper"

I awoke in a fever. The bedclothes were all soaked in sweat.
She said "You've been having a nightmare and it's not over yet"
Shadow Robert
22
Years of Service
User Offline
Joined: 22nd Sep 2002
Location: Hertfordshire, England
Posted: 14th Oct 2003 06:52
oki my world make sense again

Login to post a reply

Server time is: 2024-11-24 01:13:02
Your offset time is: 2024-11-24 01:13:02