Sorry your browser is not supported!

You are using an outdated browser that does not support modern web technologies, in order to use this site please update to a new browser.

Browsers supported include Chrome, FireFox, Safari, Opera, Internet Explorer 10+ or Microsoft Edge.

Geek Culture / Oh Oh Oh... the Shiney new FX cards!

Author
Message
Shadow Robert
22
Years of Service
User Offline
Joined: 22nd Sep 2002
Location: Hertfordshire, England
Posted: 27th Oct 2003 17:37
las checkout several other sites posted earler you idiot... but that makes a very clear point about how you just can't trust the online sites.

Maybe when they're satistics actually cooberate on issues maybe then they'll be worth listening to.

there were TWO cards on Show, and its amusing how somehow in the new FX5700/5600 tests they're the ONLY GeForce line that doesn't have far quicker multitexturing, eh?

or didn't you bother doing your little homework... infact the FX5600 Ultra appears to have dropped around 600 points in a matter of 3months.
Miricles do happen for ATI don't they.

I would STRONGLY suggest you check the other sites posted and the other cards and reviews.
I'll start believing thier scores when they start to be CONSISTANT.


To Survive You Must Evolve... This Time Van Will Not Escape His Fate!
Eric T
21
Years of Service
User Offline
Joined: 7th Apr 2003
Location: My location is where I am at this time.
Posted: 27th Oct 2003 17:40
Well i have a Question...

any of you know what the Vid Card Requirement's for FF11 are.. (i'm buying it tomorow!!! and just wanna know what the REquiremenets are)

A Dream is a Dream unless it is Real
Shadow Robert
22
Years of Service
User Offline
Joined: 22nd Sep 2002
Location: Hertfordshire, England
Posted: 27th Oct 2003 17:50
If you want it to be playable

min:
Pentium3 800mhz (Celeron 1.3ghz)
128mb Ram
GeForce3/4ti Card or Radeon 9Series
DirectX9
Broadband Internet Connection (56k possible but NOT recommended)
Windows 9x/2000/XP
Mouse & Keyboard

recommended:
Pentium4 2.5Ghz
512mb Ram
GeForceFX 5900 Ultra 256mb
DirectX
1.0mbit Broadband
Windows 9x/2000/XP
Mouse & Keyboard

the site has them too


To Survive You Must Evolve... This Time Van Will Not Escape His Fate!
Eric T
21
Years of Service
User Offline
Joined: 7th Apr 2003
Location: My location is where I am at this time.
Posted: 27th Oct 2003 17:52
eeek....gonna have to really pump my 56k for it... Luckily i have everything else...

Sometimes i hate this damned state

A Dream is a Dream unless it is Real
Ian T
22
Years of Service
User Offline
Joined: 12th Sep 2002
Location: Around
Posted: 27th Oct 2003 18:17
So much hate over ...

--Mouse: Famous (Avatarless) Fighting Furball

A very nice %it, indeed.
Shadow Robert
22
Years of Service
User Offline
Joined: 22nd Sep 2002
Location: Hertfordshire, England
Posted: 28th Oct 2003 00:14
Quote: "Which graphics card is the most future-proof ? ie directx 9 ? I think it's the radeon 9800XT... okay it's a $500 card...still whoops the geforcefx5950 ultra in directx9 as the fx's are missing a vital key dx9 feature which is lack of support for floating-point texture formats"


a) Radeon 9800XT you'll be lucky to find one under $580, you can get a Pro for around $520 but they lack alot of features.
FX5950 is going to be no more than $500, thats what only the top cards will cost you'll be able to get OEMs for much less.

b) As for Floating-Point Texture, somehow i think you've misheard what has been said. Because they were the FIRST to support Floating Point Calculations for Shaders, this extends to the Images.

What they don't support however is the Floating Point Stencil nor do they support Floating Point Images within the texture pipelines, which means they must be compiled direct rather than pre-processed. However this is a driver issue and only for DirectX, not a card capability issue.

Quote: "The NV3x chipset uses 12bit integer shader operations in Doom 3, whereas the R3xx chipset uses 24bit floating point shader operations in Doom 3. When the NV3x is forced to use 24bit floating point operations, the R3xx exceeds the NV3x performance wise."


Really, try again ... NV30 uses 16Bit Half Precision and 32Bit Full Precision Floating Point Shader Operations, it is also capable of 16Bit Full Precision Integer Operations (NV35+ is capable of 32Bit Double Precision)
Whereas the R300 is only capable of 24Bit Compressed Floating Point which is then widened to the 32Bit, add to this the Texture Pipelines only use 16Bit allowing thier drivers to compress 2 Texture cycles into the called 32Bit operation.

The way Radeon's are setup technically they ARE capable of double the geometry pipelines and half the texture pipelines.
This SHOULD make thier Vertex and Rendering capabilities far quiter than the FX's whilst leaving thier Texturing abilities only half the speed.

However when you look at the actual benchmark results what you find is in the Polygon-for-Polygon Tests the Radeons actually fall short even at comperable speeds, but thier Texturing seems to show that they're really only comperably at the single level rather than dual or more levels.

Which if we take a game suchas Quake2, run it at 1280x1024x8
on the FX5900 Ultra and the 9800PRO ...
what we see is the FPS stay at a very level 320fps for both cards

Yet we take Quake3 and run that with its Multitexturing Shaders at 1024x1024x16 and we notice that the difference in texture speed hit the Radeon hard.
Now the FX5900 Ultra is doing 160fps whereas the 9800PRO is only producing around 98fps.

The even more interesting thing is that if you then put the game into 32bit mode, the speeds somehow start to show more compariable ... with the 9800Pro still doing 98fps whereas the FX5900 Ultra is now down to 119fps.

That right there is a very very interesting point to what the Radeon drivers are doing, although 32bit Texture mode allows the APis more precision to blend the textures ... the textures themselves within the Radeons are being stripped of the additional 16bit precision in order to push double the textures down the pipelines - this effectively turns 1 Pipeline into the 2 making it equal to the FX.
The only major problem with this being is that this causes a noticable quality drop. Although Reviewers seem not to notice this drop, gamers have and so has the industry as a whole. When a Reviewer reffers to Image Quality all they're reffering to is the FSAA abilities of the Card. Whilst it is admitable that the Radeons are capable of slightly better looking FSAA at equal levels, the Radons and FX do FSAA in totally different mannors.

the Radeons use an onboard chip which has direct access to the framebuffer, what this chip does is simply supersample the ENTIRE framebuffer by a Nth level ... then adverages it down.

Smoothvision 6x for example expands the image x3.0 - then it adveraged the image by a 3x3 Sample Range quality.
IntelliSample 6xS on the other hand doesn't do anything to the actual image, what it does is use its 24bit Z buffer to create a depth image and from that it then calculated within a 3x9 samples adverage. It then uses that image as a template and blurs the onscreen pixels by 3.0 that correspond to the template.

The difference in result is that the Radeons FSAA blurrs the ENTIRE screen, whereas the GeForce simply blurrs the affected edges, this gives the Radeons a softer more console look - whereas it gives the GeForce a much sharper and personally better overall image quality.
But then again alot of people don't agree with me when i think a game has good graphics to one that has crap graphics.

Half-Life2 generally has very bland and crappy graphics in my eyes, whereas Final Fantasy XI has truely breathtaking graphics.
Just like Half-Life I think had very little atmosphere and its graphics were too bland again, whereas Quake2 has a superb atmosphere to it.

But then I know too many people who'd totally disagree, some cause they don't like me and a good few cause they honestly have a different taste and perspective.

Really the more you look into what ATI are doing, and what is comming across is just stupid.
They're affecting the overall quality fo thier games and graphics for the sake of thier speed, although you probably have been brainwashed into thinking that nvidia are doing the same ... quite fankly you'd be a fool to think so.
These are from the same people who think that the Radeons' current graphics levels are no difference.

And the speed difference that the Radeons have over the FX in DirectX9 is not a simply case of "the FX technology is unoptimised"
Its because the Catalyst drivers actually replace DirectX9 aspects on your system.

ATI haven't just optimised thier drivers, they've optimised DirectX9 for thier own use - you want proof in this pudding then checkout your Dx9 library on your computer, the sizes from a normal installation are different. More proof is in the fact that during the Half-Life2 scandal it was uncovered that ATI were caught tampering with DirectX9.0 and 8.0 information which slipped out underneath the media's attention because they were all relieved that an ATI employee wasn't the main cause.
But the fact of the matter remains that ATI have done EVERYTHING they can... you want to know why they have to patch everysingle game that has a bug? It's because your not running the mircosoft release of DirectX9.

Sure it has the MS Front to it, but the actual libraries being used AREN'T.
You think this makes it all cool that you get your extra 5fps because of this? At the end of the day this just goes to show that the Radeons really aren't upto what everyone believes they are.

Effectively Radeon have set your system up like a console are are maintaining it in such a way that will keep suspicion from them.
It is also why under OpenGL thier graphics cards quite rightly suck. The FX in alot of instance are getting oftenly double speed.
And for saying the Radeon's new shader technology is suppose to be equal, the R380 vs the NV35 id software has show than quite frankly there was NO competition at all. And the FX5950 just extends the gap.

ATI ofcourse have been blaming this speed difference on CgFX but quite frankly the speed difference has nothing to do with CgFX because it has ZERO optimisations for any cards in it.

At the end of the day the FX's minorly disappointing speed in OpenGL is down to unoptimised and not yet fully featured drivers, the ATI however are just happy that most games that are being released are for DirectX, because as Jedi Knight Academy recently showed - They're close to useless in OpenGL.

This also makes them pretty much bin fodder for anyone using Macintosh or Linux based Systems.
Considering i know how many of you love Linux, I'd be interested to know of anyone using it who has high praises to say about the ATI Radeon line.


To Survive You Must Evolve... This Time Van Will Not Escape His Fate!
Shadow Robert
22
Years of Service
User Offline
Joined: 22nd Sep 2002
Location: Hertfordshire, England
Posted: 28th Oct 2003 03:10 Edited at: 28th Oct 2003 03:11
you wanted to get all technical about this, i was just explaining.
i'd suggest you save up your online time and take the time to read it, you might learn something.

[edit-]
it was written within 5mins left of my lunchbreak, i took the time to write it. you could atleast read it


To Survive You Must Evolve... This Time Van Will Not Escape His Fate!
Ian T
22
Years of Service
User Offline
Joined: 12th Sep 2002
Location: Around
Posted: 28th Oct 2003 04:24
'Half-Life2 generally has very bland and crappy graphics in my eyes, whereas Final Fantasy XI has truely breathtaking graphics.'

Well hot damn, I'm sure all the respectibigle 3d sites will up and switch over to anime MMORPGs as their benchmarks instead of FPSs because of that!

' i took the time to write it. you could atleast read it'

And I took the time to carefuly take various game snapshots at different levels of antialiasing detail...you could atleast do what you said you would and make pictures of your own. Something tells me you're trying to avoid the results .

--Mouse: Famous (Avatarless) Fighting Furball

A very nice %it, indeed.
Shadow Robert
22
Years of Service
User Offline
Joined: 22nd Sep 2002
Location: Hertfordshire, England
Posted: 28th Oct 2003 04:39
Quote: "Well hot damn, I'm sure all the respectibigle 3d sites will up and switch over to anime MMORPGs as their benchmarks instead of FPSs because of that!"


how about FFXI is close to THE most graphcially demanding game on full graphics (yes even more so than HL2) ... also it currently has an Asian userbase of around 3million and the estimated US userbase is currently into to the projected 5million range due to site stats, FF sales and thier forums.

Sorry but MMORPGs make up far more of the gaming population and the fact that FFXI will require a 5th generation Shader card is going to just put proof in the pudding that it needs to be tested.

The only reason AoD isn't tested much is because there is no more official testing support and to be honest the engine is a mess.
And i'm not trying to avoid any results, you wanted me to snap a game that I don't have and didn't want to download and install.

unless your willing to come over to my place and give it to me because i'm not wasting 3hrs download time on something i'm never going to use again just to prove a point. I have better things to do.


To Survive You Must Evolve... This Time Van Will Not Escape His Fate!
HardBoot
21
Years of Service
User Offline
Joined: 10th Sep 2003
Location: Earth
Posted: 28th Oct 2003 04:46
Nvidia cheats on benchmarks and makes game/map specific settings which override user settings.
shame on you Nvidia!
saving up for a Radeon.
Ian T
22
Years of Service
User Offline
Joined: 12th Sep 2002
Location: Around
Posted: 28th Oct 2003 05:30
'how about FFXI is close to THE most graphcially demanding game on full graphics (yes even more so than HL2) ... also it currently has an Asian userbase of around 3million and the estimated US userbase is currently into to the projected 5million range due to site stats, FF sales and thier forums.'

I'd like to see the source of those statistics (unless, that is, it's your own head, which I find quite likely).

'Sorry but MMORPGs make up far more of the gaming population and the fact that FFXI will require a 5th generation Shader card is going to just put proof in the pudding that it needs to be tested.'

More of the gaming population than the supposed 45 million that own flight simulator, hmm? Show me ANY decent survey proving that MMORPGs make up more of the gaming population than FPS and I'll eat my bloody hat. That's ridiculous.

--Mouse: Famous (Avatarless) Fighting Furball

A very nice %it, indeed.
Shadow Robert
22
Years of Service
User Offline
Joined: 22nd Sep 2002
Location: Hertfordshire, England
Posted: 28th Oct 2003 07:00
goto the Square & FFXI website, not just american but japanese too.
they've got 28 server JUST for the states, plus they've got another 20 spread across asia.

Quote: "More of the gaming population than the supposed 45 million that own flight simulator, hmm? Show me ANY decent survey proving that MMORPGs make up more of the gaming population than FPS and I'll eat my bloody hat. That's ridiculous."


GameSpy are doing a special 10 week report on them, perhaps you should check it out ... you might find it an interesting read.
It is why your seeing so many in 2001 & 2002, because its becomming a truely massive market.

MMORPG servers are finding it hard to keep up. Sure most people own an FPS and might play it online.
Most of them ALSO own and subcribe to an MMORPG or alike game, and even more don't own an FPS.

I bet you $50, that if you did a survey even on this site about what the top 3 games people played online... ALOT would RPG rather than FPS.

Hell even I would only have 1 FPS in my top 3 games online, becaue quite frankly FPS get boring quickly ... there is only so much they can keep you entertained with before you have to buy a new one.
MMORPGs though are oftenly constantly changing and when you get bored of the actual RPG elements you oftenly just end up using it as a fancy chatroom.

You willing to sit there and also argue FPS are bigger than chatrooms? Thats what MMORPGs started out as, thats where they have more of thier customers from... and they've been around over 3x longer than FPS online gaining from strength to strength.

in the last year alone online revenues from Ultima Online an almost dead mmorpg really, were 991million! (that figure i got from the GameSpy report)
figure that out for yourself as online registration costs $15/month, and thats only the legitimate users for EA's servers!

you can't argue with the figures of turnover to such things... especially as you don't buy UltimaOnline you just download it and pay the monthly subs!


To Survive You Must Evolve... This Time Van Will Not Escape His Fate!
Eric T
21
Years of Service
User Offline
Joined: 7th Apr 2003
Location: My location is where I am at this time.
Posted: 28th Oct 2003 16:30
ERrrrr.. i usually don't like to do this... but i CAN testify to the fact that FFXI does require a Geforce 4 FX series... or Raedon 9XXX..

And it has graphiclly impressed me (although I only got to play it for about 15 minutes so far, woulda been 30, but 56K screwed me again.)

As for it being better then HL2, from the screens i have seen of HL2, and the Vids, it is better, but ValVe has a few months to improve on Graphics if they want to... so we'll have to see when it comes out.

A Dream is a Dream unless it is Real
Shadow Robert
22
Years of Service
User Offline
Joined: 22nd Sep 2002
Location: Hertfordshire, England
Posted: 28th Oct 2003 22:05
Quote: "A contradiction now..
Quote: "Whilst it is admitable that the Radeons are capable of slightly better looking FSAA at equal levels, the Radons and FX do FSAA in totally different mannors."
then you mention..
Quote: "whereas it gives the GeForce a much sharper and personally better overall image quality."
So, which one is it raven ? "

how is it a contradiction... i do prefer the sharper graphics of the GeForceFX, however the screen blend of the Radeons CAN look slightly better at times.

Note i first used the word CAPABLE, not IS

Quote: "Concerning your argument that one of ati's graphics cards stays at 98fps in 16 bit and 32 bit..... isn't it better to have consistant performance ? and who's gonna be using 16 bit anyway ?
I know the point you was trying to make, but even still you've inadvertantly pointed out an important factor.... in games the frame rates don't drop in performance as much as the geforce's do which overall contributes to 'smoother' gameplay on the ati's."


I used FPS games because these are the first that came to mind with these particular features ... as I said i'm a hardcore FPS fan, not to mention most other people understand the differences too.
And actually I play in 16bit, when my system isn't capable of 32bit because on my GeForce cards I get ONE HELL of a speed increase.

Jedi Academy the speed from top graphics in 32bit goes from 35fps to almost 60fps, thats almost a 100% speed increase!
It's all well and good unifing the speed but the Radeon's aren't capable of even comming close to GeForce on Jedi Academy and the GeForce perform pretty poorly, that game is close to unplayable on a Radeon. I've head of FPS of around 2-3 even on Radeon 9800XT's

To you it may make more sense, to me it just goes to show that ATi's hardware is far worse.

Quote: "What about those people using nvidia's nForce2 chipsets on their mainboards ? (like me for one) Linux perform's very poorly with those. So, given that, then they would be no point whatsoever even having a geforce fx on such a system. See what I just did ? Cancelled each other out.. oh whoops ! rofl

Now something for you to ponder over raven.."

I have an nForce2 400 and nForce3 Pro motherboards and I have RedHat 9 on a dual boot. I've seen ZERO speed problems, nvidia create specific Linux/Unix/FreeBSD and Macintosh automatically updates the drivers in thier OS for GeForce products.

So you mind telling my what is wrong with yours? As i've never had a problem with nvidia hardware under Linux.
Perhaps next time you want to make up BS you might want to check on the nvidia website and see what they're drivers do and what they're boards are capable of... over 50% of all of nvidia's customers are professionals who require nForce and QuadroFX boards for professional workstations using Linux/Unix/FreeBSD - and considering they're TOP of the field in this area you must have the ONLY board in existance that performs badly under this OSs

Quote: "Have a nice day with your FP16 on yer geforcefx"

Well considering I've used FP32 & FP16 in a combined setup to some great success I'd like to know what the hell you think your talking about eh?
And Perhaps you might want to take a quote ACTUALLY from nvidia if you want to try and bussle up stuff from them eh.

And thier right, the visual difference most users arn't going to notice ... this isn't like the difference between 16bit int and 32bit int - floating point calculations work totally different.
And the only optimisations not there ARE IN THE DRIVERS.
However most good coders are actually capable of bypassing the drivers all together, and thats what alot of us have been doing because what the driver layers uncovers isn't even close to what the graphics card is actually capable of.

Quote: "Optimizing is good. However, when things are done with drivers that increase framerate (which optimizing is supposed to do) but also can not be used in games, clipping planes etc, it can not be classified as true optimizing, but can only be rendered a cheat to make your hardware look better. Although nVidia does this..they do not call it a cheat. they call it Optimizing. Thus when you see some of us use the word "optimize" surrounded by quotes it is just using nVidias words of what they are doing, and also a nicer word than saying CHEAT, which is what it really is!

ATI's re-arranging instruction did not effect IQ, it was a true Optimization and could be used in games. nVidia on the other hand did alot of stuff that although may not have change the IQ, would not work in a game envirionment, thus was an application specific "optimization" that should not be allowed, because it shows your hardware to be better than it actually is."


your glossing over the reality... ATi aren't optimising, they're straight out cheating. They've altered DirectX causing ALOT of games to fail without constant driver updates, they're altered pipelines which cause alot of previously available graphical options to now be obsolete, they've replaced alot of shader lines with "optimised" shader lines ... meaning for developers the result you've coded IS NOT what appears onscreen. They've cut alot of the rendered triangles into purely visible planes only.
In otherwords BSP engines now run piss poor because ATI's drivers do the triangle stripping rather than the game itself, again this is causing agrovation.
Quite frankly ATI ARE CHEATING, they're rewriting the entire f**king API to thier cards!
And each game that is broken they just wait until they fix enough and release a new patch. Seems like thier compatibility is growing but in reality it is THIER fault it is so piss poor in the first place.

whatever you wanna think about nvidia, ATI are nothing but sickening cheats who's cards performance in older titles is quite rightly showing that they're true speed and nature.
They're slow and useless without thier optimisations, nvidia aren't.

I don't care what spin you wanna feel like putting on this but quite frankly the Radeons are usless without thier driver 'optimisations' ... which if the card was even remotely as good as they claim they wouldn't need them


To Survive You Must Evolve... This Time Van Will Not Escape His Fate!
Magpie
21
Years of Service
User Offline
Joined: 16th Jul 2003
Location: Otherland! Cos it rocks!
Posted: 28th Oct 2003 22:47
Welcome to the page for extremely long posts. In the lead is Raven, closely followed by Divide By Zero. Place your bets now.
Shadow Robert
22
Years of Service
User Offline
Joined: 22nd Sep 2002
Location: Hertfordshire, England
Posted: 29th Oct 2003 00:16
Quote: "I applaud ati for being so innovative !"


INNOVATIVE?! ... Thats Bullsh*t and you damn well know it.
If they've cards cannot achieve thier potencial properly using the STANDARD DirectX Specification what the hell right do they have tampering with it.
DirectX isn't OpenGL - you NOT ALLOWED to just tinker with it, its in the f**king EULA about it!
Any optimisations done within DirectX MUST be approved and then added to the next official Release of DirectX... Third Party Improvements are BREAKING THE AGREEMENT!

Quote: "Might I recommend a radeon ? then you might be able to play in that all-elusive 32 bit mode
And don't try and tell me 16 bit is better than 32 bit again! it's plainly obvious with anyone that's played the same game in both bit depths which has the better graphics quality."


Radeon 32bit mode DOES NOT look as good as GeForce 32bit mode... Sacrificing Quality for speed, the only reason 32bit mode looks better than 16bit is because of DirectX's Colour Blending and Alpha Blending, past that the difference is then down to the colour variance and this is why Radeon powered games look WASHED OUT!

I don't give a damn about a few simple FPS if it means my game LOOKS and FEELS better. (Not to mention ACTUALLY DAMN WELL RUNS!)

Quote: "ATI cards are more efficient and simply ooze quality. (and you can use 32bit in games without any drop in performance) <- didn't think that needed to be said, but obviously it did "

Because they're running 16bit quality with 32bit blends... and as DirectX Colour blends are done through MMX on the CPU Ofcourse it means Zero performance hit - it ain't clever its cheating and the WHOLE god damn reason they're not used for Art.

Quote: "NVIDIA cards have ramped up clock speeds to 'compete' with radeons, yet you *still* using 16 bit raven !?
Geforcefx cards aren't bad, far from it....but the point I'm trying to make is ati is no.1 in the high-end graphics card market."


Even the nvidia cards have limits, and Jedi Academy take them to it on even higher powered systems.
But atleast my FX5200 is pushing 50fps whereas my mates Radeon 9800XT can't even manage 5fps

But I suppose your right the Radeon's ARE obviously the better choice eh - HAHAA

Quote: "The latest radeon's are for playing the latest games especially in directx 9...."

Take it someone missed the Halo and Max Payne2 benchmark scores, now i wonder which cards came out on top... Hmmm I wonder

Not being funny but you seem to have overlooked one major thing so have other people... nvidia make the GeForce series here, you guys have nothing bad to say about the 4 Titanium Series do you?
But when it comes to the FX, lets see about all chiping in to kick them.

Face facts... ATI despite ALL of thier Driver and API cheats are still only beating out nvidia by a small margin and this is only within DirectX and this gap appears to be dropping everytime a new detonator/forceware is released.
Quite frankly it is obvious which is the better card, nvidia might seem behind atm ... but just imagine what kinda speed they would also have if they decided to alter everything to suit thier cards?

Sorry but the Radeons are decrepid and ATI know it all too well.
They've broken every damn'd rule in the book and they're still only just beating the GeForceFX series.

and the GeForceGFX is in a brand new league of its own. I can bet you within the next 12months Forceware will have leveled the playing field, combined with nvidia's cheaper prices - GeForceFX ain't going anywhere and no matter how much you wanna put the card down ... it is the better hardware.
ATI's new Radeon offering is gonna have to be something damn special, because honestly I can't see them lasting much longer as they've been doing nothing but constant PR, Driver updates and slight card tweaks just to keep thier heads above the GeForceFX.

At every turn the FX has been there with lower prices, more stable drivers, better featured cards and quite frankly an army of growing developers specifically supporting nvidia over ATI.

If ATI truely wanted to do more than just make people loose faith in the FX's abilities then perhaps they should've thought about getting some REAL hardware to match up with theirs


To Survive You Must Evolve... This Time Van Will Not Escape His Fate!
AlecM
22
Years of Service
User Offline
Joined: 26th Aug 2002
Location: Concord, MA
Posted: 29th Oct 2003 00:19 Edited at: 29th Oct 2003 00:22
too bad a readon 9800 pro still out performs them. What a waste of money. 9800 pros even out perform them in tests where fraps are used to benchmark.

http://www6.tomshardware.com/graphic/20031023/index.html

[P4 2.8C @ 3.03 with an 866mhz FSB:: MSI Neo-2LS running PAT:: 1gb Mushkin PC-3500 DDR High Perf level 2@ 2,2,2 :: ATI Radeon9800ProAIW :: 120Gb SeagateBarracuda 7,200RPM SATA HD :: Antec Plus1080AMG]
the_winch
21
Years of Service
User Offline
Joined: 1st Feb 2003
Location: Oxford, UK
Posted: 29th Oct 2003 00:57
Quote: "INNOVATIVE?! ... Thats Bullsh*t and you damn well know it.
If they've cards cannot achieve thier potencial properly using the STANDARD DirectX Specification what the hell right do they have tampering with it.
DirectX isn't OpenGL - you NOT ALLOWED to just tinker with it, its in the f**king EULA about it!
Any optimisations done within DirectX MUST be approved and then added to the next official Release of DirectX... Third Party Improvements are BREAKING THE AGREEMENT!"


Do you have even a single link to back this up?
Microsoft don't exactly take kindly to people messing with their stuff and destributing it. Perhaps you could say what libaries they change and how they differ compared to a directx install on a computer with a nvidia card.

Must be hell for ati users when there is a new version of directx and you have to wait for ati to beam their version of drectx onto your computer to get your card working at proper speed.
Shadow Robert
22
Years of Service
User Offline
Joined: 22nd Sep 2002
Location: Hertfordshire, England
Posted: 29th Oct 2003 03:05
Quote: "Microsoft don't exactly take kindly to people messing with their stuff and destributing it. Perhaps you could say what libaries they change and how they differ compared to a directx install on a computer with a nvidia card."


No Microsoft don't, which is why Windows XP Media Center 2004 is specifically designed for nvidia GeForce Hardware.
It is also why DirectX 9.1 is being designed specifically with the help of nvidia, however this is a double reason as nvidia are now also the main contributers to OpenGL as well.

When I find the link I'll post it up, it was on most of the Half-Life2 sites though when ATI were trying to clear their guy. And was futher echoed from those who stole the source, and even companies have been complaining alot more recently about such things.

Quote: "Really ? I thought it was the other way round"

Good for you, and I suppose when you've done art for close to 15 years and 5years professionally specifically within the area of 3D Game Art and even Shader based artwork and programming - then perhaps your opinion and reasons for it not looking washed out might mean a damn to me. Atm I just put it up to the fact that your blind... Hell I'm slightly colourblind and EVEN I can see the colours are washed out, usually it takes one hell of alot of washout before I'll notice it without studying something.

Quote: "If the radeon's didn't run games properly then I'm sure ati wouldn't have released their cards... and if you are referring to jedi academy... well where's your proof of this ? and even if it is true that's just 1 game...one you keep mentioning to fight your argument... come up with some other BS, I'm sure you are creative enough..being an 'artiste' and all "


Why, if ATI did that they they'd make Zero profit. Instead they otpimise thier cards for games that people are likely to play and use and will be in the media attention. Then when people bitch that something isn't working or isn't working right they ship out the fix next patch but as it isn't an important title no one cares.
It's no different to nvidia's driver tricks to skip texture cycles in such games, well actually it is considering games can be patched to fix such thing, ATI are doing it via thier own APi edits.

And don't think Jedi Academy is the only game to be affected it's simply the most accessible and publicised one. The forums were rife with close to 40 Radeons users screaming about why the hell the game was so unplayable when the demo was released.
It didn't make sense to anyone why either, even my mate Andy at Raven Software who programs the engine was like "well we've not edited much for the 3D engine except add the addition glow and lighting routines, there shouldn't be anything wrong!"

Quote: "Notice how I can fight my argument *without* mentioning how half life 2 runs on geforce's ??? This is the first time I've mentioned it..and to be honest, it's petty and low... I prefer to talk about facts... and maybe throw in some humour along the way.. ~thumbs up~"

perhaps becuase HL2 is actually the source where all of ATi's current conjecture is comming from?

If HL2 wasn't leaked no-one would've been any the wise of what ATi were truely doing, atleast not publically.
It has THE most optimised version of DirectX 8 & 9 packaged with it, we're talking over 1,400 additional lines of optimised code and the new DLLs go specifically within the HL2 Binary Directory.
The game is sodding rigged to ONLY perform well on Radeons, nothing else!

not the new SiS offering which isn't that bad, not the new Perhelia, not the new GeForceFXs ... it was designed SPECIFICALLY for the Radeon.
Not only was the API designed specific but the entire Material System was as well!!
A system that was suppose to incorportate a specific optimised GeForce pipelines as well, had ZERO optimisations.

Quote: "Must be hell for ati users when there is a new version of directx and you have to wait for ati to beam their version of drectx onto your computer to get your card working at proper speed. "


Winch you remember when Glide was dieing and OpenGL was replacing it for the Voodoo cards. You remember how everyone had to have wrappers?
Well checkout a program called GLDirect from SciTech, it explains it one hell of alot better. They don't need to change Dx everytime it updates, only the major updates.
Think about it the Radeon's previous the Cat drivers were very pathetic speed wise, then when the Catalyst came along they speemed to jump as far as Matrox's G400 cards with thier first drivers.
Matrox admited they fudged up the drivers bigtime, ATi on the other had were creating the fastest drivers they could already.

I'd love to hear they're explaination.

Quote: "I hope nvidia's next offerings will be damn good, spanking even.. otherwise ati will lag without any competition... competition is good for us consumers "


The GeForceGFX is revolutionary - truely revolutionary. Certainly as hell going to change how we see notebooks as well


To Survive You Must Evolve... This Time Van Will Not Escape His Fate!
Shadow Robert
22
Years of Service
User Offline
Joined: 22nd Sep 2002
Location: Hertfordshire, England
Posted: 29th Oct 2003 03:57
sorry but this i just couldn't pass up to post on
http://www.firingsquad.com/news/newsarticle.asp?searchid=5578

also look at this
http://www.firingsquad.com/hardware/imagequality2/page1.asp

To me it is obvious that ATi's FSAA is better at the pissy little things, whereas the GeForceFX has a more overall better FSAA.
Obviously throughout the entire thing they don't point such things out, only that the FX seems to have improved however there is ZERO FSAA visual improvments since Detonator 38.xx which is what makes this so amusing

Also take point about the plane to look at the body not just the turret guns... And note exactly what is said time and time again about the ATi's overall image quality.


To Survive You Must Evolve... This Time Van Will Not Escape His Fate!
Ian T
22
Years of Service
User Offline
Joined: 12th Sep 2002
Location: Around
Posted: 29th Oct 2003 04:05
'To me it is obvious that ATi's FSAA is better at the pissy little things, whereas the GeForceFX has a more overall better FSAA.'


If you conisder complete quality to be a pissy little thing, as you seem to, then yeah, I suppose that's how it works...

--Mouse: Famous (Avatarless) Fighting Furball

A very nice %it, indeed.
Ian T
22
Years of Service
User Offline
Joined: 12th Sep 2002
Location: Around
Posted: 29th Oct 2003 04:06
Furthermore--

'One point to note is that we render the same image using our latest driver (CATALYST 3.8) as we do with a driver that pre-dates the release of Aquamark3 by almost six months (CATALYST 3.2)'

Explain that.

--Mouse: Famous (Avatarless) Fighting Furball

A very nice %it, indeed.
Shadow Robert
22
Years of Service
User Offline
Joined: 22nd Sep 2002
Location: Hertfordshire, England
Posted: 29th Oct 2003 05:21
Quote: "If you conisder complete quality to be a pissy little thing, as you seem to, then yeah, I suppose that's how it works..."


well if you look the ATi are better at the minor less prodominant scene Aliasing, whereas the nvidia are far better at the OVERALL picture.

Quote: "Furthermore--
'One point to note is that we render the same image using our latest driver (CATALYST 3.8) as we do with a driver that pre-dates the release of Aquamark3 by almost six months (CATALYST 3.2)'
Explain that."


i think the link to Epic's Tom Sweeny had it best with,
"We don't remember talking to ATI. I Don't remember them offering an explaination to thier driver cheats, and we did find quite a few cheats going on within our game which we have since tried to put a stop to in the current patch."

if you wanna know where the quote is from follow the breadcrums from the users
ATI stated that NO Halo cheats are there, there Gearbox seemed to state otherwise on thier own forums hahaa

They were caught out, they need to deal with it man ... hell after tha bashing them and Valve did at Alcatraz quite frankly its nice to finally see them getting some of thier own treatments.

Also I found a story on Futuremark and this is the exact quote
Quote: "Futuremark now has a deeper understanding of the situation and NVIDIA's optimization strategy. In the light of this, Futuremark now states that NVIDIA's driver design is an application specific optimization and not a cheat ."


interesting what just a little digging can do... and the postmark of the topic is Tue 2003-06-03 a mere 8 days post the original accusations. But something that seems to be overlooked whenever people claim about nvidia's cheating.

There is however NO resinded news about ATI's cheating. And I looked bloody hard for it.


To Survive You Must Evolve... This Time Van Will Not Escape His Fate!
DMXtra
22
Years of Service
User Offline
Joined: 28th Aug 2002
Location: United States
Posted: 29th Oct 2003 13:28
"sniffing white powder and Beats on Chest! Geforce FX 5950 Blows away everything! Its the greatest! Nothing can beat Nvidia... Beats on chest some more!"

Signed (GeforceFX 5950 less) Raven

Dark Basic Pro - The Bedroom Coder's Language of choice for the 21st Century.
DMXtra
22
Years of Service
User Offline
Joined: 28th Aug 2002
Location: United States
Posted: 29th Oct 2003 13:30
"I'll take BUY RAVEN A CLUE for $500 ALEX"

Dark Basic Pro - The Bedroom Coder's Language of choice for the 21st Century.
Magpie
21
Years of Service
User Offline
Joined: 16th Jul 2003
Location: Otherland! Cos it rocks!
Posted: 29th Oct 2003 13:52
Close behind Divide By Zero is the_winch, he's a favourite to win this race, but has he got the stamina to compete alongside the prolific Raven? Only time will tell!!!


Click image to see screenshots...
kingius
22
Years of Service
User Offline
Joined: 16th Oct 2002
Location:
Posted: 29th Oct 2003 17:43
Quote: "interesting what just a little digging can do... and the postmark of the topic is Tue 2003-06-03 a mere 8 days post the original accusations. But something that seems to be overlooked whenever people claim about nvidia's cheating."


This is indeed true and the sequence of events. However, it is pretty obvious that FutureMark simply backed down over this rather than changing their mind, and those of us who read the original points know the truth. Turning the camera around in the dev kit, which is on rails in the retail version of the benchmarker 3d mark, revealed that the nVidia cards were simply not drawing anything not normally visible - they had precalculated what triangles were present on every frame of the on rails demo and culled them at the driver level!
empty
22
Years of Service
User Offline
Joined: 26th Aug 2002
Location: 3 boats down from the candy
Posted: 29th Oct 2003 18:28 Edited at: 29th Oct 2003 18:28
The statements
Quote: "my Duron processor doesn't do SIMD instructions,"

and
Quote: "the Duron only has MMX & 3DNow!2"


contradict each other unless your Duron is broken.

I awoke in a fever. The bedclothes were all soaked in sweat.
She said "You've been having a nightmare and it's not over yet"
kingius
22
Years of Service
User Offline
Joined: 16th Oct 2002
Location:
Posted: 29th Oct 2003 18:50
lol

Someone doesnt know what they are talking about, yet again!
the_winch
21
Years of Service
User Offline
Joined: 1st Feb 2003
Location: Oxford, UK
Posted: 29th Oct 2003 19:15 Edited at: 29th Oct 2003 19:16
Quote: "Winch you remember when Glide was dieing and OpenGL was replacing it for the Voodoo cards. You remember how everyone had to have wrappers?"


You said they changed the libaries. How exactly does a wrapper change libaries?

Quote: "Well checkout a program called GLDirect from SciTech, it explains it one hell of alot better. They don't need to change Dx everytime it updates, only the major updates."


GLDirect just translates opengl api calls to directx ones. What does this have to do with ati changing directx libaries to their own? are you saying that when you update directx on an ati comp directx doesn't update? Are all the people who play dx9 games on ati comps just using dx8 with an ati wrapper to translate all dx9 calls to dx8 ones?

Why don't you say what file are changed and let us compare between a computer with an ati card and a computer with a nvidia card and see if the files are different. If the directx libaries are different it should be obvious.
ReD_eYe
21
Years of Service
User Offline
Joined: 9th Mar 2003
Location: United Kingdom
Posted: 29th Oct 2003 21:41
Seems like there are a few people here doing alot of pointless research on old threads and other forums when they aren't ever gonna agree on anything, its more pointless than me picking my nose clean with a cocktail stick i found under the sofa which i'm actually doing right now and i'm sure i'm having more fun than any of you people who are wasting their time argueing about graphics cards! Get a life or go and pick your nose, just do something other than this.
stress relieving smilies--->>>
Have fun!


Go on, click on my signature image, you know you want to
Favourite smiley>>>
Shadow Robert
22
Years of Service
User Offline
Joined: 22nd Sep 2002
Location: Hertfordshire, England
Posted: 30th Oct 2003 02:16
Quote: "The statements
Quote: "my Duron processor doesn't do SIMD instructions,"
and
Quote: "the Duron only has MMX & 3DNow!2"
contradict each other unless your Duron is broken."


How exactly are they contradictory?
AMD Duron Processor supports 3DNow!, 3DNow!Enhanced, MMX and a few other instructions that generally are use for runtime calcs.

3DNow! + 3DNow!Enhanced is 3DNow!2 ... or did someone miss their CPU lessons?

Quote: "You said they changed the libaries. How exactly does a wrapper change libaries
GLDirect just translates opengl api calls to directx ones."


THAT IS A WRAPPER, then you take the Call from one APi and the replace it with your OWN function. Both nvidia and ATi current do this already with thier FSAA routines, Quality Bars and AF routines.

This is a pretty common practise...
Effective it is like this, lets say for a section someone made a DBP program and used the #include command to include a dba and therefore needing to package it as something external to the main runtime.

in there you have this code


right... so what if nvidia came along and said, "hey, our cards support lock pixels so we can make this much faster"
So what they do is they make thier own which looks like



and then they setup in thier driver ... if(program('..//thegame.exe')==WM_EXECUTE){ rename('box.dba','box1.dba'); copy('c:/nvidia/fix/box.dba','..//'); }
and when you exit the game it detects that and the swaps them back.

the game player is getting a much faster game on that card, however the changes are NOT part of the original program...
Although thats more showing an optimisation, if on the other hand ... this optimisation were to alter the actual DirectX Layer, then they're then tampering with this on the DRIVER level.
DirectX is suppose to be left along for compatibility reasons.

Add to this is it niether companies right to actually tamper with someone elses game without permission .. something that Gearbox and Epic have been VERY vocal about.

Quote: "This is indeed true and the sequence of events. However, it is pretty obvious that FutureMark simply backed down over this rather than changing their mind, and those of us who read the original points know the truth. Turning the camera around in the dev kit, which is on rails in the retail version of the benchmarker 3d mark, revealed that the nVidia cards were simply not drawing anything not normally visible - they had precalculated what triangles were present on every frame of the on rails demo and culled them at the driver level! "


Show me, because I NEVER saw any of these so-called cheats... and infact non of my mate who also have GeForceFX saw any of these problems either.
There is a visual difference between 320 and 330 versions of 3DMark03 however the scores on the demo actually went UP between the versions.

Quote: "Notice something there ? nvidia lowered the quality to boost speed....it's what I've been saying all along."

Quote the whole damn thing, the Quality loss was because the Texture Sharpening bug was FIXXED ... and for 4-5fps yes i would bitch about loss of quality, for upto 30fps THAT is a different story. That is the difference between a game being playable and unplayable!

The second quote, is from the Benchmarked speeds of Doom3's Pixel Shader 2.0 routines, and considering the 9600pro beats the FX5600 Ultra in every standalone DirectX test - it doesn't prove a thing in most cases the card under OpenGL DO produce upto 2x the speed, Pixel Shaders are a very small part of that.
Vertex Shader wise actually they're still ALOT faster ...

Just goes to show out you can take stuff out of context just to make someone look bad eh.

As for the next comments, the HLSL Shader Structure was DEVELOPED primaily by nvidia ... the comment wasn't to say that they'd have place optimisations within them, simply that considering THEY made the specification and yet for thier cards to be performing so badly with was a mystery. (not a mystery anymore however!)

As for DirectX 9.1 - nvidia are the leaders in technological breakthroughs graphically and thanks to creative and ibm on the forefronts of motherboard and media technologies as well.
After what ATI have been doing, they're lucky to not be banned from developing for DirectX 9.1

Quite simply put... nvidia are THE most valueable graphics solution developers in the world. What actual optimisations towards thier cards is pretty much unknown, it might have alot more to do with the new structure of the FX-2 Processor.

--- you want me to go point by point whats wrong with your arguments?

firstly on the MMX use, ATI are a hardware developer IT IS NOT THIER CALL to choose what the Software Developer SHOULD use. What the hell will they demand next, that everyone uses ONLY thier ATI approved engine?
If the hardware is incapable of even simple Colour Blending that what the hell kind of Visual Processing Unit IS IT??

On the point of HL2, yes... I still truely believe that the graphics ARE BLAND. Most Shader based games are very bland, alot of developers are trading in textures for shader materials... Which would be okay if they knew howto bloody use them.
But they don't - sure they can create nice visuals, but shaders are capable of creating WOW style visuals.

Hell Half-Life2's graphcis up against the graphics within Mother Nature from Mark03, which is better? I know they're totally different types of game, but c'mon which is BETTER?

Mother Nature in Mark03 shows how Shaders SHOULD be used. As a way to enhance the visuals within a game... not like the Battle for Proxycon which relies heavily on shaders and not very good shaders at that.
Visually artists and programmers have found a new toy and they don't have a bloody clue howto use use it.

-- -- --

i'd recommend you actually go back and reread what you think you've read because i've not contradicted myself ANYWHERE, you've simple tried to use what i've said against me and to be honest you've done one hell of a piss poor job.


To Survive You Must Evolve... This Time Van Will Not Escape His Fate!
empty
22
Years of Service
User Offline
Joined: 26th Aug 2002
Location: 3 boats down from the candy
Posted: 30th Oct 2003 13:30 Edited at: 30th Oct 2003 13:30
Quote: "How exactly are they contradictory?"

MMX = SIMD, 3DNow = SIMD, so if a CPU has MMX or 3DNow it can do SIMD.


Quote: "AMD Duron Processor supports 3DNow!, 3DNow!Enhanced, MMX and a few other instructions that generally are use for runtime calcs.
3DNow includes"

Durons support MMX and 3DNow and those with the Morgan core (>= 1GHz) also SSE.


Quote: "3DNow! + 3DNow!Enhanced is 3DNow!2"

Enhanced 3DNow includes all 3DNow instructions.


Quote: "or did someone miss their CPU lessons?"

Yup, you did.

I awoke in a fever. The bedclothes were all soaked in sweat.
She said "You've been having a nightmare and it's not over yet"
kingius
22
Years of Service
User Offline
Joined: 16th Oct 2002
Location:
Posted: 30th Oct 2003 15:03
Quote: "Quote: "or did someone miss their CPU lessons?"
Yup, you did."


Pretty basic ones, at that

Quote: "THAT IS A WRAPPER, then you take the Call from one APi and the replace it with your OWN function"


That isnt what a wrapper does at all, a wrapper sits above the API and simplifies things so you dont have to work with the "bare metal" of the API. It is not a replacement, since it is using the underlying API, its a simplification.
Shadow Robert
22
Years of Service
User Offline
Joined: 22nd Sep 2002
Location: Hertfordshire, England
Posted: 30th Oct 2003 16:31 Edited at: 30th Oct 2003 16:32
Quote: "MMX = SIMD, 3DNow = SIMD, so if a CPU has MMX or 3DNow it can do SIMD"


wrong... SIMD, Streaming Instructions Multiplication & Division are specific extensions found ONLY within Pentium3, Pentium4 and AthlonXP (plus later variants) Processors that allow far faster manipulation of data in Multiplication and Division, as standardly they will run at around 10% the speed of all the other mathematical operations.

MMX are the 16 & 32bit Colour Registries and Ops
3DNow! are 32bit Colour, 3D Enhancements (Transform & Lighting Optimisations) and Multimedia Extensions for Streaming Video

do you god damn homework before you even think about trying to show me up again empty as your attitude and information into how things work is getting very annoying and very old quickly.

Quote: "That isnt what a wrapper does at all, a wrapper sits above the API and simplifies things so you dont have to work with the "bare metal" of the API. It is not a replacement, since it is using the underlying API, its a simplification"


What the f**k do you think it is called a Wrapper for?
If what your saying were to be true then it would be called Simplifer or something stupid... it is called a WRAPPER, because it takes given instructions and uses its OWN version of them, ERGO WRAPPING/BENDING THEM TO THIER GRAPHICS SETUP.

I mean FFS man, you have Glide Wrappers to convert Glide to OpenGL and DirectX... GLIDE is the SIMPLEST VARIATION ON OPENGL THERE IS!
How in that tiney brain of yours could you possible think that you would be SIMPLIFYING a SIMPLE API call into a COMPLEX one for OpenGLs use??

Christ are all of you born stupid or is it something you've worked years at... Wrappers don't sit ON the API, they are there to REPLACE the runtime.
They have f**kall to do with the API, the API is just used to give you access (Application/Abstract Programming Interface... INTERFACE!) to the Runtime ... i swear you guy don't have a sodding clue, not even a little one.


To Survive You Must Evolve... This Time Van Will Not Escape His Fate!
Shadow Robert
22
Years of Service
User Offline
Joined: 22nd Sep 2002
Location: Hertfordshire, England
Posted: 30th Oct 2003 17:33
hahaa... oh well i'm sure i'm the one hanging myself.
:: slaps his forehead ::
how stupid of me, i forget. you are the bigger man aren't you?
damn, i should've know i'd lost with that truely awe[fill in the blank] quoting post.
i mean how on earth could've i ever answered to all of those attacks on me that were so painsteakingly thoughout and researched.

oh yeah not to mention that, that was ofcourse the FINAL word in this arugment because you did quit it after all.
and your just here now to lend moral support right?

:: grins :: oki today is like shooting fish in a barrel with a 12gague ... c'mon you wanna truely want to get into some flamewar with me and come away smelling like roses, then get some lessons from a master at it. Then come back to challenge me again.
Else just don't bother man, it's pathetic.


To Survive You Must Evolve... This Time Van Will Not Escape His Fate!
Shadow Robert
22
Years of Service
User Offline
Joined: 22nd Sep 2002
Location: Hertfordshire, England
Posted: 30th Oct 2003 17:35
Quote: "MMX, SSE, 3DNow!, AltiVec, etc. are all acronyms for SIMD instruction sets

it kinda goes a little something like this..


Quote: "An increasing number of newer processors come with extensions to their instruction set commonly referred to as SIMD instructions. SIMD stands for Single Instruction Multiple Data meaning that a single instruction, such as "add", operates on a number of data items in parallel. A typical SIMD instruction for example will add 8 16 bit values in parallel. Obviously this can increase execution speed dramatically which is why these instruction set extensions were created."

Now raven, go actually learn something.... oh look! here's a link for you about SIMD instruction sets (MMX, SSE, 3DNow! etc)
http://www.hayestechnologies.com/en/techsimd.htm"


check the AMD or Intel sites Einstien


To Survive You Must Evolve... This Time Van Will Not Escape His Fate!
the_winch
21
Years of Service
User Offline
Joined: 1st Feb 2003
Location: Oxford, UK
Posted: 30th Oct 2003 18:21
Raven said

Quote: "ATI haven't just optimised thier drivers, they've optimised DirectX9 for thier own use - you want proof in this pudding then checkout your Dx9 library on your computer, the sizes from a normal installation are different. More proof is in the fact that during the Half-Life2 scandal it was uncovered that ATI were caught tampering with DirectX9.0 and 8.0 information which slipped out underneath the media's attention because they were all relieved that an ATI employee wasn't the main cause.
But the fact of the matter remains that ATI have done EVERYTHING they can... you want to know why they have to patch everysingle game that has a bug? It's because your not running the mircosoft release of DirectX9."


Say what directx files are differnt on an comp with an ATI card so people can check there systems against comps without ATI cards and verify this to be true.

I don't think what you are saying is true, I think it is stuff made up in your mind so you can bad mouth ati.

If you can provide evidence that you are right and I am wrong then I will admit I am wrong, I would presume if you can't provide evidence then you will do the same.
kingius
22
Years of Service
User Offline
Joined: 16th Oct 2002
Location:
Posted: 30th Oct 2003 18:51
Quote: "What the f**k do you think it is called a Wrapper for?
If what your saying were to be true then it would be called Simplifer or something stupid... it is called a WRAPPER, because it takes given instructions and uses its OWN version of them, ERGO WRAPPING/BENDING THEM TO THIER GRAPHICS SETUP.

I mean FFS man, you have Glide Wrappers to convert Glide to OpenGL and DirectX... GLIDE is the SIMPLEST VARIATION ON OPENGL THERE IS!"


The glide wrapper in your example is sitting between the application and the driver and simply remapping calls to open gl or directX. It isnt doing anything other than this. This is the only type of wrapper where what you're definition applies and it is the exception, but you know that, youve been a games developer for 15 years, right?

Most wrappers are simplifiers and not replacing API calls with their own versions at all. They are seperate objects with an API of their own.

Also, about SIMD... MMX is SIMD. The more you say, the more apparant it is that you dont know what you are on about!
empty
22
Years of Service
User Offline
Joined: 26th Aug 2002
Location: 3 boats down from the candy
Posted: 31st Oct 2003 00:35 Edited at: 31st Oct 2003 00:37
Indeed, as posted before by Divided by Zero, SIMD stands for Single Instruction Multiple Data.
There is also Streaming SIMD (like SSE) but that's a different story.


Quote: "MMX are the 16 & 32bit Colour Registries and Ops
3DNow! are 32bit Colour, 3D Enhancements (Transform & Lighting Optimisations) and Multimedia Extensions for Streaming Video"

This is absolute rubbish.
MMX is able to process multiple integer values (eg. four words or 8 bytes) with one instruction. That is done by utilising the Floating Point stack (!). Therefore you can't mix FP and MMX operations and must call the EMMS (which is pretty slow) to clean up the stack after MMX opereation and before floating point operations. 3DNow additionally adds SIMD support for floating point values. Also AMD processors introduced a Faster EMMS command.


Quote: "do you god damn homework before you even think about trying to show me up again empty as your attitude and information into how things work is getting very annoying and very old quickly."

here, here.
I though it was you who knows everything. At least you act as if you do. Unfortunately you don't but that's your problem not mine.


Quote: "check the AMD or Intel sites Einstien "

Although a quote from you that's my advice for you.

http://www.amd.com/us-en/assets/content_type/white_papers_and_tech_docs/22621.pdf
http://www.intel.com/cd/ids/developer/asmo-na/eng/catalog/codeoptimization/mmx/index.htm

I awoke in a fever. The bedclothes were all soaked in sweat.
She said "You've been having a nightmare and it's not over yet"

Login to post a reply

Server time is: 2024-11-24 02:52:25
Your offset time is: 2024-11-24 02:52:25