Sorry your browser is not supported!

You are using an outdated browser that does not support modern web technologies, in order to use this site please update to a new browser.

Browsers supported include Chrome, FireFox, Safari, Opera, Internet Explorer 10+ or Microsoft Edge.

Geek Culture / [LOCKED] Half-Life2 News [update]

Author
Message
Shadow Robert
22
Years of Service
User Offline
Joined: 22nd Sep 2002
Location: Hertfordshire, England
Posted: 13th Oct 2003 17:30
i've been kept pretty close the past few weeks to this ... i'm actually awaiting alot of confirmation about stuff I have personally noticed from the source but for obvious reasons don't want to post

Quote: "
ATI Leak?
- Submitted by Nukem at 11:33:59 PM - Comments (0)


There is now speculation that the leak was potentially master-minded by ATi's Guennada Riguer, one of their software engineers from Canada due to his name appearing in the leaked Half Life 2 source code!

The folks over at The Inquirer have put together a report about this issue, which can be found here, that says the following:

Quote:

--------------------------------------------------------------------------------
Huddy said: "There's no ATI specific parts [in the code]," he said. "They're all just paths for DirectX9".

He said that ATI had taken a conscious decision just to write to the DirectX 9 specification, without adding anything else to the mix.

--------------------------------------------------------------------------------


ATI say there is no significance to this find as the firm has helped to optimize the code so that the game runs properly on hardware.

I'll keep you updated with updates regarding the recent source leak and hopefully get the chance to officially announce the Hacker who caused such havoc!
"


... Also aparently as of 9:51pm PST 12th October the RMF files from the source have been leaked by the hacker and are currently spreading like wildfire, there is also now a playable Beta up and running along with a test map.

I can't post links and if you ask me for them then i will not tell you where they are - again for obvious reasons.
This Beta is of one of the multiplayer levels, and certainly does bring into question if the E3 demo was faked for the community.
However each step that alot of people are taking into the darkside to find out the truth all that is happening is that more questions are being uncovered.

But there has been one HUGE question answered recently that has alot of the Californian nvidia guys up in arms.

Eric T
21
Years of Service
User Offline
Joined: 7th Apr 2003
Location: My location is where I am at this time.
Posted: 13th Oct 2003 17:35 Edited at: 13th Oct 2003 17:36
Quote: "ATI Leak?
- Submitted by Nukem at 11:33:59 PM - Comments (0)


There is now speculation that the leak was potentially master-minded by ATi's Guennada Riguer, one of their software engineers from Canada due to his name appearing in the leaked Half Life 2 source code!

The folks over at The Inquirer have put together a report about this issue, which can be found here, that says the following:

Quote:

--------------------------------------------------------------------------------
Huddy said: "There's no ATI specific parts [in the code]," he said. "They're all just paths for DirectX9".

He said that ATI had taken a conscious decision just to write to the DirectX 9 specification, without adding anything else to the mix.

--------------------------------------------------------------------------------


ATI say there is no significance to this find as the firm has helped to optimize the code so that the game runs properly on hardware.

I'll keep you updated with updates regarding the recent source leak and hopefully get the chance to officially announce the Hacker who caused such havoc!
"





So lemme get this straight....

ATI master minded the plot to theive the Source of HL2, and then stole the RMF files, following a mass release of the game on the net. But what was there motive??? casue the game WASN'T SPECIFICLLY MADE FOR THEM?



seems a lil farfetched to me.

A Dream is a Dream unless it is Real
Shadow Robert
22
Years of Service
User Offline
Joined: 22nd Sep 2002
Location: Hertfordshire, England
Posted: 13th Oct 2003 17:44
nope, thats what people want you guys to believe... what has the nvidia guys up in arms is the fact that ATI have inadvertently announced that they've "optimised" DirectX9 for Valve.

Although this can help to cast aside the blame from Rieger, the fact remains that Half-Life2 was planned to be shipped with an altered version of DirectX9 as it's refference.
When you look at the benchmark tests that were done, the cogs suddenly go **CLICK** ... and it also might explain why the latest DetonatorFX 51.75 were not allowed to be tested instead of the 45.23

Quite frankly, that isn't just low - thats is an unforgiveable move, and would also go a LONG way to explain why nvidia have been kept in the dark about HL2.
The jokes made by Gabe at Alcatraz are just sickening ... an even bigger insult is that the Shaders also have Radeon specific speed updates.

it was believed that Valve and ATi were trying to royally screw nvidia, especially after the benchmarks - and evidence like that just goes to help prove the case.

CattleRustler
Retired Moderator
21
Years of Service
User Offline
Joined: 8th Aug 2003
Location: case modding at overclock.net
Posted: 13th Oct 2003 19:47
I don't understand any of this. It seems silly that Valve would attempt (with the help of ATI) to screw over nVidia. In screwing over nVidia customers Valve would be screwing themselves considering the amount of people using nVidia cards.

-RUST-
lagmaster
22
Years of Service
User Offline
Joined: 26th Aug 2002
Playing:
Posted: 13th Oct 2003 20:27
yeah i read that a few days ago. quite shocking.

the card wars continues

lagmasteruk - [url]www.lagmaster.net[/url] is alive! r.nash@ntlworld.com

Dark Snippet Pro V9 is out!!
Shadow Robert
22
Years of Service
User Offline
Joined: 22nd Sep 2002
Location: Hertfordshire, England
Posted: 13th Oct 2003 20:46
how CattleRustler?
thier engine is designed to run on a reported
800Mhz CPU / 48mb Ram / Dx6 Card 8mb+ / Dx6 / Win95

which means you'll be able to play it, just at a lower graphics level ... and most people will want to play it how it looks on the box and shown from E3. And so they'll upgrade to an ATI card.
anyway you look at this situation, Valve always financially come on top.

Ian T
22
Years of Service
User Offline
Joined: 12th Sep 2002
Location: Around
Posted: 13th Oct 2003 21:44
Ludicrous... the idea that they would leak the game that was the main advertizing for the 9800 pro (they had ads for the September release date, a page about it, everything) is just insane.

Sounds to me like nVidia is trying to tag the blame on their main enemy... that, now, makes PERFECT sense.

--Mouse: Famous (Avatarless) Fighting Furball

A very nice %it, indeed.
Shadow Robert
22
Years of Service
User Offline
Joined: 22nd Sep 2002
Location: Hertfordshire, England
Posted: 13th Oct 2003 21:57
Quote: "Sounds to me like nVidia is trying to tag the blame on their main enemy... that, now, makes PERFECT sense."


this would be true if it wasn't first brought to light as a question in a forum to why this guys name appeared in the Dx9 source.
and the fact that there is also an official ATI statment on thier site to clear this man from the suspects list.

i mean apart from all that i suppose yeah it must be an nvidia plot hahaa
even if nvidia are behind the leak (which i highly doubt) the fact of the matteris Valve and ATI have single handedly destoryed nvidia's 3D reputation within a matter of 3months, although it was shaky cause of the controversy ... these guys have truely delt them a blow because of the sheer scale of how many people have seen and now truely believe that nvidia are bad cards.

but nvidia have always been a pretty friendly corp, so i don't see them doing that
hell they're sharing almost all of thier technology with the entire industry, and they're the ones who have created the shader standards being used today. They could've quite easily have kept it an nvidia only feature.

Eric T
21
Years of Service
User Offline
Joined: 7th Apr 2003
Location: My location is where I am at this time.
Posted: 13th Oct 2003 22:27
All i know, is that the past few months have been a War between ATI and Nvidia... trying to get the big score on the big games...

Anything after that i am lost...


(P.S) at this point, it looks as if DN:F may come out first

A Dream is a Dream unless it is Real
Wik
21
Years of Service
User Offline
Joined: 21st May 2003
Location: CT, United States
Posted: 14th Oct 2003 04:53
Raven,
What do you have against ATI.
What did they ever do to you

I would post my opinion on which card is better and the reasons but it would probably start a flame war

*cough*at*cough*i

The rock has rolled!
Shadow Robert
22
Years of Service
User Offline
Joined: 22nd Sep 2002
Location: Hertfordshire, England
Posted: 14th Oct 2003 07:33
i have several reason,
a) the support for thier hardware for developers is just appauling unless you have ALOT of money, unless your a high profile company they just don't give a damn about you.
b) thier hardware is one step behind, even IF GeForce were alot slower as seems to be the current online claim ... ATi are about as innovative with thier hardware as a slice of bread.
c) they have constantly attacked thier competitors cards that have threatened thier own. This has happened with Videologic's KryoII card, Matrox's G400 & Perherlia cards and the most high profile GeForceFX. When they can't win technically ATI got to town on trying to destroy the competitors in a slander match.
d) they're cards are INCAPABLE of winning on thier own merits. Although the Radeon series is probably the most solid gaming card they've produced, they were very mediocre cards until the online press started praising them for some god unknown reason. And quite frankly it pisses me off to see credit given where it is not due.

there are just countless reasons i dislike ATI and thier current Radeon line ... and this latest little stunt actually goes to show just how damn far they are willing to go to discredit their competitors.
Not to mention it was ATI who raised the regflag about nvidia's so called driver cheats and everyone was like "ohh nvidia have been naughty, but ati seem to be playing by the rules"

which against recently has been proven to be totally bull, because ATi's drivers are totally cheating the system...
Textures are loaded, stored and processed in 16bpp,
Shader Floating Poinst Full Precision runs using compression 24bit even though the card is capable of 32bit...
They utilise the system ram to compile and cache Pixel Programs, this so-called amazing F-Buffer is no more than a simply driver hack ... even thier XT range can still only handle 230lines of code buffered oncard and only a mear 160lines static, 65,320 dynamic vertex lines.

Compared to the GeForeFX's which can handel 1,024 Pixel code lines and 255static Vertex, 65,535 dynamic... you then add to this the floating point and integer artitectures of both cards are different.
FX is int128 float128 colour128 whereas the Radeons are int64 float96 colour128 (even the XT's)

at the end of the day there is no conclusive evidence to truely show that the Radeons are actually any faster, they certainly as hell lack ALOT of the features, they're pretty much hated to be developed upon, and they're not recognised throughout the industry.
Quite frankly they're a mediocre card with bloody good PR.

If thier technology was even half as good and they claim it to be, then it would be able to have shown performance wise that it is superior. The kicker is that thier website you read the topbar and they proclaim thier the BEST graphics hardware suppliers.
When i've seen NO evidence that they're cards are the best, and they certainly were recognised this or last year by any of the industries top awards as such.
Thier pure arrogance and tabloid style PR quite frankly is totally uncalled for, disrespectful and just down right f**ked up!

nvidia have done nothing to ati than be thier competitor...
all they've tried to do is show off thier card as best they can, they've haven't gone out of thier way to damage their competitor - and considering they produce the ONLY ShaderC capable for OpenGL development that is used on the consoles, they could very very easily do so for this next generation and Linux games.
nvidia have given all of the graphics industry everything and ati are the only manufacture to sit there and spit what was given right back in nvidia's face and demanded more.

makes me sick

Preston C
21
Years of Service
User Offline
Joined: 16th May 2003
Location: Penn State University Park
Posted: 14th Oct 2003 14:13
Quote: "d) they're cards are INCAPABLE of winning on thier own merits. Although the Radeon series is probably the most solid gaming card they've produced, they were very mediocre cards until the online press started praising them for some god unknown reason. And quite frankly it pisses me off to see credit given where it is not due.
"


God unknown? They probably bribed the media with a whole lotta cash.


Shadow Robert
22
Years of Service
User Offline
Joined: 22nd Sep 2002
Location: Hertfordshire, England
Posted: 14th Oct 2003 15:37
you know what i find amusing about it all,
FireGL vs QuadroFX

it is no secret which is the best card ... hell the Quadro is currently the ONLY serious card for both home and businesses.
Althought the WildcatVP and TridentGL's are good - they're not even close to the Quadro in performance at the top end.

But the actual amusing thing is that as a high performance card the FireGL really isn't given a second glance because it is a)WAAAY to expensive and b)its slow... V-E-R-Y Slow.
And the baffling thing is that the GL4 and GL5 are based on thier Radeon Processor, like the Quadro is based on the FX Processor.

So how come thier performance has not just a little but quite frankly different leagues in difference, yet thier gaming cards are only minorly different and the situations reversed?
To me makes no sense at all

kingius
22
Years of Service
User Offline
Joined: 16th Oct 2002
Location:
Posted: 14th Oct 2003 15:55
Quote: "But the actual amusing thing is that as a high performance card the FireGL really isn't given a second glance because it is a)WAAAY to expensive and b)its slow... V-E-R-Y Slow"


Typical nonsense from someone who doesnt know anything about what hes talking about. FYI FireGL cards are supplied with many serious 3d workstations and are highly respected in the CG business.
Shadow Robert
22
Years of Service
User Offline
Joined: 22nd Sep 2002
Location: Hertfordshire, England
Posted: 14th Oct 2003 16:41
and yet the most majoritivly used card is the Quadro XGL and QuadroFX ... the nForce3 chipset is quite rapidly replacing alot of of the older machines to allow for Opteron Processors, over the slower and useless Itainium2.
Also if you bothered to checkout the Solidworks benchmarks, you'd see just HOW far behind the FireGL is.

The WildcatVP is one of the next best cards currently available, the FireGL is just quite outrightly and overpriced pile of crap ... and the only reason it is actually bundled with workstation machines is because ATI give deals which makes the GLs cheaper in bulk.

Add to this that there are NO, ZERO, ZILCH renderfarms available with FireGL cards - which means although a workstation might (and this is unlikely) use a FireGL for prototyping when it comes to actual rendering gruntwork you have something completely different doing the work.

Currently Industrial Light & Magic, Pixar, ForceLance, Visual Dreams, Dreamworks, Square Studios Enix (both US and Jap), etc... ALL use Quadro based graphics solutions.

quite frankly the FireGL is the price it is because they just don't shift enough to sell it cheaper, and considering there is no FireGL6 planned it would look very well set that ATI are set to drop or atleast heavily cutback on thier professional industry solutions.

kingius
22
Years of Service
User Offline
Joined: 16th Oct 2002
Location:
Posted: 14th Oct 2003 18:18
The more you say, the less you prove you actually know.

Quote: "Add to this that there are NO, ZERO, ZILCH renderfarms available with FireGL cards "


You dont even need a 3d graphics card on any of the machines in a renderfarm in order to render anything because the graphics card isnt doing anything! This is because rendering is done with CPUs and not graphics cards!

Graphics cards are for workstations which require realtime 3d rendering while building and tweaking a 3D scene.

Your problem is that you know a little (3D games require a 3D card) and now you think you know a lot. Renderfarms are NOT workstations!
Ian T
22
Years of Service
User Offline
Joined: 12th Sep 2002
Location: Around
Posted: 14th Oct 2003 19:29
Raven, your words would be far better spent writing a 5000-page article on how ATI is run by the devil and nVidia is allmighty on your web site. I'm sure you'd get it done in a week.

--Mouse: Famous (Avatarless) Fighting Furball

A very nice %it, indeed.
Shadow Robert
22
Years of Service
User Offline
Joined: 22nd Sep 2002
Location: Hertfordshire, England
Posted: 15th Oct 2003 00:02
Renderfarms work in a synced form of Graphics and Processor combination... Graphics Processing Units are specifically setup to calculate the complex surfaces that standard processing units are not capable of.

Withing your typical Renderfarm you will have 16 memory slots with 2banks per slot, 4 processor slots and 7 graphics slots... this can then be added to another Renderfarm box to for additional power.
The design is typical of a server/workstation style hard disk indepedant system able to be controlled through single network connection points.

The cards used within a Renderfarm must be capable of SLi technology, which most professional solution cards are (the FireGL is one of the exceptions which make its useless for them) ... Rather than using the DVI connection point for the multiple cards the motherboards have the AAGP slots (which is where the extra 32pins at the end of your AGP card go, only the professional solution cards have them... home users will recongise the area though as is it still incorporated so they can be used on such boards)

Kingus if you want to sit there and claim you know this stuff then fine try to - but i've worked with this hardware for over 5years now, i'm fully qualified to actually build such machines and actually have a very basic model at home for my professional use.

i'd strongly suggest you find out the difference between a ProcessorFarm and a RenderFarm - as they are VERY VERY different indeed. Especially in the Speed difference.

a 16x Itainium2 2.8ghz HT system with w/32Gb 400mhz Ram ProcessorFarm is only capable of rendering at around 10% the speed of a 7x QuadroFX 3000G 4xOpteron 820 w/32Gb 400mhz Ram RenderFarm

The main difference being that the QuadroFX 3000G is capable of computing 330million triangle/sec per GPU and it contains 4 this is including the Shader Pipeline Loads ... whereas the Itainium2 processors are only capable of 260million triangles/sec and that is not even including the shaders/texel/pixel pipelines, which have to be software rendered which means alot of converting and hard processing, whereas on the GPU they're specifically setup for this task and it takes a single pipeline out of 64. Tieing up NO additional resources.
Although Lightwave/3D Studio Max/TrueSpace's primary renderers do not take advantage of the graphics cards increased rendering, Maya Sofimtage and Houdini's Mental Ray CAN ... as well as Messiah, Renderman & Brazil also being able to take FULL advantage.

Although i'm sure being so knowlagable you already knew and understood all of this, right?

Ian T
22
Years of Service
User Offline
Joined: 12th Sep 2002
Location: Around
Posted: 15th Oct 2003 00:09
Raven... all of that is true... but his point is still completely valid. You don't need a renderfarm to render anything.

(That's not saying [you don't need a renderfarm to render] [anything], that's [you don't need a render farm to] [render anything])

--Mouse: Famous (Avatarless) Fighting Furball

A very nice %it, indeed.
Shadow Robert
22
Years of Service
User Offline
Joined: 22nd Sep 2002
Location: Hertfordshire, England
Posted: 15th Oct 2003 03:50
no ofcourse you don't... but i'm sure any of the 3D Artists here will be quite happy to tell you that when scenes get complex particularly multipass scenes such like Sofimage|XSI and Maya produce, a single card within a single workstation just doesn't cut it.

A good example would be Pixar's "Finding Nemo", if all of that had been rendered even in multiple scene passes on a standard workstation just using the CPU w/o GPU speed enhancements using dual Itanium2 or Opteron 8-Series 3.0ghz ... you would be looking at a good 3-4hrs rendering pass time.
As scene like that are made up of several passes, suchas models - lighting - background - depth - effect, etc...

you'd be talking in the region of a week just to render a 5minute scene of the movie. Whereas a Renderfarm would have it done within a matter of 3-4hrs
doesn't just save money man hours wise, but also needed power consumption.

Even with the omit of a Renderfarm aside, it still doesn't mean that the FireGL has even close to the power the QuadroFX has... Hell the workstations might come with the FireGLs as standard, but there are quite a few people here who can tell you for nowt that when they work on a machine you'd have a sure lock bet that it has a Quadro inside it.

The machine i work on has 2 Quadro's inside it, the entire office i work in is fully Quadro equipped ... the only ATi cards even in the building are on the testing/debug systems.

It isn't just the pure power fact that makes the FireGL's a waste of time though, its the Macintosh, Linux & Solaris support... or rather lack of it. Yes it can use standard OpenGL - however Standard VESA/MESA based OpenGL doesn't even come close to compairing to a Specific OpenGL/MGL Driver for that operating system.
Hell you might as well throw in an SiS card without 100% support for the given OS... and honestly, i don't see many Windows based systems within the workplace when it comes to graphical editing.

This area is mostly SGI, Mac and IBM Workstations ... with good reason too.

Wik
21
Years of Service
User Offline
Joined: 21st May 2003
Location: CT, United States
Posted: 15th Oct 2003 05:21
Oh Jeeze,
Big long posts form Raven..Not a good sign.

I started a flame war.



The rock has rolled!
Shadow Robert
22
Years of Service
User Offline
Joined: 22nd Sep 2002
Location: Hertfordshire, England
Posted: 15th Oct 2003 11:35
nope just a few users being totally bloody minded and trying to show me up as usual ... kinda pathetic if you ask me

kingius
22
Years of Service
User Offline
Joined: 16th Oct 2002
Location:
Posted: 15th Oct 2003 14:17
Raven, you are showing yourself with anti-ATI nonsense.
Shadow Robert
22
Years of Service
User Offline
Joined: 22nd Sep 2002
Location: Hertfordshire, England
Posted: 15th Oct 2003 15:46
not really, it's a fact that the FireGL are just not cutout for industry use ... alot of companies own them, however i don't remember a single house that actually uses them.

hell i own a hedgestrimmer, doesn't mean i'm ever going to do the gardening
you want proof of what kind of a card it is... goto the Solidworks website, SGI's or even Softimae|XSI

up until around 2years ago, the Wildcat and before that the 3DOxygen series were THE BEST card on the market. 3DLabs is still producing outstanding cards ... but quite frankly the FireGL has been nothing more than overinflated features.

wow an amazing 500mhz running 8 geometry pipelines, which sounds fantastic against a 350mhz card running 4 geometry pipeline right?
difference being that the FireGL can only calculate triangles on those pipleines, everything has to be converted, whereas the 3DOxygen2 was capable of 32point n-gons.
just because something seems better of the surface, doesn't make it so underneath all the PR bullcrap.

And at the end of the day this is EXACTLY why FireGL are not even close to the top of an artists wish list

mog_squad
21
Years of Service
User Offline
Joined: 22nd Sep 2003
Location:
Posted: 16th Oct 2003 06:35
.....
Whos navida...
Ah, my brain just broke... I never knew games were this complicated...
Im never gonna learn to program, im so sad...

*Runs back into a corner of the room and starts writing again...*

DONT @$#%#@$% MESS WITH ME! I'll kupo your @$$!
Eric T
21
Years of Service
User Offline
Joined: 7th Apr 2003
Location: My location is where I am at this time.
Posted: 16th Oct 2003 08:15
Nvidia is a major player in the Video Card war (hheheheh i just noticed this is a PCI, not AGP )

It is rare that i agree with Raven on anything, but When it comes to ATI vs. Nvidia, i have to go with nvidia. only because of my personal experiences, and how Nvidia is hella farhter in tech the ATI.

A Dream is a Dream unless it is Real
Martyn Pittuck
22
Years of Service
User Offline
Joined: 27th Aug 2002
Location: United Kingdom
Posted: 16th Oct 2003 10:44


Web Design Starting from $200. Special limited offer. MSN or Email me for more information.
Van B
Moderator
22
Years of Service
User Offline
Joined: 8th Oct 2002
Location: Sunnyvale
Posted: 16th Oct 2003 11:42
So I'm your average PC user, and I want a new card - I'm not spending a great deal of money because my PC is only worth about £200, what graphics card would you guys recommend for a £50 budget?.


Van-B

Shadow Robert
22
Years of Service
User Offline
Joined: 22nd Sep 2002
Location: Hertfordshire, England
Posted: 16th Oct 2003 12:02
Creative GeForceFX 5200 64mb is £46 from Watford Electronics, which is a powerful version of the FX5200.
There is also the MSI GeForceFX 5200 for around £45 has 2x more ram, but seems to never be in stock...

at such a low price not really out for alot of options tbh, but the FX5200 is pretty much the fastest, best supported & fully featured in that class.
If you were to spend an extra £50 it would become a toss up between the FX5600 Ultra and 9600pro though cause you can get either of those for just under £100, and the speed jump between the bracket is just unthinkably huge.

FX5200 is roughly 2x faster than your adverage GeForce2/4mx (no surprise there) & the 9200pro is as fast as the FX5200.
FX5200ultra is arount 1.25x faster than the non-ultra, and the FX5600 ultra and 9600pro are then around 2-3x faster than that.

in FPS terms i'll use Q3 as a base, 1024x768x32 Full Graphics FSAA 4x AF8x (which makes it look pretty nice)
in my Duron800mhz machine, so hardly highend

9200 - 15fps
5200 - 25fps
5200 ultra - 48fps
5600 ultra - 109fps
9600pro - 91fps

thats what this machine gets anyways, so really as i said between the £50 -> £100 range the speed different really just takes the piss honestly, so although its a harder choice to pick between the 5600u and 9600p it is definately worth the extra power

and with a full AthlonXP/Pentium4 processor that speed just increases quite admirably

las6
22
Years of Service
User Offline
Joined: 2nd Sep 2002
Location: Finland
Posted: 16th Oct 2003 12:02
anything but nvidia. Cos I don't care about the press, or the technical stuff. it's the benchmarks and the image quality that counts. and there's not much you can do about those, if you have a crappy card.

Thank God I don't have to stick around here.

Shadow Robert
22
Years of Service
User Offline
Joined: 22nd Sep 2002
Location: Hertfordshire, England
Posted: 16th Oct 2003 12:12
Quote: "it's the benchmarks and the image quality that counts"


the nvidia benchmarks show there isn't much in the card with nvidia having better overall pixel processing and the radeons overall better geometry processing.

As for image quality, to be perfectly frank that is
a) a matter of taste and opinion (aparently)
b) nvidia specialise in pixel operations, so it is a joke that the Radeon is capable of beating them here.

although i'll agree the new 51.75 drivers appears to take away some quality over the 45.23, the fact is that on the old drivers Texture Sharpening was on ALL the time. And even with the minor quality drop because of turning it off, you can turn it back on (they didn't take the option away) and the quality it still far better than the Radeons. FSAA or even just standard Rendering.



as you can quite plainly see, one side is much better quality than the other ... but i will leave you guys to decide which one is which

AlecM
22
Years of Service
User Offline
Joined: 26th Aug 2002
Location: Concord, MA
Posted: 18th Oct 2003 07:25
The one on the right looks nicer to me. Anyway, those are 2 different shots form the game so its hardly a fair comparison. Lighting and surrounding are different in both.

[P4 2.8C @ 3.03 with an 866mhz FSB:: MSI Neo-2LS running PAT:: 1gb Mushkin PC-3500 DDR High Perf level 2@ 2,2,2 :: ATI Radeon9800ProAIW :: 120Gb SeagateBarracuda 7,200RPM SATA HD :: Antec Plus1080AMG]
Ian T
22
Years of Service
User Offline
Joined: 12th Sep 2002
Location: Around
Posted: 18th Oct 2003 07:35
I totally disagree that the GeForce line's image quality is sharped than Radeon's. On top of that, the shots you're comparing are from two different parts of the game, and with enough JPEG compression applied for me to frankly not be able to really tell which one looks worse. How about this-- you turn all the settings up on an nVidia card of yours and supply an image of the 'way it's meant to be played' logo at the beginning, and I'll supply a shot from my 9500, and we can compare those .

--Mouse: Famous (Avatarless) Fighting Furball

A very nice %it, indeed.
the_winch
21
Years of Service
User Offline
Joined: 1st Feb 2003
Location: Oxford, UK
Posted: 18th Oct 2003 07:54
Quote: "So I'm your average PC user, and I want a new card - I'm not spending a great deal of money because my PC is only worth about £200, what graphics card would you guys recommend for a £50 budget?."


Going for the most popular at the time and hopefully get the most compatability with current games is always something to look at.

I was in a similar situation and ended up with a msi fx 5200.
The only problem I have had with it is the nvidia drivers, I have to install several versions until I find one that doesn't cause lockups in media player. The latest drivers are working ok so perhaps they have fixed my little problem.
Shadow Robert
22
Years of Service
User Offline
Joined: 22nd Sep 2002
Location: Hertfordshire, England
Posted: 18th Oct 2003 17:41
well you guys are echoing my point exactly ... they are two different shots and you can't compare them prorperly, however it is quite obvious which has the better FSAA

but these shots are from Tom's Hardware Page where after doing a close up of a ridge (that wasn't in the Radeon shot) it was declared that the Radoen had far better visual quality, which was just enough to tip the scales in its favour. Especially as the 3DMark '03 330 tests were dropped because they couldn't be "trusted"

the only time i've seen identical shots compared for visual quality was in the 45.23 vs 51.75 test. the whole net has something against the GeForceFX atm, and i don't care if everyone hates them - but there has been no call for them to lie to the public the way they have.

... and mouse you want to compare shot, pick a game test and settings for Mark '03 and i'll take the image from my lowely FX5200.
nVidia won 3 frickin' awards for thier graphics cards quality from the Game/Cinematics/Console industries - so even if you don't believe me how can the peer bodies of each of those industries be wrong?

on the drivers... Winch i've found that 44.04 are the most stable drivers that seems to work with everything, 45.23 is just a waste of time imo, they're the slowest in the line, they can't run OpenGL properly, they're buggy in DirectX 9.02 (not .01 or .00 for somereason). Also the Omega versions of the drivers are even more stable and faster, quite a fair bit faster ... i gained 100points in mark03 from using the omega 44.04

Shadow Robert
22
Years of Service
User Offline
Joined: 22nd Sep 2002
Location: Hertfordshire, England
Posted: 18th Oct 2003 17:44
just a tip although the lighting is different... the best place to look it the handle notice the colour differences and edges.
the Radeon's colours have wider areas and is a little more washed out not being as visually sharp. The Edges have noticeable "jaggies"

as i said its obvious which one is from the Radeon and which is from the GeForce

Ian T
22
Years of Service
User Offline
Joined: 12th Sep 2002
Location: Around
Posted: 18th Oct 2003 22:29
You're right, the whole 'net does disagree with that, and with very good reason. The GeForce line has very bad FSAA compared to the Radeons, in both speed and effectivness.

I'll upload shots from a couple common games, unaltered and with high quality, and you can capture the same from your FX5200. Quite frankly I can't consider this a definitive test as I would put nothing past you to try and prove nVidia's superiority, but I will reconsider my words if your shots look better.

--Mouse: Famous (Avatarless) Fighting Furball

A very nice %it, indeed.
Ian T
22
Years of Service
User Offline
Joined: 12th Sep 2002
Location: Around
Posted: 18th Oct 2003 23:05
Alright, Raven, here are the screenshots I'm taking.

They are all (except where mentioned) at 8xFSAA and 16xAS (quality).

Firstly, the UT2003 DEMO. Please make sure it's the demo as the demo has different texture detail than the full game (which I don't have). Resolution is 1024x768x32.

First, the 'way it's meant to be played' logo at its full brightness, before it gets dented-- I have captured this with the demo's visual settings maxed, and with them at lowest detail.

Next, the 'way it's meant to be played' logo just after it's blasted down, before Grunt looks around and sees you. I have also captured this at max and lowest detail levels.

Also, I have captured the logo just when Grunt sees you, before he shoots (slightly different frames, human error, but they're still comparable) with the demo's detail settings maxed, but at 8xFSAA, 16xAS, and then 0xFSAA and 0xAS.

Finally, please run the attached code in Dark Basic Professional patch 5.1b in fullscreen mode and the aforementioned visual settings. Capture the actual objects in an image.

All of these images are JPEGs with level 5 compression (out of 100). A bit large in file size, but I want minimal detail loss.

Tell me when you're done and we can compare the shots.

Also keep in mind this is a lowly 9500 Pro-- I like it a lot, but the 9700 and 9800 have newer versions of Smartshader so their FSAA will look even better. The 9500 was meant to contest with the high-end GF4 cards, so even a FX 5200 will be a bit of a tough match for it-- obviously, even the lowest of the FX line carry full feature capability (which I find interesting compared to nVidia's previous penchant to slice off features but keep speed [MX line]).

...but I look forward to it anyways. I still think my Radeon shots will be better .

Cheers!

--Mouse: Famous (Avatarless) Fighting Furball

A very nice %it, indeed.
Shadow Robert
22
Years of Service
User Offline
Joined: 22nd Sep 2002
Location: Hertfordshire, England
Posted: 19th Oct 2003 04:34
Quote: "The GeForce line has very bad FSAA compared to the Radeons, in both speed and effectivness."


the screenshot above says differently
and speedwise don't make me laugh, the GeForce isn't only much faster but its capable of it at higher resolutions without much more of a performance hit than at the lower levels.

as for the test i said 3DMARK '03 ... i'm not downloading anything new, and you should already have it.

you know what i'll make a sodding demo in DBP for the quality tests, then i'll have you and someone OTHER than me to download it and test on thier FX - cause no doubt you'll think i'm cheating or something when your proven wrong.
that way niether person can cheat...

Preston C
21
Years of Service
User Offline
Joined: 16th May 2003
Location: Penn State University Park
Posted: 19th Oct 2003 04:58
I can do it, Nvidia GeForce FX 5200 128 MB


Ian T
22
Years of Service
User Offline
Joined: 12th Sep 2002
Location: Around
Posted: 19th Oct 2003 07:18
You're far too excitable Raven

I have excellent reasons not to use 3dmark. Last I checked both nVidia and ATI cheated with it (although ATI less so by a slight amount). DBP is something that there is no way cheating could be involved.

I've shown you some perfectly operational code already-- what, do you think I rigged it?

--Mouse: Famous (Avatarless) Fighting Furball

A very nice %it, indeed.
Shadow Robert
22
Years of Service
User Offline
Joined: 22nd Sep 2002
Location: Hertfordshire, England
Posted: 19th Oct 2003 19:41
Quote: "I have excellent reasons not to use 3dmark. Last I checked both nVidia and ATI cheated with it (although ATI less so by a slight amount). DBP is something that there is no way cheating could be involved."


build 330 is devoid of ALL cheating, nVidia is now part of the Beta Tester Scheme ... and build 330 sees the addition of some features which means if either company tries to cheat again the program will simply bypasses the drivers.

this is why the Radeons saw a shocking drop in speed with 330
i have UT2K3 full edition, i'm not downloading a 200mb demo just to prove a point - you can bloody get stuffed cause that'll take me more time than i care to be online atm

Ian T
22
Years of Service
User Offline
Joined: 12th Sep 2002
Location: Around
Posted: 19th Oct 2003 22:17 Edited at: 19th Oct 2003 22:17
Far too excitable

Let's see-- nVidia involved in beta testers, new version released, ATI's cards drop in speed... wow, that's hard to figure out

Conspiracy theories aside, I'm not downloading an 177 megabyte file just a prove a point when I've already got virtually all the major and minor review boards, technical groups, and discussion forums on the internet backing me-- so-- how did you put it-- 3dmark can bloody get stuffed

--Mouse: Famous (Avatarless) Fighting Furball

A very nice %it, indeed.
Shadow Robert
22
Years of Service
User Offline
Joined: 22nd Sep 2002
Location: Hertfordshire, England
Posted: 20th Oct 2003 03:00
Quote: "Conspiracy theories aside, I'm not downloading an 177 megabyte file just a prove a point when I've already got virtually all the major and minor review boards, technical groups, and discussion forums on the internet backing me-- so-- how did you put it-- 3dmark can bloody get stuffed"

fine... then get another FX owner WITH the demo to prove it
and all ATI has IS the internet backing it, games magazines/computer magazines/professional benchmark tools/3d package developers/gaming rig providers/gaming conventions/etc... most if not all back and provide nvidia solutions.

just saying you have an internet full of teens who don't know thier arse from thier elbow going "yeah, i think radeons are 733t35t" or some other crap doesn't mean ATI = best

as for the 3DMark, its funny when they investigated everyones drivers because of nvidia and created a the latest uncheatable 330 build, nvidias marks although dropped a little stayed within 100-200 points of the original scores.
ATIs on the other hand dropped by almost 1,000 points in ALL areas for the 9800pro cards, and thier other cards equally dropped quite drastically in speed.

Thats not a conspiricy thats a company being found out for cheating and getting away with it ... and after thier whole "oh bad nvidia you cheated!" crap.
and then they sit back and claim that the alterations were to "enhance DirectX specifically for thier cards", although nvidia made replacementment shaders - they didn't actually change any of DirectX8.1 or 9.0 to give thier cards a faster performance boost.
The speed enhancements they gave were actually pretty superficial (quite literally), aimed at lightening the shader load.

ATI were caught, tampering with shaders, tampering with the pipelines, tampering with the colour ratios, tampering with the texture sizes and tampering with the N-Patch depths.

although thier cheats were as blatent as nvidia's they were far more serious ... giving users false senses of what thier cards were actually capable of.
and although these cheats still move into the DirectX realms being called "optimisations" several companies recently have been screaming at ATI, along with individuals suchas myself to give us the registered addresses of the changes to turn them off.

7th Day Project utilises an optimised Tri-Strip Local Portal GeoMipMap routine - long and short being that it is capable of LOD Tesselation and optimisation so that it is always running at peak effeciency for your graphics card.
Radeon 9800pro SHOULD be able to push 288million triangles per second ... that is what it is technically capable of according to 7DPs debug - however what actually happens is that on the Radeon the Tri-Strip Local Portals are disguarded for thier own routines.

What this means is that ~12million polygons per scene that should be running at 60fps actually ends up running at 14fps...
there are many other engines that are also using similar technology that our games just become 100% unplayable on ATI hardware because these "optimisations" are based on us using the default pipelines and triangle stripping.
you combine this with the fact that 16bit textures DO NOT give the same effect that a 32bit texture does when using the alpha channel for needs ... quite frankly the overall quality is poor and thier alterations are screwing up engines that if they didn't touch would be performning JUST as well on both cards, if not slightly better on the Radeons because of the optimised shader routines for 2.0 32bit shaders (even though technically they're running at 24bit :: moan :: )

you want the net to back you then fine, the net is full of idiots this board is proof enough of that - people who follow the crowd simply because.
but don't try to give me bull that developers and the industry prefer ATI ... because quite frankly the reason there is 100:1 in the nvidia vs radeon demos is because they are hated.
i've disliked their cards in the past but thier past year with thier crap against nvidia - quite frankly NOW i hate them.

Preston C
21
Years of Service
User Offline
Joined: 16th May 2003
Location: Penn State University Park
Posted: 20th Oct 2003 03:20
Quote: "you want the net to back you then fine, the net is full of idiots this board is proof enough of that - people who follow the crowd simply because.
but don't try to give me bull that developers and the industry prefer ATI "


HEY!!! I prefer NVidia thank you very much!!!


Pincho Paxton
21
Years of Service
User Offline
Joined: 8th Dec 2002
Location:
Posted: 20th Oct 2003 04:08
Quote: "fine... then get another FX owner WITH the demo to prove it
and all ATI has IS the internet backing it, games magazines/computer magazines/professional benchmark tools/3d package developers/gaming rig providers/gaming conventions/etc... most if not all back and provide nvidia solutions.

just saying you have an internet full of teens who don't know thier arse from thier elbow going "yeah, i think radeons are 733t35t" or some other crap doesn't mean ATI = best

as for the 3DMark, its funny when they investigated everyones drivers because of nvidia and created a the latest uncheatable 330 build, nvidias marks although dropped a little stayed within 100-200 points of the original scores.
ATIs on the other hand dropped by almost 1,000 points in ALL areas for the 9800pro cards, and thier other cards equally dropped quite drastically in speed.

Thats not a conspiricy thats a company being found out for cheating and getting away with it ... and after thier whole "oh bad nvidia you cheated!" crap.
and then they sit back and claim that the alterations were to "enhance DirectX specifically for thier cards", although nvidia made replacementment shaders - they didn't actually change any of DirectX8.1 or 9.0 to give thier cards a faster performance boost.
The speed enhancements they gave were actually pretty superficial (quite literally), aimed at lightening the shader load.

ATI were caught, tampering with shaders, tampering with the pipelines, tampering with the colour ratios, tampering with the texture sizes and tampering with the N-Patch depths.

although thier cheats were as blatent as nvidia's they were far more serious ... giving users false senses of what thier cards were actually capable of.
and although these cheats still move into the DirectX realms being called "optimisations" several companies recently have been screaming at ATI, along with individuals suchas myself to give us the registered addresses of the changes to turn them off.

7th Day Project utilises an optimised Tri-Strip Local Portal GeoMipMap routine - long and short being that it is capable of LOD Tesselation and optimisation so that it is always running at peak effeciency for your graphics card.
Radeon 9800pro SHOULD be able to push 288million triangles per second ... that is what it is technically capable of according to 7DPs debug - however what actually happens is that on the Radeon the Tri-Strip Local Portals are disguarded for thier own routines.

What this means is that ~12million polygons per scene that should be running at 60fps actually ends up running at 14fps...
there are many other engines that are also using similar technology that our games just become 100% unplayable on ATI hardware because these "optimisations" are based on us using the default pipelines and triangle stripping.
you combine this with the fact that 16bit textures DO NOT give the same effect that a 32bit texture does when using the alpha channel for needs ... quite frankly the overall quality is poor and thier alterations are screwing up engines that if they didn't touch would be performning JUST as well on both cards, if not slightly better on the Radeons because of the optimised shader routines for 2.0 32bit shaders (even though technically they're running at 24bit :: moan :: )

you want the net to back you then fine, the net is full of idiots this board is proof enough of that - people who follow the crowd simply because.
but don't try to give me bull that developers and the industry prefer ATI ... because quite frankly the reason there is 100:1 in the nvidia vs radeon demos is because they are hated.
i've disliked their cards in the past but thier past year with thier crap against nvidia - quite frankly NOW i hate them."



Lol! I just wanted to have the TGC quote size record!

Pincho.
Rob K
Retired Moderator
22
Years of Service
User Offline
Joined: 10th Sep 2002
Location: Surrey, United Kingdom
Posted: 20th Oct 2003 06:19
Dear me Raven, you are just flaming for the sake of it. Both brands of card will no doubt run HL2 just fine by the time that it is released.

Login to post a reply

Server time is: 2024-11-24 01:40:48
Your offset time is: 2024-11-24 01:40:48