Sorry your browser is not supported!

You are using an outdated browser that does not support modern web technologies, in order to use this site please update to a new browser.

Browsers supported include Chrome, FireFox, Safari, Opera, Internet Explorer 10+ or Microsoft Edge.

Geek Culture / Is your hardware up to date . . . (Article)

Author
Message
Shadow Robert
22
Years of Service
User Offline
Joined: 22nd Sep 2002
Location: Hertfordshire, England
Posted: 25th Jul 2004 00:31
Recently I have found myself going very in-depth with what hardware works well with what, and just how it affects the graphics hardware.
I had a simple question:

How much does your processor really matter for your graphics speed?


If we cast our minds back for a second to before 'Graphics Processing Units' when 3D Accelerators were special operation designed chips, you know that although they help speed up 3D, they also happen to rely on your Central Processing Unit to basically do everything.
With this in mind, when you run a game like Quake2, using a Pentium 133MHz using a Voodoo Card and a Voodoo2 Card, when running in OpenGL mode you will obviously notice a difference in speed between the cards, however you upgrade to a 200MHz model and your speed would jump to a point where you could easily outperform your friends machine using your Voodoo even with him using his awesome new Voodoo2.

As graphics has evolved though, more and more graphics has moved away from it being a joint effort from your entire system and more emphasis has been placed on having your graphics card basically do everything.
The reason for this is to allow your processor to do all the other things like realistic physics.

When new cards are benchmarked, we always have reviewers putting together the most awesome hardware you can think of so you can see how well these new cards can perform on the ultimate settings.

Tests are always done to put pure stress on the computer and make it really show the power of the cards. Well although this is all cool, how do we know that the power we're seeing happening is actually being done by the graphics card and not the processor?

I'm sure alot of people remember quire vividly when MMX was released. Intel were making sure a big deal over 57 new Operation Instructions and it's 8 new processor registers.
To those who never programmed on the processor level this ment nothing though, and honestly I didn't have a clue what it ment either.
As Intel put it across it just made things look more colourful, finally giving developers 16bit Colour (65535 Colours) instead of 8bit Colour (256 Colours) to use in thier game palettes, which during this point in time, 2D Cards were still really mainstream and running any more than 8bit Colour on them would make them choke.

I actually remember rushing out to purchase a brand new Intel Pentium 200MHz MMX to replace my old Pentium 200MHz. In the box there was a game called P.O.D. it was a futuristic racing game that ran under Windows 95 using the new DirectX 3.0 technology.
Before installing the new processor I tried the game, even with my Orchid Voodoo it was a struggle for the system to keep above 30fps in non-MMX mode. Was still fun to play and impressive. After a few wasted hours I turned off my system and installed the MMX processor.
Thankfully it was just a swap job, because the processors were the same speed and i hated having to take the motherboard out just to change some jumpers on it.
Turning on the machine the difference was noticeable even just using Windows, as everything would open faster than before and my 16bit backdrop didn't take 2minutes of hdd access to show.
Running P.O.D. again the difference was amazing, not only was I now using beautiful 16bit colour but the game itself was running at a more respectable 20fps.

Technicals I didn't understand, but that sort of a speed boost I did!
I doubt most people even realised how much of a difference that was until they played Half-Life for the first time. Without 3D Hardware and MMX the game was just unplayable on the minimum specifications.
I'd given my old hardware to my brother was using it for school work but also obviously games, however he still was using my old S3 ViRGE which Half-Life didn't recognise as a 3D Accelerator. So he could only run software mode.

The interesting thing about Half-Life though was when you upped the resolution. For my Vodoo, upping it to 800x600 over the normal 640x480 and the game would choke quite badly. The framerate would drop from a pleasent 30fps to an unplayable 8fps, I updated to a Creative Voodoo2 that winter and noticed that although my framerates went up 800x600 was still an unplayable 18fps.
It wasn't until I upgraded to the recently released Pentium II 233MHz MMX Processor that I saw Half-Life running at 800x600 properly.

So what is the point in all of this uselss history?
Well years ago, your processor speed and graphics chip really were a combination that worked together to give your games thier speed.
What is interesting about it though, is it could be argued that my processor was holding back the potencial of my graphics card.

This would be interesting as this is exactly what is believed with current graphics card lines.

This Link shows the comparison of Graphics Cards and Processors.

So, is it true? I mean you look at the benchmarks and it does appear to be that at low resolutions the processor is holding back the graphics card noticeably, and this appears to further be the case with ATI's latest offering.

A simple question is, how? Well technically speaking it can't unless your graphics card is falling back on your processor.
In the days of old this was commonplace, but nowadays this shouldn't be going on. Another explaination is that all the data is going through your chipset and as such your lossing speed that way because the chipset is running slower than your graphics card.
This actually was a major problem before the advent of AGP, because systems would be running on 66MHz Front Side Buses, which ment that the graphics card would only recieve data so fast. The faster the FSB the faster your graphics card could recieve and send data, instantly you gained alot of speed.
Accelerated Graphics Ports changed this because data for the graphics card basically sidestepped the system and accessed it directly.

With this being the case the only thing that would slow down a card would be the AGP Bus Speed. The orignal 1.0 was 2X AGP which means it was 2X your bus speed, at the time was 66MHz so it was 133MHz. Your graphics card on AGP would get data 2x faster than it would on PCI.
Obviously this is cool, but everything is now on AGP really. So how could that possible be causing the slowdown problem.
Simple, it can't!

The question that poses me right now, is 'Why does the CPU Speed affect low resolutions and ATi's new Card?'
Next time I will try my best to find out why and answer this very issue.

End of part 1...

Login to post a reply

Server time is: 2024-11-25 16:35:53
Your offset time is: 2024-11-25 16:35:53