Quote: "lol, the GC didn't have final fantasy yet.... it had a secret of mana game with final fantasy slapped on the title for marketing purposes.... not saying it was a bad game, it just wasn't final fantasy"
http://ffcc.nintendo.com ... hmm, really?
SSC.. Process MHz means nothing.
I have an Athlon64 FX-51 in my machine. It runs at 1.8 GHz, it can out perform a Pentium IV 3.2 GHz HyperThreading Extreme Edition Processor with relative ease.
The X-Box Processor is 32-bit (X86-32bit), it's graphics Processor is 128-bit. (the GeforceFX was the first 256bit card from NVIDIA).
133MHz DDR Front Side Bus. (233MHz FSB)
Gamecube's Processor is also 32-bit (PPC-405 32-bit), it's graphics processor is the 256-bit Radeon based LSI.
Playstation2 is the only true 128-bit Processor out of the lot (TX79-B 128-Bit 'Emotion' Processor). Again DDR runs on 148 MHz FSB.
The X-Box has 233MHz (PC-1800) 64MB Ram, 10ns Response Time
The Gamecube has 162MHz T1-SRAM 40MB Ram, 10ns Response Time (however is Cast into the LSI, so where the XBox has to bridge through the AGP at a max of 1.1GB/sec through the FSB, the GC can do this 1.3GB/sec)
Playstation2 has 148MHz T1-SRAM 20MB System, 20MB Graphics. 10ns Response Time. (again cast into the LSI)
Processor MHz and Ram Sizes mean nothing.
PS2 and GC use the Ram as Buffers, XB uses it as Cache.
These totally different uses mean that the system *MUST* cache anything it is using on the XB... so you have 64MB Per Scene.
The GameCube and PS2 are buffers meaning you have 40MB Per Second.