Sorry your browser is not supported!

You are using an outdated browser that does not support modern web technologies, in order to use this site please update to a new browser.

Browsers supported include Chrome, FireFox, Safari, Opera, Internet Explorer 10+ or Microsoft Edge.

Geek Culture / So even NVidia makes mistakes ...

Author
Message
Fallout
22
Years of Service
User Offline
Joined: 1st Sep 2002
Location: Basingstoke, England
Posted: 23rd Aug 2003 01:58 Edited at: 23rd Aug 2003 02:00
While trying to get my GFX card to do something impressive and run faster than a snail with concrete feet, I happened upon some information. As I dug deeper and deeper I found even more information. Some of it was official, and some of it was from respected sites.

Anyway, as it turns out the Geforce FX 5800 and 5800 ultra cards are a bag of sh*t. They employ technology that was buggy at the time of their manufacter, and their DDR-2 memory actually causes more problems than it solves. I went through a horde of bench mark test results, and information official and unofficial, including tables and charts and basically the 5800 has been discontinued because its crap. Originally NVidia's flagship FX card, and now apparently they admit it was a failure. Comparing it to the 5600 cards which are over £100 cheaper (almost half the prices) it runs at almost exactly the same speed across the board, only offering some subtle speed advantages in places, which aren't worth the extra ton!

Anyway, I'm just ranting because I'm pissed off that I bought that card and it's basically a waste of £100. Luckily I've been able to return it, and I'm getting it replaced with a 5600, which will be similar in performance and £120 cheaper. Apparently NVidia have got it right with the 5900 chipset, and it appears that that's streaking ahead in the performance stakes, but the 5800 ... if anyone can find one anywhere (they're not being produced anymore by the leading card manufacters) DON'T BUY THE BASTARD! Get a 5600 instead.

Insiiiiiiiiiiiiiiiiiiiiiiiide!
Shadow Robert
22
Years of Service
User Offline
Joined: 22nd Sep 2002
Location: Hertfordshire, England
Posted: 23rd Aug 2003 02:10
what manufactures seem to forget though is that the 5200->5600 there is only performance increase.
the 5800 actually has alot of NEW technology in it.

you've just downgraded all the shaders available too you, until you update to the 5900 that is ... and you keep saying 5800 and not 5800ti and the non-ti version are HALF the speed, not theoretically literally of the ti versions.

a 5200ti is the same price as a 5600 and it will outperform it too.
suggest before going off on a rant you actually read properly

Fallout
22
Years of Service
User Offline
Joined: 1st Sep 2002
Location: Basingstoke, England
Posted: 23rd Aug 2003 02:22
I know you've been banging on about shaders Raven but they're not really that impressive. They make surfaces look a bit better, which is nice, but not worth £100.

Also, you keep refering to all these FX cards as ti and non-ti versions, but I nowhere I've been have they refered to cards as ti (except Geforce4 cards). All I've seen available are standard and ultra version where the only difference is a bigger fan bolted on allowing higher mem and GPU clock speeds.

I must admit, I'm not a GFX card expert, but it incredible how you seem to be an expert on everything. You were even an expert on sound until someone who actually knew what they were talking about (namely me) pointed out you were making it all up. Not trying to attack you here, cos I don't know exactly what you know, I just get the feeling you have a habbit of knowing a bit and then "filling in the gaps" with a bit of guess work.

Insiiiiiiiiiiiiiiiiiiiiiiiide!
Fallout
22
Years of Service
User Offline
Joined: 1st Sep 2002
Location: Basingstoke, England
Posted: 23rd Aug 2003 02:29
Btw, here's a little test for you:
Go to google and type in "Geforce FX 5800 ti" and it returns no results. You just get results for a mixture of information about Geforce FX 5800 and Geforce 4 ti chipsets. There is no "TI" version of FX cards.

Now while I appreciate that's not a big deal, how much can you honestly know if you don't even get your terminology right?

I refuse to let you belittle my ports with comments like " suggest before going off on a rant you actually read properly" if you are actually talking about stuff you don't really know inside and out. Unfortunately, I don't know much about GFX cards like I do about sound, so I can't prove you right or wrong this time.

Insiiiiiiiiiiiiiiiiiiiiiiiide!
lcfcfan
21
Years of Service
User Offline
Joined: 19th Feb 2003
Location: North East, UK
Posted: 23rd Aug 2003 02:52
there is the standard and the ultra that is correct.

Shadow Robert
22
Years of Service
User Offline
Joined: 22nd Sep 2002
Location: Hertfordshire, England
Posted: 23rd Aug 2003 02:52
think about what my primary job is Fallout ... and then think about who i have been working for these past 6months.
i know the FX series inside and out.

i call them Ti (Titanium) because this is what they were called in development and its only with the FX's line they actually took up the totally stupid Ultra name again for retail.
GeForce2, was confusing because there were some Ti and some Ultra - the GeForce3 set the Ti name for the most advanced versions and the GeForce4 came in 2 flavours MX or TI.
so i'm gonna keep reffering to them like that because it's simpler and it doesn't sound so naff.

That aside... there is FAR more difference between the 5600 & 5800 than just pure shaders.

the 5800 will not get ANY performance hit from using FSAA upto 4x as it has HTC which is different to IntelliSample as it is hardware not software based.
It can also handle a Maximum of 16 Hardware Shadows as opposed to 6 in the 5200/5600, it can also handle more Shaders.

and before you say not everything is about Shaders, there are 32titles due next year which use them - and more are no doubt on the way as people are finally learning them. The more and better Shaders you card can handle the better.

Not to mention the 5800 has access to the extended nv30/v3.0 shaders which cover Light & Shadow as well as having the 2.5 extensions.

and if you somehow feel that Shaders are just PURE graphical updates, you should understand that most games now use Vertex Shader 1.0 for thier animations which most of the GeForce line can do either hardware or software - however the FX line are capable of enhancing the performance here even further especially with the enhanced floating point values.

The 5800 has a 128bit Integer and 128bit Floating Point, which is different to the 5200/5600s 32bit Integer and 64bit Floating Point and can add a great deal of speed dealing with shaders if they're programmed using Cg. i know alot of cretins don't believe there is a difference between using DirectX HLSL and Cg, but there is which will come in handy in titles like Half-Life2 ... and is the difference of almost 30fps.

Not to mention the particle routines to handle upto 63million particle instances, which can be used by programmers for anything.
you want to see what your card can do that Time Machine demo is a damn good example - even more so when you realise you can put FSAA to 4x like the demo intends you to do at a resolution of 1280x960x32 & it still runs as smoothly as if it wasn't using it.

you can't run that demo on a Radeon or 5600 you wanna know why? because it just doesn't have the Shader support ... to you they might not seem important.
But ask any programmer which they'd prefer to program for an Intel Pentium4 with its 16 PreSet Registers or the GC's StrongArm Gecko with its 16 Free Registers what do you think they'll choose and why?

Shaders are not just graphical extras, they are registers for the Graphical Processing Units - sure on the surface it might seem like the ASM shaders only do standard tasks, but the GPUs are now TRUE processors. They are fantastic RISC processors which give PC developers the ability to finally have Console style standard to program towards... and they ARE doing so.

so the 5800ti can't physically push anymore polygons than the Radeon 9800pro ... the Pentium200 and the Pentium200mmx had no difference between them than 57 operation registers for 16bit multimedia operations, and you know what you use that technology everyday now in every single current processor for the past 8years we have used it ... and what performance does it yeild?
non what so ever for standard programs, but you activate the MMX extensions when you compile a Windows Application and you instantly get 10% speed increase when performing colour and desktop operations.

people are looking at shaders as technically just graphics, but they're not even by far ... they're giving you an EXTRA processor inside your home system with additional power and registers to use however you see fit.

the 5800 & 5900 FX are THE MOST ADVANCED shader cards on the market, technical power is nothing. Think about it, how can your 5800 standard actually match head-to-head with a Radeon 9700pro when it is 1/4 of the speed?

Ian T
22
Years of Service
User Offline
Joined: 12th Sep 2002
Location: Around
Posted: 23rd Aug 2003 03:13
'think about what my primary job is Fallout ... and then think about who i have been working for these past 6months.
i know the FX series inside and out.'

That's why you make so many mistakes about them? You seem to worship nVidia... I looked into this after I read Fallout's post, and everything he has said has been accurate.

' technical power is nothing'

Yeah, right. What use is a half-off sale if you don't want any of the items anyways? Regardless of how 'advanced' you think they are, they are a big faliure.


Get a solid ATI card if you want good gaming!

--Mouse

Famous (Avatarless) Fighting Furball
Fallout
22
Years of Service
User Offline
Joined: 1st Sep 2002
Location: Basingstoke, England
Posted: 23rd Aug 2003 03:36
Thanks for keeping it civil Raven when I was getting pissed off.

Look, I don't know how it works. All I know is I've seen 3D Mark results that are almost exactly the same for 5800 and 5600. It bothers me cos I thought I was buying one of the most powerful cards on the market, and so far I haven't seen it do what its supposed to do.
It ran Unreal 2 very well, and the water effects and stuff looked a lot better.
It ran Morrowind slowly. I'm hoping that was just the Morrowind engine.
I ran the 3d Mark 03 demos and the highest FPS I saw was 150 peaking for the plane demo, dropping as low as 30 FPS. The nature demo ran at 15 FPS for a lot of it. Now, why does one of the market leading cards run so slowly?

I've tried to overclock the bastard, but it doesnt seem to want to overclock, or the clocking isnt making any difference.

Do me a favour Raven, cos I have until Tuesday before this card is picked up and my 5600 is despatched. Point me in the direction of some sort of diagnostic tool - a really decent tool that'll look at my card, let me tweak it and let me find out once and for all if its working properly, and if I really am running it on the right settings, cos I really dont know if I am. Then I will shut my mouth and appologise if it roars into life (like it seems it needs to). Come on - there must be a tool out their somewhere that'll really let me diagnose and tweak the card to perfection. I've searched all over google for something - NVmax doesnt work with the new drivers.

Insiiiiiiiiiiiiiiiiiiiiiiiide!
Shadow Robert
22
Years of Service
User Offline
Joined: 22nd Sep 2002
Location: Hertfordshire, England
Posted: 23rd Aug 2003 03:50
pfft! ATi are about as stable as my bowels after eating 5 bowls of chilli

this has nothing to do with how "advanced" i feel the FX are...
my cards perform admirably in the next generation games, Doom3/Half-Life2/Final Fantasy XI/Tomb Raider: AoD/World of Warcraft
the problem isn't the hardware, its that it is made for the FUTURE in mind ... not the past.

and the GeForceFX demo's show this - if you haven't experienced them then quite frankly you haven't much of a clue of what you card can accomplish.

and if the FX are such bad cards explain why Maya/SoftimageXSI/3DStudioMax/Lightwave ALL actively support GeForce4 & GeForceFX enhancements and yet they don't actively support the Radeons enhancements?
Why the top gaming rigs from the top computer rig companies all supply GeForceFX as standard over Radeons even though they're more expensive?
Why Valve themselves have recommened a GeForceFX and NOT a Radeon for Half-Life2?

as a developer i fail to see this so called speed advantage that the Radeons have ... i also fail to see why the Development Industry choose the FX over the Radeons, but the Gaming Community picks the Radeons over the FX.

try as you might FSAA on a Radoen get a HUGE performance drop, on an FX it doesn't. Shadows on the Radoens again are a HUGE performance drop, but the FX have builtin Shadows which are better quality and don't suffer from any speed loss.
The FX might only be able to push 20% less polygons per second than the Radoens, but they can push 35% more Shaders & 6 passes to the Radeons 4.

no for older games the FX & Radoen are pretty much head to head really, and for the new games there is no contest - and no matter what reports you want to pull up about it, nVidia didn't admit that the 5800 was a failiar because of speed - it was a failiar because it didn't live upto expectations.

you know what that means? that means that the card just wasn't accepted by the community and those who's opinions matter.
funny though that still a development company would recommend a brand of card which have been dubbed "a failier" though.

you want to know something ironic is that the actual speed of the FX5900 is no different to the 5800s ... the chip just has more pipelines, some extensions and added shaders and hardware support for more standard features.

its again weird that the 5800 in Dx9 performs so "poorly" but in OpenGL it performs like a bat out of hell. (just something wierd if you ask me)

Shadow Robert
22
Years of Service
User Offline
Joined: 22nd Sep 2002
Location: Hertfordshire, England
Posted: 23rd Aug 2003 03:56
try http://www.nvmax.com they have alot of cool things on there... one thing in particular is a benchmarking utility.

but really think about it, Morrowind is an old engine - Unreal2 is realtively new with an optimised engine.
C&C Generals will also run very admirably.

infact alot of games will perform alot better, with the except of games with early incorportation of Shaders... i mean heck, when i swapped my GeForce4 4200ti for my FX5200 i was expectig a performance hit in Max Payne and Jedi Knight2 - but i can run it even faster now on more extreme resolutions.

Fallout
22
Years of Service
User Offline
Joined: 1st Sep 2002
Location: Basingstoke, England
Posted: 23rd Aug 2003 04:01
Yeah, but I still need to find the be all and end all of diagnotic/tweak programs so I can properly find out what's going on with my card .... any ideas?

Insiiiiiiiiiiiiiiiiiiiiiiiide!
Richard Davey
Retired Moderator
22
Years of Service
User Offline
Joined: 30th Apr 2002
Location: On the Jupiter Probe
Posted: 23rd Aug 2003 04:27
Fallout - check out this months issue of Custom PC magazine (my copy arrived today, Issue 1). They put 5 Radeon 9800 cards against 5 GeForce FX 5900 Ultras in real-world bench marks (none of this 3D Mark crap, but proper tests that count for the likes of me and you). The 9800's win in every case by a significant fact, it's quite stunning to see the results. Mag costs £3.50 and is the coolest PC tech mag I've seen in a looong time. http://www.custompc.co.uk.

There's a superb write-up on DX9 too and some game reviews where they don't review the game, but review how it copes on legacy systems vs. recommended systems. Way cool.

I've got a GeForce FX5900 sat here in a box and a Radeon 9700 Pro installed and I'm dubious about switching them over now. But we'll see, can't hurt to test it out

Cheers,

Rich

"Gentlemen, we have short-circuited the Universe!"
Fallout
22
Years of Service
User Offline
Joined: 1st Sep 2002
Location: Basingstoke, England
Posted: 23rd Aug 2003 05:04
Cheers for your help guys. I'm pretty sure I know what the problem with my 5800 is now. It's not changing the clock speed - going from the 2D setting to 3D setting is supposed to change the clock speed. Ignoring that, changing the sliders in the menu isn't working either.

The way I know this is:
-The fan is supposed to spin up when the chip works harder, but it's never working in anything but quiet mode.
-The temperature should go up at higher clock speeds, but its staying the same.
-There should be a visible performance increase on benchmarking programs and software, but there isn't.

So what the hell is going on!!!?? The settings are saved, but nothing is changing. Hardware problem? Driver problem? Incompatibility?

This really needs to be resolved. I'm convinced my 5800 isn't clocked correctly.

Insiiiiiiiiiiiiiiiiiiiiiiiide!
lcfcfan
21
Years of Service
User Offline
Joined: 19th Feb 2003
Location: North East, UK
Posted: 23rd Aug 2003 05:43
Well i have seen loads of comparisons on the web between the 9800 and the 5900 and there is very little difference in speed go to my web site there is a link with a comparision of the two cards not done by me but someone else but from the results on there both cards beat each other in different tests, and fallout try aquamark 2003 not sure if it is out yet but i hear it is a very good benchmark.

Fallout
22
Years of Service
User Offline
Joined: 1st Sep 2002
Location: Basingstoke, England
Posted: 23rd Aug 2003 05:53 Edited at: 23rd Aug 2003 05:54
arrghh! Some random guy on another forum just told me their are issues with clocking the 5800 when using an intel 850 chipset, which is exactly what I have. He mentioned he sorted it out as well, but didnt explain how! Now he's disappeared, and no amount of searching on google is throwing up any infomation on this problem.

Why is it that search engines are never useful when you really need them?

Insiiiiiiiiiiiiiiiiiiiiiiiide!
Ian T
22
Years of Service
User Offline
Joined: 12th Sep 2002
Location: Around
Posted: 23rd Aug 2003 06:06
I repeat... get a nice stable ATI

The 5900 may be good but if you bought a 5800 I'm assuming it's a bit out of your budget (Hehe, I wish I could buy one). Below the 5900 ATI still reigns supreme (What I've read on tech sites atleast) and my card was incredibly simple to install and gave me no problems. Well... it was incredibly easy *once it was installed* but that's just how video cards are . I don't know a single person who has been dissapointed with their ATI.

--Mouse

Famous (Avatarless) Fighting Furball
Ian T
22
Years of Service
User Offline
Joined: 12th Sep 2002
Location: Around
Posted: 23rd Aug 2003 06:09
Oh, and just another comment-- this isn't nVidia's first big mistake. Disregarding their cheating at 3dmark not too long ago, I have heard very bad things about the higher end GF 4 TI models-- supposedly due to a manufacturing error, they were very loud (even for an nVidia card!) and the fan wasn't as effective. I'll turn up details if you want...

--Mouse

Famous (Avatarless) Fighting Furball
Fallout
22
Years of Service
User Offline
Joined: 1st Sep 2002
Location: Basingstoke, England
Posted: 23rd Aug 2003 06:14
The figures go both ways though, as I've discovered over the last week. You can find any website presenting the figures anyway you like, depending on which card they're biased too. Even the unbiased ones are throwing up different arguments. Every person I've spoken to on the net are throwng up different arguments too.

ATI? Geforce FX? Who really knows? I know I don't. I've had far too many people, sites, reviews and reports givng me conflicting opinions.

Insiiiiiiiiiiiiiiiiiiiiiiiide!
the_winch
21
Years of Service
User Offline
Joined: 1st Feb 2003
Location: Oxford, UK
Posted: 23rd Aug 2003 06:27
My method has always been to try and get the most popular card for better hardware and software compatability and to stay away from the bleeding edge and hope any problems are fixed before the card reaches me.

I would say it's almost impossible to tell which card is better. Ignore all the people that say one make is the best while the other is crap and pehaps you have a chance or forming an opinion loosly related to reality.
Eric T
21
Years of Service
User Offline
Joined: 7th Apr 2003
Location: My location is where I am at this time.
Posted: 23rd Aug 2003 06:42
i have a NVIDIA Quadro FX 3000G and i have to say.. i prefer my Raedon 9500. Although the Quadro does work alot better for 3d Modeling and CAD, for gaming.. it sucks..

While on the subject of Video cards.. i found this video card called a S3 Virge/GX2. and i was wondering if anyone knew wnything about it?

It has 1 monitor out put.. a S-Vid output.. and a RCA output.
So i was wondering if anyone knew anything about it.

Working on 4 projects 2 RPG(programming texturing and 3d map), 1 3rd person shooter (Programming), and a special project.
Fallout
22
Years of Service
User Offline
Joined: 1st Sep 2002
Location: Basingstoke, England
Posted: 23rd Aug 2003 14:20
Nope. Dont know anything about that.

Btw, that guy got back to me with the solution to my problem - buy a new motherboard.

Insiiiiiiiiiiiiiiiiiiiiiiiide!
Shadow Robert
22
Years of Service
User Offline
Joined: 22nd Sep 2002
Location: Hertfordshire, England
Posted: 23rd Aug 2003 16:35
i used to have a ViRGE DX 4mb... they were S3's contribution to the 3D Accellerator Race, they didn't stand a chance against PowerVR or Voodoo but they were nice budget cards

the Quadro series isn't designed for game loads - they're designed specifically to enhanced rendering & applicaiton speed within the industries biggest 3D Packages.

Supported Applications are Softimage3D, SoftimageXSI, Maya, 3D Studio Max, Lightwave, trueSpace, Houdini (5.2+), SolarisPoint, AutoCad ... and there might be a few others too like Apami but i can't remember.

I've done alot of real benchmark tests between the major FX & Radeon lines - and really the Radeons only beat the FX in pure polygon pushing abilities. Polygon Rich environments like Quake3/Unreal2/Etc... they're capable of pushing them quicker - the only problem is that as soon as you add Anistrophic Filtering at anything over 2x and Texture Sharp Sampling and FSAA and Shaders, thats when the Radeons start to struggle and struggle badly!

Even Transform & Lighting Applications can perform better with the GeForce chipsets.
fact of tha matter is if you want Polygon pushin abilities the GeForce4 line are the best, and if you want Advanced Effects & Future gaming abilities the FX line is the best.

Not just in singular area's but overall - add to this the Radeons are power hungry ... not just a little but ALOT.
a GeForceFX 5900ti will quite happily run on a 230watt system, a Radeon will end up rebooting your system every so often because it tries taking too much power and the APMI will just restart the computer to prevent damage (unfortunately alot of HDDs will get damaged when this happens)

which means either getting an APU for your system or a 300-320watt powersupply, particularly if you have Pentium4 or AthlonXP Processors as they're also power hungry - and USB devices.

-- -- --

you want real worlds tests of performance,
Jedi Knight 2 full options 1024x768 w/Volumetric Shadows ...
on my Athlon 1.0Ghz w/512mb w/GeForceFX 5600 will run at an adverage of 120fps - swap the Radeon 9700pro and i only get an adverage of 95fps.

I turn the options down, like Volumetric Shadows & Detailed Shaders as well as Anistropic Filtering ... my FX gets 170fps whilst the Radeon will get 210fps.
its really just a case of the real world details you'll have in your games.

To me its framerates like that which matter to me.
Also Fallout if there is an issue with the 810chipset, make sure your board has the onboard graphics fully disconnected (no jumper or bios activation) ... turn on palette snooping & then see if there are any Intel fixxes on the site, usually this sort of thing is a driver issue.

Had it before with my Savage4 Pro on my Intel 440LX board, it wouldn't recognise it was a Creative card ... so i had to update everything.
Though to be honest you want to do some serious gaming you'd get the Northbridge chipset

Fallout
22
Years of Service
User Offline
Joined: 1st Sep 2002
Location: Basingstoke, England
Posted: 24th Aug 2003 06:14
There is no built in GFX chipset on my board. It's external graphics only, but it's not the motherboard anyway. That guy was probably wrong. I tested the card on my mates PC and it's the same problem.

The thing is, the Detonator drivers dont alow me to change the clockspeed for my card. There is no faciliy to do that. It's set to run in slow ass 2D mode, for some reason. After downloading the registery updated "coolbits" I had access to the clocking menu, but it doesnt work. Today I read an article about Nvidia preventing the access off the clocking functions made available by coolbits - they no longer work. So basically all my clocking hasn't been working cos the new Detonator drivers don't allow it.

So, to put it in a nice polite way, how the f*ck does Nvidia expect me to make full use of my card if they won't supply any facilities to change the clock speed from 300/600 to the 400/800 or beyond that it's designed to run at?!?!?! There are no clocking options in the drivers once installed, unless you download a registry hack making them available, and then they've been disabled. What's going on?

Insiiiiiiiiiiiiiiiiiiiiiiiide!
Shadow Robert
22
Years of Service
User Offline
Joined: 22nd Sep 2002
Location: Hertfordshire, England
Posted: 24th Aug 2003 17:05
well i posted the updated reg to allow access to the tweak...
that aside did you checkout that site i posted?

RivaTuner is really an updated version of Nvmax but for all cards not just nVidia
the clocking speed for the drivers really was taken out because it was dangerous for the cards... or atleast how people were using it was anyway.

You should only ever overclock if you know what your doing, otherwise best to leave it... why don't you email nVidia with your system details to thier technical support and ask them what is wrong?

Ian T
22
Years of Service
User Offline
Joined: 12th Sep 2002
Location: Around
Posted: 24th Aug 2003 23:23
'You should only ever overclock if you know what your doing'

Oh yeah . It's dangerous if you're new to it. On top of that, no matter how good you are, it cuts down the lifespan of your card considerably, and can make it noiser (particularly nVidia sonic grenades), so if you're hoping to keep your card around for 3 years, it is not the optimal path to take.

Generally speaking, don't try it unless you can replace your video card, and perhaps your whole computer

--Mouse

Famous (Avatarless) Fighting Furball
Fallout
22
Years of Service
User Offline
Joined: 1st Sep 2002
Location: Basingstoke, England
Posted: 25th Aug 2003 06:33
Tried Riva Tuner, but that doesnt work either. In the evolution of this nice enjoyable problem, I've come to the final stage which is probably the fact there is a hardware fault. Simple fact is, the card is never switching from 2D to 3D mode when it's supposed to. Driver error? Unlikely, cos I have the newest detonators. The card came in a beat up box, so maybe some of that loving violence was passed onto the sensitive hardware?

Either way, it's too late now. I'm gonna stick with the 5600 replacement, and spend my £100 account credit on 2 cheap whores and a pre-order of half life.

Insiiiiiiiiiiiiiiiiiiiiiiiide!
lcfcfan
21
Years of Service
User Offline
Joined: 19th Feb 2003
Location: North East, UK
Posted: 25th Aug 2003 06:54
Sounds like a plan!

randi
22
Years of Service
User Offline
Joined: 27th Aug 2002
Location: United States
Posted: 25th Aug 2003 09:27 Edited at: 25th Aug 2003 09:31
"spend my £100 account credit on 2 cheap whores"

GROSS!!!!


Fallout
22
Years of Service
User Offline
Joined: 1st Sep 2002
Location: Basingstoke, England
Posted: 25th Aug 2003 15:23
Quote: ""spend my £100 account credit on 2 cheap whores"

GROSS!!!! "


Randi, I thought you'd be pleased! You were asking for me to share the money with you before!

Insiiiiiiiiiiiiiiiiiiiiiiiide!
OSX Using Happy Dude
21
Years of Service
User Offline
Joined: 21st Aug 2003
Location: At home
Posted: 25th Aug 2003 15:38
Quote: "You were asking for me to share the money with you before"

Interesting thought...


randi
22
Years of Service
User Offline
Joined: 27th Aug 2002
Location: United States
Posted: 25th Aug 2003 18:42
Randi, I thought you'd be pleased! You were asking for me to share the money with you before!



Now that's just wrong!!!!

I'll get you for that one.
Just wait and see.

Fallout
22
Years of Service
User Offline
Joined: 1st Sep 2002
Location: Basingstoke, England
Posted: 25th Aug 2003 23:59


Notice, normally I use the cheeky smiley, but this time I'm stooping to a

hehehehe

Insiiiiiiiiiiiiiiiiiiiiiiiide!

Login to post a reply

Server time is: 2024-11-23 16:42:09
Your offset time is: 2024-11-23 16:42:09