you need to keep track of the timer and alter the speed of all the game components when the game speeds up or slows down, for example if the game is meant to run at 40fps then thats 25msecs per frame, if the game slows down then you would be only manageing say 20 fps, if you check the timer difference each loop then you get50msecs per frame, so you need twice the speed of all the objects to keep up to the correct speed, if you divide the time it took for one frame by 25, then multiply your player/object speeds by this result(use a float), then when you run at 40 fps, the timer returns 25 and 25/25=1, speed*1=speed and your game runs at the correct speed, when you run on a slower pc and drop to 20 fps the timer returns 50, 50/25=2 (speed*2) so your game elements double their speed and appear to move at the same speed as the faster computer, sync rate 40 stops the code going too fast, and checking the timer speeds things up when required, you use a float so that variations like a couple of frames slowdown can be adjusted for, hope that explains things.
Mentor.
PC1: P4 3ghz, 1gig mem, 3x160gig hd`s, Radeon 9800pro w cooler (3rd gfx card), 6 way speakers.
PC2: AMD 2ghz, 512mb ram, FX5200 ultra, 16 bit SB.
Mini ATX cases suck.