Has anyone succeeded with implementing timer based simulations? I seem to have trouble setting it up.
The basics of the code I have so far:
DYN SET TIMING 0, 0, 1
DYN SIMULATE timestep#/10.0
main loop
update_Time()
DYN FETCH RESULTS
DYN UPDATE
runScript_parseData()
runScript_updateGame()
ai update timestep#/10.0
UpdateKeyHold() : UpdateMouseHold()
DYN SIMULATE timestep#/10.0
I have a timer based function from kistech as follows:
fps#=30.0
sys.diff = hitimer() - sys.lastTime
sys.lastTime = hitimer()
sys.speed = 0.0
for tx = 1 to 29
factors(tx) = factors(tx+1)
sys.speed = sys.speed + factors(tx)
next tx
factors(30) = sys.diff/ 100.0
sys.speed = sys.speed + factors(30)
sys.speed = sys.speed / fps#
It has always worked for me until now, I can't help but think I'm setting up the physics wrong somehow, when the FPS is dropped the objects are moved slower (these objects are moved using add force). If I multiply the force by the timestep movement becomes erratic and when the fps goes down it moves the objects faster.
Any help on this would be appreciated, I have been scratching my head over this for days