Hi! Here's my problem:
In my code, I use forced-timing. This means that the sync rate is set to zero. I store the duration of each loop, and the variations of this duration give me a factor which allow me to "weight" each movement of the existing objects. This way, a fast computer will have a higher framerate than a slow one, but my objects won't move faster, but more smoothly.
Then, I noticed a problem with the "play object" command: it seems to be "framerate-related". An animation sequence will be faster if the framerate is high. And, of course, this is bad in forced-timing: animation speed should be "time-related" (its duration, in seconds or milliseconds, should never change).
Maybe I could use the "set object speed" command to modulate the animation speed? And this, at each loop to accomodate the framerate variations?
Please, if anyone already had this problem, could you tell me how to solve it?
Thanks a lot!
Ideas: memories of things which did not occur yet...