Sorry your browser is not supported!

You are using an outdated browser that does not support modern web technologies, in order to use this site please update to a new browser.

Browsers supported include Chrome, FireFox, Safari, Opera, Internet Explorer 10+ or Microsoft Edge.

AppGameKit Classic Chat / Timer-based events

Author
Message
Zwarteziel
13
Years of Service
User Offline
Joined: 22nd Jan 2011
Location: Netherlands
Posted: 27th Jul 2015 22:23 Edited at: 27th Jul 2015 22:24
Hi all,

here's something I really should know by now (and I thought I did), but have questions about anyway: In my game, I use a lot of timer-based events, such as calculating a mandatory delay between key-presses, screen-fades and the movement of objects. For this, I use code that works like this:



All this was working rather well I thought, until I started experimenting with sync-rates on different devices. I noticed that in some events, there is quite a discrepancy between an application that is running at say, 700fps and the same application that is limited to 60fps using SetSyncRate and SetVSync(1). Here's a log from my game to illustrate this. In it, the same events are run twice. Once with no sync and once where it is turned on:



I would have thought that the use of timer() and a pre-defined 'pause' would always result in the same delay, no matter the frame-rate. What am I missing?
BatVink
Moderator
21
Years of Service
User Offline
Joined: 4th Apr 2003
Location: Gods own County, UK
Posted: 27th Jul 2015 23:15
60 FPS is one frame every 0.016 seconds

At this frame rate the delay had ended 11 milliseconds before you caught it (at 0.005). You have to perform ~3.5 delays worth of stuff every cycle.

700FPS is one frame every 0.00143 seconds

At this frame rate you catch the delay 0.0007 of a second after every 4th frame. You are cycling so fast that it only updates every 4 frames.

So in essence there is 10 milliseconds difference between each capture when you compare scenarios.

Why is this? Because at 60FPS you are trying to capture an event that is 3 times faster than your smallest timeslice.
In addition, you reset your baseTime# after each event, so the system has no knowledge that you missed the delay by such a large margin (You "throw away" the 11 milliseconds). In a scenario like this, you need to retain the baseTime# and factor in the number of cycles, so you can make a more accurate assessment of time elapsed.

Quidquid latine dictum sit, altum sonatur
TutCity is being rebuilt
Markus
Valued Member
20
Years of Service
User Offline
Joined: 10th Apr 2004
Location: Germany
Posted: 27th Jul 2015 23:23
0.005 is a very short time.
1 second / 60 frames = 0.016 seconds or 16 ms minimum

AGK 108 (B)19 + AppGameKit (Steam) V2 Beta .. : Windows 8.1 Pro 64 Bit : AMD Radeon R7 265 : Mac mini OS X 10.10 (Yosemite)
Zwarteziel
13
Years of Service
User Offline
Joined: 22nd Jan 2011
Location: Netherlands
Posted: 28th Jul 2015 00:09 Edited at: 28th Jul 2015 00:12
Thank you BatVink and Markus,

so if I understand correctly, the timer is inaccurate at 60fps because the delay I set is simply to short and I'm calculating the new target-time at at the wrong moment, after the 'endif' in my example.

I re-wrote the function so it works like this:



Sure enough, the routine now works at both 700fps and 60fps. The log displays the following delta:



I was wondering though, how I would be able to pause for smaller, but consistent amounts of time at different framerates. For instance, if I try a interval of 0.008, the 700fps-version of the game fades twice as fast as the previous example (as it should), but the 60fps-version 'misses' the check because of the 16ms minimum. Is there a way I can achieve this or work around it?

BatVink
Moderator
21
Years of Service
User Offline
Joined: 4th Apr 2003
Location: Gods own County, UK
Posted: 28th Jul 2015 00:17
You can do it. BaseTime# has to stay at the same value all the way through the fade.

Then calculate the % fade from the beginning, instead of the % fade since the last delay.

Or use a Tween In the past I wrote all sorts of fades, moves, resizes etc into a big library. I really didn't want to swap them for Tweens - all that hard work!
But I am now a convert. The code is so much neater and works perfectly.

Quidquid latine dictum sit, altum sonatur
TutCity is being rebuilt
Markus
Valued Member
20
Years of Service
User Offline
Joined: 10th Apr 2004
Location: Germany
Posted: 28th Jul 2015 02:22 Edited at: 28th Jul 2015 02:26
u can also set the Delay by frames per seconds. ScreenFPS()
example,
the result is u do 700x something at 700 fps
and 60x at 60 fps.

similar smaller and larger steps.

a step is 1.0 * delta (difference) time

AGK 108 (B)19 + AppGameKit (Steam) V2 Beta .. : Windows 8.1 Pro 64 Bit : AMD Radeon R7 265 : Mac mini OS X 10.10 (Yosemite)
Zwarteziel
13
Years of Service
User Offline
Joined: 22nd Jan 2011
Location: Netherlands
Posted: 28th Jul 2015 07:29
Thanks for the input again, both of you! I am using Markus's method for the 3D movement of objects and will try this and Batvink's suggestions later today. I only recently used the tweening-commands for the first time, so I will need to wrap my head around them on how to use them for non-animation stuff

Login to post a reply

Server time is: 2024-11-25 19:57:14
Your offset time is: 2024-11-25 19:57:14