It turns out my guess was correct. The Timer() command behavior is different on different Windows versions, but it shouldn't, so I can only conclude that the Kernel32 library comes with a bug, at least, on Windows 8.1 Pro and Single Language.
If I remember correctly the Timer() command simply calls the GetTickCount() function located in the Kernel32 library. Someone who is more familiar with the actual code might want to correct me if I'm wrong. However, I "verified" this by calling Timer() and GetTickCount() side by side and also "confirmed" that the variables are initialized to zero by DBPro.
On Windows XP and Vista, Timer() returns a positive number equivalent to the elapsed time since the computer was turned on and the function call starting from zero. This matches the documented behavior of the function on MSDN.
On Windows 8.1 Single Language, Timer() returns a number equivalent to the elapsed time since the computer was turned on and the function call but starting from the most negative value of a 32-bit signed variable. This differs from the documented behavior.
On Windows 8.1 Pro, Timer() returns a very high positive value, outside the range of a 32-bit unsigned variable, that cannot possibly be equivalent to the elapsed time since the computer was turned on and the function call. This also differs from the documented behavior.
Here's the code I used
global Ts1 as integer
global Ts2 as integer
load dll "Kernel32", 1
do
cls
if dll exist(1)
if dll call exist(1, "GetTickCount")
Ts1 = timer()
text 10, 5, "Timer(): " + str$(Ts1)
Ts2 = call dll(1, "GetTickCount")
text 10, 25, "GetTickCount(): " + str$(Ts2)
text 10, 45, "Difference: " + str$(abs(Ts2 - Ts1))
endif
endif
wait 125
loop
delete dll 1
end
Both Windows 8.1 behaviors are unexpected but things only get wacky when the Timer() command returns a negative number and you are, intentionally or not, relying on the default initialization of variables.
Windows API Kernel32 Library GetTickCount() function MSDN Documentation
https://msdn.microsoft.com/en-us/library/windows/desktop/ms724408%28v=vs.85%29.aspx
Most likely the odd behavior on my other games relates to this problem but in case it's something else I'll post an update about it.
@Cescano
The language favors, actually enforces, the use of global variables, as you cannot pass them to functions by reference or have functions to return structured types. Anyway, somehow the propagation of the timestamp as a parameter makes more sense to me in this case. It was the ability to (unintentionally) use an undeclared variable on the fly that made me trip on this.
The math is very simple
if current_time - recorded_time > latency
`time to do something here
endif
Thanks for the tips and help.
@Chris Tate
Yes, that's a documented behavior for GetTickCount(), after aproximmately 49.7 the variable overflows and rolls back to zero. But as long as it is an unsigned variable (DWORD) you can safely compute the absolute difference between two values directly with at most an error of one unit which is negligible for a value that has a resolution of 10 to 16 units.
I have that library sitting around somewhere. The author is IanM if I remember correctly. I haven't used it because my games are too simple for a high resolution timer. Same goes for Lee's PerfTimer() command.
Thanks for the tips and help.