Sorry your browser is not supported!

You are using an outdated browser that does not support modern web technologies, in order to use this site please update to a new browser.

Browsers supported include Chrome, FireFox, Safari, Opera, Internet Explorer 10+ or Microsoft Edge.

Dark GDK / Timing (oh no! not again!)

Author
Message
bjadams
AGK Backer
16
Years of Service
User Offline
Joined: 29th Mar 2008
Location:
Posted: 15th Oct 2009 16:48
In my game, the introduction part is done in realtime 3d instead of an avi animation. all is timed to the soundtrack.

What is the best way to make my 3d camera movements time in sync with the music, on every different cpu & gpu configuration?

How can my camera move from position A to position B, ALWAYS the same exact secs & millisecs to get it synced perfectly?

I read many threads about timers, but I can't seem to find the foolproof way to do it like in pro games.

thanks a lot.
Paynterboi TicTacToe
16
Years of Service
User Offline
Joined: 23rd Dec 2007
Location: That one place
Posted: 15th Oct 2009 22:17
Try something like:
if ( Song_has_been_playing_for_this_long )
{
PositionCamera( here );
ManipulateCamera( as so );
ActivateAnimations ( on these objects )
}

I dont know if that makes sense, but try to plan it around the specific part in the music as opposed to useing a timer separate form the song.

Hope This Helps

EYE R T3H +ick +ack +oe mester
How is it going
17
Years of Service
User Offline
Joined: 18th Sep 2007
Location: California
Posted: 16th Oct 2009 06:18
it the music repetative enough to just put it on a for loop?
bjadams
AGK Backer
16
Years of Service
User Offline
Joined: 29th Mar 2008
Location:
Posted: 16th Oct 2009 09:56
unfortunately for me, it has to be timer based because there are different scenes and they have to be connected together seamlessly

so again my question is...

is there a 100% way to move a simple cube object and a camera from point A to point B, and it always plays at the same speed on any cpu & gpu?

or is this impossible under gdk?

thanks
entomophobiac
21
Years of Service
User Offline
Joined: 1st Nov 2002
Location: United States
Posted: 16th Oct 2009 10:38
This is the methodology you're looking for: http://unity3d.com/support/documentation/ScriptReference/Time-deltaTime.html

To find deltaTime, you must calculate how long the last frame took to render. I guess a simple division of the timer difference between each frame and dbScreenFPS() would do the trick.

You then need to use this as a multiplier in all parts of your code where it's needed.
bjadams
AGK Backer
16
Years of Service
User Offline
Joined: 29th Mar 2008
Location:
Posted: 18th Oct 2009 10:49
thanks a lot for the info.

i tried different variants of this method in the past, but it always produced "jittering" movement
entomophobiac
21
Years of Service
User Offline
Joined: 1st Nov 2002
Location: United States
Posted: 18th Oct 2009 12:06
I'm sorry, because I was speaking nonsense before. Dividing with dbScreenFPS() is unreliable. You should of course subtract the new timer by the old timer.

Note that the order these commands are called is very important for the functionality to actually work.

In the small test app I had, with some minor 3D operations, the delta time was 17 milliseconds. I.e., 0.017 seconds, which would be the multiplier used every frame, as g_Delta.

bjadams
AGK Backer
16
Years of Service
User Offline
Joined: 29th Mar 2008
Location:
Posted: 18th Oct 2009 20:36
thanks a lot once again.

most probably my biggest problem is that i am missing the proper order for all to function correctly.

i tried so many examples, and looked up so many threads both from gdk/dbpro and always got unsatisfactory results, that it's getting very annoying.

timing is a CRUCIAL part of a game project and i want to get it sorted out asap, as basically anything that moves has to be "tagged" by the timer multiplier...

Login to post a reply

Server time is: 2024-10-01 14:40:46
Your offset time is: 2024-10-01 14:40:46