Sorry your browser is not supported!

You are using an outdated browser that does not support modern web technologies, in order to use this site please update to a new browser.

Browsers supported include Chrome, FireFox, Safari, Opera, Internet Explorer 10+ or Microsoft Edge.

Dark GDK / problem with dbSyncRate()

Author
Message
study
15
Years of Service
User Offline
Joined: 12th Dec 2008
Location:
Posted: 12th Dec 2008 17:29
Hi friends
I'm new with DarkGDK, i use 3d object sample of darkGDK, when i write dbSyncRate(30) it works and when i write dbSyncRate ( 60 ) it works too, but when i want something more than 60 it works like 60. ie when i write dbSyncRate(0) it works like 60 and when i write dbSynceRate(100) it works like 60 again, i mean my maximum fps is 60 anyway. do you know whats my problem?
regards
RanQor
17
Years of Service
User Offline
Joined: 8th Jun 2007
Location:
Posted: 12th Dec 2008 18:24
I just came across this bug as well, I've been told they know about the issue, and we still dont know when they are going to fix this.
jason p sage
17
Years of Service
User Offline
Joined: 10th Jun 2007
Location: Ellington, CT USA
Posted: 13th Dec 2008 00:09
true... true... but that doesn't have to "throttle" your game... meaning you can keep your own timer... and do other things until the right amount of time has elapsed - then render...

Basically.. 60frames persecond is 166 milliseconds per frame right? So if you set a timer when you render a frame, and then you're loop comes around and 166 milliseconds hasn't elapsed yet.. you could go call an ai function, or a series of functions to update things behind the scenes.. including moving objects etc...

This should in theory allow your program to utilize the CPU horsepower you have... without the 60fps "speed" throttling your code.

Additionally, I haven't delved into multi-threading to much in DarkGDK or DarkGDK.Net but multi threading might allow you to accomplish even more game processing than a single thread and the timer technique I mention alone.

--Jason

sydbod
16
Years of Service
User Offline
Joined: 14th Jun 2008
Location: Just look at the picture
Posted: 13th Dec 2008 00:14
Quote: "I just came across this bug as well,"


It is not a bug. It has to do with how you actually set up your computer system (video driver actually)

In your video driver there should be a setting called something like "Wait For Vertical Sync".
If this is turned ON , then the video driver will not let the game run any faster than the video frame refresh rate that you have set for your monitor.

You will have to turn "Wait For Vertical Sync" to "OFF".

The only problem with doing that is that you will suffer from "Screen Image Tearing" when the image on the screen gets updated part way when the image gets updated before a full screen is rendered to the display.

This mode should only be used under code development conditions, when a person is optimising their code for speed.
RanQor
17
Years of Service
User Offline
Joined: 8th Jun 2007
Location:
Posted: 13th Dec 2008 00:34
It may be an Graphics Card option, but when you have it set to Application Controlled (which is default), the application can (and should) change settings like VSync for you.
sydbod
16
Years of Service
User Offline
Joined: 14th Jun 2008
Location: Just look at the picture
Posted: 13th Dec 2008 01:01
Quote: "the application can (and should) change settings like VSync for you"


You are correct, the application should do the VSync change for you..... and it does, just people do not use the correct command.

The dbSyncRate() is just that, it adjusts the sync rate (within the limits of what the computer will allow ..... no more .... no less.

The VSync control gets set at the instant that the game window is created.... when the game starts.

Most people use the "dbSetDisplayMode ()" but that one does not access "VSync control".

In the new beta for DarkGDK one will find the replacement command.
" dbSetDisplayModeAntialias ( int iWidth, int iHeight, int iDepth, int iVSyncOn, int iMultisamplingFactor, int iMultimonitorMode)"

This is the command people should be using.
study
15
Years of Service
User Offline
Joined: 12th Dec 2008
Location:
Posted: 13th Dec 2008 09:16
Thanks a lot Friends
Quote: "
It is not a bug. It has to do with how you actually set up your computer system (video driver actually)
"

If it's because of my video card setting then why i can play other games in my computer with more fps? ie i play Live For Speed now with 92 fps without changing my video card settings. maybe the game changes video settings when i run game?
Thanks again
prasoc
16
Years of Service
User Offline
Joined: 8th Oct 2008
Location:
Posted: 13th Dec 2008 16:44
beware though, at the moment " dbSetDisplayModeAntialias " only does vsync, not anti aliasing
sydbod
16
Years of Service
User Offline
Joined: 14th Jun 2008
Location: Just look at the picture
Posted: 14th Dec 2008 02:06
Quote: "If it's because of my video card setting then why i can play other games in my computer with more fps? ie i play Live For Speed now with 92 fps............."


Please read my post directly above yours.
You should be using " dbSetDisplayModeAntialias ( int iWidth, int iHeight, int iDepth, int iVSyncOn, int iMultisamplingFactor, int iMultimonitorMode)" if you want to control the VSyncOn control of your video card driver.


Quote: "beware though, at the moment " dbSetDisplayModeAntialias " only does vsync, not anti aliasing"


The AA mode does work in that function when one only uses 3D objects in the game. Somehow, 2D images seem to throw it out of the AA mode. You are correct, that function looks to be buggy for AA for the moment.
Lilith
16
Years of Service
User Offline
Joined: 12th Feb 2008
Location: Dallas, TX
Posted: 14th Dec 2008 05:08
I'm not seeing this function as available to me. Is this in a certain version?

Lilith, Night Butterfly
I'm not a programmer but I play one in the office
sydbod
16
Years of Service
User Offline
Joined: 14th Jun 2008
Location: Just look at the picture
Posted: 14th Dec 2008 09:43
Look into the "Dark GDK Upgrade - November 2008 " thread for the updated DarkGDK file set that has that new function.

I suspect, once the DBPro 7.1 file set is complete and released in a hopefully few days time, then the DARK GDK and .NET should be upgraded in a very short time after that.
jason p sage
17
Years of Service
User Offline
Joined: 10th Jun 2007
Location: Ellington, CT USA
Posted: 14th Dec 2008 22:16
LOL - ugh.. had my decimal off .. thanx for clarifying

Lilith
16
Years of Service
User Offline
Joined: 12th Feb 2008
Location: Dallas, TX
Posted: 16th Dec 2008 05:40
Quote: "Look into the "Dark GDK Upgrade - November 2008 " thread for the updated DarkGDK file set that has that new function."


I installed it but the function still doesn't appear to be available. The DarkSKDDisplay.h header is dated properly but the function isn't isn't amongst the ones listed.

If I do find it I would assume that you use 1 or 0 to turn the feature on, ditto for the multi-monitor mode. But what does one use for the multi-sampling factor?

Lilith, Night Butterfly
I'm not a programmer but I play one in the office
sydbod
16
Years of Service
User Offline
Joined: 14th Jun 2008
Location: Just look at the picture
Posted: 16th Dec 2008 09:10
Did you install the latest version from near the bottom of the first page. Lee provided a link.
I don't think the first version of the same update had the new function.

These new functions are not listed as such, from what I know, they are just integrated as such and work.

The multi-sampling is the AA setting. A value of 4 sets the card to use 4xAA .... etc
Lilith
16
Years of Service
User Offline
Joined: 12th Feb 2008
Location: Dallas, TX
Posted: 17th Dec 2008 03:35
Grabbed that, installed it and I'm still not finding the function.

Lilith, Night Butterfly
I'm not a programmer but I play one in the office
sydbod
16
Years of Service
User Offline
Joined: 14th Jun 2008
Location: Just look at the picture
Posted: 17th Dec 2008 03:49 Edited at: 17th Dec 2008 03:50
At the very bottom of the DarkSDKDisplay.h you should see the following entry

Quote: "
void dbMinimizeWindow ( void );
void dbMaximizeWindow ( void );
int dbDesktopWidth ( void );
int dbDesktopHeight ( void );
bool dbSetDisplayModeVSync ( int iWidth, int iHeight, int iDepth, int iVSyncOn );
bool dbSetDisplayModeAntialias ( int iWidth, int iHeight, int iDepth, int iVSyncOn, int iMultisamplingFactor, int iMultimonitorMode );
"


Are you sure you updated the correct DarkGDK installation?
Lilith
16
Years of Service
User Offline
Joined: 12th Feb 2008
Location: Dallas, TX
Posted: 17th Dec 2008 04:12
Ack!! There it is. I did a copy/paste and it works. I think I may have capitalized the 'a' in alias.

Now, what's the iMultisamplingFactor used for and how?

Thanks,

Lilith, Night Butterfly
I'm not a programmer but I play one in the office
sydbod
16
Years of Service
User Offline
Joined: 14th Jun 2008
Location: Just look at the picture
Posted: 17th Dec 2008 04:44 Edited at: 17th Dec 2008 04:47
Quote: "what's the iMultisamplingFactor used for and how?"


That is the Antialias setting that the video card should use.

When you have a 3D image that is tilted slightly from the horizontal or vertical, the tilted edge will have a step appearance as the edge goes from one pixel row/column to the next pixel row/column.

The antialiasing value determines how many steps of transitional pixels with part color will be used to blend out that step appearance.

To show the difference, look at the outline of the aircraft in each picture.

With NO AA selected (value of 0) look at how jagged the aircraft outline is.

http://i16.photobucket.com/albums/b49/sydbod/1-1.jpg

Now a similar picture with a value of 4 .

http://i16.photobucket.com/albums/b49/sydbod/2-1.jpg
Much smoother outline.
Lilith
16
Years of Service
User Offline
Joined: 12th Feb 2008
Location: Dallas, TX
Posted: 17th Dec 2008 05:56
Kewl. Unfortunately I'm a 2D kinda girl. But turning off the vsync on my home PC upped the frame rate by a factor of almost 4.

Lilith, Night Butterfly
I'm not a programmer but I play one in the office
AlexI
19
Years of Service
User Offline
Joined: 31st Dec 2004
Location: UK
Posted: 18th Dec 2008 18:07
Does using this function stop the 60 fps limit?

Lilith
16
Years of Service
User Offline
Joined: 12th Feb 2008
Location: Dallas, TX
Posted: 18th Dec 2008 18:11
It did for me. But for my program I still only got about 110 fps both at home (2.3 Ghz) and at the office (3.2 GHZ). Still not shabby and it gets me around the issue of the hardware controlling my frame rate. If I'm going to have to deal with timer issues I'd rather not have the end user's refresh rate dictate how I program the timing.

Lilith, Night Butterfly
I'm not a programmer but I play one in the office
chanchan
15
Years of Service
User Offline
Joined: 17th Dec 2008
Location:
Posted: 19th Dec 2008 06:55
I read in some site, that human eye won't see any different from frame rate bigger than 60 fps.

So despite for benchmarking, there is no need to make frame rate bigger than 60 fps

Always learning
Lilith
16
Years of Service
User Offline
Joined: 12th Feb 2008
Location: Dallas, TX
Posted: 19th Dec 2008 07:03
Quote: "
I read in some site, that human eye won't see any different from frame rate bigger than 60 fps.

So despite for benchmarking, there is no need to make frame rate bigger than 60 fps
"


Probably can't see any different from 24 frames per second since that's the rate that film usually runs at. However, if I can run at a higher rate the motion of the objects is less coarse and the likelihood that a collision will not be so deep into the objects.

Lilith, Night Butterfly
I'm not a programmer but I play one in the office
chanchan
15
Years of Service
User Offline
Joined: 17th Dec 2008
Location:
Posted: 19th Dec 2008 07:12
Quote: "Probably can't see any different from 24 frames per second since that's the rate that film usually runs at."


Not quite, I still can se the flicker on movie if I concentrate hard. But I won't enjoy the movie that way.
And below 30fps games are slugish. it quite different from 60fps games

I use ATI Radeon x1050... is it ATI default-ing WaitForVerticalSync to on?
I don't install the ATI control panel (and my Installer CD is gone), so I can't configure it, can I?

Always learning
jason p sage
17
Years of Service
User Offline
Joined: 10th Jun 2007
Location: Ellington, CT USA
Posted: 19th Dec 2008 07:28
Note though that film has motion blur - and that can make a difference... just a thought.

also I read that the human eye framerate varies person to person anywhere from 30 to perhaps a bit higher than 60.. but I can't imagine much more than that if it even can interpret that. The brain has a visual "memory" (thats why we see blurs) that sort of meld multiple images together at once... images leave an impression on our cones and rods that lingers a little bit..

In fact if you read about how a motion blur shader is usually written .. I mean how it does its thing... its not all that different from what I'm suggesting the human eye does.

--Jason

FIGHTEX
15
Years of Service
User Offline
Joined: 30th Nov 2008
Location:
Posted: 19th Dec 2008 19:34
I have this problem 2.

Game.Love
jason p sage
17
Years of Service
User Offline
Joined: 10th Jun 2007
Location: Ellington, CT USA
Posted: 19th Dec 2008 20:33
ATI likely has that control panel available. This whole dbSyncRate issue, flawed or not, I have trouble with framerates being an issue.

At one point I saw this as a major issue, but its not. Now just to make sure I cover the bases - if its the flashing you see using ATI, well.. I don't know about that...

However I am familiar with frame rates and screen tearing. I think if some control is "missing" for sync rate, we can at least be thankful that it is "stuck at 60" versus not being capped at all.

For performance concerns, you set up your own timer and only call vSync when you get the 16.6 millisecond "tick" from your home made timer. Why? You can run code faster than the refresh rate/sync rate using this technique.

Now I'm working on a directX application that needs the absolute fastest performance one can get from directX, and tearing is not an issue because I'm dealing with points and wireframes mostly and frankly... its not a video game so tearing even with textures is acceptable because the only time the screen changes is when the user whips the mouse around to change the view.

I've read in a few articles that have said basically the same thing on this issue in regards to 3D in general - which has helped change my reasoning about tearing and screen rates.

I used to think the faster the batter... but I'm now convinced that this is only true during benchmark testing and while optimizing your code to run as smooth as fast as possible because the FPS can be a great gauge.

Tearing images in your final product just looks messed up... and 60FPS is actually a pretty darn fast render speed.

I'm now convinced this is why TGC doesn't comment all to much about it... now for testing... ok you might want that unthrottled FPS for optimization and benchmarking like I said... and ok.. I'll agree this is an issue for many of you but I also know that each video card has different programmable parameters... and as such some may exclude this sync flag "monitor refresh rate" lock to vsync and not make it easy or possible to change. I'm also aware that DBPor/DarkGDK etc.. might not honor your preferences for some video cards forcing the lock for reasons they have with making things just workright for that brand of video card/and or chip set.

I have a nVidia 9600 gt and that runs even my DBPRO Iron Infantry at 65-70 FPS which is much faster than my previous system could do and is obviously not capped at 60. I also has my nVidia set to software controlled - meaning the software "is allowed" to change this setting if it wants to.

Now I'm not yelling or meaning to sound authoritive and this post just has my views on the matter I am interpreting this discussion is mostly about.

Now I want to know how this issue is causing major problems developing your game/editor.

I have read many people mention game timing and how their game doesn't run right if the FPS sync rate can't be controlled. I took the advice of one of the news letters and just wrote all my code to use my own flavor of a home grown timer system. I already knew the importance of writing code so my program performs as well as possible regardless if its on an old clunker or a turbo jet computer system... and after readibng that newsletter I decided to not use refresh rate as a "cheap shortcut" to get around this extra labor. Now once I forced myself to write my own timing routines to handle object movement etc based on this home variety of timer code - I was quite pleased! Furthermore I have early programs of Iron Infantry's development without it and newer stuff you guys have maybe seen on google that does have it.. and you know - the code that does have runs identically regardless of the video card or computer specs.. unless of course the PC is so slow the software is chugging along very slowly.

--Jason

Lilith
16
Years of Service
User Offline
Joined: 12th Feb 2008
Location: Dallas, TX
Posted: 19th Dec 2008 22:26
The reason I get so concerned about frame rate is because I have to think about the other guy if I'm ever going to release my game(s). My home machine tops off at 30 fps with vertical sync active. My work machine tops at 60 fps. If I write a game at the office and find myself more restricted at home then it's more work that I need to do to make my game run smoothly. I have no guarantee that it will run as well on someone else's workstation and I don't have the resources to test it very easily.

Lilith, Night Butterfly
I'm not a programmer but I play one in the office
AlexI
19
Years of Service
User Offline
Joined: 31st Dec 2004
Location: UK
Posted: 20th Dec 2008 00:35 Edited at: 20th Dec 2008 00:37
I made a large level in irrLitch which ran 1000fps+ ! However a level half the size using the same objects was running about 30fps using DarkGDK.

roka
16
Years of Service
User Offline
Joined: 2nd Apr 2008
Location:
Posted: 20th Dec 2008 00:46
i have the same problem with dark basic pro , when i play on a other game and after work on my project the FPS dont move over 60 FPS , if i restart my computer , now i can work over 60 , strange

Roka The French Boy
jason p sage
17
Years of Service
User Offline
Joined: 10th Jun 2007
Location: Ellington, CT USA
Posted: 20th Dec 2008 02:10
@Lilith - Sorry your home PC is only getting 30fps... that seems like its maxed out regardless of the vert sync cap.

@AlexI - I don't understand the point you're making. Irrlicht doesnt lock FPS and TGC does? TGC stuff is just slow? Irrlicht is super fast? I can understand all these point I just don't know which one you're making.

@Roka - that is kinda bizarre... I don't have any idea what would be causing that except a simple finger point to TGC possibly altering a setting in your vid card that makes that happen... perhaps not setting certain registers or something back how they were. Typically in dx9, you clean up and "release/destroy" the adapter/device and it should be all swell.

--Jason

Lilith
16
Years of Service
User Offline
Joined: 12th Feb 2008
Location: Dallas, TX
Posted: 20th Dec 2008 02:48
Quote: "Sorry your home PC is only getting 30fps... that seems like its maxed out regardless of the vert sync cap."


Not so. With the vsync turned off I get roughly 110 fps. I just find it hard to believe that any modern video card would only refresh 30 times a second.

Lilith, Night Butterfly
I'm not a programmer but I play one in the office
jason p sage
17
Years of Service
User Offline
Joined: 10th Jun 2007
Location: Ellington, CT USA
Posted: 20th Dec 2008 04:27
Quote: "I just find it hard to believe that any modern video card would only refresh 30 times a second"
I did too.. but for me the solution was a PC upgrade (I had a pretty slow PC before the one I'm using now)

sydbod
16
Years of Service
User Offline
Joined: 14th Jun 2008
Location: Just look at the picture
Posted: 20th Dec 2008 05:02
Quote: "Not so. With the vsync turned off I get roughly 110 fps. I just find it hard to believe that any modern video card would only refresh 30 times a second.
"


Your own statement says that that same video card will also run the game at 110 FPS, so that should tell you that the problem is not the video card, but something else.

The video card will set its vertical scan rate to whatever is required for the video device ( LCD screen or CRT).

I would assume that since you are seeing 30FPS when using "Wait for VSync" mode, that you are using an LCD screen that has a 30 Hz refresh rate...... probably a very cheep or early generation LCD....... or you have 30Hz refresh rate selected in the video driver.
sydbod
16
Years of Service
User Offline
Joined: 14th Jun 2008
Location: Just look at the picture
Posted: 20th Dec 2008 05:13
Quote: "I made a large level in irrLitch which ran 1000fps+ ! However a level half the size using the same objects was running about 30fps using DarkGDK."


I suspect you may be comparing apples with oranges.

Are you sure you are doing the same in both environments.

If I had to take a guess, I would bet that "irrLitch" was automatically instancing multiple same objects, while you are still loading and displaying individual objects for identical objects within DGDK.

Who knows how many other features you are running in different modes between the two.
Lilith
16
Years of Service
User Offline
Joined: 12th Feb 2008
Location: Dallas, TX
Posted: 20th Dec 2008 05:21
The monitor has three settings, 60, 70 and 75 Hz. I've tried bumping it up to the higher levels and it continued to do 30 fps

Lilith, Night Butterfly
I'm not a programmer but I play one in the office
sydbod
16
Years of Service
User Offline
Joined: 14th Jun 2008
Location: Just look at the picture
Posted: 20th Dec 2008 05:29
Is it an LCD screen or a CRT display?
Lilith
16
Years of Service
User Offline
Joined: 12th Feb 2008
Location: Dallas, TX
Posted: 20th Dec 2008 05:31
LCD screen

Lilith, Night Butterfly
I'm not a programmer but I play one in the office
sydbod
16
Years of Service
User Offline
Joined: 14th Jun 2008
Location: Just look at the picture
Posted: 20th Dec 2008 05:37
I get the impression that your screen is forcing the video card to switch to 60Hz "Interlaced" mode.
This produces a full rendering at effectively 30Hz.

Do you have an old "CRT" screen that you can try, instead of the LCD, and see if the FPS jumps up to 60Hz with it?
Lilith
16
Years of Service
User Offline
Joined: 12th Feb 2008
Location: Dallas, TX
Posted: 20th Dec 2008 06:08
I used to have one as my second screen. It's around here somewhere but I'll have to dig for it.

Lilith, Night Butterfly
I'm not a programmer but I play one in the office
AlexI
19
Years of Service
User Offline
Joined: 31st Dec 2004
Location: UK
Posted: 21st Dec 2008 14:44
Quote: "@AlexI - I don't understand the point you're making. Irrlicht doesnt lock FPS and TGC does? TGC stuff is just slow? Irrlicht is super fast? I can understand all these point I just don't know which one you're making."


I was making the point that DarkGDK has capped the FPS and Irrlitch has not, how can I uncap DarkGDK?

jason p sage
17
Years of Service
User Offline
Joined: 10th Jun 2007
Location: Ellington, CT USA
Posted: 22nd Dec 2008 08:43
@Alexi - No idea how to beat the cap you're experiencing - but hopefully you can tweak code a bit to still utilize as much CPU as you can versus being throttled. Yes I understand you should have the option of running capped versus not - but I think it varies machine to machine - video card to video card - as I haven't hit this with my nVidia 9600 gt yet

Login to post a reply

Server time is: 2024-11-25 09:43:17
Your offset time is: 2024-11-25 09:43:17