Sorry your browser is not supported!

You are using an outdated browser that does not support modern web technologies, in order to use this site please update to a new browser.

Browsers supported include Chrome, FireFox, Safari, Opera, Internet Explorer 10+ or Microsoft Edge.

Dark GDK / 3D scene not rendering to some video cards

Author
Message
Phoenix73
16
Years of Service
User Offline
Joined: 27th May 2008
Location: Australia
Posted: 11th Jun 2008 06:32
I am experimenting in creating my own dynamic landscapes and found that using the dbMakeObjectTriangle function was very slow to build up a whole landscape, triangle by triangle.

I found some code in the forums to allow me to create faster objects using the SetupStandardVertex call. This is working very fast on two out of three of my computers - these two have discreet ATI graphics cards.

The third machine has an Intel integrated graphics card that could run my previous dbMakeObjectTriangle program and also runs all the tutorial programs that came with DarkGDK.

For some reason my new code that uses SetupStandardVertex doesn't generate any output on the integrated graphics card. The card is configured to use 128Mb of memory so I am not sure what feature I am missing on this card, or what is wrong with my code.

I have attached my source code to this message just in case anyone has a few minutes to see where I might be going wrong.

Thanks in advance!

Attachments

Login to view attachments

Login to post a reply

Server time is: 2024-11-20 13:19:27
Your offset time is: 2024-11-20 13:19:27