Sorry your browser is not supported!

You are using an outdated browser that does not support modern web technologies, in order to use this site please update to a new browser.

Browsers supported include Chrome, FireFox, Safari, Opera, Internet Explorer 10+ or Microsoft Edge.

Geek Culture / The Incredible Inefficiency of DBPro

Author
Message
Dark Java Dude 64
Community Leader
14
Years of Service
User Offline
Joined: 21st Sep 2010
Location: Neither here nor there nor anywhere
Posted: 27th Jul 2012 07:05
Randomness 128 and I did a little test comparing the speeds of java and DBPro. Well, he did. He made two programs, one in DBPro and one in java. They both filled 100,000,000 array elements. The DBPro program took 1930 milliseconds. That's pretty much two seconds. Java, on the other hand, took 170 milliseconds. .17 seconds that is.

He did this after we saw some posts regarding the ASM output of the DBPro compiler. We did some of our own tests. A simple program that creates an integer named number and stores a value of 0 in it outputs somewhere around 40-50 ASM commands. Another thing, I made a program that defines 3 functions and calls only 2. The compiler still included the uncalled function in the ASM, even though it wasn't needed.

Just some things I thought I'd say. BDPro is an excellent language, there is no arguing that. It's just not at all something you will want to use for really math or logic intensive applications. I think the comet and someone else also had some things they posted about the ASM outputs...
Jeku
Moderator
21
Years of Service
User Offline
Joined: 4th Jul 2003
Location: Vancouver, British Columbia, Canada
Posted: 27th Jul 2012 08:31
There's no point comparing Java to DBP. There's worlds of difference between those types of languages. It would be akin to bench-marking C++ with Visual Basic.


Senior Developer - CBS Interactive Music Group
TheComet
17
Years of Service
User Offline
Joined: 18th Oct 2007
Location: I`m under ur bridge eating ur goatz.
Posted: 27th Jul 2012 08:37
Even so, the DBP compiler is one of the most inefficient compilers out there. There's no optimisation whatsoever.

TheComet

nonZero
13
Years of Service
User Offline
Joined: 10th Jul 2011
Location: Dark Empire HQ, Otherworld, Silent Hill
Posted: 27th Jul 2012 09:07 Edited at: 27th Jul 2012 09:10
Yes, DBPro is not the most effecient but with assembly it is not always the number of lines that are important. In fact less lines can equal worse performance in some cases.

So far as including uncalled functions, I'm on the fence about that. DBP may have future uses for me that will require functions that are uncalled to be present in the ouput.
I think there need to be more preprocessor commands to address this and many others. Like say "#ALL_INCLUDE" for situations where it's necessary to keep uncalled functions.
The main thing I'd like to see is a c-style include system whereby one need only include necessary dbpro features. For eample, in cpp I can make an executable that outputs a word to the console at the cost of a few kb. The DBPro equivilent is going to cost me a few mb.

Remember too that DBPro is a games design package. Its purpose is to interface with DX9 without the need of complex programming skills. Of all the BASICs I have encountered, DBPro is still the highest performing and best maintained without losing too much on simplicity or features. If you want better performance, use Unity3D or something C-oriented (I believe U3D is c# but I've yet to try it out due to being so darn busy (and procrastinations like this)).

I think though, your speed descrepency might be due to memory management. DBPro is actually reasonably fast with basic math.

Anyway, just my opinion as a cat and by no means an expert like all the humans out there.

Dark Java Dude 64
Community Leader
14
Years of Service
User Offline
Joined: 21st Sep 2010
Location: Neither here nor there nor anywhere
Posted: 27th Jul 2012 09:33
True points mentioned above.
BMacZero
19
Years of Service
User Offline
Joined: 30th Dec 2005
Location: E:/ NA / USA
Posted: 27th Jul 2012 09:43
Quote: "The main thing I'd like to see is a c-style include system whereby one need only include necessary dbpro features. For eample, in cpp I can make an executable that outputs a word to the console at the cost of a few kb. The DBPro equivilent is going to cost me a few mb."

DBP actually does do this (automatically). Every DLL, including core DLLs, is only included if you actually call a command from it. Filesize is still quite high of course, probably because the basic DLLs such a program uses are themselves quite large.

http://brianmacintosh.com
"'Who's that primitive entity accessing sliding collision data from my bridge?!', the troll roared." - TheComet
Jeku
Moderator
21
Years of Service
User Offline
Joined: 4th Jul 2003
Location: Vancouver, British Columbia, Canada
Posted: 27th Jul 2012 23:37 Edited at: 27th Jul 2012 23:38
I'm still of the opinion that it's not the tools that make a game great, but the creator's skills. I've seen amazing games made in BASIC variants like PlayBasic and Visual Basic, and I've played muddy messes made with Java and OpenGL. If you're a hobbyist and you need the power of something like Unreal Engine 4, then you're in the minority and you should use UE4. But take a look at the famous indie games over the past 5 years. Many of them could technically have been made with a tool like DBP.


Senior Developer - CBS Interactive Music Group
swissolo
15
Years of Service
User Offline
Joined: 9th Jan 2010
Location:
Posted: 27th Jul 2012 23:51
Quote: "Even so, the DBP compiler is one of the most inefficient compilers out there. There's no optimisation whatsoever."

This made me happy.
Reminded me of compiling evolved's adv lighting after fully applying it to 10,000 lines of code

swis
Joined: Tue Dec 16th 2008
Interstellar
Dark Java Dude 64
Community Leader
14
Years of Service
User Offline
Joined: 21st Sep 2010
Location: Neither here nor there nor anywhere
Posted: 28th Jul 2012 00:01
That doesn't sound fun :/
nonZero
13
Years of Service
User Offline
Joined: 10th Jul 2011
Location: Dark Empire HQ, Otherworld, Silent Hill
Posted: 28th Jul 2012 00:36
Quote: "DBP actually does do this (automatically). Every DLL, including core DLLs, is only included if you actually call a command from it. Filesize is still quite high of course, probably because the basic DLLs such a program uses are themselves quite large."

Hmmm... Never noticed. Just assumed all were included coz of filesize, lol! Well that's good to know. At least there's some trimming.

Quote: "
I'm still of the opinion that it's not the tools that make a game great, but the creator's skills."

Totally, 100% agree. When it comes to games, you play what you're dealt.

Quote: "
Anyway, just my opinion as a cat and by no means an expert like all the humans out there."

Darn it that cat's been at my PC again. Oh well, I learned something from it so thanks cat. But please at least tell me you're gonna post OR get your own account

Aaron Miller
19
Years of Service
User Offline
Joined: 25th Feb 2006
Playing: osu!
Posted: 28th Jul 2012 00:58
I'm writing this from my phone right now. I'll post back later with more details and examples.

The original assembly outputs were in the posting competition. I'll link later.

@nonZero
Your statement annoys me greatly because you don't elaborate or explain which class of functions. You don't provide examples either. And, that's irrelevant given the context of the discussion: Dark Basic Professional does none of these optimizations.

----------------------------------

Assembly Line Count
More lines = more cache usage. The clock cycles of some instructions are much shorter than others. For example, IMUL is faster than IDIV and there's a trick to convert constant divides to constant multiplies with a few extra lines of assembly making the IMUL version faster. In the case of DBPro, no optimization is performed prior to the code generation phase. This is evident when "x = 1 + 2 + 3 + 4 + 5" produces a bunch of memory/register operations, which can cost more than the instructions themselves. (I read somewhere that memory fetches can cost about 500 cycles.) This is of course more inefficient than "mov dword [x], 15." Also, "exitfunction 0" should just be "xor eax, eax; retn" (where ';' is an instruction separater, not a comment). It's not iirc. Each line also does error checking, which is not even remotely smart.

Dead Functions
When a function is not used it should be removed. This is considered a good thing by everyone. Not removing dead functions has NO UTILITY. A dead function is by definition a function that is not used. Keeping it because "it might be used eventually" is not logical, because it won't be used eventually. If at some point you need to call the function just recompile, which you'd have to do anyway because you changed the code to require the function. If you need the function's pointer to call it at some other time, the issue is the language, not the optimization. Many BASICs have a way to get the function pointer.

Tools Versus Creativity
Tools aid creativity. They're meant to improve your efficiency. In many ways games are an art precisely because of the engineering effort involved with making them interactive at acceptable frame rates, among the other arts. The tool is only usable for what it does accomplish. If your code is as optimized as is practical with that tool and you don't have acceptable performance, well... It's not a very "professional" tool, is it? It's a prototyping tool and a learner's aid. Nothing more. Things are sold with it, yes, but that doesn't mean they're "professionally made." It doesn't mean they're innovative or pushing forward the art. That's what every game or engine programmer should be aiming for... Not how to make a simplistic product work for their code. If it works for you though, use it. If not, don't. When efficiency improvemnts come with a price tag ("Elite") when they should have never been an issue in the first place, just judge for yourself. It seems pretty silly.

How to Improve the DBPro Compiler
DBPro to C; compile with clang. Output will be more optimal. Everything will still work. It'd be more cross-platform (e.g., 64-bit Intel, ARM, etc) ready. All of the DLL loading and stuff could still work because it's all automated by the source language.

Miscellaneous
I'm not saying don't use DBPro. I'm saying it should be more efficient. That's all.

Cheers,
Aaron

Diggsey
19
Years of Service
User Offline
Joined: 24th Apr 2006
Location: On this web page.
Posted: 28th Jul 2012 03:31
The DBPro compiler itself was never about efficiency, it focuses on ease of use. The commands themselves are all written in C++ so should be quite fast.

If you really need speed, the upcoming release of DarkGDK 2.0 should be of some use! You will even be able to use it from both VB.net and PureBasic in case you want to stick with a BASIC language.

[b]
Aaron Miller
19
Years of Service
User Offline
Joined: 25th Feb 2006
Playing: osu!
Posted: 28th Jul 2012 03:47
The commands themselves are usually not fast at all either. Consider that images, objects, etc, are all given IDs. 1, 2, 3, etc. The underlying implementation for this is a std::map iirc. This is rather inefficient considering better algorithms and methods are available. For example, actual handles. Many people end up using ID managers just so they don't have to manually manage the IDs themselves, which is quite sensible considering that's how it works with memory. The language itself is unique among other BASICs, and I like certain portions of the syntax. However, the compiler associated with that syntax is immensely inefficient. Though efficiency is not the concern of the developers behind the language, it becomes the concern of most developers using it (who are serious, anyway). That lack of efficiency does not reflect well on the language. This is a concern as there's a large misconception that the language used (syntax) has a direct effect on the efficiency of the code. This is not true. The language used (if it can be compiled and does not rely on runtime operations) is just a syntax. For example, if Lee Bamber made a C compiler it would probably be just as inefficient as the DBPro compiler.

Random thought: DBPro should adopt the object/linking paradigm commonly found in C/C++ compilers. (I use Makefiles extensively.)

Anyway, even if efficiency isn't (and never was) the concern, it should be.

Cheers,
Aaron

Kevin Picone
22
Years of Service
User Offline
Joined: 27th Aug 2002
Location: Australia
Posted: 28th Jul 2012 04:00 Edited at: 28th Jul 2012 04:03
Dark basic dude79,

Quote: "
He made two programs, one in DBPro and one in java. They both filled 100,000,000 array elements
"


Benchmarking can be interesting, but one pitfall that can be easily made, is assuming were comparing equal operations, when they might not be. I suspect arrays aren't implemented the same in both dialects.


Quote: "
It's just not at all something you will want to use for really math or logic intensive applications.
"


But filling an array isn't comparing either.


It's well known that DBPRO doesn't produce the most optimal machine code result. It was clear during beta testing they were opt'ing for the function sets to make up from any short fall.

Phaelax
DBPro Master
22
Years of Service
User Offline
Joined: 16th Apr 2003
Location: Metropia
Posted: 28th Jul 2012 04:17
Quote: " took 170 milliseconds. .17 seconds that is"


huh?

"You're not going crazy. You're going sane in a crazy world!" ~Tick
Aaron Miller
19
Years of Service
User Offline
Joined: 25th Feb 2006
Playing: osu!
Posted: 28th Jul 2012 04:24
170/1000 = 0.17

1000 milliseconds per second.

Cheers,
Aaron

Phaelax
DBPro Master
22
Years of Service
User Offline
Joined: 16th Apr 2003
Location: Metropia
Posted: 28th Jul 2012 04:25
I thought he was saying it was 17 seconds.

"You're not going crazy. You're going sane in a crazy world!" ~Tick
Virtual Nomad
Moderator
19
Years of Service
User Offline
Joined: 14th Dec 2005
Location: SF Bay Area, USA
Posted: 28th Jul 2012 04:30
Quote: "I thought he was saying it was 17 seconds"

go calibrate your monitor already!

Virtual Nomad @ California, USA . DBPro V7.5
AMD Phenomâ„¢ X4 9750 Quad-Core @ 2.4 GHz . 8 GB PC2-6400 RAM
ATI Radeon HD 3650 @ 512 MB . Vista Home Premium 64 Bit
Aaron Miller
19
Years of Service
User Offline
Joined: 25th Feb 2006
Playing: osu!
Posted: 28th Jul 2012 05:03
Hahahaha

Yeah, I agree with Virtual Nomad. ;[i][/i])

Cheers,
Aaron

Phaelax
DBPro Master
22
Years of Service
User Offline
Joined: 16th Apr 2003
Location: Metropia
Posted: 28th Jul 2012 06:54
I thought it was a period to end the other sentence! Jeez, get off my back mom!

And I'm working on that!

"You're not going crazy. You're going sane in a crazy world!" ~Tick
BMacZero
19
Years of Service
User Offline
Joined: 30th Dec 2005
Location: E:/ NA / USA
Posted: 28th Jul 2012 06:56
Yeah, I thought it was ellipsis for suspense

http://brianmacintosh.com
"'Who's that primitive entity accessing sliding collision data from my bridge?!', the troll roared." - TheComet
Aaron Miller
19
Years of Service
User Offline
Joined: 25th Feb 2006
Playing: osu!
Posted: 28th Jul 2012 07:55
Trade on ForEx ... Profit!

The real suspense is in the trading though. xP

Cheers,
Aaron

Diggsey
19
Years of Service
User Offline
Joined: 24th Apr 2006
Location: On this web page.
Posted: 28th Jul 2012 19:04 Edited at: 28th Jul 2012 19:05
Quote: "The commands themselves are usually not fast at all either. Consider that images, objects, etc, are all given IDs. 1, 2, 3, etc. The underlying implementation for this is a std::map iirc. This is rather inefficient considering better algorithms and methods are available. For example, actual handles."


However, the implementation caches the most recently accessed ID, so that if you perform sequential commands on the same resource of a given type (a very common thing to do) it will only perform the lookup once. (Thanks IanM )

Even when using handles, to be completely safe from crashing it will still have to lookup each handle to check that it's valid, so there is no performance gain. Admittedly it does mean that ID managers can be avoided though!

[b]
nonZero
13
Years of Service
User Offline
Joined: 10th Jul 2011
Location: Dark Empire HQ, Otherworld, Silent Hill
Posted: 28th Jul 2012 19:14 Edited at: 28th Jul 2012 19:20
@Aaron:
Quote: "Your statement annoys me greatly"

You mean cat's statement.
(That comment was 'sposed to make you laugh - hope it didn't annoy you further :s )

Seriously though, saw your post on my email this morning and I was "Huh???" coz it basically implied one of the earliest things I learned was wrong (implying I shouldn't trust stuff I learn on the internet - as if the internet would lie). So I decided to actually investigate further (If someone challenges something I learned or was taught I will always re-evaluate it and do my best to find out if there is something wrong with the information because dogmatism only gets in the way of learning). Btw, I do agree about the general lack of optimization in DBPro. I was just saying that I was under the impression that optimization didn't necessarily always mean compact code. You see I didn't really take cache into account because modern processors boast relatively large cache sizes but I do see your point there and it's a very valid one. I was only thinking of the processing of instructions. So, for example, jumping back and forth may slow things down as apposed to just repeating the process on a linear fashion.

Repeating the same code block:
DBPro:

ASM Output:



Using a Function called multiple times:
DBPro:

ASM Output:


The code containing less lines actually takes longer while the greter number of lines processes quicker. If I missed something, then I'm sorry. Please excuse my lack of full knowledge here but I started learning ASM many moons ago and got sidetracked (the tutorial isn't even on my machine anymore, lol) and never picked up where I left off. It's possible (more than 'just') that I'm missing something (like maybe one of the dbprocore dll functions being called is slowing it down??). If that's the case then please lemme know I've messed up as I do need to know if something I think I know isn't what it appears to be.

Quote: "How to Improve the DBPro Compiler
DBPro to C; compile with clang. Output will be more optimal. Everything will still work. It'd be more cross-platform (e.g., 64-bit Intel, ARM, etc) ready. All of the DLL loading and stuff could still work because it's all automated by the source language."

I fully agree. In fact when you (I'm sure it was you who mentioned it on another thread I read recently) mentioned the idea the first time around I was really excited about the prospects of such a compiler as it would negate the need for a 64-bit version of DBPro (a pie-in-the-sky thought I had a while back) and allow infinite flexibility. This is actually WHY I WANTED THE OPTION TO KEEP UNCALLED FUNCTIONS in the ouput source - TO BUILD DLLs IN DBPRO Can you imagine how fun that would be *grins maniacally* (For the record I have an obsession with trying to make things do what they shouldn't do. I'm strange that way. Example: Vegetable peeler = cheese slicer - for real). Well, this is all based on the hypothetical compiler of course but still. So basically that's what I meant by the comment (I did say "future uses"). Sorry, I perhaps should have clarified myself - or rather the cat should have.

On the note of tools vs creativity, I'm afraid I have to stick with Jeku on that one. I still believe that it's about 30% tools, 70% developer - within reason of course. I mean somebody without a creative bone in their body can't produce anything but generic rubbish, regardless of their software. Okay, if they're using professional tools it'll be pretty polished generic rubbish but generic rubbish nonetheless.

Hopefully nothing in my post annoys you this time and I promise I'll have a word with the cat about making posts on my behalf. In fact I'll even noob slap the cat as I have given him a warning about posting on my other accounts already.

Anyway, here's to hoping Lee gives the go-ahead on this new compiler idea of yours as I think it's an excellent idea.

ionstream
20
Years of Service
User Offline
Joined: 4th Jul 2004
Location: Overweb
Posted: 28th Jul 2012 20:47
There's no reason to have DBPro convert to C to compile in clang when they could just make a DBPro frontend for LLVM (since clang is a C/C++/ObjC frontend for LLVM).

Dar13
17
Years of Service
User Offline
Joined: 12th May 2008
Location: Microsoft VisualStudio 2010 Professional
Posted: 29th Jul 2012 01:04
Quote: "There's no reason to have DBPro convert to C to compile in clang when they could just make a DBPro frontend for LLVM (since clang is a C/C++/ObjC frontend for LLVM)."

That would be really cool.

tiresius
22
Years of Service
User Offline
Joined: 13th Nov 2002
Location: MA USA
Posted: 29th Jul 2012 02:56
The least efficient way to make a game is to complain/obsess about the speed/implementation of the language... and then not make the game!

In other words, optimize your organic algorithm and go make some games.


A 3D marble platformer using Newton physics.
Aaron Miller
19
Years of Service
User Offline
Joined: 25th Feb 2006
Playing: osu!
Posted: 29th Jul 2012 07:45
Okay, I can be a bit more detailed now that it's easy to type again. (Was not near my net connection; only my phone worked for net.)

nonZero
Sorry, I wasn't saying you annoy me, just statements like that. Regardless, my bad.

Quote: "Even when using handles, to be completely safe from crashing it will still have to lookup each handle to check that it's valid, so there is no performance gain."

Your logic there interests me. You are aware that Windows has structured exception handling, which does not suffer from the performance drawbacks of C++'s exception handling because it's "hardware accelerated," so to speak, right? Additionally, had DBPro been written with G++ or Clang in mind (which obviously wasn't possible back then, but today is) they could use __builtin_expect to manage branch prediction. For error detection, this amounts to a "zero cost," (theoretically) according to Intel. (Optimization manual.) In my tests error handling with the branch prediction added in has been faster than error handling with the prediction left out. Regardless, SEH can still be used.

Quote: "The code containing less lines actually takes longer while the greter number of lines processes quicker."

That's loop unrolling for that particular set of operations. It reduces the number of memory fetches and cache may not become invalid as quickly. In general, loop unrolling is considered a good thing for operations like that.

However, for a single operation, generating more code does not mean that it's better. Consider that each instruction costs a certain number of cycles. Adding an integer to another integer should not be as expensive as DBPro makes it. It used to be a function call to a DLL library! That's still the case for adding floats and casting and such, which should not be the case. I'm sure you agree there though.

It's important to note that memory fetches do take quite a while, and DBPro makes a few on each line (mov dword [$_SLN_], <line number>; mov eax, [$_ERR_]) even when it shouldn't be. (I was looking for a paper I read yesterday, but there appears to be no trace of it. I got Google down to 27 results using terms I saw directly in the paper, but no luck. That paper would've really put some things in perspective too.)

Quote: "This is actually WHY I WANTED THE OPTION TO KEEP UNCALLED FUNCTIONS in the ouput source - TO BUILD DLLs IN DBPRO"

Yes, but then the functions would be referenced by the export table of the DLL and therefore not dead.

Quote: "On the note of tools vs creativity, I'm afraid I have to stick with Jeku on that one. I still believe that it's about 30% tools, 70% developer - within reason of course. I mean somebody without a creative bone in their body can't produce anything but generic rubbish, regardless of their software. Okay, if they're using professional tools it'll be pretty polished generic rubbish but generic rubbish nonetheless."

I'm actually being a bit technical on my side. Tools are at every step of the process and are unavoidable. There's the compiler, the modelling software, the audio recording or synthesis, voice over work, animation rigging, etc. Then there's your computer, which serves as the tool for running all the other tools, and the monitor which is meant to display it, and the keyboard, mouse, speakers, etc. Outside of computers there's the pencil and the pen. Sure, you can be really creative with stories and not use any tools at all to tell them... But then how effective is it? You can be creative, but if you're making a computer game, there are tools everywhere.

For programming though, it doesn't matter much what you use as long as it works well enough for your art. Being creative isn't just about the story you tell or the game you make, but also about the engineering ingenuity that goes into it, and how you work with your tools. It's an art-form there, and it can be one of the most creative parts of a game. For me, all of id Software's games are creative because each generation introduces a fundamental new technology that the world then implements. (Even FrostBite 2 uses virtual texturing in places.) Some tech isn't original (DOOM rendering, for example), some was thought up without knowledge of prior existence ("Carmack's Reverse" for stencil shadowing, DOOM 3), and some is completely original (virtual texturing, as far as I can tell). No matter what, their incorporation into the picture is, at least to me, an art-form and creative.

Quote: "Anyway, here's to hoping Lee gives the go-ahead on this new compiler idea of yours"

lol! If I were to make it I just wouldn't be able to call it "Dark Basic" unless he approved. I think it's trademarked, but I don't care to find out. But, all of the same syntax and DLLs can be used.

Quote: "There's no reason to have DBPro convert to C to compile in clang when they could just make a DBPro frontend for LLVM (since clang is a C/C++/ObjC frontend for LLVM)."

Although LLVM is a good option, there are reasons to use clang. And they are logical. First, the ability to output to C allows users the ability to learn. It's also easier to output to clang C than to generate the ASM. It might also be faster because DBP would act mostly as a pre-processor then and be able to pass data almost verbatim into clang. (Of course, error checking and such still has to be performed, but you get my point I think.) If it exports to clang, users can have the ability to include C code for any area that DBPro might be lacking. (Or C++, since clang drives both.) It may also be easier to interact with Mac OS X using objective-c as opposed to assembly. (Perhaps DBPro may offer intrinsics for some operations.)

So yes, LLVM is a good option if none of the above matters to any of the users and they care to learn how to interact with it. Otherwise, it'd be easier to just target clang, and it offers some fringe benefits.

Quote: "The least efficient way to make a game is to complain/obsess about the speed/implementation of the language... and then not make the game!"

A less efficient way than that is to use said language for years and then have to resort to migrating your game away from it because it's not efficient enough to run it at full capacity. That's after, of course, optimizing all your algorithms and dealing with work-arounds and sacrificing convenience for efficiency where both should be available. Still, if it works for you then use it. I'm not saying "go use this instead." This thread is about inefficiency, and it's in geek culture. It's not saying that DBPro can't be used to make a game. It's about expressing that it could be better. Way better.

Cheers,
Aaron

nonZero
13
Years of Service
User Offline
Joined: 10th Jul 2011
Location: Dark Empire HQ, Otherworld, Silent Hill
Posted: 29th Jul 2012 11:40
Quote: "Adding an integer to another integer should not be as expensive as DBPro makes it. It used to be a function call to a DLL library!"

WUUUT?

Quote: "Yes, but then the functions would be referenced by the export table of the DLL and therefore not dead"

Didn't think of it that way . I had this idea in my mind of the precompiler culling the uncalled functions before it got output as C-code coz of the previous post. Of course, logically, the precompiler would include exported functions. My mind's somewhere else lately and very much out of context.

Quote: "You can be creative, but if you're making a computer game, there are tools everywhere."

I didn't mean that there were less tools involved than creators, I meant that the quality of the tools were of lesser importance (that ratio) than the quality of the artists and coders. Tools help us realise our dreams, just as the tools used to create those tools helped somebody else reach their dreams too. I in no way meant tools weren't necessary. It would be kinda hard to justify my being here if I did.

Quote: "The least efficient way to make a game is to complain/obsess about the speed/implementation of the language... and then not make the game"

Nope, that's the second-least. The least efficient way to make a game is to procrastinate on the TGC forums...and then not get around to making the game.

Diggsey
19
Years of Service
User Offline
Joined: 24th Apr 2006
Location: On this web page.
Posted: 29th Jul 2012 13:11
Quote: "Your logic there interests me. You are aware that Windows has structured exception handling, which does not suffer from the performance drawbacks of C++'s exception handling because it's "hardware accelerated," so to speak, right? Additionally, had DBPro been written with G++ or Clang in mind (which obviously wasn't possible back then, but today is) they could use __builtin_expect to manage branch prediction. For error detection, this amounts to a "zero cost," (theoretically) according to Intel. (Optimization manual.) In my tests error handling with the branch prediction added in has been faster than error handling with the prediction left out. Regardless, SEH can still be used."


Memory protection only works on a per-page basis as we both know, so an exception is never guaranteed to be thrown when using an invalid handle - it could instead simply overwrite some other piece of memory and cause a crash at some later point

[b]
Aaron Miller
19
Years of Service
User Offline
Joined: 25th Feb 2006
Playing: osu!
Posted: 29th Jul 2012 13:49
Diggsey
There are methods in play that work around this. Feel free to email or IM me if you'd like details.

My main point, however, is that the handle checking wouldn't be more expensive at all. A proper handle/object system could've been used, or even memory could've been passed around directly. This works for other systems with error handling. (I don't want to turn this into a versus thread though, but I think everyone knows what I'm talking about here.)

Of course, checks still have to be performed, but if that's the case anyway, why bother with the ID system? Though it is unique (as far as I can tell), it doesn't really have any advantage over a normal handle system, IMO.

Cheers,
Aaron

Diggsey
19
Years of Service
User Offline
Joined: 24th Apr 2006
Location: On this web page.
Posted: 29th Jul 2012 14:06
Oh, I wasn't suggesting it was the best way, merely pointing out that it wasn't the performance destroyer it was being made out to be

[b]
Kevin Picone
22
Years of Service
User Offline
Joined: 27th Aug 2002
Location: Australia
Posted: 29th Jul 2012 15:14
The DBPRO to C model was a suggested topic before the compiler ever existed. One concern apparently was getting a free, small, fast and optimal C compiler for the back end back then. Today it's not much of an issue.

Interestingly, the initial editions of the compiler produced slower runtime code than DB classic, I suspect from way too much function call overhead. but anyway

Login to post a reply

Server time is: 2025-05-21 14:52:47
Your offset time is: 2025-05-21 14:52:47