Sorry your browser is not supported!

You are using an outdated browser that does not support modern web technologies, in order to use this site please update to a new browser.

Browsers supported include Chrome, FireFox, Safari, Opera, Internet Explorer 10+ or Microsoft Edge.

DarkBASIC Professional Discussion / DarkBASIC / DarkBASIC Pro Optimizer (Expressions Of Interest)

Author
Message
Kevin Picone
22
Years of Service
User Offline
Joined: 27th Aug 2002
Location: Australia
Posted: 19th Jul 2013 13:22 Edited at: 19th Jul 2013 13:25
DarkBASIC / DarkBASIC Pro Optimizer (Expressions Of Interest)

Ok... Long... long... story short, some 5 (or more) years back I wrote some source level optimization tools for in-house use, the project wasn't for DB / DBpro related but the same core principals could be applied, actually it'd work with AppGameKit tier 1 code also. Unfortunately most BASIC compilers have limited optimization support ( if any ), so they tend to produce a warts and all representation of your source code. The redundancies that you might think are nicely pre-computed away, end up in your object code. How much impact this makes upon your end program, really comes down how iteration intensive your program is. A general rule of thumb is the longer code gets, the more potential opportunities improvement.

Now before anybody gets overly excited (which I highly doubt), what we're talking about here is simply a pre-processor that parses your source code (multiple pass) and generates a more stream lined version of it. The programmer can control the region and mode the optimizer is applying using control codes hidden behind remarks. So the code should compile and run as normal without the optimizer, providing it's not dependant upon directives or .

The optimizer could be wedged into the build process, or stand alone (the old version is stand alone). Not all fragments of code can be stream lined however, as the focus is mainly on reducing variable / array, memory, string, math, logical redundancies from user code. Routines that are command set dependant would largely stay the same, some known operations could be substituted with alternative 'faster' methods without altering the original behavior, but the it makes no attempt to understand individual commands from the target languages command set. It simply assume what you give it, makes sense to begin with.


Possible Features:

* pre-solution of constant expressions (all operators including a good selection of standard functions).

* Redundant calculation removal / expression caching (within expressions and across lines & scopes)

* Operational replacement. (Replace / Rewrite known fragments with alternative versions, or substitute native commands with 3rd party library alternatives )

* Pre processor directives. #Unroll, #Inline, #Macro

* Obfuscation of output source code (optional).

* Full Syntax Evaluation. (Explicit Mode is possible, providing there's an easy way to create command set declarations. Quickly catch typos before passing to the compiler, improve your productivity ! )


That should give you some idea of what's possible without going into too much detail. The obvious question is going to be "how much faster will it make my program ?", sorry can't answer that without taking a much closer look at your coding style. What I can say is that most of these techniques are built into PlayBASIC, the redundancy trapping alone can double/triple the performance of some loops, where in others it won't gain anything.

Much of this stuff you could do yourself though, but it's far less intuitive. One way I like to visualize execution performance, is by imagining every operation in a loop / routine / functions costs $1 to execute. If that was the case, we'd all be lot more attentive to what needs to be constantly computed and what doesn't..

Anyway, let us know if it's worth the effort.

TheComet
17
Years of Service
User Offline
Joined: 18th Oct 2007
Location: I`m under ur bridge eating ur goatz.
Posted: 19th Jul 2013 14:05
OK, here are my thoughts.

Pre-compiling DBP code is (in my opinion) going to be not far from completely useless. The only case where it might be worth doing is if the programmer is retarded. In such a case, front-end code (the code being passed to the DBP compiler) could be made more optimized.

Have you seen the abomination of assembly output of a DBP executable? http://forums.nuclearglory.com/index.php?tid=2079

Here's an example:

Function GetSum(a,b)
Return a+b
EndFunction


In GCC:


In DBP:


You'd be far better off trying to optimize the generated assembly code. A post-compiler of some sort. Pre-compiling DBP code is like trying to optimize the workflow in a production line at a company and not attacking the problem at its roots: H-R and upper management.

TheComet

Chris Tate
DBPro Master
16
Years of Service
User Offline
Joined: 29th Aug 2008
Location: London, England
Posted: 19th Jul 2013 16:42
That all sounds good to me. Anything from catching typos before compilation and boosting the compilation speed is most certainly beneficial. My available time and expertise does not permit me to optimize as much as I would like to. I am now having to compile more than one executable for my projects due to an unexpected function count limit, compiling 90,000 lines 20 to 30 times a day.

Kevin Picone
22
Years of Service
User Offline
Joined: 27th Aug 2002
Location: Australia
Posted: 19th Jul 2013 22:35 Edited at: 19th Jul 2013 22:39
@TheComet

Quote: "Pre-compiling DBP code is (in my opinion) going to be not far from completely useless. The only case where it might be worth doing is if the programmer is retarded. In such a case, front-end code (the code being passed to the DBP compiler) could be made more optimized.
"


Yeah.. was expecting that reaction


Quote: "Have you seen the abomination of assembly output of a DBP executable?"


That's actually why it's necessary. The DBpro compiler simply spits out every simple operation in your program, even when they're completely redundant.

ie.



produces,



Normally the optimizer would catch / precompute such calculations so they never make to the code generation stage, but not here.

If we did, we get,



becomes,



I know, I know.. how useless.. But redundancies don't just occur in constant expressions, BASIC programs are full of them. Take this solution for drawing a grid.



there's at least 3 operations on the BOX line that are dead weight.



This is really what the solver does, it tries to only compute values when they're needed. Not just within a single expression, but across the current scope.



@Chris,

Quote: "
Anything from catching typos before compilation and boosting the compilation speed is most certainly beneficial
"


Now... I haven't tied this targeting DBPRO, but would be surprised if the end compilation speed from the Dbpro, was dramatically quicker. Obviously, if pre-processor was able to strip a lot of material from the input sources. Say, like toggling a debug modes, or switching off the intro/menus or whatever.. Then i'd expect a quicker built time from Dbpro.

ie.



So post processing the source, the output that goes to Dbpro would be,



The function limit thing sounds an awful lot like the variable limits (Test Dark Basic Compilers Variable Limits ) etc, people were hitting in DB classic back in the day. Can think of way to get around it, but it's not pretty.

Chris Tate
DBPro Master
16
Years of Service
User Offline
Joined: 29th Aug 2008
Location: London, England
Posted: 20th Jul 2013 04:44
Interesting

Kevin Picone
22
Years of Service
User Offline
Joined: 27th Aug 2002
Location: Australia
Posted: 21st Jul 2013 18:17
@Chris,

It's possible to strip most, if not all functions from a program by converting them into subs. The scoping rules would be the same as it'd protect locals from variable/array name collisions. Function calls effectively become a Gosub/Return. DBpro compiler assembly does seem somewhat cleaner when everything is in the one scope. But the output code becomes a huge rats nest

Do you use a lot of GET/SET styled abstraction functions in your libraries ?

Chris Tate
DBPro Master
16
Years of Service
User Offline
Joined: 29th Aug 2008
Location: London, England
Posted: 21st Jul 2013 20:25 Edited at: 21st Jul 2013 20:28
I do use alot of GET/SET function calls for dynamic classes; stuff defined outside of DBP or by the user. I also use lots of function pointer algorithms; functions picked outside of DBP or by the user.

ShellfishGames
12
Years of Service
User Offline
Joined: 6th Feb 2013
Location:
Posted: 22nd Jul 2013 00:35 Edited at: 22nd Jul 2013 00:38
Quote: "* Pre processor directives. #Unroll, #Inline, #Macro"


Oh, that would be so great. Especially Inline Functions.. I hate the idea that extremely low level operations implemented as functions (such as the get/set functions you just mentioned) get that huge, unnecessary overhead. I often go out of my way to implement things more efficiently in a far less intuitive/easy/flexible way just to avoid that.

I remember having read a thread on the topic a while ago (this one), but it didn't yield any actual results.

Quote: "Pre-compiling DBP code is (in my opinion) going to be not far from completely useless. The only case where it might be worth doing is if the programmer is retarded."


I think that's a little harsh... pre-compiling DBP code could be used to reduce overhead of all sorts, so for certain kinds of code it would probably make quite a difference. Also a pre-compiler could make programming a lot more comfortable. Such things as getting the date of compilation or line numbers or converting characters to ASCII-codes at compile time can be really handy.


Edit:

Quote: "It's possible to strip most, if not all functions from a program by converting them into subs. The scoping rules would be the same as it'd protect locals from variable/array name collisions."


Wouldn't that be problematic when dealing with recursion? You would need a stack for each local variable.. right?

Kevin Picone
22
Years of Service
User Offline
Joined: 27th Aug 2002
Location: Australia
Posted: 23rd Jul 2013 05:23 Edited at: 23rd Jul 2013 05:36
@ShellfishGames,

Quote: "Oh, that would be so great. Especially Inline Functions.. I hate the idea that extremely low level operations implemented as functions (such as the get/set functions you just mentioned) get that huge, unnecessary overhead. I often go out of my way to implement things more efficiently in a far less intuitive/easy/flexible way just to avoid that. "


yeah it's most suited for small functions, since they're basically treat as macros really, effectively dumping a localized block into the calling code.

here's an example for those that might not follow what we're talking about.



So the inlined result isn't particularly pretty in this case.




As you can see this actually creates redundant calculations within the object progressing for/next loop, but these can also be pre-solved since 'Object' and 'SizeOfObjectArray' variables aren't changing inside the loop.




Yep... A human could easily do better. But, just by tagging the compares inside the functions with #IF / #ENDIF directives, this would help more the pre-processor out more.





So when our RemoveSafeCode define is set to 1, the inner loop becomes.





Still not particularly pretty since the code needs to compile to run without the pre-processor. If it could be made to stand between the IDE and Dbpro compiler, then really the sky is the limit.



Quote: "Wouldn't that be problematic when dealing with recursion? You would need a stack for each local variable.. right?
"


Yep, it'd need to classify each functions up front, detecting when/if a function is used, what it's dependent upon, number of exit points and if function looks recursive. A recursive function that call some other user functions for example, would turn off any call short cutting. Unsure as to how useful that would be in DB / DBpro at this point, since it just occurs to me as way around any function limit.

Login to post a reply

Server time is: 2025-08-08 19:25:03
Your offset time is: 2025-08-08 19:25:03