Sorry your browser is not supported!

You are using an outdated browser that does not support modern web technologies, in order to use this site please update to a new browser.

Browsers supported include Chrome, FireFox, Safari, Opera, Internet Explorer 10+ or Microsoft Edge.

Work in Progress / Et Sulium Sao Eterniae

Author
Message
Ortu
DBPro Master
16
Years of Service
User Offline
Joined: 21st Nov 2007
Location: Austin, TX
Ortu
DBPro Master
16
Years of Service
User Offline
Joined: 21st Nov 2007
Location: Austin, TX
Posted: 29th Aug 2014 10:11 Edited at: 30th Aug 2014 08:34
Intro

I’ve been working on a single player RPG for a while now, (actually 2 years off and on with some fairly lengthy breaks at times) and although it’s still in very early stages, I’m at a point now where I am confident in my ability to pull it off given time.

The game itself is planned to be deeply story driven with a strong focus on characters. There will of course be engaging action and combat, exploration and decision making, but I truly want you to be invested in the characters, who they are, what they are doing, and why.

Most of the work so far has been preparation and technical coding, so there isn’t much in the way of gameplay yet, but I promise there is much more to come!

This thread will be part WIP, part dev log, and part design theory.

Currently I’ve just finished integrating Evolved’s Advanced Lighting into the project. Yes, just like everyone else… I know a bunch of people are jumping in with this lately, and projects using it will look sort of same-y until people start replacing his stock media with custom textures and shader tweaks. Honestly though, I don’t mind seeing a bunch of people using it, it has clearly helped to draw some interest back to DBpro and I’m confident that projects can and will begin to differentiate themselves visually as things progress.

I’ll post up some current screens and a video to show where the project stands now, but then I’m going to backtrack a bit and cover what has already been done to reach this point.

-----------------------------------------------------------------

So here is the present:

Video quality is a bit laggy, sorry, it will take some experimentation to get a good balance of visual quality, framerate, and filesize.

This video presents the current state of the engine. Note the character and camera control, collision detection and physics, lighting and shaders, a flexible and responsive UI. It also has pause functionality and much more. Over the next several posts I will back up and go into each of these components in more detail.



Here is a short FRAPS capture, it doesn't show as much but is a better quality recording as far as lag and color banding.



-----------------------------------------------------------------

And here is the past:

Part 1 – Concept and Pre-Development

My Thoughts:

For any large-scale project, I think that it’s good to start with some preliminary work on overall concept and design of the content. This step is entirely research and creative writing/thinking, and while it requires an understanding of what you are capable of successfully producing in terms of code and media, it shouldn’t involve any actual work on either.

You don’t need to completely flesh out and finish a story, or define every character with complete backgrounds, and work out every plot line and twist, but you should put together a basic overview, a vision of what you want the game to be, how it should feel, who it is for, what sort of mechanics, themes and styles it will use.

It doesn’t need to be a full, formal game design document, though it certainly can be if you want. The important thing is that you feel comfortable in the understanding of what the game is and where it is going.

Personally, I like maps, probably too much. I draw them for everything. It’s one of the first things I do when it comes to starting a lengthy story. Once you have a world pinned down, its people start to emerge. They are shaped by their geography and neighbors. The history of nations and peoples begins to build and the world as a whole starts to come together, feeling ever more solid and real.

It’s important that everything fits together and isn’t just thrown in at random because you suddenly realize something needs to be present or needs to be a certain way for something else you’ve just decided to add. Changes and additions can always be made, and will be, but they need to be cohesive and have some kind of plausible root or link to the overall whole.

Once I have built a general culture, I can begin to identify and flesh out individual characters within that culture, their personal stories and interactions with other individuals, who they are as a person and how they fit into their greater society. This is all important for determining the motivations behind their actions, particularly for villains.

A character should not do evil things because they are ‘evil’ or because they are the ‘villain’, that’s simplistic and creates shallow characters. A character should be the evil villain because of the things that they do. The question then, is why do they do these things?

Often it’s far more complex than any one thing. To humanize a character, their actions and beliefs need to be the result of identifiable human motivations. The villain doesn’t need to conform to any standard concept of ‘evil’, at times they can simply stand in opposition or disagreement with the player for any number of reasons. This can allow for interesting changes in factions and alliances as things progress, which is not always possible when things are simply black and white.

The Results:

While there will certainly be action elements and hopefully fun and engaging combat, this game will be largely story and character driven, and it will be left up to the player to decide who is in the right, who is in the wrong, and why. There is rarely a clear cut good and evil. Players will face complex choices between loyalty and love, duty and honor, power and morality in which there isn’t a single ‘right’ answer, it will be subjective to the player.

This will be a branching story in which your choices can drastically alter your experience. There are entire levels, which a player may not even see, depending on their actions. And the allies of one play through may be the enemies of another.

All actions and choices have consequence. Time is always passing, and events in a location will continue to progress with or without you. Your presence or absence in a given place at a given time will also alter the story you experience.

Progress:

The overarching plot and its lines of possibility have been largely written and mapped. I have clear written concepts and descriptions of each level and its events, of the characters present and their possible interactions and options.

Most of the major characters have been identified. Backgrounds, personalities and motivations have been mostly fleshed out, though much of the player’s own character has been left open for the player to determine and individualize as play progresses.

Visual concepting of the characters, mostly for modeling, still needs a lot of work. I’ve only really begun work on one or two, and these are mostly incomplete. I’m also not much of a traditional artist and my sketching/painting leaves a lot to be desired :/

I put together a name generator utility to assist in character development as well as for more minor NPC generation, discussed here:

http://forum.thegamecreators.com/?m=forum_view&t=203419&b=1&msg=2433559#m2433559

I had begun writing out the detailed dialogue script for the initial level, which may also serve as a demo level, but have decided to set that aside until I put together a specialized editor to handle the creation and management of story elements which include things like dialogue and response, events, triggers, and cinematic camera handling for these story bits which fall outside of the normal gameplay control. This conversation mode will end up as an important part of the game experience, and will be fully immersive and interactive. The use of cut scenes will be tightly limited.

OK, So back to the general pre-development discussion: preparation and ideas only go so far before it is time to start getting things going in game. With a solid concept base to refer to, it’s time to start laying down some code.

If there are any mechanics or components you’re not sure about at this stage, it can be helpful to work up a small isolated proof of concept for that particular bit before starting on the larger project as a whole. It’s important to understand or at least have a solid idea on how each piece will eventually fit together, so that when you do begin the main project, everything can more easily slip into place without having to rework the core foundations again and again as new pieces are added.[u]

Ortu
DBPro Master
16
Years of Service
User Offline
Joined: 21st Nov 2007
Location: Austin, TX
Posted: 29th Aug 2014 10:47 Edited at: 29th Aug 2014 10:59












MrValentine
AGK Backer
13
Years of Service
User Offline
Joined: 5th Dec 2010
Playing: FFVII
Posted: 29th Aug 2014 15:20
Can you elaborate on the various components you are using?

Physics? Etc...

Nice work so far!

Ortu
DBPro Master
16
Years of Service
User Offline
Joined: 21st Nov 2007
Location: Austin, TX
Posted: 29th Aug 2014 18:34
MrV: Thanks, appreciate it!

Yep, I plan to go in depth on each piece, at least one solid update per week. They take a good bit of time to write up and prepare though so I don't want to commit to more than that just yet.

As to the physics specifically, I am not currently using a physics engine or package like darkphysics or havok or anything, I am however applying specific physics principles and calculations where appropriate.

Movement has inertia: speed ramps up and down over a short curve before hitting a maximum or stopping. When coming to a stop from a sprint, you first slow to a jog then to a walk.

Movement slows as slope grades get steeper, and you will slide and fall when it gets too steep.

Jumping and projectiles are handled through arc trajectory calculations and are affected by velocity, weight, gravity, etc.

The combat system in particular will be more rooted in physics principles than most RPGs, but that is a story for another day

I do intend for players to be able to interact with the environment in some ways, but closer to Ocarina of Time than to HL2 if that makes sense.

I may eventually implement a true physics engine, but so far, a few specific calculations have been enough to cover my needs and performance remains high. Fortunately, the project is highly modular in design and it is pretty easy to swap components in and out.

Chris Tate
DBPro Master
15
Years of Service
Recently Online
Joined: 29th Aug 2008
Location: London, England
Posted: 29th Aug 2014 18:42 Edited at: 29th Aug 2014 18:49
Congratulations for posting up your project; and my what a great one. Wow, it is so good to see a new RPG made with DBP.

I am looking forward to reading more, seeing how the game progresses, and in particular your artwork and the character system you end up designing.

Keep at it. Do not underestimate your potential.

Ortu
DBPro Master
16
Years of Service
User Offline
Joined: 21st Nov 2007
Location: Austin, TX
Posted: 29th Aug 2014 19:03
Thanks Chris.

You know, I've known for a while that it was probably past time to share, but have always held back because there was one more thing I felt needed to be done, then one more thing after that. I wanted to get the cottage complete, at least the walls textured and a roof added in and this will indeed be at the top of the list, but waiting to post any further is really just procrastination at this point.

This character isn't complete, texturing and hair and animation all need more work, but she is far enough along to perform basic game functions and that's a good thing! Perhaps more importantly though, she has helped establish a solid workflow for asset production and helped me work out a bunch of the tricky little caveats involved in getting a pipeline going from Blender to DBPro and working with both Sparky's Collision and Evolved's Advanced Lighting. Each of those three have things in them that you need to be mindful of when creating the model and exporting it or something just isn't going to work properly.

This is also not intended to be the main player character, that one is still in the concept phase and needs more design work before I begin to model.

wattywatts
14
Years of Service
User Offline
Joined: 25th May 2009
Location: Michigan
Posted: 29th Aug 2014 20:30
Well first off, it looks pretty nice indeed. I'm having a hard time believing that character render is under 7k tri's. I'm pretty sure I spy some additional geometry not shown in the wireframe, is that a render without diffuse, but with a normal map?
Or just low poly wireframe on top of a hi poly model? Or am I totally off?
Chris Tate
DBPro Master
15
Years of Service
Recently Online
Joined: 29th Aug 2008
Location: London, England
Posted: 29th Aug 2014 20:40 Edited at: 29th Aug 2014 20:45
I see your point.

It can be frustrating to present a work in progress inferior to what you intend to release; even when the game is released there may be features that won't quite meet your expectations; as was the case with George Lucas and Star Wars, which took 30 years after its release to satisfy him with the remastered sound effects.

Trust that the majority of people on these kinds of communities are aware of why certain things are unfinished and take time to complete; only on a few occasions some non-developers may question why an object has no textures or why a certain 3D model does not look right, perhaps the eyes are too big or the wood is too orange; but you know this! You look at it 100 times a day, surely if you'd be aware of the obvious. There are just 10,000 other more urgent things you are busy with.

The reality is that it is a long process to make things look perfect and work nicely; as you have no doubt experienced, it can take days to rectify a small problem but not everyone can understand that. But for the many who do, this is well past the WIP start date, IMO; I look forward to next one.

PS

If you struggle to get video capturing to run smoothly, providing it is the video capturing tool that is lagging the simulation; use Matrix1 NICE SLEEP to give the capturing tool more of the processing; if you have a multi-core PC, set the affinity of the capture tool to run on a seperate core using the task manager.

Ortu
DBPro Master
16
Years of Service
User Offline
Joined: 21st Nov 2007
Location: Austin, TX
Posted: 29th Aug 2014 21:09 Edited at: 29th Aug 2014 21:25
Watty:

the 3rd picture down, the top row is the straight high poly, the bottom row is low poly with normal mapping, baked down from the high poly.

The 6766 tri count includes what is seen in the wireframe shot: ie it does not include hair, eyes, teeth, the straps and lacing between the sleeves and shirt, nor the weapons which are completely separate assets useable by multiple characters.

The wireframe is layed over the normal mapped low poly mesh which is identicle to the wireframe mesh itself. In this same pic, the fully textured bottom right is the low poly with diffuse, normal, and specular maps applied, to give an idea of how the high poly above it translated down. most of these shots are older and the entire progress of this model can be seen in the 3d board WAYWO threads for 2013 and 2014, pretty much everything I've worked on for the last few years has been with the intention of contributing to this project in one way or another:

http://forum.thegamecreators.com/?m=forum_view&t=202743&b=3&msg=2423751#m2423751

http://forum.thegamecreators.com/?m=forum_view&t=209369&b=3&msg=2502454#m2502454

Now including hair, eyes, teeth, and straps, the final tri count when loaded into game as seen in the vid is just under 11000 which does exceed my original target of 10k, but it is within an acceptable margin for a major character. I'll do at least one more even lower poly LOD version targeted at 2k - 2.5k

the other images 1, 2, and 4 are all high poly renders

Ortu
DBPro Master
16
Years of Service
User Offline
Joined: 21st Nov 2007
Location: Austin, TX
Posted: 29th Aug 2014 21:16
"use Matrix1 NICE SLEEP" - This is already being utilized it is not currently set to give much back though as it is easier to detect when some new change gives an unusual hit to frame rates when the frame rates are kept higher. a change that kills 30-40fps can go unnoticed when frames are capped or kept at a standard 60, but are immediately spotted when you suddenly drop from 130 to 80. After the bulk of engine developement gets completed and work moves into just content, this will smooth out and more unused time will be given back to the system.

I'll play around with pushing it higher for video capture sessions though and "set the affinity of the capture tool to run on a seperate core using the task manager." this is a really good idea, thanks.

wattywatts
14
Years of Service
User Offline
Joined: 25th May 2009
Location: Michigan
Posted: 29th Aug 2014 21:49
Quote: "Now including hair, eyes, teeth, and straps, the final tri count when loaded into game as seen in the vid is just under 11000 which does exceed my original target of 10k, but it is within an acceptable margin for a major character. I'll do at least one more even lower poly LOD version targeted at 2k - 2.5k
"

Well that's some pretty impressive work! I would have never guessed the model was that low from the video, and I pay close attention to such things.
Ortu
DBPro Master
16
Years of Service
User Offline
Joined: 21st Nov 2007
Location: Austin, TX
Posted: 29th Aug 2014 22:53
Quote: "Quote: "Now including hair, eyes, teeth, and straps, the final tri count when loaded into game as seen in the vid is just under 11000 which does exceed my original target of 10k, but it is within an acceptable margin for a major character. I'll do at least one more even lower poly LOD version targeted at 2k - 2.5k
"
Well that's some pretty impressive work! I would have never guessed the model was that low from the video, and I pay close attention to such things. "


Thanks man, that's what I was hoping for. It really is amazing what a decent normal map can do combined with good modeling fundamentals in the sillouhette. My skill pales in comparison to the professionals in places like polycount, and I have been learning and improving on them for 5 or 6 years to even get to this point, but it's starting to pay off

Quel
15
Years of Service
User Offline
Joined: 13th Mar 2009
Location:
Posted: 30th Aug 2014 01:20
Make her eyes a bit relaxed or something, she's creepy now, otherwise a very good character!

The video doesn't show off that much of a programming skill, but i see some nice building interaction and character management so it's finally not the average "here's Evolved's shader pack i'm king of the world!" kinda stuff!

I like it so far, just please remember not to throw in one performance killer feature after an other because your machine makes it look running okay-ly... i'm so happy to see new stuff pop up here and there written in DBPro, but i'm kinda tired of promising shots, which then lead to a demo which run for 5 seconds at 10 fps then freezes.
wattywatts
14
Years of Service
User Offline
Joined: 25th May 2009
Location: Michigan
Posted: 30th Aug 2014 06:14
I don't think the game is choppy, I think it's the recording software. The only way I've found to record gameplay without lag is using FRAPS. Pretty much all the recording software that captures the desktop is like that, though I never personally tried Chris's example of changing the processor core.
Ortu
DBPro Master
16
Years of Service
User Offline
Joined: 21st Nov 2007
Location: Austin, TX
Posted: 30th Aug 2014 06:59 Edited at: 30th Aug 2014 07:13
Quote: "I don't think the game is choppy, I think it's the recording software. The only way I've found to record gameplay without lag is using FRAPS. Pretty much all the recording software that captures the desktop is like that, though I never personally tried Chris's example of changing the processor core. "


Yeah, I use FRAPS as well, the only problem with it is that it doesn't capture the desktop or anything outside the game.

Quote: "Make her eyes a bit relaxed or something, she's creepy now, otherwise a very good character!

The video doesn't show off that much of a programming skill, but i see some nice building interaction and character management so it's finally not the average "here's Evolved's shader pack i'm king of the world!" kinda stuff!

I like it so far, just please remember not to throw in one performance killer feature after an other because your machine makes it look running okay-ly... i'm so happy to see new stuff pop up here and there written in DBPro, but i'm kinda tired of promising shots, which then lead to a demo which run for 5 seconds at 10 fps then freezes. "


Yeah, I'm definitely not claiming to be anywhere near as skilled in programming as many on this forum and I have no interest in making the coolest ground-breaking tech. What I want is to make a fun game, that looks decent for last-gen/dx9 standards while maintaining a solid 60+ fps. Believe me when I say that performance is among my absolute top priorities.

As to the eyes, you're probably right, they are still in the rigging pose position. The lids, eyeballs and mouth are all skinned for fully articulate animation, but I have not yet finished all of the gross animation and so have also not yet started on the fine.

Thanks for the interest, and I think you'll like next weeks update. It will go into more detail on the structural design of my core framework, modules, and module loading.

Chris Tate
DBPro Master
15
Years of Service
Recently Online
Joined: 29th Aug 2008
Location: London, England
Posted: 30th Aug 2014 20:59 Edited at: 30th Aug 2014 21:01
There is hardly anything wrong with her eyes; and there is a multitude of other less trivial things to be worrying about at this present moment.

Rudolpho
18
Years of Service
User Offline
Joined: 28th Dec 2005
Location: Sweden
Posted: 31st Aug 2014 14:01
It's looking / sounding really nice and seems you have been giving it lots of thought, nice work. Good to see someone paying attention to game design at an early stage, it sure helps out later on. Also that's a great looking character IMO! Sure there will always be small things to add or be improved upon but still
I'm looking forward to see the future development of this, and like others have said before, it's really nice to see a larger game being in development using DBPro again.

Quote: "What I want is to make a fun game, that looks decent"

That's probably a really wise attitude for actually having a chance at completing something you'll be "content enough" with in the end.

Quote: "Thanks for the interest, and I think you'll like next weeks update. It will go into more detail on the structural design of my core framework, modules, and module loading."

Looking forward to it!


"Why do programmers get Halloween and Christmas mixed up?"
Ortu
DBPro Master
16
Years of Service
User Offline
Joined: 21st Nov 2007
Location: Austin, TX
Posted: 4th Sep 2014 04:02 Edited at: 4th Sep 2014 18:30
Part 2 – Code Foundation and Groundwork

My thoughts:

This is not the first ‘large’ project that I have started, and with each of the past projects, I would eventually reach a point where I realize that I could have started it off better. Learning from these past endeavors, keeping in mind what worked and what didn’t, and most importantly why, the first thing that I decided when I thought about how to begin laying out the code on this new project, was that I wanted a modular framework that would be easily extensible as the project progressed. I also wanted it to be generic and flexible enough that it could easily be reused as the groundwork for future projects.

Some modules like character data handling would by nature be largely game specific and need to be heavily modified from game to game, but the framework itself, and certain common utilities like math or xml function libraries should be more universal.

For extensibility and flexibility, the core system shouldn’t need to have anything hard coded about what is loaded, or what update functions to call, or what it is handling specifically. All program actions should be directed dynamically by what has been loaded into the framework.

As each module is loaded, it will add its update handlers to an automated update queue for the core system to work with.

Each module is responsible for constructing its own data handlers, configuring and adding its update handlers to the queue, and loading any other modules it depends on if these required modules are not already loaded.

This allows me to break up a project into small, basic components to build, test, and work with individually. I no longer need to tackle an entire massive project to feel that I’ve completed something, or to see concrete results; I only need to knock out the next module.

The results:

I have a very simple and short main loop, indeed main source file, which becomes a basic template that can be carried to any project I start in the future. This is accompanied with the core framework file with some basic configuration that can be easily modified to adapt to any given new project.

Progress:

The framework core is complete and able to incorporate new modules as they are completed.

Completed and/or functional modules include:

-Math Library –
this won’t ever really be ‘complete’ as it is more a collection of useful functions than an actual engine component, and new useful functions can always be added, but it is in place and fully functional.

-Player Controls –
This handles player input control of the camera and of the player’s character, as well as setting character state/status as a result of the input.

-Character Management –
This module handles the data, state, and status of the characters present. This module handles the updating of all character positions, rotations etc including the player’s character. This module also works with the 3d animation module to control and direct the current animation based on the current character state.

-3d Animation –
This manages the loading and playing of animations for 3d objects.

-Combat –
This handles the checks and calculations involved in combat. Also loads and handles base data for weapons, armor, combat abilities/styles, style-chains etc.

-User Interface –
This handles GUI elements and events, as well as input controls which don’t directly involve controlling the character. This module implements a GUI system based on HTML/CSS layout and styling specifications and is built around Model-View-Controller design principles.

-System –
This manages screen and file output of debugging and program information and metrics. Each component has its own debugging stream (like debug_animation or debug_framework) which can be directed, enabled, or disabled as needed.

-XML Parser –
This is a lightweight function library for loading xml data into various components. It is primarily used by the GUI system for defining and loading views, but is also used for a variety of configuration and data files.

-Evolved’s Advanced Lighting –
I am using his newer version of the system, with a few modifications.

Modules yet to be implemented include management utilities for things like levels, AI, special effects, story dialogue and events and so on.

Psuedo OOP:

While DBpro is not OOP, it can still make use of some basic OOP and namespace concepts to help organize and manage code.
Due to the procedural nature of DBpro, any grouping of variables and functions into a 'class' will *always and only ever* have a single instantiated 'instance' of that class with effectively only static 'members, properties and methods'.
Any refernce to a 'class' refers to this singular instance static class.
These are 'partial' classes which may be extended by other modules. Such extension primarily occurs when a module adds update handlers to the framework.
Any sort of class, namespace or object grouping is by *intent and convention only*: anything 'belonging' to a class is not truly linked in any internal or formal manner, anything notated as private is not truly protected or restricted in any way.

Function Naming Conventions:

Public functions are prefixed with a module or group ('class') identifier, followed by an underscore, then the function name in camelCase.
Public functions are those which are intended to be callable from any arbitrary point within the program, and are typically 'access points' to a class.
Public functions often call additional private functions internally.

Private functions are prefixed with an underscore, followed by a module or class identifier, then an underscore, then the function name in camelCase.
Private functions are those which are intended to be callable only internally within the module, class or namespace they belong to.

For example:

gui_alert(text$) is a 'public' function belonging to the gui 'class' and is intended to be called from anywhere at any time.

_gui_closeAlert(alertID) is a 'private' function belonging to the gui 'class' and is intended to only ever be called by the alert 'object' itself.

_gui_makeElement(tag$, id$, name$, parent$, class$) is a 'private' function belonging to the gui 'class' typically called by public functions like gui_alert(text$) or gui_loadDocument(doc$)
While you might want to create an element at some arbitrary point outside of the module (publicly), it is intended for elements to be managed (privately) through the definition of gui documents and the use of controlled public interfaces.

Update handlers intended to be used by the auto update queue extend the update subclass of the framework class new with public functions.

For example:

framework_update_gui()
framework_update_arcs()

Variable Naming Conventions:

GLOBAL variables and CONSTANTS are in all capitals and spaced with underscore.

For example:

INPUT_READY = TRUE
MY_CHAR = 1
MATH_MODULE_LOADED = FALSE

They may be prefixed for public/private class use.

GUI_ACTIVE_DOC could be a global public variable 'ACTIVE_DOC' belonging to the gui 'class' and refer to a data representation of the current gui layout.
IO_ACTIVE_DOC could be a global public variable 'ACTIVE_DOC' belonging to the file IO 'class' and refer to an actual file open for read/write.

This, creates an effective namespace allowing variable (and function) names to be reused across modules/classes without conflict or confusion.

non-global variables follow camelCase conventions.

For example:

mousePosX = 256
backgroundImage = "My Pic.jpg"

these are effectively local transient throw aways.

Array Naming Conventions:

To distinguish an array() from a funtion(), arrays will use PascalCasing.
They may be noted as public or _private as normal.

for example:

dim math_Arcs() as arcData
dim _math_SomePrivateArray(100)


Lastly, here is a preview of some of the code:

(edit: the tab formatting got wonky in the copy/paste sorry.)

Main.dba



framework.dba



animation_3d.dba



New progress this week:

Not a lot tbh. I was out of town with family for the long holiday weekend, and most of my work is done on weekends.

I have started modeling out the roof and added window holes to the cottage walls. I'll stick up a screenshot once I have these textured.

Wolf
16
Years of Service
User Offline
Joined: 8th Nov 2007
Location: Luxemburg
Posted: 5th Sep 2014 01:14
I find that this has way too few comments for how well made and documented this is! I find that the model of your protagonist is just lovely! Well done.



-Wolf

"When I contradict myself, I am telling the truth"
"absurdity has become necessity"
Ortu
DBPro Master
16
Years of Service
User Offline
Joined: 21st Nov 2007
Location: Austin, TX
Posted: 5th Sep 2014 07:28
Thanks man! Glad you like the model. I will get her more polished one of these days, but i am getting excited to start work on the next one, her brother.

Sometimes i feel like i comment more than i probably need to, but i mostly tend to use comments to describe the thought process directing the code rather than the specific code itself. I use them to sort of isolate an idea or as a kind of label to locate a section or process quickly.

The log strings being written also help serve as comments to describe what is happening, and i tend to push a lot of the detail info into the header and type definitions to keep the code body cleaner.

I try to keep to the idea that with consistent and descriptive naming of functions and variables, with proper indentation, and some comments to describe the train of thought, it should be readily apparent what most code blocks are doing.

The detailed headers have been incredibly helpful, particularly when coming back to a module that i haven't worked with in a while and it's one of those things where spending a little bit of extra time up front can save a lot of time later on.

Dimis
12
Years of Service
User Offline
Joined: 12th Jun 2011
Location: Athens, Hellas
Posted: 5th Sep 2014 17:14
Nice looking project Ortu! And i really like that model. Can you tell me how many bones are you using to animate it? I noticed in the videos that the palms are always in the relaxed pose. Are you going to add more detail to the hands in the future? (more bones?) Or do you have another solution for it?

I am just asking because in my game, my models have a large number of bones (the fingers specifically need a lot of bones, for fully detailed animation) and that is a problem when i try to use Evolved's bone shaders, i am exceeding the limits.

Ortu
DBPro Master
16
Years of Service
User Offline
Joined: 21st Nov 2007
Location: Austin, TX
Posted: 5th Sep 2014 19:26 Edited at: 11th Nov 2014 05:24
Yeah, evolved has a bone limit of 50, I had to cut some out and get creative in places.

It is currently at 44 bones, which leaves me a few to add around the eyes for lid and brow articulation.

On the hands, the bottom three fingers all share bones as these rarely need to move independently, you can get the illusion of fully articulate hands with much fewer bones.

Another thing to watch out for with evolveds shaders, for each vertex, the total weight must be normalized to 1.0 and weights cannot be split across more than 4 bones maximum.



Attachments

Login to view attachments
Dimis
12
Years of Service
User Offline
Joined: 12th Jun 2011
Location: Athens, Hellas
Posted: 5th Sep 2014 23:59
Thanks for the reply Ortu.

I have narrowed down the number to 56 but i can not remove more bones. I have tried removing some finger bones already, and I managed to make the bone shaders work, but some animations don't look correct, so i skipped that solution.
I wonder if there is a way to increase the limit to 60. But i have no idea if it is possible anyhow, i don't know how shaders work really.

Ortu
DBPro Master
16
Years of Service
User Offline
Joined: 21st Nov 2007
Location: Austin, TX
Posted: 6th Sep 2014 00:39
There is an array in the shader sized to 50, i tried at one point to increase this without success. I am not sure if other changes need to be made in this or another shader, or in the dbpro code, or if there is some sort of limitation in DX9, or the shader model being used.

Dimis
12
Years of Service
User Offline
Joined: 12th Jun 2011
Location: Athens, Hellas
Posted: 6th Sep 2014 14:36
I think the limit for dbpro is 60, for bone shaders. So, I can't guess why evolved uses a maximum of 50. That array is in the un-tweaks part, i tried changing it too, but no luck.

Rudolpho
18
Years of Service
User Offline
Joined: 28th Dec 2005
Location: Sweden
Posted: 6th Sep 2014 16:34
Probably because you can only have so many bytes at most and his other shaders use up all the allotted memory. I don't know that for a fact though, but it seems reasonable.
If it uses shader model 1 / 2 you can try to set it to use SM3 instead which should increase those limits.


"Why do programmers get Halloween and Christmas mixed up?"
Chris Tate
DBPro Master
15
Years of Service
Recently Online
Joined: 29th Aug 2008
Location: London, England
Posted: 8th Sep 2014 13:51
Quote: "I find that this has way too few comments for how well made and documented this is! I find that the model of your protagonist is just lovely! Well done.
"



I have been a bit ill so have not commented here much; will comment when I hopefully get well soon

Ortu
DBPro Master
16
Years of Service
User Offline
Joined: 21st Nov 2007
Location: Austin, TX
Posted: 9th Sep 2014 05:11
Oh hmm, I assumed he meant code commenting. Thread comment rate is about as expected, or even a little more thanks guys

Ortu
DBPro Master
16
Years of Service
User Offline
Joined: 21st Nov 2007
Location: Austin, TX
Posted: 12th Sep 2014 01:18 Edited at: 11th Nov 2014 05:25
Current progress in game:



It still looks too modern, I'm not too happy with the roof and this will likely be changing. I will also be working to make it more rustic and weathered as things progress.

Part 3 – Character and Camera Control

My thoughts:

Not a lot to go into here, most basic player controls have become fairly standardized: wasd for character movements, mouse looks around and/or steers, space to jump, shift to sprint etc. I see no need to innovate in this, I just want to implement a smooth and comfortable interface. This isn’t meant to be a game built around neat, new, control mechanics, with the possible exception of combat.

Key and control assignments should default to the common standards, but be modifiable by the user through in game options and settings.

The results:

I’ve implemented a control system, which results in movements similar to many 3rd person perspective games. Keyboard controls allow for directional movement, strafing, sprinting, crouching, rolling/evasion, and jumping, as well as activating various game actions/interactions. Mouse buttons, combined with keyboard use, allow for attacking and defending. While stationary and not in combat, mouse movement will orbit the camera around the character or other focal point, and while moving, mouse movement will steer the character side to side. For in combat, the plan currently is to have the character and camera locked for strafing and backsteps instead of the normal turning controls.

Progress:

All movement, camera, and out of combat controls are complete. Attacking and defending mechanics, although controlled by this player control module, will be talked about in more detail when I get to the combat module.

I told you there wasn’t much to this one And so, on to…

Attachments

Login to view attachments
Ortu
DBPro Master
16
Years of Service
User Offline
Joined: 21st Nov 2007
Location: Austin, TX
Posted: 12th Sep 2014 01:20 Edited at: 12th Sep 2014 01:28
Part 4 – Animation Handling

My thoughts:

Central to a good animation system is the ability to define a range of keyframes as belonging to a distinct animation sequence such as ‘walk’, ‘jump’, ‘idle’. These sequences can then easily be linked up to character states and actions. As with everything, having these ranges hard coded is best avoided. Animations can change, new ones added or inserted, made longer or shorter and the code shouldn’t have to be recompiled every time you tweak a model.

Sequences will generally be played in one of two ways: play once and end, or play and loop. With proper idle sequences, even a ‘play once’ sort of animation will never really just end, the object should never be in a static state and so ‘play once and end’ really becomes ‘play once then play something else’, and so the handler should give you the ability to setup a follow up action. Follow ups can even then be chained into complex sequences which could be consided to be a single animation by the player, even if internally it is made up of smaller components. This creates a sort of modular animation system where the same concepts of any modular design can be applied.

Attack and jump sequences are a good example of sequences which are made of smaller components. A simple jump is actually rather complex once it is broken down into component stages:

-Anticipation of movement:
the character will crouch a bit preparing to forcibly push up into the jump, then transition into the jump ascent itself. This will take a fixed amount of time and arc calculations should not yet be started.

-Jump ascent:
character is fully stetched and rising towards the peak of the jump, this can take a variable amount of time depending on variable jump force, collision with environment etc. Being variable, it must be able to loop until the peak or a collision is reached. Arc calculation should be handling the object positioning at this point.

-Peak:
character has hit the height of the jump and transitions from a jump pose to a falling pose over a fixed period of time. Arc calculation continues.

-Free fall:
character is descending for a variable amount of time. A jump on level ground will have a much shorter free fall than a jump off a wall or cliff. The animation must loop until character lands or otherwise collides with something. Arc calculation continues.

-Landing:
character reaches the ground and transitions from falling pose to a landing sequence where it crouches as it hits to absorb the force of the landing, then straightens back into a standing idle pose. This is a fixed time animation and arc calculation has ended at this point and normal control can be passed back to the player.

As these components are modular, variations can be applied such as simply walking off a cliff without jumping, you will start at the free fall stage and go from there, or perhaps if a character free falls past a certain threshold, the character will tuck into a roll upon landing instead of a basic land on feet landing, in that case you can just swap out the land sequence for a roll sequence at the end without having to work up completely separate jump sequence.

It’s also important to allow random variation to certain sequences, idles in particular. A character that will sometimes look around or shift their weight etc. has far more, well, character than one that just stands there all the time.

The results:

Animated objects will include a configuration file which links ranges of keyframes to animation sequence key names, and flags the key if it can cycle variations. So if you have a model like ‘character.x’ there will be a partner ‘character.anim’ file, which will be loaded when the .x is and the keys will be added to the animation handler.

This handler manages timing of playback and the looping of sequence keys, as well as transitioning into the follow up sequence if one was set. Due to things like timing and performance management, it takes direct control of setting keyframes and avoids commands like play object and loop object.

It does no evaluation of external states; it simply updates the current interpolated keyframes of the objects being managed in the manner directed when the animation was added to its scope. The character module will evaluate the state of the character and decide which animation sequence should be playing, if that animation is not playing, it will instruct the animation module to begin playing say ‘animation x, follow up with animation y’ or begin playing ‘animation z and loop, allow random variation’ and so on.

Progress:

This module is mostly complete. It is fully functional, though I would eventually like to add smoother interpolation between sequence transitions, and possibly the ability to blend different animations on a per limb basis. This is more of a final polish kind of thing, and low on my priority list at the moment.

The code for this module can be found in Part 2 above.

Wolf
16
Years of Service
User Offline
Joined: 8th Nov 2007
Location: Luxemburg
Posted: 12th Sep 2014 09:55
Quote: "It still looks too modern, I'm not too happy with the roof and this will likely be changing. I will also be working to make it more rustic and weathered as things progress."


If you need a hand with that, please ask!

"When I contradict myself, I am telling the truth"
"absurdity has become necessity"
Ortu
DBPro Master
16
Years of Service
User Offline
Joined: 21st Nov 2007
Location: Austin, TX
Posted: 13th Sep 2014 01:38 Edited at: 11th Nov 2014 05:26
Hey Wolf, thanks man! I've been a fan of your art for some time and I would welcome the help.

For the cottage structure itself I have a few ideas that I would like to keep working on for a bit, but if you are interested, I had planned to make a lighthouse to go up on a hill nearby, a small market and village up the road a bit, and the cottage needs all the internal furnishings like tables, chairs, beds, chests, wardrobes, pottery, wall hangings/tapestries and the like.

The game will features characters from several different nations and cultures. This character and her brother are from the 'Principality of Duccal' an area which takes inspiration from renaissance era Italian countrysides. The primary color palette for this culture is creams, browns, and blues.

I realize that currently the structure is somewhat monotone and drab, the roof tiles will be blue, and I intend to liven things up with vegetation and cloth to add splashes of color.

Here is today's progress:



Attachments

Login to view attachments
Chris Tate
DBPro Master
15
Years of Service
Recently Online
Joined: 29th Aug 2008
Location: London, England
Posted: 14th Sep 2014 15:22
Quote: "Animated objects will include a configuration file which links ranges of keyframes to animation sequence key names, and flags the key if it can cycle variations. So if you have a model like ‘character.x’ there will be a partner ‘character.anim’ file, which will be loaded when the .x is and the keys will be added to the animation handler."


Cool. I remember a year or so ago, I found a script for exporting Blender animation markers into text files. The script was found via google. Just something useful to know if you did not already know.

Ortu
DBPro Master
16
Years of Service
User Offline
Joined: 21st Nov 2007
Location: Austin, TX
Posted: 14th Sep 2014 19:25
Cool i will have a look. The .anim file is basically just a simple .csv I use blenders internal text editor to write it as I animate.

Wolf
16
Years of Service
User Offline
Joined: 8th Nov 2007
Location: Luxemburg
Posted: 19th Sep 2014 14:22
Hello!

Quote: "Hey Wolf, thanks man! I've been a fan of your art for some time and I would welcome the help."


Oh! Thanks Its nice to know my work has reached people beyond the FPSC community.

Quote: " but if you are interested, I had planned to make a lighthouse to go up on a hill nearby, a small market and village up the road a bit, and the cottage needs all the internal furnishings like tables, chairs, beds, chests, wardrobes, pottery, wall hangings/tapestries and the like."


This sounds doable! A lighthouse sounds challenging to model, the indoor props are easy to do too. I should have some of the stuff you'd need in storage anyway.

I'd have to decline when it comes to make entire markets or villages as that would blow my timeframe. markets and village props are an abundance on the net anyway



-Wolf

"When I contradict myself, I am telling the truth"
"absurdity has become necessity"
Ortu
DBPro Master
16
Years of Service
User Offline
Joined: 21st Nov 2007
Location: Austin, TX
Posted: 19th Sep 2014 21:02 Edited at: 19th Sep 2014 22:31
@Wolf no worries man, in no way expected you to want to tackle much of the list, i just wanted to provide a wide range of things to pick from if anything was of interest appreciate it!

Well folks, it has been a fairly productive week this week with significant progress on a couple of new modules... Well to be honest these are modules which i had been putting off for some time now, just forcing bits and pieces of those content in hard coded into other places to just get things running, but now this is all cleaned up and the modules are replacing a quick and dirty hack job of level loading and setup with proper framework loaders and updaters.

Specifically, these new modules are 'world', 'fx', and 'database'

Unfortunately, this is entirely back end data handling work and so there is nothing new visually to show for it, but it presents a good opportunity to talk about this weeks update:

Ortu
DBPro Master
16
Years of Service
User Offline
Joined: 21st Nov 2007
Location: Austin, TX
Posted: 19th Sep 2014 21:25 Edited at: 19th Sep 2014 21:56
Part 5 – Data: Storage, Access and Handling

My thoughts:

There are many different ways and formats for storing data to use later. Each have certain advantages and disadvantages and I don't know that any one form can objectively be considered 'better' than another. Certainly for any given task, some forms will be better suited than others but I think it is generally a case of every Job has a Tool and every Tool has a Job. Our task is to match the proper tool to the proper job as needed.

In my early projects, my primary method was to save out arrays to .dat files from within DBpro (well, DB Classic actually). It had the advantage of being easy to access by the game: load array filename, arrayname(0) – It doesn't get any easier than that, and poof you have an array in memory ready to use.

Unfortunately this method generally comes with more bad than good.

The data is not easily accessed outside of the program itself, often requiring a specialized editor or data loading program. This means that to modify it requires extra coding and much recompiling.

The data format is also locked to match the formatting of DBpro arrays. Of course, most loaded data will need to get pushed into an array at some point anyways, but it is often useful to keep it in a higher level format which is easier to read and modify by humans until the program actually needs to use it, and thus enters the world of text parsing.

The ability to take text, parse it, and break it up into meaningful data chunks is both elementary and invaluable, whether it is loading data from a flat text file, processing user input, or handling network packets, at some point every programmer is going to need to process text data.

Now, I should I hope that I don't even need to mention Matrix1, I should expect that anyone reading this who uses DBpro is already using Matrix1, but just in case: Get it. Use it. Get familiar and comfortable with the command set it provides. Many of these improve existing core language commands, and many others should really be core language commands themselves. Anyway, I bring it up here because it includes commands for better/faster text processing as well as extra commands for sorting arrays and much more.

Ok, back to data handling. .dat files are quick and easy to load, but they are not very flexible, and are not easy to work with externally. The next step in our data handling journey moves our data into easily editable text files which our program can process and load into arrays or otherwise use as needed.

This has several really big advantages straight off: they are very easy to edit with any general text editing software. We can make changes and then see those changes applied to the game without needing to recompile anything. The data is portable. Flat text files can be processed and used by any language, allowing the data to be common and shareable across many applications. The downsides are: they do require more time to load and process, and they are not secure. The ease of access and editablity for the developer applies equally to anyone with access to the files themselves. Of course, this may or may not matter depending on what exactly you are storing in these files and what you intend to use them for.

Now, it is sometimes needed to establish your own data format within a file, but there are so many established and commonly accepted formats that one of them can almost always fill your needs. Probably the two most commonly used text file formats for loading data into a program are .ini and .csv

.ini files are useful for defining key-value pairs:

level=1
posX=0.0
posY=10.0
posZ=0.0
playerHealth=100

These are good for defining an initial setup of singular things. While this could easily work to setup the player's own character, you probably wouldn't want to write this out hundreds of times one after another in one giant list for every object or npc in the level.

.csv files are useful for working with larger datasets of many similar things:

level,posX,posY,posZ,playerHealth
1,0.0,10.0,0.0,100
1,10.0,10.0,5.0,100
1,53.0,12.0,8.0,50


They are much like a flat text based spreadsheet or database in many ways, where cells are separated (delimited) by commas. They are easily useable by any spreadsheet software like excel or open office, and most databases can import/export directly to .csv

While csv is an incredibly common, popular, and useful format, it does have a few short comings for certain types of data. Chiefly, it lacks any sort of hierarchal structure. It also lacks the ability to use metadata to further define and describe the data it contains, and it is generally intended to be loaded in bulk, lacking the ability to easily select or load only a portion of its dataset.

Sometimes, it is useful to manage data in a hierarchy of parents and children such as: data items X, Y, and Z are sub-items belonging to the data item A. Data items U, V, and W are sub-items belonging to data item B.

While you could convey this with a csv format by adding extra columns, there are other formats which are perhaps better suited by their nature to managing such data. HTML, XML, and JSON are all common hierarchal text formats.

Of course, DBpro knows nothing about any of these formats, and the ability to work with any of them has to be coded in ourselves, thus we could really use any of them. I have chosen to focus on xml because it has both stricter formatting rules than html and is designed to hold more flexible data content.

Generally, I prefer json over xml, but I have used xml longer and already had code for loading xml data into DBpro which suits my needs well enough for this project so I went ahead and wrapped that into my framework to proceed with. Eventually, I'd like to write a json loader for DBpro, but I probably won't bother to convert this project to use it.

Visually, xml looks somewhat similar to html using a structure of nested tags with attribute properties and inner content:



The same in json:



Not a lot of difference between them, I feel that json is a bit less wordy and more concise, but the nice thing about both xml and json, is that the tag and attribute names make it very human readable and easy to understand what the data actually is.

Naming provides a metadata which makes it easier to locate specific data items within the larger data set and allows us to treat various data items differently during the loading based on the type of data being loaded without having to store different types of data in many different csv files, or to add fields which are used by some lines but not really used by others even though they are present.

In some ways, these hierarchal formats are a structured blending between ini and csv, providing complex key-value data for many similar items, and also creating associations between the items in relation to each other.

Each of these, ini, csv, and xml are similar to each other, and honestly any of them could be used in place of another. xml holds the most complex data, but requires the most processing, csv can load large amounts of data quickly, but is somewhat blind, mostly loading in bulk. ini provides quick values for specific data items but lacks some of the structure which makes it easier to manage many discrete entities. Each have their uses, and I indeed use all three.

Though not yet implemented, I intend to load system config settings and user preferences from .ini These settings are singular and specific data items needing a singular and specific value, ini is well suited.

I currently load animation data from .csv each animated object requires many animation sequences, each of these sequences are defined by common data, all of this data needs to be loaded in bulk for each object. A csv for each animated object is well suited.

Lastly, I load gui views from .xml User interface elements by are hierarchal by nature: this button needs to be centered in that box, the box needs to be forced to the right side of the screen, when the button is clicked it needs to execute this specific action. The nested structure providing complex attribute values of xml is well suited for this.



Ok, so I know this is a lot of reading so far, I've written more than I expected I would, but we are almost done I promise!

There is still one thing that each of these text files lack, and while they are great for loading things up at the start of the programs execution, they are not well suited to accessing data quickly on the fly as needed. The data is not indexed, and is not searchable without processing the entire file all over again. You can load the data into arrays to work with as needed, and some of it absolutely should be, but that still lacks proper indexing and still requires iterating through what could potentially become a very sizeable and complex array or group of arrays. Not all of this data needs to be kept in active memory at all times during the program's run time.

This is of course, where databases come in. Particularly for online games, a good database on the server is essential. This project is single player, and for this, I have chosen to use local sqlite3 databases using Duffer's Plugin. Databases can store vast quantities of data and are ideally suited to searching through that data to return only the specific subset required at any given time.

All my examples of data files above used level loading data to illustrate their usage, but in reality, a database is usually better suited to this sort of data. A level can contain hundreds if not thousands of objects, and a full game many times more. It requires information about what textures to load, shaders to use, lights, effects, each object needs data about collision, player interaction, you need trigger zones, dialogue content, the list goes on and on.

Some of this data is suited to loading in bulk through csv, but it comes down to an issue of manageability. It is just frankly easier to manage data with tables, columns, types, constraints and all the other tools a database provides than it is to work with a flat csv text file when you are dealing with massive quantities of data.

Databases also provide an advantage of being user/password protected if that is a concern.

And whew! We're done!

The results:

As mentioned, I use a variety of text based data files and sqlite databases to manage my game's data.


Progress:

Processing of csv and xml files is complete, really since before this specific project began.

My xml loader is read-only as I have no real need to load an xml file, modify it, and save it back out again from within the game. This would require a good deal of added complexity that just isn't needed for my xml usage. My original code did require some modification and a good bit of clean up to get it incorporated with this framework and able to fully handle my gui document files, but this is done.

This week I got the database up and running, and moved all of my level and object data into it. I now have a proper level loading and management module which will make it much easier to make changes to where things are, how they look, what is present or not and so on, without needing to change a value, recompile, change a value, recompile, change a value, recompile…

This also lays the foundation for beginning work on the dreaded level editor. I know, someone somewhere is going "oh no not another level editor!" These things seem to be the droldrums into which many promising projects never seem to emerge from.

They can often end up as entire projects unto themselves and it is easy to get lost in them. I think keeping a tight lid on scope is really a key here to maintaining focus on the end goal of the game itself. It is a necessary evil which I will not be tackling for some time yet, but I will need to get to eventually, and more sooner than later I think.

To keep forward progress, I plan to try to implement only enough bare-bones functionality in the editor to get the job of world building done in an efficient manner, without adding in every little cool or nice to have feature it might otherwise you know, be nice to have. We will see how it goes.

Jimpo
18
Years of Service
User Offline
Joined: 9th Apr 2005
Location:
Posted: 20th Sep 2014 04:34
Nice read on data storage

For my project, Dego, everything was plain text files with key-value pairs.

Like this:


And I mean everything was done in plain text files: monster data, item data, levels, even all the scripted game events. Every NPC interaction and puzzle in the game came from a text file with key-value pairs.

This had a lot of advantages. It was easy to make huge changes to the game's content without compiling. It was easy to read and edit the games data without bothering to load up the editors. It was easy to have the game's editors create the files. It was easy to expand and add new features. It was easy to add a new item property that only a few items had, since there is no need to go back and update every item data file.

The game was pretty big in the end (450+ items, 100+ enemies), but there was never a probably with the scalability and performance of the approach. Everything was loaded as needed. An item file wouldn't be loaded and parsed until a player picked it up and looked at it in their inventory. An NPC script wouldn't be loaded until the players talked to them. Even the graphics worked this way. An enemy image wouldn't load until that monster first spawned. There was never a performance hit from loading these files mid gameplay.

You seem like you are taking a similar approach with your game, and I think you will benefit a lot from it

Good luck with your level editor! It's such a crucial part to finishing a game like this, and will be well worth the effort

Rudolpho
18
Years of Service
User Offline
Joined: 28th Dec 2005
Location: Sweden
Posted: 20th Sep 2014 19:53
I agree with Jimpo, an interesting read.
Being a do-it-yourself kind of guy I usually device my own storage formats that are based on how the data is stored within my applications; this has the advantage of it being a lot faster to read as you can basically just copy the data straight from a file into memory without having to do any parsing. Of course this makes the editing process harder, but if you write another small tool program for doing this it all works out quite fine (you don't need to recompile the editor program for each time you want to use it to change some file contents by the way as you're doubtlessly aware, still your post seemed to suggest that so I just thought I'd point it out).
Of course that causes some additional work that you may not want to do and in those cases I find XML quite excellent. I never quite got the feel of JSON though, maybe it reminds me too much of Lisp with all those brackets...
Of course it does seem less verbose and should be quite easily readable when properly indented, but the occasional label wouldn't hurt. It's like "here's an array that contains 4 sub-arrays, which in turn contains 12 sub-sub-arrays, which in turn...
I used a combination of ini files holding simple settings (sometimes with, sometimes without associated keys), custom data files with home-made editors and an SQL database for storing information in Ageing Wrath. Given the chance to have another go at it I might just have chosen XML to replace the client-side ini and custom files (modern computers are fast enough to do some parsing once as data is loaded in), however the database was a very nice tool to have on the server side.


"Why do programmers get Halloween and Christmas mixed up?"
Ortu
DBPro Master
16
Years of Service
User Offline
Joined: 21st Nov 2007
Location: Austin, TX
Posted: 20th Sep 2014 22:35 Edited at: 21st Sep 2014 00:58
Glad you guys liked it

Quote: "Being a do-it-yourself kind of guy I usually device my own storage formats that are based on how the data is stored within my applications; this has the advantage of it being a lot faster to read as you can basically just copy the data straight from a file into memory without having to do any parsing."


I do agree that creating your own format, optimized for your specific project makes it easier and faster for your game to work with and this is worth doing at times. However, in *most* cases I would still argue for the use of an existing, standardized format.

Commonly accepted and known formats are well documented, and there are so many tools and applications already existing that are compatible to work with such standardized formats. This makes them more portable, and more useable by others if you ever needed to collaborate, or pass on the work to another. And by creating tools to allow your program to handle standard formats instead of tools to handle your own format, you give your program the ability to directly work with data generated by others without any additional special handling.

In most cases, the processing and loading time between an optimized format and a standardized format will be little enough to not matter too much.

A great example of an exception to this is say using the DBpro optimized format .dbo over the standardized format .x, but by the same token, if you were to have created that .dbo model entirely from within your dbpro application instead of an external modeling package, it would be difficult to take that .dbo object and use it anywhere other than in dbpro, because nothing else knows anything about .dbo while many other modeling tools and directX applications are already capable of working with .x

Quote: "(you don't need to recompile the editor program for each time you want to use it to change some file contents by the way as you're doubtlessly aware, still your post seemed to suggest that so I just thought I'd point it out)."


Right, reading back over what I had written, what I meant was maybe not as clear as I intended. An editor program would not need to be recompiled to make changes to the text file data it exports, it would just be executed and used. When I mention recompiles to make changes, I was speaking of cases where you have all of your data hard coded into the game program itself, or in the case of my early projects where I would save arrays to .dat files, I was not using a true interactive level editor but just a short little program like this which would update such files via a recompile: (don't laugh, this was long ago!)



Chris Tate
DBPro Master
15
Years of Service
Recently Online
Joined: 29th Aug 2008
Location: London, England
Posted: 23rd Sep 2014 20:59 Edited at: 23rd Sep 2014 21:03
Interesting information you supplied there.

So far I have a fusion of data formats, with XML as the basis and SQL as the data-source, given its ability to provide simultaneous access, transations, descrepancy management, admin-software, server side scripting, so on and so forth....

Most of my entities so far use XML, CSV and INI formats.

When hierarchical structure is overkill, links to INI files containing long linear properties are used, which in-turn can link to XML which would arrange hierarchical properties, either of which can chunk small structures in CSV (or space seperated where desirable).

Nothing is set in stone overhere, I am also toying with Limb Names and Object-Memblocks for my DBO's and DirectX objects.

Again, XML seems like a popular choice for all; for me it is the popularity and software compatibility of it that appealed to me; so many potential colleagues and modders need not learn something new.

The best way to kill your project is to pick a format which nobody wants to learn.

Ortu
DBPro Master
16
Years of Service
User Offline
Joined: 21st Nov 2007
Location: Austin, TX
Posted: 25th Sep 2014 06:08
Quote: "So far I have a fusion of data formats, with XML as the basis and SQL as the data-source"


Interesting, so are you passing query results in to your program through dynamically generated XML? Is this to maintain a separation of concerns or another purpose? Well, really I guess it depends somewhat on what flavor of SQL you are using and whether it is a direct local connection or served through a host.

Ortu
DBPro Master
16
Years of Service
User Offline
Joined: 21st Nov 2007
Location: Austin, TX
Posted: 26th Sep 2014 05:33
Another week, another update.

I began work on new functionality this week: player interaction with world/environment objects. Such interactions generally take the form of examine, use, take, and/or converse.

As is often the case when working on new functionality, forward progress was frequently slowed as errors were created, tracked down and corrected both in the implementation of the functionality itself, and in its integration with the larger existing system.

Yesterday in particular was frustrating as a tricky new crash bug came up which was rather time consuming to track down. At this point, each individual component was working correctly in isolation, but failed when put together. Complicating things further, the crash occurred in older UI code which was not touched during this expansion and had no previous issues and so had little expectation of failure.

Ultimately it was a combination of things which threw a monkey wrench, but basically it came down to a timing issue in the (unexpected) order in which two updates ended up being processed after the addition of the new code, and a failure to verify that the value of a parameter being handled at a certain point was not empty, due to an assumption that it would always have a value based on an expected order of processed updates. oops.

Correcting the update order resolved the bug, and I also added handling for the unexpected null value to be sure it is stamped out for good.

I believe I should have a new video ready to showcase the interactables in next week's update. For now, I will leave you with the next discussion segment:

Ortu
DBPro Master
16
Years of Service
User Offline
Joined: 21st Nov 2007
Location: Austin, TX
Posted: 26th Sep 2014 05:33 Edited at: 26th Sep 2014 05:59
Part 6 – Debugging, Testing, and Performance

My thoughts:

Another core ability to successful coding is the ability to identify, locate, and resolve errors, bugs, and unexpected or incorrect results. There are many tools and methods to help go about this from IDE utilities, to design principles and best practices, to plain common sense, and sometimes even intuition or a gut feeling. Oh sure, this is programming! This is a realm of logic, careful, methodical, cause and effect! We don't make guesses right? We test and get hard empirical evidence.

Well, to be honest, sometimes you just have a feeling that a section of code is contributing to a problem, even if you can't pin down exactly how or why but when you do get such a feeling, explore it. Apply that methodical logic to break the section down and see what is really going on. What may start off as a hunch or a guess at least gives you a place to start searching.

Failure to Compile:

The most obvious problems are compiler errors which usually spit out a line number and a detailed enough error to know what went wrong. I guess I should say rather that these are the most obvious in alerting you that there actually is a problem, as the program won't run if it won't compile. The exact problem can be much less obvious to track down, but usually once you have found them, they are not too difficult to resolve. These are mostly careless syntax, closure, type, and typo issues, although they can sometimes be difficult to track down if the line is unknown or if the message is unclear.

DBpro's compiler errors regarding type mismatch of UDTs isn't great, looking something like:

"Parameter mismatch in expression '_world_getInteractableID' inside modules\world.dba." - no line number

"Cannot perform integer cast on type "WORLD"inside modules\world.dba" - no line number and no real details on the source data. In this specific case, "WORLD"is a variable not a type, and is of type "worldData"the specific data being used is actually "WORLD.interactableTarget.name"which is of type string. Very little of this is conveyed through the error message which does little to help track it down.

"Subscript must be Integer or DWORD when referencing an array inside modules\world.dba."– Which array? What line?

And lastly, here is one straight from the DBpro help files which I have run across a few times:

7. Error messages can sometimes be confusing if read out of context. Here is an explanation of an error that might cause confusion;

CODE: print "X: " + camera position x() + " Y:"
ERROR: Types "$$1" and "@$F0" are incompatible at line X.
EXPLANATION: This means the string and value returned from the command cannot be added. The $$ symbol refers to a string, the @ refers to a temporary variable internally required by the compiler and the $F0 is a temporary float value also generated internally by the compiler.

As mentioned "if read out of context"the real problem is that the error does nothing to provide context, and often does not give a specific line number but rather just reads as the other errors above 'inside "filename.dba"'

So in all of these cases, the biggest issue is often simply locating where the offending line actually is. If you can't compile, then you can't step through or write to logs. Give the code a read through a time or two and see if any issues stand out.

I mostly use declared type variables (usually within UDTs) like 'someThing as string' *except* when using temporary throwaway variables like 'txt$ = someThing : print txt$' and I will often get compile errors due to a missing $ or #. These aren't too bad to catch most of the time, and are something that I know to look for straight off when dealing with type errors.

Now, if nothing stands out, it is time to start commenting out sections until you get a good compile. When building a new module that won't compile, I like to comment out all the functions, verify a good compile, then start adding the functions back in one at a time until the compile fails. Once you have narrowed it down to a specific function, you can start commenting out portions of the function until the problem is located. Fix it, then continue until the full module compiles.

Now that we have good syntax and code that runs, we can start some basic testing to verify that it is actually running as intended. As we are testing one small bit at a time, if it fails you have a good idea where to go looking, otherwise if it all looks good, it's time to integrate and retest the larger system as a whole.


Testing:

Now certainly the best way to find problems is to go looking for them. When creating new sections of code, if you test them out specifically, in as much isolation as possible, then you have already narrowed down the range of code you need to dig through to locate an issue which causes the test to fail. If the problem catches you by surprise in the middle of a larger system, you have to work through the entire system to determine where the problem is occurring.

This sort of small isolated testing is called a Unit Test and it attempts to work with the smallest segments of code which can run as a distinct unit. For DBpro, this can generally refer to a single function, or a component source file. In other languages, they could test a class or an interface and so on.

Regardless of what specifically is being tested or the exact methods used to conduct the test, the purpose is simply to determine if a specific portion of code is producing the expected and intended results.

Now these tests are generally setup to reproduce or simulate the expected conditions and input which the code being tested is likely to encounter in production, and they really excel when testing against known or expected values, but we can't ever really anticipate every possible value or path of execution that might flow through the code once it is released into the wild, and so just because a function passes its unit testing, doesn't mean it can't have errors, or cause errors, when integrated with a larger system.

The more variable the input handled and the more variable/complex the situations in which a section of code can be run, the harder it is to accurately and sufficiently test in isolation. Eventually it must be integrated and the larger system must again be tested to ensure that there are no conflicts between the new component and the old.

The fewer components you add at a time, the easier it is to pin down where, when, and what started causing a problem.

Debugging:

Designed and engineered testing is not going to catch everything. Period. Unexpected errors, unexpected results are going to catch you by surprise at some point, and before you can even begin to fix them, you first need to realize that they are occurring, and then you need to narrow down and locate where and when they occurred.

This can be something obvious like a crash with hopefully a system error message, or it can be less obvious like if you take X specific combination of actions, your character does Y incorrectly, or can even go quietly unnoticed for some time like working with mixed up or incorrect data, data that is close enough to work, but not quite as it should.

The longer these things go unnoticed, the harder they are to track down. Regular hands on use-testing can expose the obvious issues much of the time. Use the system, put it through its paces. Take any action that can be taken, attempt to take actions that should not be able to be taken, most importantly, observe what happens. If anything stands out, repeat it, add some logging to grab values and break down the flow of what is actually happening in the back end.

Performance:

Performance issues are never fun to deal with. They can be caused by actual bugs, by poor design or lack of optimization, by conflicts with hardware or other external software, and sometimes, a feature or process is just going to cost what it costs and you will have to determine if you can handle that performance hit or if it would be better to scrap whatever it is you are attempting to add. When facing this sort of situation, it is important to keep in mind that if it runs "pretty good"on your machine, it may well run "poorly"somewhere else. If you are able, it is helpful to test on a variety of hardware and OS versions from old and low end to shiny and new to get a good idea of the average performance.

To identify performance issues, you want to keep track of your average fps over time, when you add something new, compare it to your previous values and determine if the new component caused an unusual drop. As mentioned in a previous post, during development, uncapping the framerates, leaving vsync off, and skipping the nice sleep cycle in order to push maximum fps can help identify large drops in fps that may go unnoticed when maintaining a flat 60. Of course you will want to regularly run in both uncapped and with all the hardware and system friendly measures turned back on to get a comprehensive idea of both development and production performance levels.

It is also helpful to calculate and track detailed timing of each major component or update process individually to help identify where bottlenecks are occurring and to get a better idea of how long other comparable processes 'should' take.

Logs and Monitoring:

Sometimes you need detailed logs, sometimes you just need to stick something up on the screen. A debugging utility should be able to handle both. Additionally, as this framework is modular in its design, and often I am working on a single module at a time, I may only want to write out logs for a specific module, or I may want to write a general log for the overall program flow and write out specific info from a specific module at the same time to different places.

This creates a need for 'debugging streams', which can be directed individually or together, and turned on or off as needed. Once a module is complete and tested, the module-specific debugging code isn't usually needed anymore and can probably be removed, but you may need to go back to the module later and change something or add to it. You don't want to have gone through and deleted all your debugging code and now have to add it back in again. It's better to leave the code in place and just shut off the output stream so that it doesn't actually write to anything. Turning it back on again is then as simple as uncommenting a line.

Version Control:

While not directly involved in the process of testing and troubleshooting, version control is related enough to be worth mentioning here. It is nice to have the safety net provided by the ability to restore back to an earlier version. Every now and then things just get completely borked and it is just easier and faster to revert to a clean revision and try again than to track the problem down. I've had to do this after something as simple as cleaning up some basic code formatting and apparently something got deleted or moved by accident.

Version Control also provides excellent tools for comparing differences between a modified file, and its previous clean state before the modifications. Very handy for reviewing exactly what changed when a problem comes up while working on an existing file.

The results:

Test early, test often. Record and review what is happening behind the scenes.

Avoid assumptions that what is happening actually matches what you think is happening or what should be happening.

Try to avoid implementing sweeping, system wide changes all at once and break it down into smaller verifiable chunks if possible.

Use issue tracking utilities or at least keep a log of known issues, even ones that were quickly resolved, it can be helpful for reference when dealing with similar issues in the future.

Use version control, you will be glad you did.

Progress:

My current needs don't require anything as complex as an interactive console, and so this module mostly just manages test results, screen output, and the various log files.

It also monitors and reports performance metrics such as framerates, loop and update times, resource usage and the like.

Initially, the onscreen output stream was just simple text printing, but with the implementation of the User Interface module it was later integrated with the GUI allowing much greater control, flexibility, and interactivity. This will provide the basis for making an interactive console if/when I end up needing one later.

Output streaming and log file writing is fully complete. Configuring which streams are enabled/disabled is still hard coded into the module and require a recompile to change unfortunately, but I have plans to move this to an external configuration file as well as to add support for command line switches to override the config on a per-run basis.

Working alone, a simple text file is sufficient for my current needs as far as issue tracking goes, and I use svn for version control. Why not git? I don't know, I like git, I use git, I just haven't moved this particular project over to it though there is really no specific reason why.

Chris Tate
DBPro Master
15
Years of Service
Recently Online
Joined: 29th Aug 2008
Location: London, England
Posted: 26th Sep 2014 19:03 Edited at: 26th Sep 2014 19:07
Quote: "Interesting, so are you passing query results in to your program through dynamically generated XML?"


XML is the easiest way to obtain any complex analytical data into my engine from a database server; whether generated or not, be it an analysis or an asset.

Quote: "Is this to maintain a separation of concerns or another purpose?"


I would say the answer to that relates to a topic which is a project of its own right. Although separation of concerns is being applied in all data formats, there are a number of responsibilities on my end which I require an SQL server for; responsibilities such as handling multiple queries from various locations at the same time, validating data input, forming relationships between the data, security and account management, backing up the data, preparing the data for websites, business and service software, so on and so forth. An SQL database handles these concerns for you, with administration.

Quote: "Now, if nothing stands out, it is time to start commenting out sections until you get a good compile. When building a new module that won't compile, I like to comment out all the functions, verify a good compile, then start adding the functions back in one at a time until the compile fails."


It would have been nice if this was not necessary with the presence of a debugging system; but the reality is that such techniques are necessary when there is no other way to find faults. Interesting to see other people using this trick.


Quote: "Version Control also provides excellent tools for comparing differences between a modified file"


Thanks for this advice; I could do with a more comprehensive version control procedure; something to add to my TODO list.

If you do not mind me asking, what procedure are you using to generate your static and dynamic shadows?

Ortu
DBPro Master
16
Years of Service
User Offline
Joined: 21st Nov 2007
Location: Austin, TX
Posted: 26th Sep 2014 22:22
Quote: "

Quote: "Interesting, so are you passing query results in to your program through dynamically generated XML?"

XML is the easiest way to obtain any complex analytical data into my engine from a database server; whether generated or not, be it an analysis or an asset.

"


Hmm, i guess my real question was, is your engine making a direct connection to a database, or is it making requests to a server, and if making requests, is the server intended to be remote or localhost for end users?

It sounds like it goes the request route, and yeah, XML (or json) is a good choice for passing back a response.


Quote: "
If you do not mind me asking, what procedure are you using to generate your static and dynamic shadows?
"


All of the lighting, shadow, particles, and shaders are Evolved's Advanced Lighting. I did modify the bloom, scattering, and shadow shaders somewhat as i find his default bloom blows out the horizon too much, and the default shadows make things far too dark under midday lighting. Or rather the contrast between the top of an object, and the self shadowed underside was to great. Kind of hard to explain, I'll grab some screens tonight. But these are just minor value tweaks for personal aesthetics.

Chris Tate
DBPro Master
15
Years of Service
Recently Online
Joined: 29th Aug 2008
Location: London, England
Posted: 26th Sep 2014 22:32 Edited at: 26th Sep 2014 22:33
The database is most definetly intended to be remote; established to store game data. Locally I intend to use raw UDT and TCP connections for running the game. It is not a MMO, but there is a secret gaming feature which requires a remote service.

How is the speed of Advanced Lighting these days? It must have improved since the last time I checked a few years ago. I chose not to use it in favour of learning how to create my lighting system; however, this has proven to take much longer than expected.

Ortu
DBPro Master
16
Years of Service
User Offline
Joined: 21st Nov 2007
Location: Austin, TX
Posted: 27th Sep 2014 00:02
It is quite good performance wise on hardware which can reasonably expect to handle gaming with last gen graphics. Of course a 10 year old laptop with integrated graphics is going to have problems, but such would have problems with any 3d game using more than the most basic shaders in the last 5-8 year. I'd consider my machine to be on the low end of average for desktop gaming: dual core i3, 4gb ram, radeon hd6870 and his demos which have most features enabled run easily 150+, 200+ depending on the demo uncapped at 1920x1080

Aside from the .fx shaders, it is all native dbp code, and could likely be further optimized through matrix1 and perhaps some other plugins if anyone really wanted to go to the trouble.

It does require a bit of configuration to get the right balance of features vs performance, and it can be quite picky in its requirements and formatting of models and textures.

I had a few problems with his older system and had planned to looking into my own until he released this new overhaul. It is worth looking over even just out of general curiosity to see how things work.

Chris Tate
DBPro Master
15
Years of Service
Recently Online
Joined: 29th Aug 2008
Location: London, England
Posted: 28th Sep 2014 01:00
I will have a look at it when I get a chance. He also has a newer engine under development.

Login to post a reply

Server time is: 2024-03-29 11:01:34
Your offset time is: 2024-03-29 11:01:34