Ive been working on AI navigation for some time now, but cant seem to get it right. I can do sonar/radar based navigation for robots in the real world just fine, but not so in a virtual world. So, I decided to make a radar program that very roughly mimicked real world equipment.
I knew going into it that it would be slow and unusable for games but I thought it would be neat to do anyway.
The program basically takes the following parameters....
Radar Resolution
Radar Size
Radar Range
Radar Emission Angles
...and uses them to cast rays, producing distance-to-obstacle data each frame.
I havent looked at the raw data and dont plan to, but the program displays a graphic representation of the scene, as seen by the radar. The radar also must remain stationary because I didnt implement movement and rotation functions (the ray paths need to be recalculated every movement and rotation of the radar). Anywho, the way the graphic representation is *supposed* to work is that the brighter red an area is, the closer it is to the radar. However, in practice it doesnt work quite like that; there is the occasional glitch where an object appears brighter than it should. You may notice that the objects (in the radar display) sometimes fade in and out in a circular pattern - thats due to the radar raycast pattern being spherical.
To run the program, you\'ll need Sparky\'s collision
I welcome any constructive comments, hope you enjoy!
The Great Nateholio
<img src="http://ixeelectronics.com/Nateholio/Pictures/Sigblock.PNG">