I have worked with a little bot from Parallax called "Boe-Bot", it was a pretty cool programmable autonomous robot. It would help better if you analyzed the different sensory systems the robot used to navigate in real life.
For example, the boe-bot had a ultrasonic PING sensor that used acoustic pulses to wall detect. So you could maybe make the program do a series of audible "clicks" to let the user know the sonar is functioning.
As for actually making the robot navigate, use the DarkAI plug-in and set the object to avoid the walls. Just make him move forward, then when he gets close to the wall, stop, back up, turn 90 degrees, and proceed movement.
That seems like a pretty realistic way to simulate an autonomous robot. Although seems silly to do, a virtual autonomous robot simulation sounds way to easy to cheat on haha.
Post a sample if you get anything going! Sounds nifty.
www.Helios-Online.net