Sorry your browser is not supported!

You are using an outdated browser that does not support modern web technologies, in order to use this site please update to a new browser.

Browsers supported include Chrome, FireFox, Safari, Opera, Internet Explorer 10+ or Microsoft Edge.

DarkBASIC Professional Discussion / Neural Network-Back Propagation problem(I hope)

Author
Message
Sedit
6
Years of Service
User Offline
Joined: 3rd Sep 2017
Location: Ghetto of NJ
Posted: 19th Oct 2017 08:15 Edited at: 19th Oct 2017 08:17
Hello everyone. I need help with debugging my Neural Network. Its a simple system by design currently setup to approximate the XOR function. I started to conceptulize this about a week ago because I was going to use it for my creature simulator however I moved to use my own design on that due to having issues understanding how to use the output to get useful data. My current model I found is more Akin to a HTM Neuron Model and not your standard Neural network like this one is.

However the idea that came to my head last night about using a Neural network to approximate the functionality of an 8 bit computer not only sounds well within reach but its awesome and must be done. Plus I will have not issues turning my input and output data into something useful.

Plans are to get this working well. Train a bunch of these little systems to approximate the AND,OR,NOT,NAND,XOR etc... etc... and string them all together to create what should amount to a working 8 bit computer.


If Google deep mind can look at a picture and tell me its a cat sitting or a shelf like in the video I seen earlier or write pretty nice song lyrics the it should literally have very little issues in being shown how a bunch of computers work, basically show it how to be emulators and literally turn itself into a universal emulator. Like I stated last night about this topic the results of an artificial Neuron emulating a CPU are far reaching and until I at least finish this simple little project we will never know how well these work to emulate anything.

This is where you guys come it

Meet NORMAN. He is a Network Of Recursively Modeled Neurons.

I "finished" NORMAN last night and all seemed to be going well because the trainer seemed to work well and the Error rate was going down however it can only approximate the last function it put in which is for the most part worthless.

When it ask for your input place a 1 or a 0. It is running a 3 input XOR problem. The training data can be found in its function and should be self explanatory how to use it for tinkering, just give it a value to pick which data set to use.
Place the VISUALIZE_NEURAL_NETWORK command anywhere you want to in the system if you wish to view the current state and to see how the network is shaping up.

Any questions feel free to ask. I cant wait to make this larger but until I get the basics to work there is no point. It can run almost 1 MILLION passes forward and Backward through the system in around 2 seconds which on my utterly horrible old computer amazed the hell out of me.





I am not sure why but I am almost 100% sure that its because I am not correctly calculating the ERROR# or I perhaps am not using it properly. I have been trying to get it working and studying up since like 10:30 this morning and the way they write the mathmatics is confusing the hell out of me since I am not really good at Calculus and never had any formal training in it. I believe I may understand it but would like some input from anyone that can help because Im getting burnt and have started just randomly changing shit for reasons even im not sure of at this point and I almost broke it a little bit ago so I think its best to leave it here for the night and pray someone has any information that can help.


Thanks in advance,
~Sedit


Login to post a reply

Server time is: 2024-04-18 15:59:54
Your offset time is: 2024-04-18 15:59:54