Thursday, July 15, 2010


Today I finished the majority of the work on my attempt at making a neural network. I made things much clearer than in my previous attempts and avoided many of the pitfalls I had encountered earlier. I still have yet to test my work in its entirety, but I feel that when I encounter problems I will now be able to identify and correct my errors. The only part I still don't completely grasp is the backpropagation technique/algorithm used to "recalibrate" the system after each iteration of inputs and outputs. Since many Neural Networks have in them many layers between the input and output layers (lending to the term "Hidden Layer"), it is hard/impossible to tell at what point during the process something went wrong. To solve this problem, the error between the expected and actual output is distributed throughout the whole network. There is a lot of math on the subject that I don't really understand but I understand the goal of the method and I can proceed forward with my work.

On a side note, I bought a burger from the cafeteria/dinning area today. While much of the food on campus is expensive, I have to say this burger was worth the money I paid for it.

No comments:

Post a Comment