Wednesday, July 28, 2010

Day XVII;

Today I was going to help the Remote Sensing Group with another data collection session today, but after we got most of the targets set up the plan was postponed. I wasn't too mad because it is fun being outside helping out, at least when it isn't raining.

When I got back to my work I started reading some of the papers posted on the RIT twiki site. I read one paper that caught me completely off guard. At this point in time I feel as though I have a pretty firm grasp on the Neural Network concept but I guess people have come up with even crazier ideas. This paper introduced this idea that was almost a composition of Neural Networks in that one overarching Network orders for the training and learning of minion networks. The outputs from these subnetworks would then be weighted to produce a final output. It is like a Neural Network electoral college with certain helper networks getting more of a say in the final output than others. I think that the overall point of making a Neural Network that is composed of smaller networks (as opposed to just making a bigger network) is that new information can be added with having to retrain the whole entire network. It is very cool but very confusing.

No comments:

Post a Comment