next up previous contents
: 目次   目次

概要:

In recent years, study of Neural Network which is introduced a chaotic element becomes active. There is Incremental Learning as one of the chaotic neural network's learnings. Using Incremental Learning, the connection weights either increase or decrease by the amount of weight change. (Hereinafter, the amount of weight change called $\Delta w$.) In this study, I research that how $\Delta w$ influences learning.

In the first experiment, the changing of learning was researched while $\Delta w$ is being changed from 0 to 0.1 in 50, 100, 200, 300 and 400 neurons' network. The amount of patterns which are for learning was changed to research that how many patterns their networks can learn completely. In the second experiment, the variation of $\Delta w$ was researched by changing of the count of learning or the set of learning.

The result of the first experiment showed the maximum amount of patterns and the appropriate $\Delta w$ corresponding each networks exist. The networks can learn more patterns by giving the optimum $\Delta w$ corresponding them. In particular, the appropriate $\Delta w$ were changed by the size of networks and there is a inversely proportional relationship between them Also the bound of $\Delta w$ becomes narrower as the size of network is bigger, so it is important to give closer value for learning. The result of the second experiment showed the amount of patterns which the network can learn completely increases a little bit by increasing the count of learning. There is also a inversely proportional relationship between them.




next up previous contents
: 目次   目次
出口研究室へ
Deguchi Lab. 平成20年3月5日