In recent years, study of Neural Network which is introduced a chaotic element becomes active.
There is Incremental Learning as one of the chaotic neural network's learnings.
Using Incremental Learning, the connection weights either increase or decrease by the amount of weight change.
(Hereinafter, the amount of weight change called .)
In this study, I research that how
influences learning.
In the first experiment, the changing of learning was researched while is being changed from 0 to 0.1 in 50, 100, 200, 300 and 400 neurons' network.
The amount of patterns which are for learning was changed to research that how many patterns their networks can learn completely.
In the second experiment, the variation of
was researched by changing of the count of learning or the set of learning.
The result of the first experiment showed the maximum amount of patterns and the appropriate corresponding each networks exist.
The networks can learn more patterns by giving the optimum
corresponding them.
In particular, the appropriate
were changed by the size of networks and there is a inversely proportional relationship between them
Also the bound of
becomes narrower as the size of network is bigger, so it is important to give closer value for learning.
The result of the second experiment showed the amount of patterns which the network can learn completely increases a little bit by increasing the count of learning.
There is also a inversely proportional relationship between them.