03-26-2025, 09:11 PM
Hi all. 
An interesting effect was discovered during my study of the training process of a multilayer SB neural network.
It turned out that only one layer should be trained in one pass.
Learning occurs in order from first to last layer.
This is quite reasonable.
It makes sense to train the second layer only when the SB neurons of the first layer have already been trained.
After all, the SB neurons of the second layer will have to learn to respond correctly to the behavior of the SB neurons of the first layer.
But, we cannot know HOW the first layer SB neurons will behave until their training is completed.
That is why to train a multilayer SB neural network, you will need as many passes as there are layers of this SB neural network.
Friends, do you have any news?

An interesting effect was discovered during my study of the training process of a multilayer SB neural network.
It turned out that only one layer should be trained in one pass.
Learning occurs in order from first to last layer.
This is quite reasonable.
It makes sense to train the second layer only when the SB neurons of the first layer have already been trained.
After all, the SB neurons of the second layer will have to learn to respond correctly to the behavior of the SB neurons of the first layer.
But, we cannot know HOW the first layer SB neurons will behave until their training is completed.
That is why to train a multilayer SB neural network, you will need as many passes as there are layers of this SB neural network.
Friends, do you have any news?
