07-04-2025, 07:40 AM
Hi all! 
All these days I have been constantly thinking about how to eliminate the shortcomings that classical neural networks have.
A very serious drawback is that we have to multiply EVERY input value of the neural network by one weight of EVERY neuron that is in the first hidden layer.
That is, we are forced to make the same number of "passes" on the same input data of the neural network, as the number of neurons are in this layer.
It seems to me that SB neurons allow us to get rid of this disadvantage.
If nothing stops me, I will soon be able to publish an example of a SB neural network that requires ONLY ONE "PASS" to determine a symbol on a 5x7 pixel matrix.

All these days I have been constantly thinking about how to eliminate the shortcomings that classical neural networks have.
A very serious drawback is that we have to multiply EVERY input value of the neural network by one weight of EVERY neuron that is in the first hidden layer.
That is, we are forced to make the same number of "passes" on the same input data of the neural network, as the number of neurons are in this layer.
It seems to me that SB neurons allow us to get rid of this disadvantage.
If nothing stops me, I will soon be able to publish an example of a SB neural network that requires ONLY ONE "PASS" to determine a symbol on a 5x7 pixel matrix.
