09-23-2024, 09:46 AM
(translated by Google translator)
Here's some unscientific theory from a true amateur and newbie.
Essentially, the job of a neuron is to set the output to the Y value of the activation function that corresponds to the specified X coordinate.
The value of this X coordinate is equal to the "Sum" which is obtained after adding up all the weighted inputs of the neuron.
As far as I know, a classical neuron tries to create a set of weights for its inputs that, after the weighting operation, will set the "Sum" to the desired value.
That is, "Sum" is the digital code (address on the X-axis) of the desired Y-point on the activation function graph.
Our branded SB-Neuron will use different technology.
Depending on the nature of the input data, the weights on the neuron's inputs will be set from the start and will not change.
In my example, I used 100 array cells to store the key points of the future activation function.
To ensure that any combination of inputs to the two inputs of my neuron would correspond to only one point on the X-axis, without repetition, the weights of the neuron's inputs were set to "1" and "10".
Thus, during training, any combination of input data in the range "0 >= X < 10" will point exactly to its own array cell, and not to any other.
If the input data has logical values (0 or 1), then 128 cells of the activation function array will be enough for the operation of a neuron with 7 inputs.
(I'm sorry, life forces me to interrupt my story. See you later)
Here's some unscientific theory from a true amateur and newbie.

Essentially, the job of a neuron is to set the output to the Y value of the activation function that corresponds to the specified X coordinate.
The value of this X coordinate is equal to the "Sum" which is obtained after adding up all the weighted inputs of the neuron.
As far as I know, a classical neuron tries to create a set of weights for its inputs that, after the weighting operation, will set the "Sum" to the desired value.
That is, "Sum" is the digital code (address on the X-axis) of the desired Y-point on the activation function graph.
Our branded SB-Neuron will use different technology.

Depending on the nature of the input data, the weights on the neuron's inputs will be set from the start and will not change.
In my example, I used 100 array cells to store the key points of the future activation function.
To ensure that any combination of inputs to the two inputs of my neuron would correspond to only one point on the X-axis, without repetition, the weights of the neuron's inputs were set to "1" and "10".
Thus, during training, any combination of input data in the range "0 >= X < 10" will point exactly to its own array cell, and not to any other.
If the input data has logical values (0 or 1), then 128 cells of the activation function array will be enough for the operation of a neuron with 7 inputs.
(I'm sorry, life forces me to interrupt my story. See you later)