03-22-2025, 10:16 AM
(translated by Google translator)
Hi all.
Continuing to study classical neurons and neural networks, I finally became convinced that when a neuron is already "trained" and does its job, then it is an ordinary small virtual information storage, in which information is stored according to the "key-value" principle.
The "key" (the "memory cell" number) is a weighted sum of the values that are set at the inputs of the neuron.
"Value" is the data that is WRITTEN into the neuron by the training program at the required "memory cell addresses" during the "training operation".
I still disagree with people when they try to "train" a neuron by trying to pick a value for the weights on the neuron's inputs that will correctly indicate the X coordinate on the activation function graph. At the same time, this activation function itself DOES NOT COMPLY WITH THE TRAINING DATA (!).
The training program must write ACTUAL TRAINING DATA to the neuron. And the weights of the neuron inputs must ensure the correct formation of the "memory cell address".
After analyzing the training data, the training program can apply OPTIMIZATION methods so that this data takes up as little space as possible in the neuron's memory. For example, it is possible to compress data by describing it using equations. Moreover, the activation function may have some useful additional features. For example, it can perform smoothing or filtering of data.
Hi all.

Continuing to study classical neurons and neural networks, I finally became convinced that when a neuron is already "trained" and does its job, then it is an ordinary small virtual information storage, in which information is stored according to the "key-value" principle.
The "key" (the "memory cell" number) is a weighted sum of the values that are set at the inputs of the neuron.
"Value" is the data that is WRITTEN into the neuron by the training program at the required "memory cell addresses" during the "training operation".
I still disagree with people when they try to "train" a neuron by trying to pick a value for the weights on the neuron's inputs that will correctly indicate the X coordinate on the activation function graph. At the same time, this activation function itself DOES NOT COMPLY WITH THE TRAINING DATA (!).
The training program must write ACTUAL TRAINING DATA to the neuron. And the weights of the neuron inputs must ensure the correct formation of the "memory cell address".
After analyzing the training data, the training program can apply OPTIMIZATION methods so that this data takes up as little space as possible in the neuron's memory. For example, it is possible to compress data by describing it using equations. Moreover, the activation function may have some useful additional features. For example, it can perform smoothing or filtering of data.