09-27-2024, 06:44 PM
To LitDev,
It seems to me that I understand you correctly.
But, I see one specific place in the chain of patterns during the operation of the neutron.
This place can be clearly seen in this image:
In this image, you can see that "Sum" (e) is a simple coordinate on the x-axis of the activation function graph.
You can call this "Sum" by any name.
You can give "Sum" any meaning and perform any operations on it. But, you always want that every time a combination of values, for example, "ABCDEF", appears at the inputs of a neuron, then EVERY TIME the neuron sets the output to, for example, 0.8.
You don't care how this is achieved. You need that if input = "ABCDEF", then output = 0.8.
Then WHY should I, using weights, change the value of "ABCDEF" to "OPQRST" to get "0.8" from the sigmoid, if I can immediately write into my neuron that "ABCDEF" = 0.8 ?
If I need a continuous activation function, then I must create it in exactly the form that will correspond to the truth.
This is my goal right now.
It seems to me that I understand you correctly.
But, I see one specific place in the chain of patterns during the operation of the neutron.
This place can be clearly seen in this image:
In this image, you can see that "Sum" (e) is a simple coordinate on the x-axis of the activation function graph.
You can call this "Sum" by any name.
You can give "Sum" any meaning and perform any operations on it. But, you always want that every time a combination of values, for example, "ABCDEF", appears at the inputs of a neuron, then EVERY TIME the neuron sets the output to, for example, 0.8.
You don't care how this is achieved. You need that if input = "ABCDEF", then output = 0.8.
Then WHY should I, using weights, change the value of "ABCDEF" to "OPQRST" to get "0.8" from the sigmoid, if I can immediately write into my neuron that "ABCDEF" = 0.8 ?
If I need a continuous activation function, then I must create it in exactly the form that will correspond to the truth.
This is my goal right now.
