We would like to build a community for Small Basic programmers of any age who like to code. Everyone from total beginner to guru is welcome. Click here to register and share your programming journey!


Thread Rating:
  • 0 Vote(s) - 0 Average
  • 1
  • 2
  • 3
  • 4
  • 5
Creation of SB-Neuron. Ours. Branded.(v2)
#23
The way I see it is that we want to find a correlation between input and output.  We think of the ANN as an analoge of a brain, where neurons are connected and the strength of the connection increases as it is used more often.  The strength of a connection is its weight in our model.

If we have only one set of connections between input and output, then the output is a linear function of input.  If we have multiple hidden layers, then the ANN is capable of a non-linear response.

We want each progressive layer to add increasing non-linearity (this is what enables an ANN to 'see' complex patterns, just like out brains do).  However we would like each layer to not be over-dominated by a previous layers.  For example, so that a very large weight doesn't over-saturate a node so all we see from then on is the one dominant weighted node - this is analagous to one node out-shining all those around it and we loose the detail coming from other nodes.  The activation functions have the role to 'normalise' all node values to between 0 and 1 so that each layer of the network can add more subtlety to the system.

In my understanding, the activation function is analagous to controlling the size of signal reaching a neuron that will determine if it fires itself.  We don't want to over swamp it if we want to capture complexity.  The activation function does not itself carry or hold information, rather it acts to keep the contrast nice in each layer.

Again, I am probably not clear in my explanation, especially accross a change in language, but at least it is an interesting discussion.
[-] The following 1 user Likes litdev's post:
  • z-s
Reply


Messages In This Thread
RE: Creation of SB-Neuron. Ours. Branded.(v2) - by litdev - 09-27-2024, 04:57 PM

Forum Jump:


Users browsing this thread: 1 Guest(s)