We would like to build a community for Small Basic programmers of any age who like to code. Everyone from total beginner to guru is welcome. Click here to register and share your programming journey!


Thread Rating:
  • 0 Vote(s) - 0 Average
  • 1
  • 2
  • 3
  • 4
  • 5
Creation of SB-Neuron. Ours. Branded.(v2)
#51
(translated by Google translator)

Hi all.  Shy

Here is the data I currently have about the essence of an individual neuron:
Once trained, the neuron is effectively an information storage device.
The neuron inputs are actually connected to the address bus lines.
The SUM of all the values that are set at each input of a neuron actually represents the memory cell number in the neuron.
In a classical neuron, the role of physical memory is performed by the sigmoid equation.

When a code is set at the inputs of a neuron (on the "address bus"), the summation operation transforms this code into a specific number.
This number is the NUMBER of the neuron's "memory cell" (the X coordinate of the sigmoid graph).
When the activation function receives this number, it "opens the specified cell and retrieves" (calculates) the value stored there.

That's all... The usual "key -> value" pair.

All the magic happens WHILE THE TRAINING PROGRAM IS RUNNING (!!!).
It is the training program that does all the important work on information.

But that's a different conversation.  Shy
Reply
#52
You are not right. Write to me in PM, we will communicate.
Reply
#53
(02-21-2025, 08:51 AM)Alexeydef Wrote: You are not right. Write to me in PM, we will communicate.

Hello.  Shy 

I replied in a private message.
Reply
#54
(translated by Google translator)

Hi all.  Shy

Continuing to study classical neurons and neural networks, I finally became convinced that when a neuron is already "trained" and does its job, then it is an ordinary small virtual information storage, in which information is stored according to the "key-value" principle.

The "key" (the "memory cell" number) is a weighted sum of the values that are set at the inputs of the neuron.

"Value" is the data that is WRITTEN into the neuron by the training program at the required "memory cell addresses" during the "training operation".

I still disagree with people when they try to "train" a neuron by trying to pick a value for the weights on the neuron's inputs that will correctly indicate the X coordinate on the activation function graph. At the same time, this activation function itself DOES NOT COMPLY WITH THE TRAINING DATA (!).

The training program must write ACTUAL TRAINING DATA to the neuron. And the weights of the neuron inputs must ensure the correct formation of the "memory cell address".

After analyzing the training data, the training program can apply OPTIMIZATION methods so that this data takes up as little space as possible in the neuron's memory. For example, it is possible to compress data by describing it using equations. Moreover, the activation function may have some useful additional features. For example, it can perform smoothing or filtering of data.
Reply
#55
Hi all.  Shy

An interesting effect was discovered during my study of the training process of a multilayer SB neural network.
It turned out that only one layer should be trained in one pass.
Learning occurs in order from first to last layer.

This is quite reasonable.
It makes sense to train the second layer only when the SB neurons of the first layer have already been trained.
After all, the SB neurons of the second layer will have to learn to respond correctly to the behavior of the SB neurons of the first layer.
But, we cannot know HOW the first layer SB neurons will behave until their training is completed.

That is why to train a multilayer SB neural network, you will need as many passes as there are layers of this SB neural network.

Friends, do you have any news?  Wink
Reply
#56
3 Exams are left only Smile
ZS
[-] The following 2 users Like z-s's post:
  • AbsoluteBeginner, litdev
Reply
#57
Hi all.  Shy

I think I'm ready to challenge regular neural networks created in Small Basic using the "SmallBasicANN" extension to compete against our proprietary SB neural networks.  Cool

If any of you wish, then suggest some task.
You can do this task using the ANN extension, and I will do the task using SB neurons.

In this case, I won't need to normalize the original data, and the number of training passes will be very small. But I am sure that the accuracy of the SB neural network's responses will be much higher than that of the "classical" neural network.
Reply
#58
Do you have an idea that you already have.

To be a reasonable test of AI, the test data should NOT be used as the training data.  So there would be 2 independent sets of data, training and test.
Reply
#59
(03-27-2025, 07:08 PM)litdev Wrote: Do you have an idea that you already have.
...

I thought that maybe someone would suggest a task that would be interesting not only to me, but to him too.  Shy


Quote:litdev

...
To be a reasonable test of AI, the test data should NOT be used as the training data.  So there would be 2 independent sets of data, training and test.

I agree with you.
But in order for the trained neural network to find the “object” that this network was trained to find in the test data, both sets of data must contain this “object”.
Therefore, if the test dataset contains the same "object" that was in the training dataset, then both datasets will be essentially the same.
Both data sets will differ only in the "noise" around the "object" that needs to be found.

Isn't that so?  Blush
Reply
#60
Maybe, but its not just noise around the object, a bit subtle, but if we get a test case we can discuss more on its specifics.

If we are using a neural net IA, then we are really doing pattern recognition in some form.  So maybe if we have 1000 different data points from some unknown function, and we divide them in to two sets of 500, train on one, then test on the other - this is not about randomness, its about an unknown function - not quite the same thing.  Finding a good 'unknown function' for this is tricky.  If we use a known function with noise, then this is not the same thing.

Consider the function that determines if a pictuire is a cat or a dog - we don't know the function, but the difference is not random, the picture is clearly a cat or dog, we just don't know what the function that determines it is.
Reply


Forum Jump:


Users browsing this thread: 2 Guest(s)