We would like to build a community for Small Basic programmers of any age who like to code. Everyone from total beginner to guru is welcome. Click here to register and share your programming journey!


Thread Rating:
  • 1 Vote(s) - 5 Average
  • 1
  • 2
  • 3
  • 4
  • 5
Neural Nettwork (ANN) Extension
#21
(translated by Google translator)

Hi all.  Shy
Today I managed to play a little more with the SB neural network.
I was most interested in the topic of constructing a large neural network from small neural networks (let's call it "SB neuro-LEGO").
Therefore, today I tried to create a small neural network that should “calculate” the value of the “H” parameter for a color in the HSL format, using the original data specified in the RGB format.

Naturally, such work is easier than creating a large neural network that immediately converts the entire RGB color code into the HSL code.
But it turned out to be quite difficult for one small neural network to qualitatively convert even one parameter “H”. ( I deliberately do not change the original training data. I want to learn how to easily CREATE A NEURAL NETWORK STRUCTURE that can solve the problem I need. )

The range of values of the "H" parameter has a section where the parameter changes quickly. At the same time, in adjacent sections of the range the value of this parameter changes very slowly.
I'm interested to see if this difficulty can be easily overcome by creating several even smaller neural networks, each of which will be well trained on its own small part of the range of the value of the parameter "H". 
If this is easy to do, then, theoretically, I can solve any complex problem by dividing this problem into many simple tasks for small simple neural networks of our “SB neuro-LEGO”.  Rolleyes
[-] The following 1 user Likes AbsoluteBeginner's post:
  • litdev
Reply
#22
AB,

The main issue training Hue from RGB is that it is not continuous at 0/360 degrees.  In general ANNs are usually combined into a single model for any task, therefore using and capturing inter-dependencies between input/output, rather than broken down to smaller models.  Additional work is often done pre-processing input, for example filters (high/low, edge detection...) for image recognition.  Even though your approach is unusual it is interesting and fun to experiment on it.  

If you are having issues getting the HSL->RGB or RGB->HSL models to work, I'm not sure the solution is break the problem down, although this is usually an excellent approach.  Perhaps we could share code to see if there are other issues.
Reply
#23
(translated by Google translator)

Hi all.  Shy

LitDev,
I am ready to provide any of my code that you need for verification.
But, I especially agree with what you said, that it would be VERY INTERESTING to see what happens if I explore a way to CREATE A NETWORK that consists of other small pre-trained neural networks.  Rolleyes

At the moment, my neural network quite accurately determines the values of the “H” parameter.
I just feel how hard it is for my neural network to work in such difficult conditions.
If I want the error of my neural network to decrease by 100 times, then I will try to make the operating conditions of the neural network easier.

A person must have good mathematical training in order to qualitatively transform training data.
Agree that not every Small Basic fan has such training.
But it may be that it will be easier for a person to divide the training data into parts that are more convenient for the operation of the neural network.

And finally, I’m interested in the idea of assembling networks from trained parts, like in a LEGO set.
[-] The following 1 user Likes AbsoluteBeginner's post:
  • litdev
Reply
#24
(translated by Google translator)

Hi all.  Shy

Today, when I was preparing myself a cup of morning coffee, I suddenly heard the Voices of Our Ancestors (as the North American Indians would say in a Hollywood action movie).  Smile
The Ancestor Souls told me that my neural network is having a hard time working because it is using too small a percentage of all the information contained in the three RGB color values.
Indeed, in addition to the distances from the origin of coordinates to the points indicated in each parameter (R, G or B), there is data on the distance of these points between them and on the proportionality values of these distances. Perhaps at the moment I do not see any other existing information that is stored in the values of R, G and B.

That is, I first want to give my neural network ADDITIONAL INFORMATION about the distance between the coordinates R, G and B in order to see what impact this information will have on the accuracy of the neural network.
If the accuracy of the neural network increases significantly and becomes 100 times better than now, then my goal will be achieved.
If the accuracy increases, but is less than what I need, then I will add new information and conduct new tests.

The important thing is that special small neural networks trained for this work will be used to extract additional information.
The main neural network will receive at its inputs data R, G, B and values from the outputs of auxiliary neural networks, which will determine the distance between points R, G and B.

This is my plan.
Reply
#25
Hi AB,

My gut tells me that the distance between R,G,B is a a linear function of R,G,B and therefore contains no more information, BUT it may help the ANN to find a good transform - fun to play with!

Intuitively I would think orthogonal (linearly independent) input would be best, but I may well be wrong.
Reply
#26
When I was playing with different colors on the site "https://colordesigner.io/convert/rgbtohsl", I noticed that if the distance between two parameters (for example P and G) is constant, then the angle "H" also remains constant when the values of R and G increase or decrease together.
This made me think that the distance between the values could help the neural network better determine the angle.

Shy
[-] The following 1 user Likes AbsoluteBeginner's post:
  • litdev
Reply
#27
If I load the following program ( ZWWL934.000) and then just step through the first 11 lines which writes the file XORANN.txt. If I look at this file it has the following data in it:

Name: XOrANN
Structure: 2,3,1
Trained: 0
Binary Output: False
Epochs: 100000
Learning Rate: 0.7
Momentum: 0.3
Sigmoid Response: 1
Error Required: 0.0001

Synapses:
-0.389115423145292
0.234463460386947
-0.00269707804671353
0.0434920950995256
0.472775567543123
0.861893068003418
0.957047623562183
-0.204678184448126
0.890773540777514
-0.70394352576879
-0.985567580901816
-0.354131126475581
0.237446215114298

My question is: "How did all of that data get there?" I tried to find how by using ILSPY, but I don't see it. Also, it shows 13 synapses. I would have thought there would only be 9. I am wondering how they were calculated? Any help would be appreciated on explaining how this works!

JR
Reply
#28
Amazing Thing Happened When I Was tasting your program my PC is extreme slow so I think it changed the logo of textwindow Smile Big Grin
   
Everyone Already Know That I Have A Very Slow 2GB Ram Laptop With Windows 8.1 Nothing Is Possible In It.
But I Have Got One More CPU With 4GB Ram And Windows 8.1 I Could Enjoy Small Basic Pleasures On It AND MOST EXTREME Pleasures OF Sharp Develop.
ZS
Reply
#29
JR,

This is the result of the Save command - the current state of ANN, which is initially default values for everything.  The synapse values are initially set to random values prior to training.

This implementation of ANN adds a Bias node for each layer apart from the output, hence 13 weights output. (2+1)*3 + (3+1)*1.

PS, This is a nice discussion of ANN bias, https://stackoverflow.com/questions/2480...ks#2499936.

To debug an exe with extension, you can use dnSpy.
Reply
#30
LitDev,

If I take your program and comment out the following lines and want to load the neural network that I previously created. I get the below error:

name = "XOrANN"
'NeuralNetwork.New(name,inputNode+","+hiddenNode+","+outputNode)
'NeuralNetwork.BinaryOutput(name,0,"False")
'NeuralNetwork.Save(name,Program.Directory+"\"+name+".txt")
name = NeuralNetwork.Load(Program.Directory+"\"+name+".txt")

Error in Small Basic program
Index was ouside the bounds of the array.
at SmallBasicANN.ANN..ctor(String fn)
at SmallBasicANN.NeuralNetwork.Load(Primitive Filename)
at _SmallBasicProgram._Main()

Any idea on what causes this? I would think that I should be able to load a previously created neural file.
Reply


Forum Jump:


Users browsing this thread: 3 Guest(s)