Small Basic Forum
Neural Nettwork (ANN) Extension - Printable Version

+- Small Basic Forum (https://litdev.uk/mybb)
+-- Forum: Small Basic (https://litdev.uk/mybb/forumdisplay.php?fid=1)
+--- Forum: Extensions (https://litdev.uk/mybb/forumdisplay.php?fid=3)
+--- Thread: Neural Nettwork (ANN) Extension (/showthread.php?tid=83)

Pages: 1 2 3 4 5 6 7 8 9 10 11 12


RE: Neural Nettwork (ANN) Extension - AbsoluteBeginner - 07-19-2024

(07-19-2024, 10:46 AM)litdev Wrote: AB,

My understanding is that training is usually done with data in a random order, but good for you to try different approaches.

Also, it is easier to trainl HSL->RGB rather than RGB->HSL.  This is because Hue 0 is the same as 360 and 1 is very close to 359 etc.  I had better results training RGB to Cos(H),Sin(H),S,L (all of cource normalised appropriately between 0 and 1).

These are interesting tips.  Smile

Thank you.


RE: Neural Nettwork (ANN) Extension - AbsoluteBeginner - 07-19-2024

At this point, the main question for me is which path should I choose to study?

The first way is an attempt to create one large neural network to solve the entire problem.

The second way is an attempt to divide a large task into several small simple tasks and then create small simple neural networks for these tasks.
When all the small neural networks work well, then I will try to create a main neural network whose inputs will receive data from the outputs of the small neural networks.

I wonder if one of these ways of solving problems will be significantly better than the other.  Rolleyes


RE: Neural Nettwork (ANN) Extension - litdev - 07-19-2024

If you are talking about the HSL/RGB, then 1 ANN is certainly best - do you have working code for this?

Generally, I would advise using a single ANN when data is correlated (i.e. have some inter dependence).  The point of an ANN is to find relationships between input and output, so if there is a relationship between data (inter dependence) then they should be included in the same ANN.

How to set up the input/output as numbers in [0,1] range may need some thought (often scaling with other functions first - the objective is to have values that are roughly evenly distributed between 0 and 1).  If 90% of your input values are between 0.95 and 1, then they should be rescaled somehow for example.  Careful choice of these parameters will be very important.  Basically you want them to be as independent as possible (Use Red, Green Blue, not Red, Yellow and Magenta), and evenly scaled between 0 and 1.  Also, as I mentioned, a cyclic variable like Hue is better represented by 2 values Cos(H) and Sin(H).  You are trying to give the ANN the best chance to 'see' relationships by forming the system with your best understanding of the system characteristics (mathematically: linearise, normalise, orthogonalise the problem as much as you can), then use the smallest number of hidden layers and nodes that works, and train for as few epochs as possible to avaid over-training.

The choice of hidden layers (number of layers and nodes in them), other parameters in the ANN (generally use defaults), being sure not to 'over-train' etc - these are all a bit of an art that comes with experience, this is why I suggested working on the RGB/HSL problem before trying to do it for your rabbit AI program.


RE: Neural Nettwork (ANN) Extension - z-s - 07-19-2024

I don't know if we could directly convert RGB to HSL and vice versa then why do you need ANN.


RE: Neural Nettwork (ANN) Extension - AbsoluteBeginner - 07-19-2024

(07-19-2024, 03:57 PM)z-s Wrote: I don't know if we could directly convert RGB to HSL and vice versa then why do you need ANN.

Hello, Z-S.  Shy
We're just training. We study ANN using this problem.


RE: Neural Nettwork (ANN) Extension - AbsoluteBeginner - 07-19-2024

(07-19-2024, 12:31 PM)litdev Wrote: If you are talking about the HSL/RGB, then 1 ANN is certainly best - do you have working code for this?
...

Yes, I've already written code that I don't think is finished yet.

The topic of how to select the structure of a neural network interests me regardless of the specific task. This topic interests me in principle.

I really liked your tip "(mathematically linearise the problem as much as you can)".
I realized how I should act to help the neural network work well. (of course, my words do not mean that I already know everything about the neural network Smile  )

After the first attempt to train my neural network on data arranged in ascending order in a file, I received all zeros at the neural network outputs.
I think this means that the neural network failed to learn.
Now I want to CHANGE the SEQUENCE of location in the file of the same data. ( only the SEQUENCE. I will not change the data itself )

It will be interesting to see what the result will be.  Rolleyes


RE: Neural Nettwork (ANN) Extension - litdev - 07-19-2024

All zeros in output makes me think something else is wrong, maybe binary output flag is set, or something wrong with the input data format?


RE: Neural Nettwork (ANN) Extension - AbsoluteBeginner - 07-19-2024

(07-19-2024, 05:18 PM)litdev Wrote: All zeros in output makes me think something else is wrong, maybe binary output flag is set, or something wrong with the input data format?

I'll definitely check everything.

Additional Information:
I wrote code and created a training data file using a tablet running Windows 10.
I conducted the first training tests on a tablet, setting the “Number of Epochs” parameter to 9000.
The tests went without any glitches. There were no zeros in the output. But, out of ten lines with control results, several lines contained numbers with huge values (billions).

Then I decided to transfer testing to a desktop computer.
This computer has Windows 11 installed.
I copied the entire "ANN Tests" folder from the tablet to the desktop computer. This folder already contained the compiled executable, source code, and training data file.

When I tried to run the executable file, I received a message that the training data was in the wrong format.
I first recompiled the source code using the desktop computer, but the message about the data being in an invalid format appeared again.
Then I created a new file with exactly the same training data using the desktop computer.
After this, the neural network training program began to work.
But, after 100,000 Epochs of training, I received all zeros at the outputs of the neural network.

What an amazing story.  Smile


RE: Neural Nettwork (ANN) Extension - litdev - 07-19-2024

Each epoch just runs through the training data again, all it does is 'have another go' fine tuning the ANN.  You should get reasonable results with 10 to 100 max epochs.  More than that it will not be improving anything.  I have around 10000 training points - all randomly generated and are the most important factor for the training.  Repeating the training with the same set 1000s times isn't as good as having more 'different' training points.  The purpose of epochs is to get the most out of the data you have and to reduce any biases due to the order of the data - prevent early training from dominating in some way.

If you are getting output that is not numbers between 0 and 1, there is a problem - make sure all you training data, input and output is scaled between 0 and 1.

I get good answers training on a decent laptop in a few seconds, so the training shouldn't take too long or need fancy hardware for this task.


RE: Neural Nettwork (ANN) Extension - AbsoluteBeginner - 07-19-2024

Thank you.  Shy

I realized everything.
Tomorrow I will continue my fun with the neural network.  Rolleyes