Hi,
Your ideas are interesting and you are doing great things working on this.
Mathematically you are iterating (training) to obtain solution funtion F(G(X,Y)) = X.Y/10, where G(X,Y) = X+10Y+1 and X,Y are integers in range [0,9]. If you do enough training runs you will cover the space completely to fully describe F(G) (this is what you plot F v G).
In a simple case you can do enough training to cover every case and therfore exactly get the right answer (fully fit F(G)). This is also why I described neural nets at one point as fancy curve fitting, and why the training may interpolate well (test data within the training range), but extrapolate poorly (test data outside the training range). Train in [0,9] range but test on range [50,60] for example.
The power of neural nets is when it is impossible to train against all possible inputs, but the interpolation works well.
Imagine using a neural net to do face recognition, impossible to train against every possible face image, although they do use an enormous number. Also they do play carefully with the construction of the neural net, choice of training set, preprocessing input data as well as activation functions, but usually keep them constant and fairly simple, (cutoff, sigmoid, tanh). This is analogous to training F(G) without using every possible test point, e.g. train with say 100 random inputs (points_X_max= 100) to see the effects, then consider selecting the 100 randomly or systematically in some way to maximise coverage and smoothness of the input X,Y space, e.g. (with 100 points you could train with every possible combination).
https://machinelearningmastery.com/choos...-learning/
Also, is there a reason you do no use publish/import for your programs, its much easier this end and for casual users they would probably be more likely to look at it (1 click rather than several to download, cut and paste program somewhere then open it).
Your ideas are interesting and you are doing great things working on this.
Mathematically you are iterating (training) to obtain solution funtion F(G(X,Y)) = X.Y/10, where G(X,Y) = X+10Y+1 and X,Y are integers in range [0,9]. If you do enough training runs you will cover the space completely to fully describe F(G) (this is what you plot F v G).
In a simple case you can do enough training to cover every case and therfore exactly get the right answer (fully fit F(G)). This is also why I described neural nets at one point as fancy curve fitting, and why the training may interpolate well (test data within the training range), but extrapolate poorly (test data outside the training range). Train in [0,9] range but test on range [50,60] for example.
The power of neural nets is when it is impossible to train against all possible inputs, but the interpolation works well.
Imagine using a neural net to do face recognition, impossible to train against every possible face image, although they do use an enormous number. Also they do play carefully with the construction of the neural net, choice of training set, preprocessing input data as well as activation functions, but usually keep them constant and fairly simple, (cutoff, sigmoid, tanh). This is analogous to training F(G) without using every possible test point, e.g. train with say 100 random inputs (points_X_max= 100) to see the effects, then consider selecting the 100 randomly or systematically in some way to maximise coverage and smoothness of the input X,Y space, e.g. (with 100 points you could train with every possible combination).
https://machinelearningmastery.com/choos...-learning/
Also, is there a reason you do no use publish/import for your programs, its much easier this end and for casual users they would probably be more likely to look at it (1 click rather than several to download, cut and paste program somewhere then open it).