We would like to build a community for Small Basic programmers of any age who like to code. Everyone from total beginner to guru is welcome. Click here to register and share your programming journey!


Thread Rating:
  • 1 Vote(s) - 5 Average
  • 1
  • 2
  • 3
  • 4
  • 5
Neural Nettwork (ANN) Extension
#41
Use This Link For Downloading : http://litdev.uk/extensions/SmallBasicANN_v1.2.zip
ZS
Reply
#42
OK, JR - I can reproduce and think I know the issue!

EDIT

Please try new version 1.2.0.5 and thanks for the testing Tongue .

When I first decompiled to it recompile for SB version 1.2, the decompile was a bit difficult to read, so I have refactored it, fixed a couple bugs, added some bits like the reporting event and added some intellisense (English only).  If you want I can share the C# project I now have so you can run it directly from there if you want to see how it works.

I did use dnSpy to find the issue, but I sort of knew what I was looking for.
[-] The following 3 users Like litdev's post:
  • AbsoluteBeginner, jrmrhrb00, z-s
Reply
#43
LitDev,

Great! It works. Yes, I would like to see what you used to figure this out I tried using dnspy, but I don't know what I am doing. I think it would be something valuable to learn if you are willing to teach.

Thanks for your help on this. I was getting no where.

JR

Z-S,

Thanks for your response. I was able to successfully run your program. I will be looking at it later. Looks like a good program to learn from. 

 LitDev was able to reproduce the problem that I was having.  He updated the DLL  (1.2.0.5). He, also fixed some other issues that he found while he was at it.

JR

LitDev,

With the new .dll (1,2,0,5) the output doesn't look right. Here is what I am getting. Looks like the output has changed to 100 or 0.

JR

Training started
epoch=100000
Trained=100000
BinaryOutput=True
Epoch=100000
LearningRate=0.7
Momentum=0.3
SigmoidResponse=1
ErrorRequired=0.0001
Mean(48,79) = 63.5 (100)
Mean(1,14) = 7.5 (0)
Mean(31,39) = 35 (0)
Mean(85,36) = 60.5 (100)
Mean(20,70) = 45 (0)
Mean(75,21) = 48 (0)
Mean(34,23) = 28.5 (0)
Mean(25,26) = 25.5 (0)
Mean(49,81) = 65 (100)
Mean(13,44) = 28.5 (0)
Press any key to continue...
Reply
#44
I put the source code on git, https://github.com/litdev1/SmallBasicANN.  I recommend getting Visual Studio 2022 Community edition (all free) and cloning this repo from git.  If you get that going so you can compile it, we can then see what next steps may be.  We could add a small project (exe) to use it (its a dll so needs a calling program) directly in C# for example.

To use dnSpy:

1] Compile the program you want to debug in Small Basic.
2] Start dnSpy
3] Close any previous projects (File->Close All)
4] Open the compiled Small Basic program exe (File->Open...)
5] Open the tabs in Assembly Exporer (View->Assembly Explorer) until you get _SmallBasicProgram and select that
6] Put a break point in this code in the code window (click left of the line number, or F9)
7] Press Start (then OK)
8] Then step into the extension code as required, using break points, looking ar variables etc.

   
[-] The following 1 user Likes litdev's post:
  • AbsoluteBeginner
Reply
#45
Can it be used in Sharp Develop.

I WILL INSTALL DNSPY TODAY
ZS
Reply
#46
JR, On the error you are seeing I need to be super clear what you are doing, my guess is you didn't save the results after training, but could be wrong, good test for using dnSpy maybe.

I have created a simplifed case, set train="True" to train, or train = "" to use the previously trained data, FTWL857.000.
Reply
#47
ZS, I would think, SharpDevelop is pretty good - I checked out the original developers and I actually know one of them.  While SharpDevelop is no longer active for the reasons thay state, much of the stuff lives on in Avalon stuff, like the windowing used in SB-Prime (AvalonDoc) and cross platform windowing stuff Avalonia.

PS, you don't need to ask me if it works, have a go and see!
[-] The following 1 user Likes litdev's post:
  • z-s
Reply
#48
LitDev,

I used your new program and it worked OK. Then I went back to your old program and set it to original and it worked ok. Then I commented out the 3 lines and removed the ' of the line for load and it worked OK, I liked your new program setup. Easier to use. So, it looks like everything is working now.

Thanks,

JR

LitDev,

With the programs running successfully shouldn't I be able to graduate them to 
Visual Basic?  I tried your original program in SB-Prime and it says "Flow chart failed probably because of a code error. Check program compiles. Which it just did!.

Your thoughts?

JR
Reply
#49
I Also Tried To Export First Using Standard SB:
It Showed Need Visual Basic On Machine ??
Then Tried Using SB-Prime:

   

I Clicked Continue And Got This:
   
And Then I Visited Folder Which I Choose Got Two File There:
1)Untitled.vbproj
2)UntitledModule.vb
Following Content Of Both File:

Untitled.vbproj Content:
Code:
<?xml version="1.0" encoding="utf-8"?>
<Project ToolsVersion="3.5" DefaultTargets="Build" xmlns="http://schemas.microsoft.com/developer/msbuild/2003">
  <PropertyGroup>
    <Configuration Condition=" '$(Configuration)' == '' ">Debug</Configuration>
    <Platform Condition=" '$(Platform)' == '' ">AnyCPU</Platform>
    <ProjectGuid>$$(PROJECT_GUID)</ProjectGuid>
    <OutputType>WinExe</OutputType>
    <StartupObject>Untitled.UntitledModule</StartupObject>
    <AssemblyName>Untitled</AssemblyName>
    <RootNamespace>$(AssemblyName)</RootNamespace>
    <FileAlignment>512</FileAlignment>
    <MyType>WindowsFormsWithCustomSubMain</MyType>
    <TargetFrameworkVersion>v3.5</TargetFrameworkVersion>
    <OptionExplicit>On</OptionExplicit>
    <OptionCompare>Binary</OptionCompare>
    <OptionStrict>Off</OptionStrict>
    <OptionInfer>On</OptionInfer>
  </PropertyGroup>
  <PropertyGroup Condition=" '$(Configuration)|$(Platform)' == 'Debug|AnyCPU' ">
    <DebugSymbols>true</DebugSymbols>
    <DebugType>full</DebugType>
    <DefineDebug>true</DefineDebug>
    <DefineTrace>true</DefineTrace>
    <OutputPath>bin\Debug\</OutputPath>
    <NoWarn>42016,41999,42017,42018,42019,42032,42036,42020,42021,42022</NoWarn>
  </PropertyGroup>
  <PropertyGroup Condition=" '$(Configuration)|$(Platform)' == 'Release|AnyCPU' ">
    <DebugType>pdbonly</DebugType>
    <DefineDebug>false</DefineDebug>
    <DefineTrace>true</DefineTrace>
    <Optimize>true</Optimize>
    <OutputPath>bin\Release\</OutputPath>
    <NoWarn>42016,41999,42017,42018,42019,42032,42036,42020,42021,42022</NoWarn>
  </PropertyGroup>
  <ItemGroup>
    <Reference Include="SmallBasicLibrary">
      <SpecificVersion>False</SpecificVersion>
      <HintPath>$(programfiles)\\Microsoft\Small Basic\SmallBasicLibrary.dll</HintPath>
    </Reference>
    <Reference Include="System" />
    <Reference Include="System.Drawing" />
    <Reference Include="System.Windows.Forms" />
    <Reference Include="System.Xml" />
    <Reference Include="System.Core" />
    <Reference Include="System.Xml.Linq" />
  </ItemGroup>
  <ItemGroup>
    <Import Include="Microsoft.VisualBasic" />
    <Import Include="System" />
    <Import Include="System.Diagnostics" />
    <Import Include="System.Linq" />
    <Import Include="System.Xml.Linq" />
    <Import Include="Microsoft.SmallBasic.Library" />
  </ItemGroup>
  <ItemGroup>
    <Compile Include="UntitledModule.vb" />
  </ItemGroup>
  <Import Project="$(MSBuildToolsPath)\Microsoft.VisualBasic.targets" />
</Project>

UntitledModule.vb Content :
Code:
Module UntitledModule
    Dim scale, train, path, name, inputNode, hiddenNode, outputNode, trainingData, data, i, A, B, C, epoch, input, output As Primitive
    Sub Main()
        scale = 100
        train = true
        path = Program.Directory + "\Add_ANN.txt"

        If train Then
            name = "Add"
            inputNode = 2
            hiddenNode = 3
            outputNode = 1
            trainingData = Program.Directory + "\" + name + ".txt"
            NeuralNetwork.New(name, inputNode + "," + hiddenNode + "," + outputNode)
            NeuralNetwork.BinaryOutput(name, 0, false)
            data = "X"
            For i = 1 To 1000
                A = Microsoft.SmallBasic.Library.Math.GetRandomNumber(scale)
                B = Microsoft.SmallBasic.Library.Math.GetRandomNumber(scale)
                C = (A + B) / 2
                data = data + (A / scale) + LDText.LF + (B / scale) + LDText.LF + (C / scale) + LDText.LF
            Next
            data = Text.GetSubTextToEnd(data, 2)
            ' The following line could be harmful and has been automatically commented.
            '  File.WriteContents(trainingData,data)
            TextWindow.WriteLine("Training started")
            epoch = NeuralNetwork.Train(name, trainingData, false)
            TextWindow.WriteLine("epoch=" + epoch)
            TextWindow.WriteLine("Trained=" + NeuralNetwork.Trained(name))
            TextWindow.WriteLine("BinaryOutput=" + NeuralNetwork.BinaryOutput(name, 0, true))
            TextWindow.WriteLine("Epoch=" + NeuralNetwork.Epochs(name, 0, true))
            TextWindow.WriteLine("LearningRate=" + NeuralNetwork.LearningRate(name, 0, true))
            TextWindow.WriteLine("Momentum=" + NeuralNetwork.Momentum(name, 0, true))
            TextWindow.WriteLine("SigmoidResponse=" + NeuralNetwork.SigmoidResponse(name, 0, true))
            TextWindow.WriteLine("ErrorRequired=" + NeuralNetwork.ErrorRequired(name, 0, true))

            NeuralNetwork.Save(name, path)
        Else
            name = NeuralNetwork.Load(path)
        End If

        For i = 1 To 10
            A = Microsoft.SmallBasic.Library.Math.GetRandomNumber(scale)
            B = Microsoft.SmallBasic.Library.Math.GetRandomNumber(scale)
            C = (A + B) / 2
            input = (A / scale) + "," + (B / scale)
            output = NeuralNetwork.Use(name, input)
            TextWindow.WriteLine("Mean(" + A + "," + B + ") = " + C + " (" + (output * scale) + ")")
        Next
    End Sub
End Module

I Have No Knowledge About VB Maybe You Can Help Smile
ZS
Reply
#50
JR,

Original program ZWWL934.000.  I import this, uncomment File commands, save and run.  Then use Tools->FlowChart in SB-Prime - no issue for me, so I guess the issue is with Graduate?

Graduate (to VB.Net) or Decompile in SB-Prime (to C#) should work - the Graduate tries to use the original SB code and has a number of 'fixes on top of fixes' since it is quite out of date for recent versions of Visual Studio - it is pretty flaky so I recommend using the Decompile option, similar to what you get with ILSpy or similar.  Maybe I should get rid of the Graduate option.

You will probably have to target .Net 4.8 rather than 4.5 using Decompile (I will fix this bit, but rthe Graduate is probably too far gone!).

I have also updated the git repro to include a simple test program to go with dll, which you can compile and debug through the original ANN source.  Using reflection to decompile you won't get as pleasant an experience - hence why I refactored it.

   

If you have access to the original source code and project allways use it, however I looked at the Graduate issue.  The error about FlowChart was spurious - it should be an error that the selected folder is not empty.  I also fixed another Graduate error and uploaded updated SB-Prime - but still best to use original source if available - I still had some issues compiling VB from Graduation (add LitDev and SmallBasicANN references and Imports) - New is a reserved VB keyword so NeuralNetwork.New(name, layers) fails.
Reply


Forum Jump:


Users browsing this thread: 4 Guest(s)