- Neural Network! [Complete/Almost Finished!]
- 26 Mar 2017 06:28:04 pm
- Last edited by iPhoenix on 12 Apr 2017 02:58:09 pm; edited 20 times in total
cross-over
I made probably the worst decision of my life. I am making a TI-Basic neural network.
Here's what's working:
-Perceptron/single neuron
-Input
-Weights
Here's what's not working:
-Outputs
-More than one neuron (network part of 'neural network')
-My brain
-TI-Basic
I have some code done, but it's really sloppy, and uses a lot of subprograms to keep it organized. (I'm porting from Java, ok?!) In the end, I can just paste the code from the subprograms into the mainclass (dang OOP is getting to me) program.
I will post progress updates here and potentially code snippets, but I'm going to take a small break from it. I spent way too much time getting to this point.
Also, this is definitely the first time anyone has done this, so I'm pretty proud of mymess code.
I will post code when I can, as I am in a very volatile state. I currently am working on solving some of the problems of the code and do not have a working version.
I expect to be done in a week or two, but I am not sure. [edit: not happening]
Additionally, it is surprisingly fast, with one neuron/layer. The time required per layer should (this is from experience) go up exponentially, though.
Single Perceptron
Code/Download (version 1, I used a supercheatysimple activation function to give me time to re-learn basic calculus.)
[dropbox download here]
Source (optimized, credits down below, may not work 100%):
prgmPERCEPT:
Code:
last updated March 30, 2017
prgmTRAIN:
Code:
Last updated April 2, 2017
And here's the README file (text, have fun putting this on your calc), for archival purposes (and for help):
Code:
Credits for optimizations not included in download, but in source:
mr womp womp
PT_ or P_T (can't tell anymore!)
(potentially you?)
Here's a really good image on perceptrons I found.
It explained a lot (to me, at least, a long time ago)
Here's a video about the speed (and I leaked some stuff)
here!
Here's a great explanation (including psudocode) on what a neural network is, and how to make one.
1 layer (3 neurons) neural network
prgmXOR (solves the classic xor problem)
Takes literally days to run the 20k iterations to be somewhat accurate
Code:
^^ Code looks weird, just input it into SC3
I literally copied and pasted the code, so there may be a few bugs.
I am also not 100% sure the code works (It should), I need a day or two to run 20 thousand trials
I also will not (as of right now) be explaining what it writes on the homescreen, it's a huge mess.
I made probably the worst decision of my life. I am making a TI-Basic neural network.
Here's what's working:
-Perceptron/single neuron
-Input
-Weights
Here's what's not working:
-Outputs
-More than one neuron (network part of 'neural network')
-My brain
-TI-Basic
I have some code done, but it's really sloppy, and uses a lot of subprograms to keep it organized. (I'm porting from Java, ok?!) In the end, I can just paste the code from the subprograms into the main
I will post progress updates here and potentially code snippets, but I'm going to take a small break from it. I spent way too much time getting to this point.
Also, this is definitely the first time anyone has done this, so I'm pretty proud of my
I will post code when I can, as I am in a very volatile state. I currently am working on solving some of the problems of the code and do not have a working version.
I expect to be done in a week or two, but I am not sure. [edit: not happening]
Additionally, it is surprisingly fast, with one neuron/layer. The time required per layer should (this is from experience) go up exponentially, though.
Single Perceptron
Code/Download (version 1, I used a super
[dropbox download here]
Source (optimized, credits down below, may not work 100%):
prgmPERCEPT:
Code:
Ans->|LARGS
0.1->C
If 1=|LARGS(1
seq(2rand-1,I,1,|LARGS(2->|LWTS
If 2=|LARGS(1
Then
DelVar S
For(I,3,dim(|LARGS)-1
S+|LARGS(I)*|LWTS(I-2->S
End
1-2(S<0->theta
End
If 3=|LARGS(1
Then
|LARGS(2->D
dim(|LARGS)-2->dim(|LA
DelVar S
~1+2(5>2sum(seq(|LARGS(I)*|LWTS(I-2),I,3,dim(|LARGS)-1->S
D-Ans->E
For(I,1,dim(|LWTS
C*E*|LARGS(I+2->|LB(I
|LWTS(I)+C*E*|LARGS(I+2->|LWTS(I
End
{E,S,|LB(1),|LB(2
End
last updated March 30, 2017
prgmTRAIN:
Code:
Menu("RESUME TRAINING?","YES",02,"NO",03
Lbl 03
3->dim(|LZYZYZ
Fill(0,|LZYZYZ
1->|LZYZYZ(1
ClrDraw
{1,2:prgmPERCEPT
Lbl 02
Input "NUMBER OF TRIALS: ",A
|LZYZYZ(2->I%
A->|N
Repeat not(|N
|N-1->|N
Text(0,1,|LZYZYZ(1
Text(12,1,|LZYZYZ(2
Text(24,1,|LZYZYZ(1)-|LZYZYZ(2
randInt(~100,100,2->L1
1-2(L1(1)>L1(2
augment({3,Ans},L1->|LZ
prgmPERCEPT
Ans->L1
If Ans(1
Pt-On(|LZ(3),|LZ(4),Black,3
|LZYZYZ(2)+not(Ans(1->|LZYZYZ(2
Pt-On(|LZ(3),|LZ(4),Black+L1(2),1
1+|LZYZYZ(1->|LZYZYZ(1
End
Last updated April 2, 2017
And here's the README file (text, have fun putting this on your calc), for archival purposes (and for help):
Code:
Drag both onto CE/CSE (monochrome may work)
Launch prgmTRAIN
Do not change the name of prgmPERCEPT.
Bugs/Modifications are welcome.
If there is any code that doesn’t run (I’m talking about stuff in prgmPERCEPT), it is there for later versions.
Thanks!
~iPhoenix
Credits for optimizations not included in download, but in source:
mr womp womp
PT_ or P_T (can't tell anymore!)
(potentially you?)
Here's a really good image on perceptrons I found.
It explained a lot (to me, at least, a long time ago)
Here's a video about the speed (and I leaked some stuff)
here!
Here's a great explanation (including psudocode) on what a neural network is, and how to make one.
1 layer (3 neurons) neural network
prgmXOR (solves the classic xor problem)
Takes literally days to run the 20k iterations to be somewhat accurate
Code:
"Change to {0 (instead of {1}) if you do not want to see the nerd stuff :P (may or may not run faster)
{1->|LDEBUG
rand(6->|LN1
rand(3->|LN2
DelVar theta
{0,1,0,1->|LIN1
{0,0,1,1->|LIN2
{0,1,1,0->|LIN3
Repeat 0
theta+1->theta
Disp "---","Trial Num: "+toString(theta
1+remainder(theta-1,4->I
|LIN1(I->|LINPUT(1
|LIN2(I->|LINPUT(2
|LIN3(I->Z
Disp "Expected Output: "+toString(Z
For(A,1,2
Disp "Input "+toString(A)+": "+toString(|LINPUT(A
End
Disp "---",""
DelVar P
For(A,1,3
DelVar S
For(I,0,3,3
S+|LINPUT(I/3+1)*|LN1(A+I->S
End
S->L2(A
1/(1+e^(~S->|LRES(A
End
|LRES*|LN2->|LRES
sum(|LRES->A
1/(1+e^(~A))->A
For(N,1,3
If sum(|LDEBUG:Then
Disp "Synapse "+toString(N
Disp "Expected: "+toString(Z),"Result: "+toString(A),"Error: "+toString(Z-A
End
Z-A->E
nDeriv(1/(1+e^(~B)),B,A)*E->C
If sum(|LDEBUG
Disp "Change: "+toString(C
|LN2(N->L3(N
C+|LN2(N->|LN2(N
If sum(|LDEBUG:Then
Disp "old:"+toString(L3(N
Disp "new:"+toString(|LN2(N
Disp ""
End
End
If sum(|LDEBUG
Disp "","Input-Hidden:"
For(A,1,3
L2(A->O
C*L3(A)*nDeriv(1/(1+e^(~X)),X,O)->L4(A
If sum(|LDEBUG
Disp "Change "+toString(A)+": "+toString(Ans
End
For(A,1,3
|LINPUT(1->L5(A
|LINPUT(2->L5(A+3
End
3->dim(L4
augment(L4,L4->L4
L4*L5->L5
|LN1+L5->|LN1
If sum(|LDEBUG:Then
For(A,1,12
If A<6
Disp "old: "+toString(|LN1(A)-L5(A
If A=6
Disp ""
If A>6
Disp "new: "+toString(|LN1(A-6
End
End
Disp "",""
End
End
^^ Code looks weird, just input it into SC3
I literally copied and pasted the code, so there may be a few bugs.
I am also not 100% sure the code works (It should), I need a day or two to run 20 thousand trials
I also will not (as of right now) be explaining what it writes on the homescreen, it's a huge mess.