erm, can you expand the snips please?
Pieman7373 wrote:
erm, can you expand the snips please?


Badically, each subprogram is like a class, the first part of the input is a function call, and the rest are arguments.

Yes.

Code:
<s  n  i  p>


One could say I'm making Ti-BASIC OOP again! (and OOP is great!)

If this is what I have to deal with: no code Evil or Very Mad

Quote:
10:30:57 PM ***prgmTrouble got back from picketing
10:30:32 PM [Cemetech] prgmTrouble entered the room
10:30:14 PM ***prgmTrouble pickets outside _iPhoenix_'s residence
10:29:01 PM [prgmTrouble] It's a conspiracy to deprive us of the code
10:24:29 PM [mr womp womp] _iPhoenix_ we demand the code
10:23:11 PM [Cemetech] calcnerd_CEP_D entered the room
10:22:45 PM ***Battlesquid facepalms
10:22:26 PM [mr womp womp] CODE ME DADDY
10:21:46 PM [oldmud0] CODE PLS
10:21:02 PM [Cemetech] Luxen entered the room
10:15:32 PM [Cemetech] _iPhoenix_ edited a post in [Neural Network!]
Evil or Very Mad
*bump* Code released on Wednesday, March 29, 2017

See first post.
Here, have some optimizations.

Code:
1
If 2S<5
~1
Ans->theta

Could be

Code:
1-2(2S<5->theta


Code:
{randInt(~100,100),randInt(~100,100->|LZYZYZ

Could be

Code:
randInt(~100,100,2->|LZYZYZ


Code:
1
If (2|LZYZYZ(1)-5)>|LZYZYZ(2
~1
augment({3,Ans},|LZYZYZ->|LZYZYZ

Could be

Code:
augment({3,1-2((2Ans(1)-5)>Ans(2))},Ans->|LZYZYZ

Didn't test this, but I think this should work...

Code:
DelVar S
For(I,3,dim(|LARGS)-1
   S+|LARGS(I)*|LWTS(I-2->S
End

Could be

Code:
sum(seq(|LARGS(I)*|LWTS(I-2),I,3,dim(|LARGS)-1->S


Code:
If 1=|LARGS(1
Then
   |LARGS(2->dim(|LWTS
   For(I,1,Ans
      2rand-1->|LWTS(I
   End
End

Could be

Code:
If 1=|LARGS(1
seq(2rand-1,I,1,|LARGS(2->|LWTS
mr womp womp wrote:
Here, have some optimizations.

Code:
1
If 2S<5
~1
Ans->theta

Could be

Code:
1-2(2S<5->theta


Code:
{randInt(~100,100),randInt(~100,100->|LZYZYZ

Could be

Code:
randInt(~100,100,2->|LZYZYZ


Code:
1
If (2|LZYZYZ(1)-5)>|LZYZYZ(2
~1
augment({3,Ans},|LZYZYZ->|LZYZYZ

Could be

Code:
augment({3,1-2((2Ans(1)-5)>Ans(2))},Ans->|LZYZYZ

Didn't test this, but I think this should work...

Code:
DelVar S
For(I,3,dim(|LARGS)-1
   S+|LARGS(I)*|LWTS(I-2->S
End

Could be

Code:
sum(seq(|LARGS(I)*|LWTS(I-2),I,3,dim(|LARGS)-1->S


Code:
If 1=|LARGS(1
Then
   |LARGS(2->dim(|LWTS
   For(I,1,Ans
      2rand-1->|LWTS(I
   End
End

Could be

Code:
If 1=|LARGS(1
seq(2rand-1,I,1,|LARGS(2->|LWTS


Much thanks!

It already ran fast, but now it's much better. I'll add those revisions when I am not on mobile.

Thanks!

Here's the code with Womp's optimizations:

prgmPERCEPT:

Code:
Ans->|LARGS

0.1->C
If 1=|LARGS(1
seq(2rand-1,I,1,|LARGS(2->|LWTS

If 2=|LARGS(1
Then
   DelVar S
   For(I,3,dim(|LARGS)-1
      S+|LARGS(I)*|LWTS(I-2->S
   End
   1-2(2S<5->theta
End

If 3=|LARGS(1
Then
   |LARGS(2->D
   dim(|LARGS)-2->dim(|LA
   DelVar S
   sum(seq(|LARGS(I)*|LWTS(I-2),I,3,dim(|LARGS)-1->S
   1-2(2S<5
   Ans->G:Ans->S
   D-G->E
   For(I,1,dim(|LWTS
      C*E*|LARGS(I+2->|LB(I
      |LWTS(I)+C*E*|LARGS(I+2->|LWTS(I
   End
   {E,S,|LB(1),|LB(2
End

prgmTRAIN:

Code:
ClrDraw
{1,2:prgmPERCEPT
Input "NUMBER OF TRIALS: ",A
A->|N
Ans->PV
0->I%
1
Repeat not(|N
   1->dim(L1
   randInt(~100,100,2->|LZYZYZ
   augment({3,1-2((2Ans(1)-5)>Ans(2))},Ans->|LZYZYZ
   prgmPERCEPT
   Ans->L1
   Pt-On(|LZYZYZ(3),|LZYZYZ(4),Black+L1(2
   
   Disp Ans,|LZYZYZ
   "--------------------------
   If not(L1(1
   Then
      I%+1->I%
      "++++++++++++++++++++++++++
   End
   Disp Ans
   |N-1->|N
End
Disp I%n/dPV


And here's the README file (text, have fun putting this on your calc), for archival purposes (and for help):

Code:
Drag both onto CE/CSE (monochrome may work)

Launch prgmTRAIN

Do not change the name of prgmPERCEPT.

Bugs/Modifications are welcome.

If there is any code that doesn’t run (I’m talking about stuff in prgmPERCEPT), it is there for later versions.

Thanks!

~iPhoenix
Quote:
Bugs<snip>are welcome.

What a generous policy!
Most developers I know try to eliminate them immediately.

I'm not too experienced with using neural networks, but this still looks like an interesting project.
Have neural networks been created with ez80 assembly before? Obviously, the program will be very slow once multiple neurons are introduced, and I wonder what the "maximum" speed on a calculator would be using assembly.
It would, but assembly is hard and causes a crash when a mistake happens. Basic is used here mostly for conceptual purposes.
prgmTrouble wrote:
It would, but assembly is hard and causes a crash when a mistake happens.

Only if you don't know what you are doing...
MateoConLechuga wrote:
prgmTrouble wrote:
It would, but assembly is hard and causes a crash when a mistake happens.

Only if you don't know what you are doing...

So pretty much everyone except you and a handful of others. Evil or Very Mad
commandblockguy wrote:
I'm not too experienced with using neural networks, but this still looks like an interesting project.
Have neural networks been created with ez80 assembly before? Obviously, the program will be very slow once multiple neurons are introduced, and I wonder what the "maximum" speed on a calculator would be using assembly.


Part 1: Thanks!
Part 2a: I don't know assembly, and that'd take away the fun! (This would be easily accomplished in C, IIRC, but non OOP is more challenging)
Part 2b: It's actually not going to be that slow. Here's my (simplified) logic:

We only need to use 2 programs: One that has the perceptrons, and one that organizes the network (and one that trains, but whatever)

We can have different weights/inputs by specifying different parts of the lists. I could modify my "syntax" 'making TI-Basic OOP again, and OOP is great!" - Meby making it so that you specify a single perceptron in the network. Currently, the syntax for training a perceptron is:

Code:
{3,[expected result],[input1],[input2]...[input n]

but this can easily be turned into

Code:
[number of inputs per perceptron]->n
{3,n,[expected result1],[perceptron number],[input1],[input2]...[input n],[expected result2],[perceptron number],[input1],[input2]...[input n]


I should probably get started, eh?
When will you use ICE for heavens sake? Speed is key for all of these projects!
CalcMeister wrote:
When will you use ICE for heavens sake? Speed is key for all of these projects!


Yes, I have started learning ICE, yes speed is key, no, there aren't any decimals.

That last part is killing it for me Sad

<edit>
Also, code for a rudimentary ANN is working, but freaking back-propagation.

Crud!

prgmTrouble wrote:
Basic is used here mostly for conceptual purposes.


You just posted that in the TI-BASIC forum...

Code:
   For(I,3,dim(|LARGS)-1
      S+|LARGS(I)*|LWTS(I-2->S
   End

can be

Code:
sum(seq(|LARGS(I)*|LWTS(I-2),I,3,dim(|LARGS)-1->S
as mr womp womp already said.
That, in combination with

Code:
   1-2(2S<5
   Ans->G:Ans->S
   D-G->E

can be

Code:
~1+2(5>2sum(seq(...->S
Ans->G


You can also remove G, and use D-Ans->E instead.

EDIT:

Code:
   If not(L1(1
   Then
      If not(L1(1


PT_ wrote:

Code:
   For(I,3,dim(|LARGS)-1
      S+|LARGS(I)*|LWTS(I-2->S
   End

can be

Code:
sum(seq(|LARGS(I)*|LWTS(I-2),I,3,dim(|LARGS)-1->S
as mr womp womp already said.
That, in combination with

Code:
   1-2(2S<5
   Ans->G:Ans->S
   D-G->E

can be

Code:
~1+2(5>2sum(seq(|LARGS(I)*|LWTS(I-2),I,3,dim(|LARGS)-1->S
Ans->G


You can also remove G, and use D-Ans->E instead.


Thanks!

[edit]

No idea how THAT managed to get in the code.

I am currently unable to post the new version of the code (#mobile), but it does away with that section and improves it.

Also, backpropagation hype!!
I would like to say that this project is not dead and I am still trying to get my algorithm to work.

I'd say I'm 70% done, but my calculator keeps becoming sentient. (jk about the last part)
Quote:
Calc: Wassup!!!
IPhoenix: Shock
Calc: Hellooooo? I'm talking to you!!!!
IPhoenix: Shock Uhhhhhh.... Hi?
Calc: Good. Human life is sentient. It would be a shame to destroy a planet filled with dum-dums.
*Randomguest smashes calc*
Randomguest: Wink You're welcome.
IPhoenix: Neutral
*two minutes later*
IPhoenix: Mad ALL.....MY....WORK....RUINED
Randomguest: Crying Hey, I was just trying to help! Let go of me!
*Randomguest gets thrown out a window*

Wink Wink Wink Wink Wink Wink Wink
Nice.


I back up my calc twice daily, (#lessonlearned) once to my computer, then to my friend's computer (one backup is NEVER enough.)


Most people (at my school, that know me) know I back it up to 2 locations, but they only know I back it up to my computer, so it's a common joke that I back it up to my washing machine.

Stupid idiots, I'm not that smart.
*bump*

I did it. I made it a neural network!

Although it only has one layer, it should be able to solve the xor problem easily. (in a day or two) I recommend running the code in CEmu, with the throttle set at 500%, and perhaps removing some of the
Code:
Disp
tokens, to speed it up.

Have fun!
  
Register to Join the Conversation
Have your own thoughts to add to this or any other topic? Want to ask a question, offer a suggestion, share your own programs and projects, upload a file to the file archives, get help with calculator and computer programming, or simply chat with like-minded coders and tech and calculator enthusiasts via the site-wide AJAX SAX widget? Registration for a free Cemetech account only takes a minute.

» Go to Registration page
Page 2 of 2
» All times are UTC - 5 Hours
 
You cannot post new topics in this forum
You cannot reply to topics in this forum
You cannot edit your posts in this forum
You cannot delete your posts in this forum
You cannot vote in polls in this forum

 

Advertisement