Shop OBEX P1 Docs P2 Docs Learn Events
Implementing a Neural Network (BPN) on a Prop — Parallax Forums

Implementing a Neural Network (BPN) on a Prop

MightorMightor Posts: 338
edited 2008-02-12 19:30 in Propeller 1
Hey peeps,

I was wondering if anyone had ever tried to implement a Back Propagation Network (BPN) in SPIN (or ASM, if you're really hardcore). As far as I can tell it will require a lot of floating point operations as most of the weights and error calculations are all done with numbers much smaller than 1. I will also need to use 2 dimensional arrays. I've tried implementing a Spiking Neural Network on PyRo (a virtual robotics environment, Python based) but the end result was a robot with less intelligence than a squished cockroach.

What I'd like to do is implement a robot that can learn to navigate around a room without running into things. I know these things can be implemented with standard programming methods, but I am trying to create an army of semi intelligent minions to take over the world. I already have the secret dormant volcano base and a white cat.

All silliness aside, if someone has already made a start or has already implemented a BPN, I'd be very grateful if they could share their code or experiences trying to implement such a beast.

Thanks,
Mightor

▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔
| To know recursion, you must first know recursion.
| I reject your reality and substitute my own!
| - Adam Savage

Comments

  • RaymanRayman Posts: 14,162
    edited 2008-02-12 02:17
    I would lean toward a "fuzzy logic" approach toward a robot-steering app... It's easier to implement and would probably work better...
  • Shane De CataniaShane De Catania Posts: 67
    edited 2008-02-12 12:23
    G'Day Mightor,
    I like the sound of where you are going with this (the neural network and taking over the world!)... Definitely keep us informed of your progress. There was a post ages ago from Marcus Ekerhult where he submitted his character recognition program which used a neural network. As usual, searching the forum for that post yielded no relevant results, but I do still have a copy of the code - see attachment. Hopefully that will provide you some inspiration at least. Maybe someone else can provide a link to the original post.
    Cheers,
    Shane.
  • crgwbrcrgwbr Posts: 614
    edited 2008-02-12 17:46
    A couple months ago there was a series of articles in Servo magazine about implementing a neural network on a PIC. If they were able to accomplish that on a (lowly) PIC, I'd say that it's definitely possible on the Prop.

    Send me a PM if you'd like a copy of the article(s).

    Regards,
    Craig

    ▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔
    My system: 1.6 GHz AMD Turion64 X2, 4GB DDR2, 256MB ATI Radeon Graphics card, 15.4" Widescreen HD Screen

    I have a duel boot of Ubuntu Linux and Windows Vista. Vista, because it came with the PC, Ubuntu because I like software that works.

    "Failure is not an option -- it comes bundled with Windows."

    Use The Best...
    Linux for Servers
    Mac for Graphics
    Palm for Mobility
    Windows for Solitaire
  • OwenSOwenS Posts: 173
    edited 2008-02-12 18:33
    Could the calculations not be done in fixed point? Fixed point could be made much faster than floating point, since it only uses integer arithmatic. Would the fidelity of floating point really be needed?
  • VIRANDVIRAND Posts: 656
    edited 2008-02-12 18:38
    This is what I use as a simple reference and what I would translate if I made a neural net demo.
    But I played with it before and found it rather disappointing, because it can only learn
    about 3 letters, and if you try to teach it more it makes stuff up. I have only one idea of how
    to make it smarter, and that would be add more neurons, but I don't think it would ever be
    usefully smart. I could be wrong.

    www.fourmilab.ch/documents/c64neural.html

    If it WERE usefully smart, it could be used for machine vision, and the bottom line of the image
    would be memorized with binary data such as a string, which would create an associative memory,
    so that when you show it an object, it would recall the archetype of that object with it's "name" (in binary ASCII).
    The recalled name could then be read... and thus the seen object recognized... by the application.
    In other words, it could be taught by the equivalent of a book that contains pictures of objects
    with names of objects under them. Alternatively, the alphabet would be taught as:
    A (bitmap) 01000001 (eight pixels in the corner of the bitmap)
    B (bitmap) 01000010
    C (bitmap) 01000011...
    Thus the image would be associated with it's name.
  • VIRANDVIRAND Posts: 656
    edited 2008-02-12 18:43
    @OwenS:
    (sorry I don't know how to edit my last post)
    I find fixed point (arbitrary precision arithmetic) to be far superior for ALL my needs than floating point.
  • MightorMightor Posts: 338
    edited 2008-02-12 19:30
    Thanks for the many responses. I am not quite familiar with the term fixed point. Do you mean numbers that can have a decimal point in them, but of a fixed length, ie just with four numbers behind the decimal point or whole numbers? I don't think I could implement a tanh(x) function with just ints or the alternative function 1/(1+e^-x). Those functions are used to calculate the activation values of the neurons. If you have ideas as to how I could accomplish this with whole nmbers, I'd really love to see them, I'm keen to make this as efficient as possible. However, maybe just to get things started, I'll use the standard float method and once I get it all working in a PoC, then I'll go about making it more efficient [noparse]:)[/noparse]

    Edit: I just Googled for fixed point math and it looks easy enough. I'll have to figure out how to make it work with the maths I need it to do. New stuff to learn!

    Regards,
    Mightor

    ▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔
    | To know recursion, you must first know recursion.

    Post Edited (Mightor) : 2/12/2008 7:45:04 PM GMT
Sign In or Register to comment.