MathGroup Archive 2008

[Date Index] [Thread Index] [Author Index]

Search the Archive

simple neural network with mathematica HELP

  • To: mathgroup at smc.vnet.net
  • Subject: [mg90306] simple neural network with mathematica HELP
  • From: Dino <dinodeblasio at yahoo.it>
  • Date: Sat, 5 Jul 2008 04:50:02 -0400 (EDT)

Hello everybody,

I have the following program:
-----------------------------------------
sigmoid[x_] := 1/(1 + E^(-x));

bpnStandard[inNumber_, hidNumber_, outNumber_, ioPairs_, eta_, 
   numIters_] := 
  Module[{errors, hidWts, outWts, ioP, inputs, outDesired, hidOuts, 
    outputs, outErrors, outDelta, hidDelta}, 
    hidWts = 
    Table[Table[Random[Real, {-0.1, 0.1}], {inNumber}], {hidNumber}]; 
   outWts = 
    Table[Table[
      Random[Real, {-0.1, 0.1}], {hidNumber}], {outNumber}];
   errors = 
    Table[ioP = ioPairs[[Random[Integer, {1, Length[ioPairs]}]]];
     inputs = ioP[[1]];
     outDesired = ioP[[2]];
     hidOuts = sigmoid[hidWts. inputs];
     outputs = sigmoid[outWts. hidOuts];
     outErrors = outDesired - outputs;
     outDelta = outErrors (outputs (1 - outputs));
     hidDelta = (hidOuts (1 - hidOuts)) Transpose[outWts].outDelta;
     outWts += eta Outer[Times, outDelta, hidOuts];
     hidWts += eta Outer[Times, hidDelta, inputs];
     outErrors.outErrors, {numIters}];
   Return[{hidWts, outWts, errors}];];

ioPairs = {{{0.9, 0.9, 0.9, 0.9, 0.1, 0.1, 0.9, 0.9, 
     0.9}, {0.1}}, {{0.9, 0.9, 0.9, 0.1, 0.9, 0.1, 0.1, 0.9, 
     0.1}, {0.9}}, {{0.9, 0.9, 0.9, 0.9, 0.1, 0.9, 0.9, 0.1, 
     0.9}, {0.1}}, {{0.1, 0.1, 0.9, 0.9, 0.9, 0.9, 0.1, 0.1, 
     0.9}, {0.9}}, {{0.9, 0.9, 0.9, 0.1, 0.1, 0.9, 0.9, 0.9, 
     0.9}, {0.1}}, {{0.1, 0.9, 0.1, 0.1, 0.9, 0.1, 0.9, 0.9, 
     0.9}, {0.9}}, {{0.9, 0.1, 0.9, 0.9, 0.1, 0.9, 0.9, 0.9, 
     0.9}, {0.1}}, {{0.9, 0.1, 0.1, 0.9, 0.9, 0.9, 0.9, 0.1, 
     0.1}, {0.9}}};

eta = 0.5
--------------------------------------------------
The above program train a network with ioPairs in input, one ioPair gives a input vector and the desired output.
When I run the following:
------------------------
outs = {0, 0, 0};
outs = bpnStandard[9, 3, 1, ioPairs, 3, 250];
ListPlot[outs[[3]], PlotJoined -> True]
------------------------------
I train the network and the "errors" will be plotted, is possible to see that the errors go to 0 when the number of iteration "numIters" are a big number and modifying the eta factor.
The question is: after i train the network, how is possible to show that the program recognize also an input vector slightly different from the original?
For example the first vector in "ioPairs" is:
{{0.9, 0.9, 0.9, 
  0.9, 0.1, 0.1, 
  0.9, 0.9, 0.9}, {0.1}}

The first part represents the letter C and the desired output is 0.1.
How is possible to verify that the network recognizes as C also the vector:
{0.1, 0.9, 0.9, 
 0.9, 0.1, 0.1, 
 0.9, 0.9, 0.9}  ??

Please if you are interested in helping me, contact me at dinodeblasio at yahoo.it  and i will also send you the pdf file from the book where I found the code, so you will understand exactly the problem.

Thanks for your collaboration.
Dino.


  • Prev by Date: Re: Re: Re: Problem with parametric minimization
  • Next by Date: Re: SelectionPlaceholder in Packages
  • Previous by thread: Re: Relational Operators and Random Integers
  • Next by thread: Re: simple neural network with mathematica HELP