MathGroup Archive 2008

[Date Index] [Thread Index] [Author Index]

Search the Archive

Re: simple neural network with mathematica HELP

  • To: mathgroup at smc.vnet.net
  • Subject: [mg90335] Re: [mg90306] simple neural network with mathematica HELP
  • From: DrMajorBob <drmajorbob at att.net>
  • Date: Sun, 6 Jul 2008 07:19:26 -0400 (EDT)
  • References: <13218198.1215251547827.JavaMail.root@m08>
  • Reply-to: drmajorbob at longhorns.com

Something like this:

sigmoid[x_] := 1/(1 + E^(-x));
bpn[inputs_, hidWts_, outWts_] :=
  Module[{hidOuts = sigmoid[hidWts.inputs]}, {hidOuts,
    sigmoid[outWts.hidOuts]}]
bpnStandard[inNumber_, hidNumber_, outNumber_, ioPairs_, eta_,
   numIters_] :=
  Module[{errors, hidWts, outWts, inputs, outDesired, hidOuts, outputs,
     outErrors, outDelta, hidDelta},
   hidWts = RandomReal[{-.1, .1}, {hidNumber, inNumber}];
   outWts = RandomReal[{-.1, .1}, {outNumber, hidNumber}];
   errors = Table[{inputs, outDesired} = RandomChoice@ioPairs;
     {hidOuts, outputs} = bpn[inputs, hidWts, outWts];
     outErrors = outDesired - outputs;
     outDelta = outErrors (outputs (1 - outputs));
     hidDelta = (hidOuts (1 - hidOuts)) Transpose[outWts].outDelta;
     outWts += eta Outer[Times, outDelta, hidOuts];
     hidWts += eta Outer[Times, hidDelta, inputs];
     outErrors.outErrors, {numIters}];
   {hidWts, outWts, errors}]
ioPairs = {{{0.9, 0.9, 0.9, 0.9, 0.1, 0.1, 0.9, 0.9,
      0.9}, {0.1}}, {{0.9, 0.9, 0.9, 0.1, 0.9, 0.1, 0.1, 0.9,
      0.1}, {0.9}}, {{0.9, 0.9, 0.9, 0.9, 0.1, 0.9, 0.9, 0.1,
      0.9}, {0.1}}, {{0.1, 0.1, 0.9, 0.9, 0.9, 0.9, 0.1, 0.1,
      0.9}, {0.9}}, {{0.9, 0.9, 0.9, 0.1, 0.1, 0.9, 0.9, 0.9,
      0.9}, {0.1}}, {{0.1, 0.9, 0.1, 0.1, 0.9, 0.1, 0.9, 0.9,
      0.9}, {0.9}}, {{0.9, 0.1, 0.9, 0.9, 0.1, 0.9, 0.9, 0.9,
      0.9}, {0.1}}, {{0.9, 0.1, 0.1, 0.9, 0.9, 0.9, 0.9, 0.1,
      0.1}, {0.9}}};
outs = bpnStandard[9, 3, 1, ioPairs, 3, 250];
{hidWts, outWts, errors} = outs;
ListPlot[errors, PlotJoined -> True]

newInput = {0.1, 0.9, 0.9, 0.9, 0.1, 0.1, 0.9, 0.9, 0.9};
Last@bpn[newInput, hidWts, outWts]

{0.240182}

Of course, 0 error can't be expected (or even defined) for a new input.

Bobby

On Sat, 05 Jul 2008 03:50:02 -0500, Dino <dinodeblasio at yahoo.it> wrote:

> Hello everybody,
>
> I have the following program:
> -----------------------------------------
> sigmoid[x_] := 1/(1 + E^(-x));
>
> bpnStandard[inNumber_, hidNumber_, outNumber_, ioPairs_, eta_,
>    numIters_] :=
>   Module[{errors, hidWts, outWts, ioP, inputs, outDesired, hidOuts,
>     outputs, outErrors, outDelta, hidDelta},
>     hidWts =
>     Table[Table[Random[Real, {-0.1, 0.1}], {inNumber}], {hidNumber}];
>    outWts =
>     Table[Table[
>       Random[Real, {-0.1, 0.1}], {hidNumber}], {outNumber}];
>    errors =
>     Table[ioP = ioPairs[[Random[Integer, {1, Length[ioPairs]}]]];
>      inputs = ioP[[1]];
>      outDesired = ioP[[2]];
>      hidOuts = sigmoid[hidWts. inputs];
>      outputs = sigmoid[outWts. hidOuts];
>      outErrors = outDesired - outputs;
>      outDelta = outErrors (outputs (1 - outputs));
>      hidDelta = (hidOuts (1 - hidOuts)) Transpose[outWts].outDelta;
>      outWts += eta Outer[Times, outDelta, hidOuts];
>      hidWts += eta Outer[Times, hidDelta, inputs];
>      outErrors.outErrors, {numIters}];
>    Return[{hidWts, outWts, errors}];];
>
> ioPairs = {{{0.9, 0.9, 0.9, 0.9, 0.1, 0.1, 0.9, 0.9,
>      0.9}, {0.1}}, {{0.9, 0.9, 0.9, 0.1, 0.9, 0.1, 0.1, 0.9,
>      0.1}, {0.9}}, {{0.9, 0.9, 0.9, 0.9, 0.1, 0.9, 0.9, 0.1,
>      0.9}, {0.1}}, {{0.1, 0.1, 0.9, 0.9, 0.9, 0.9, 0.1, 0.1,
>      0.9}, {0.9}}, {{0.9, 0.9, 0.9, 0.1, 0.1, 0.9, 0.9, 0.9,
>      0.9}, {0.1}}, {{0.1, 0.9, 0.1, 0.1, 0.9, 0.1, 0.9, 0.9,
>      0.9}, {0.9}}, {{0.9, 0.1, 0.9, 0.9, 0.1, 0.9, 0.9, 0.9,
>      0.9}, {0.1}}, {{0.9, 0.1, 0.1, 0.9, 0.9, 0.9, 0.9, 0.1,
>      0.1}, {0.9}}};
>
> eta = 0.5
> --------------------------------------------------
> The above program train a network with ioPairs in input, one ioPair  
> gives a input vector and the desired output.
> When I run the following:
> ------------------------
> outs = {0, 0, 0};
> outs = bpnStandard[9, 3, 1, ioPairs, 3, 250];
> ListPlot[outs[[3]], PlotJoined -> True]
> ------------------------------
> I train the network and the "errors" will be plotted, is possible to see  
> that the errors go to 0 when the number of iteration "numIters" are a  
> big number and modifying the eta factor.
> The question is: after i train the network, how is possible to show that  
> the program recognize also an input vector slightly different from the  
> original?
> For example the first vector in "ioPairs" is:
> {{0.9, 0.9, 0.9,
>   0.9, 0.1, 0.1,
>   0.9, 0.9, 0.9}, {0.1}}
>
> The first part represents the letter C and the desired output is 0.1.
> How is possible to verify that the network recognizes as C also the  
> vector:
> {0.1, 0.9, 0.9,
>  0.9, 0.1, 0.1,
>  0.9, 0.9, 0.9}  ??
>
> Please if you are interested in helping me, contact me at  
> dinodeblasio at yahoo.it  and i will also send you the pdf file from the  
> book where I found the code, so you will understand exactly the problem.
>
> Thanks for your collaboration.
> Dino.
>
>



-- 
DrMajorBob at longhorns.com


  • Prev by Date: Re: Re: Printing, WYSIWYG, and Window Magnification?
  • Next by Date: Fwd: The show command freezes
  • Previous by thread: simple neural network with mathematica HELP
  • Next by thread: exporting from mathematica to a poster