MathGroup Archive 2008

[Date Index] [Thread Index] [Author Index]

Search the Archive

programming a simple neural network

  • To: mathgroup at smc.vnet.net
  • Subject: [mg90263] programming a simple neural network
  • From: dinodeblasio at gmail.com
  • Date: Fri, 4 Jul 2008 03:55:19 -0400 (EDT)

Hello, I have found a code of a Back propagation neural network in the
Book: "simulating neural networks with mathematica". The network
recognize the difference between the letters T and C
by inserting a series of input vectors and the desired pattern.
The code is given below:

"we write the sigmoid function"

sigmoid[x_] := 1/(1 + E^(-x));

"this is the input to the neural network"

ioPairs =
{{{0.9, 0.9, 0.9,
   0.9, 0.1, 0.1,                                   (*letter C*)
   0.9, 0.9, 0.9}, {0.1}},
{{0.9, 0.9, 0.9, 0.1, 0.9, 0.1, 0.1, 0.9, 0.1}, {0.9}},
{{0.9, 0.9, 0.9, 0.9, 0.1, 0.9, 0.9, 0.1, 0.9}, {0.1}},
{{0.1, 0.1, 0.9, 0.9, 0.9, 0.9, 0.1, 0.1, 0.9}, {0.9}},
{{0.9, 0.9, 0.9, 0.1, 0.1, 0.9, 0.9, 0.9,  0.9}, {0.1}},
{{0.1, 0.9, 0.1, 0.1, 0.9, 0.1, 0.9, 0.9, 0.9}, {0.9}},
{{0.9, 0.1, 0.9, 0.9, 0.1, 0.9, 0.9, 0.9,  0.9}, {0.1}},
{{0.9, 0.1, 0.1, 0.9, 0.9, 0.9, 0.9, 0.1, 0.1}, {0.9}}};

"here is the bpnStandard function"
------------------------------------------------------------------------------------------------------------------------
bpnStandard[inNumber_, hidNumber_, outNumber_, ioPairs_, eta_,
   numIters_] :=
  Module[{errors, hidWts, outWts, ioP, inputs, outDesired, hidOuts,
    outputs, outErrors, outDelta, hidDelta},
   hidWts =
    Table[Table[Random[Real, {-0.1, 0.1}], {inNumber}], {hidNumber}];
   outWts =
    Table[Table[
      Random[Real, {-0.1, 0.1}], {hidNumber}], {outNumber}];
   errors =
    Table[ioP = ioPairs[[Random[Integer, {1, Length[ioPairs]}]]];
     inputs = ioP[[1]];
     outDesired = ioP[[2]];
     hidOuts = sigmoid[hidWts. inputs];
     outputs = sigmoid[outWts. hidOuts];
     outErrors = outDesired - outputs;
     outDelta = outErrors (outputs (1 - outputs));
     hidDelta = (hidOuts (1 - hidOuts)) Transpose[outWts].outDelta;
     outWts += eta Outer[Times, outDelta, hidOuts];
     hidWts += eta Outer[Times, hidDelta, inputs];
     outErrors.outErrors, {numIters}];
   Return[{hidWts, outWts, errors}];];
-------------------------------------------------------------------------------------------------------------------------------
"here are the inputs to the network"

outs = {0, 0, 0};
outs = bpnStandard[9, 3, 1, ioPairs, 7, 250];
ListPlot[outs[[3]], PlotJoined -> True]

Now what i would like to do is train the network and then try again
with some noise, for example giving a vector that is similar to the
letter C but a bit different, but when in the bpnStandard function I
have: outDesired = ioP[[2]]; I want my code to do the matching process
with both the value: 0.1 and 0.9. In other words the program should
pick up: inputs = ioP[[1]]; and match the input with all possible
outputs one by one, then choose whitch output belongs to the inserted
input.

Do you have any idea to how modify the code?
Thanks.
Dino.

Write me also by mail: dinodeblasio at yahoo.it
or dinodeblasio at gmail.com


  • Prev by Date: Re: Set::setps problems
  • Next by Date: Re: ShowLegend font size
  • Previous by thread: Exporting Plots with Labels
  • Next by thread: Converting a string to a variable name