Services & Resources / Wolfram Forums
-----
 /
MathGroup Archive
2006
*January
*February
*March
*April
*May
*June
*July
*August
*September
*October
*November
*December
*Archive Index
*Ask about this page
*Print this page
*Give us feedback
*Sign up for the Wolfram Insider

MathGroup Archive 2006

[Date Index] [Thread Index] [Author Index]

Search the Archive

Re: Joint Entropy

  • To: mathgroup at smc.vnet.net
  • Subject: [mg65629] Re: [mg65575] Joint Entropy
  • From: bsyehuda at gmail.com
  • Date: Tue, 11 Apr 2006 04:04:45 -0400 (EDT)
  • References: <200604090832.EAA00800@smc.vnet.net>
  • Sender: owner-wri-mathgroup at wolfram.com

Hi Sensei,
I noticed that you detemined the dictionary from the appearences in th
eseries. Don't you know this in advance. Determining the dictionary from a
sample path is an approcimation, afor low probability events it is correct
only for LONG series.
Tuples , a new function in version 5.2 saved all the trouble with
flattening, and is faster then the implementation of Outer (or Distribute
which can also be used in this case).
Tuples[{{a,b,c},{1,2,3}}]
returns
{{a, 1}, {a, 2}, {a, 3}, {b, 1}, {b, 2}, {b, 3}, {c, 1}, {c, 2}, {c, 3}}
and this is the set of all pairs from both dictionaries
Now you can use the approach you presented for the pairs, etc.
regrads
yehuda


On 4/9/06, Sensei <senseiwa at mac.com> wrote:
>
> Hi! I'm writing some functions to analyze the informative content of
> sequences, and I've stopped trying to produce the joint entropy.
>
> These are my auxiliary functions:
>
> (* Generates a sequence of random numbers *)
> In[2]:=
> RandomSequence[nsamples_,min_,max_]:=Table[
>      Random[Integer,{min,max}], {nsamples}
>      ]
>
> (* Alphabet of a sequence *)
> In[3]:=
> SignalAlphabet[signal_]:=Union[signal]
>
> (* Gives the probability of a symbol *)
> In[13]:=
> SymbolProbability[symbol_,signal_]:=Count[signal,symbol]/Length[signal]
>
> (* Gives the list of all symbols and their probabilities *)
> In[20]:=
> SignalProbabilityList[signal_]:=Map[
>      {#,SymbolProbability[#,signal]}&,
>      SignalAlphabet[signal]]
>
> (* Calculates the entropy *)
> In[24]:=
> SignalEntropy[signal_]:=-1*Fold[Plus, 0,
>        Map[Log[2,Last[#]]&,SignalProbability[signal]]]
>
>
> Now, my question is, how to produce the joint probability of two
> sequences ``mathematica style''? So, given X and Y, I can produce the
> alphabet of XY, that is the cartesian product of the two alphabets
> (using CartesianProduct), but... well, I don't know how to make a
> good code! As I said previously, I'm new to mathematica... How should
> I proceed?
>
> Thanks for any hints!
>
> PS. If the code above is not so good, please let me know! :)
>
> --
> Sensei <senseiwa at mac.com>
>
> The optimist thinks this is the best of all possible worlds.
> The pessimist fears it is true.      [J. Robert Oppenheimer]
>
>
>



  • Prev by Date: Re: Re: Joint Entropy
  • Next by Date: Re: For Loop problem
  • Previous by thread: Joint Entropy
  • Next by thread: Re: Joint Entropy