MathGroup Archive 2006

[Date Index] [Thread Index] [Author Index]

Search the Archive

Joint Entropy


Hi! I'm writing some functions to analyze the informative content of  
sequences, and I've stopped trying to produce the joint entropy.

These are my auxiliary functions:

(* Generates a sequence of random numbers *)
In[2]:=
RandomSequence[nsamples_,min_,max_]:=Table[
     Random[Integer,{min,max}], {nsamples}
     ]

(* Alphabet of a sequence *)
In[3]:=
SignalAlphabet[signal_]:=Union[signal]

(* Gives the probability of a symbol *)
In[13]:=
SymbolProbability[symbol_,signal_]:=Count[signal,symbol]/Length[signal]

(* Gives the list of all symbols and their probabilities *)
In[20]:=
SignalProbabilityList[signal_]:=Map[
     {#,SymbolProbability[#,signal]}&,
     SignalAlphabet[signal]]

(* Calculates the entropy *)
In[24]:=
SignalEntropy[signal_]:=-1*Fold[Plus, 0,
       Map[Log[2,Last[#]]&,SignalProbability[signal]]]


Now, my question is, how to produce the joint probability of two  
sequences ``mathematica style''? So, given X and Y, I can produce the  
alphabet of XY, that is the cartesian product of the two alphabets  
(using CartesianProduct), but... well, I don't know how to make a  
good code! As I said previously, I'm new to mathematica... How should  
I proceed?

Thanks for any hints!

PS. If the code above is not so good, please let me know! :)

--
Sensei <senseiwa at mac.com>

The optimist thinks this is the best of all possible worlds.
The pessimist fears it is true.      [J. Robert Oppenheimer]



  • Prev by Date: Re: Permutations
  • Next by Date: Re: graphic inside a gridbox
  • Previous by thread: Re: Help with Identities
  • Next by thread: Re: Joint Entropy