 
 
 
 
 
 
Joint Entropy
- To: mathgroup at smc.vnet.net
- Subject: [mg65575] Joint Entropy
- From: Sensei <senseiwa at mac.com>
- Date: Sun, 9 Apr 2006 04:32:02 -0400 (EDT)
- Sender: owner-wri-mathgroup at wolfram.com
Hi! I'm writing some functions to analyze the informative content of  
sequences, and I've stopped trying to produce the joint entropy.
These are my auxiliary functions:
(* Generates a sequence of random numbers *)
In[2]:=
RandomSequence[nsamples_,min_,max_]:=Table[
     Random[Integer,{min,max}], {nsamples}
     ]
(* Alphabet of a sequence *)
In[3]:=
SignalAlphabet[signal_]:=Union[signal]
(* Gives the probability of a symbol *)
In[13]:=
SymbolProbability[symbol_,signal_]:=Count[signal,symbol]/Length[signal]
(* Gives the list of all symbols and their probabilities *)
In[20]:=
SignalProbabilityList[signal_]:=Map[
     {#,SymbolProbability[#,signal]}&,
     SignalAlphabet[signal]]
(* Calculates the entropy *)
In[24]:=
SignalEntropy[signal_]:=-1*Fold[Plus, 0,
       Map[Log[2,Last[#]]&,SignalProbability[signal]]]
Now, my question is, how to produce the joint probability of two  
sequences ``mathematica style''? So, given X and Y, I can produce the  
alphabet of XY, that is the cartesian product of the two alphabets  
(using CartesianProduct), but... well, I don't know how to make a  
good code! As I said previously, I'm new to mathematica... How should  
I proceed?
Thanks for any hints!
PS. If the code above is not so good, please let me know! :)
--
Sensei <senseiwa at mac.com>
The optimist thinks this is the best of all possible worlds.
The pessimist fears it is true.      [J. Robert Oppenheimer]
- Follow-Ups:
- Re: Joint Entropy
- From: bsyehuda@gmail.com
 
 
- Re: Joint Entropy

