MathGroup Archive 2006

[Date Index] [Thread Index] [Author Index]

Search the Archive

Joint Entropy

  • To: mathgroup at
  • Subject: [mg65575] Joint Entropy
  • From: Sensei <senseiwa at>
  • Date: Sun, 9 Apr 2006 04:32:02 -0400 (EDT)
  • Sender: owner-wri-mathgroup at

Hi! I'm writing some functions to analyze the informative content of  
sequences, and I've stopped trying to produce the joint entropy.

These are my auxiliary functions:

(* Generates a sequence of random numbers *)
     Random[Integer,{min,max}], {nsamples}

(* Alphabet of a sequence *)

(* Gives the probability of a symbol *)

(* Gives the list of all symbols and their probabilities *)

(* Calculates the entropy *)
SignalEntropy[signal_]:=-1*Fold[Plus, 0,

Now, my question is, how to produce the joint probability of two  
sequences ``mathematica style''? So, given X and Y, I can produce the  
alphabet of XY, that is the cartesian product of the two alphabets  
(using CartesianProduct), but... well, I don't know how to make a  
good code! As I said previously, I'm new to mathematica... How should  
I proceed?

Thanks for any hints!

PS. If the code above is not so good, please let me know! :)

Sensei <senseiwa at>

The optimist thinks this is the best of all possible worlds.
The pessimist fears it is true.      [J. Robert Oppenheimer]

  • Prev by Date: Re: Permutations
  • Next by Date: Re: graphic inside a gridbox
  • Previous by thread: Re: Help with Identities
  • Next by thread: Re: Joint Entropy