Re: Simulating Correlated non-Normal Random Variables
- To: mathgroup at smc.vnet.net
- Subject: [mg32476] Re: Simulating Correlated non-Normal Random Variables
- From: Erich Neuwirth <erich.neuwirth at univie.ac.at>
- Date: Tue, 22 Jan 2002 03:20:09 -0500 (EST)
- References: <a2b4ge$lj2$1@smc.vnet.net>
- Sender: owner-wri-mathgroup at wolfram.com
you can apply the same method to any set of independent variables when x1 ... x1 are independent variables and A is a matrix A.(x1,..xn) has covariance matrix A'A no normality assumptions are needed for that. "Coleman, Mark" wrote: > > Greetings, > > This is not a Mathematica question per se, but I'm hoping members of this list > might be able to point me in the right direction. > > I'm constructing a Monte Carlo simulation procedure, where I need to > draw a n-vector of correlated random variables, given that I know the > univariate distribution of each element of the vector and the > corresponding covariance matrix. This problem is straightforward for > normally distributed random variables, where one can create N(0,1) > variates and multiply by the matrix square root of the covariance matrix > to created the correlated sample. I am dealing with non-normally > distrubuted random variables, however. In my case they have the Beta > distribution. > > Can someone point me to a way to draw an n-vector of correlated beta > random variables? > > Thanks, > > -Mark -- Erich Neuwirth, Computer Supported Didactics Working Group Visit our SunSITE at http://sunsite.univie.ac.at Phone: +43-1-4277-38624 Fax: +43-1-4277-9386