variance of product of 2 independent variables
- To: mathgroup at smc.vnet.net
- Subject: [mg87030] variance of product of 2 independent variables
- From: "Dankwort, Rudolf C" <Rudolf.Dankwort at Honeywell.com>
- Date: Sat, 29 Mar 2008 04:25:10 -0500 (EST)
Hello Ben - I have a question about the subject matter. To review, you sent Frank Brand the following: <<If a and b were completly uncorrelated (not even non-linear correlations among them), then you can compute the variance of their product quite easily v(ab) := < a^2b^2 > - < ab >^2 = <a^2><b^2> - <a>^2<b>^2 = v(a)<b> + v(b)<a> + v(a)v(b); v(a)=<a^2>-<a>^2, v(b)=<b^2>-<b>^2 here v(.) denotes variance, <.> denotes mean. Note that we do not have to assume normal distributions for a and b, essential is that their are uncorrelated, hence the means of products factor into products of means.>> If <a> = 1000 and <b> = 0.001, and v(a) = 100 and v(b) = 1e-10 (in other words, both a and b have 1% standard deviations), then I compute v(ab) = 100*0.001 + 1e-10*1000 + 100*1e-10 ~ 0.1 which is obviously wrong (<ab> = 1.000 and std deviation would be sqrt(0.1) = 0.31. Help! Da stimmt was nicht! Rudy dankwort Phoenix AZ USA