Testing for a non-constant variance
- To: mathgroup at smc.vnet.net
- Subject: [mg66100] Testing for a non-constant variance
- From: "john.hawkin at gmail.com" <john.hawkin at gmail.com>
- Date: Sat, 29 Apr 2006 03:40:59 -0400 (EDT)
- Sender: owner-wri-mathgroup at wolfram.com
Hello all, I made a post a while back asking about testing a set of data for stationarity. While there are statistical tests that can be used to detect a trend in a set of data, I am not aware of any that can detect a non-constant variance. One way of doing this would be to calculate the local standard deviation for a series of points throughout the time series and compare these to either the standard deviation of the whole data set, or the average of these local standard deviations. I am not sure exactly how to implement this in an optimal way, ie what kind of cut-offs to use. Does anyone know of any standard statistical tests that determine if a data set has a non-constant variance, possibly ones that are implemented in mathematica? Or does anyone have any ideas on what a good way to quantify my above method would be, in the general case? Any thoughts would be greatly appreciated.