[Date Index] [Thread Index] [Author Index]
Testing for a non-constant variance
Hello all, I made a post a while back asking about testing a set of data for stationarity. While there are statistical tests that can be used to detect a trend in a set of data, I am not aware of any that can detect a non-constant variance. One way of doing this would be to calculate the local standard deviation for a series of points throughout the time series and compare these to either the standard deviation of the whole data set, or the average of these local standard deviations. I am not sure exactly how to implement this in an optimal way, ie what kind of cut-offs to use. Does anyone know of any standard statistical tests that determine if a data set has a non-constant variance, possibly ones that are implemented in mathematica? Or does anyone have any ideas on what a good way to quantify my above method would be, in the general case? Any thoughts would be greatly appreciated.