Jump to content

Box's M test

From Wikipedia, the free encyclopedia

Box's M test is a multivariate statistical test used to check the equality of multiple variance-covariance matrices.[1] The test is commonly used to test the assumption of homogeneity of variances and covariances in MANOVA and linear discriminant analysis. It is named after George E. P. Box, who first discussed the test in 1949. The test uses a chi-squared approximation.

Box's M test is susceptible to errors if the data does not meet model assumptions or if the sample size is too large or small.[2] Box's M test is especially prone to error if the data does not meet the assumption of multivariate normality.[3]

See also

[edit]

References

[edit]
  1. ^ Box, G.E.P. (1 December 1949). "A General Distribution Theory for a Class of Likelihood Criteria". Biometrika. 36 (3–4): 317–346. doi:10.1093/biomet/36.3-4.317.
  2. ^ Rebecca M. Warner (2013). Applied Statistics: From Bivariate Through Multivariate Techniques. SAGE. p. 778. ISBN 978-1-4129-9134-6.
  3. ^ Bryan F.J. Manly (6 July 2004). Multivariate Statistical Methods: A Primer, Third Edition. CRC Press. p. 54. ISBN 978-1-58488-414-9.