In statistics, the widely applicable information criterion (WAIC), also known as Watanabe–Akaike information criterion, is the generalized version of the Akaike information criterion (AIC) onto singular statistical models.[1]

Widely applicable Bayesian information criterion (WBIC) is the generalized version of Bayesian information criterion (BIC) onto singular statistical models.[2]

WBIC is the average log likelihood function over the posterior distribution with the inverse temperature > 1/log n where n is the sample size.[2]

Both WAIC and WBIC can be numerically calculated without any information about a true distribution.

See also

References

  1. Watanabe, Sumio (2010). "Asymptotic Equivalence of Bayes Cross Validation and Widely Applicable Information Criterion in Singular Learning Theory". Journal of Machine Learning Research. 11: 3571–3594.
  2. 1 2 Watanabe, Sumio (2013). "A Widely Applicable Bayesian Information Criterion" (PDF). Journal of Machine Learning Research. 14: 867–897.


This article is issued from Wikipedia. The text is licensed under Creative Commons - Attribution - Sharealike. Additional terms may apply for the media files.