Markedets billigste bøger
Levering: 1 - 2 hverdage

Information Criteria and Statistical Modeling

Bag om Information Criteria and Statistical Modeling

The Akaike information criterion (AIC) derived as an estimator of the Kullback-Leibler information discrepancy provides a useful tool for evaluating statistical models, and numerous successful applications of the AIC have been reported in various fields of natural sciences, social sciences and engineering. One of the main objectives of this book is to provide comprehensive explanations of the concepts and derivations of the AIC and related criteria, including Schwarz¿s Bayesian information criterion (BIC), together with a wide range of practical examples of model selection and evaluation criteria. A secondary objective is to provide a theoretical basis for the analysis and extension of information criteria via a statistical functional approach. A generalized information criterion (GIC) and a bootstrap information criterion are presented, which provide unified tools for modeling and model evaluation for a diverse range of models, including various types of nonlinear models and model estimation procedures such as robust estimation, the maximum penalized likelihood method and a Bayesian approach.

Vis mere
  • Sprog:
  • Engelsk
  • ISBN:
  • 9781441924568
  • Indbinding:
  • Paperback
  • Sideantal:
  • 288
  • Udgivet:
  • 23. November 2010
  • Størrelse:
  • 155x16x235 mm.
  • Vægt:
  • 441 g.
  Gratis fragt
Leveringstid: 2-3 uger
Forventet levering: 17. Juli 2024

Beskrivelse af Information Criteria and Statistical Modeling

The Akaike information criterion (AIC) derived as an estimator of the Kullback-Leibler information discrepancy provides a useful tool for evaluating statistical models, and numerous successful applications of the AIC have been reported in various fields of natural sciences, social sciences and engineering.

One of the main objectives of this book is to provide comprehensive explanations of the concepts and derivations of the AIC and related criteria, including Schwarz¿s Bayesian information criterion (BIC), together with a wide range of practical examples of model selection and evaluation criteria. A secondary objective is to provide a theoretical basis for the analysis and extension of information criteria via a statistical functional approach. A generalized information criterion (GIC) and a bootstrap information criterion are presented, which provide unified tools for modeling and model evaluation for a diverse range of models, including various types of nonlinear models and model estimation procedures such as robust estimation, the maximum penalized likelihood method and a Bayesian approach.

Brugerbedømmelser af Information Criteria and Statistical Modeling



Gør som tusindvis af andre bogelskere

Tilmeld dig nyhedsbrevet og få gode tilbud og inspiration til din næste læsning.