Difference Between Similar Terms and Objects

Difference Between AIC and BIC

AIC vs BIC

AIC and BIC are widely used in model selection criteria. AIC means Akaike’s Information Criteria and BIC means Bayesian Information Criteria. Though these two terms address model selection, they are not the same. One can come across may difference between the two approaches of model selection.

Akaike’s Information Criteria was formed in 1973 and Bayesian Information Criteria in 1978. Hirotsugu Akaike developed Akaike’s Information Criteria whereas Gideon E. Schwarz developed Bayesian information criterion.

The AIC can be termed as a mesaure of the goodness of fit of any estimated statistical model. The BIC is a type of model selection among a class of parametric models with different numbers of parameters.

When comparing the Bayesian Information Criteria and the Akaike’s Information Criteria, penalty for additional parameters is more in BIC than AIC. Unlike the AIC, the BIC penalizes free parameters more strongly.

Akaike’s Information Criteria generally tries to find unknown model that has high dimensional reality. This means the models are not true models in AIC. On the other hand, the Bayesian Information Criteria comes across only True models. It can also be said that Bayesian Information Criteria is consistent whereas Akaike’s Information Criteria is not so.

When Akaike’s Information Criteria will present the danger that it would outfit. the Bayesian Information Criteria will present the danger that it would underfit. Though BIC is more tolerant when compared to AIC, it shows less tolerance at higher numbers.

Akaike’s Information Criteria is good for making asymptotically equivalent to cross-validation. On the contrary, the Bayesian Information Criteria is good for consistent estimation.

Summary

1. AIC means Akaike’s Information Criteria and BIC means Bayesian Information Criteria.

2. Akaike’s Information Criteria was formed in 1973 and Bayesian Information Criteria in 1978.

3. When comparing the Bayesian Information Criteria and the Akaike’s Information Criteria, penalty for additional parameters is more in BIC than AIC.

4. Akaike’s Information Criteria generally tries to find unknown model that has high dimensional reality. On the other hand, the Bayesian Information Criteria comes across only True models.

5. Bayesian Information Criteria is consistent whereas Akaike’s Information Criteria is not so.

6. Akaike’s Information Criteria is good for making asymptotically equivalent to cross-validation. On the contrary, the Bayesian Information Criteria is good for consistent estimation.

7. Though BIC is more tolerant when compared to AIC, it shows less tolerance at higher numbers.

8. Unlike the AIC, the BIC penalizes free parameters more strongly.

//


Search DifferenceBetween.net :

Custom Search


Help us improve. Rate this post! 1 Star2 Stars3 Stars4 Stars5 Stars (1 votes, average: 4.00 out of 5)
Loading...

Email This Post Email This Post : If you like this article or our site. Please spread the word. Share it with your friends/family.


  • Differences Between OLS and MLE
  • Difference Between Parametric and Nonparametric
  • Difference Between DC15 and DC25
  • Difference Between Direct Cool and Frost Free Refrigerators
  • Difference between D5300 and EOS 60Da
  • So Many Books, So Little Time.
  • Leave a Response

    Please note: comment moderation is enabled and may delay your comment. There is no need to resubmit your comment.

    Articles on DifferenceBetween.net are general information, and are not intended to substitute for professional advice. The information is "AS IS", "WITH ALL FAULTS". User assumes all risk of use, damage, or injury. You agree that we have no liability for any damages.


    See more about :
    Protected by Copyscape Plagiarism Finder