[Home ] [Archive]    
Main Menu
Journal Information::
Home::
Archive::
For Authors::
For Reviewers::
Principles of Transparency::
Contact us::
::
Search in website

Advanced Search
..
Committed to

AWT IMAGE

Attribution-NonCommercial
CC BY-NC


AWT IMAGE

Open Access Publishing


AWT IMAGE

Prevent Plagiarism

..
Registered in


..
Statistics
Journal volumes: 17
Journal issues: 34
Articles views: 703517
Articles downloads: 365151

Total authors: 581
Unique authors: 422
Repeated authors: 159
Repeated authors percent: 27

Submitted articles: 369
Accepted articles: 266
Rejected articles: 25
Published articles: 219

Acceptance rate: 72.09
Rejection rate: 6.78

Average Time to Accept: 282 days
Average Time to First Review: 27.2 days
Average Time to Publish: 26.1 days

Last 3 years statistics:
Submitted articles: 36
Accepted articles: 23
Rejected articles: 2
Published articles: 10

Acceptance rate: 63.89
Rejection rate: 5.56

Average Time to Accept: 145 days
Average Time to First Review: 6.9 days
Average Time to Publish: 154 days
____
..
:: Volume 13, Issue 1 (9-2016) ::
JSRI 2016, 13(1): 59-88 Back to browse issues page
Admissible Set of Rival Models based on the Mixture of Kullback-Leibler Risks
Abdolreza Sayyareh
, asayyareh@kntu.ac.ir
Abstract:   (4006 Views)

Abstract: Model selection aims to find the optimum model. A good model will generally yield good results. Herein lies the importance of model evaluation criteria for assessing the goodness of a subjective model. In this work we want to answer to this question that, how could infinite set of all possible models that could have given rise to data, be narrowed down to a reasonable set of statistical models? This paper considers a finite mixture of the known criterion to the model selection problem to answer to the question. The aim of this kind of criteria is to select an reasonable set of models based on a measure of closeness. We demonstrate that a very general class of statistical criterion, which we call that finite mixture Kullback-Leibler criterion, provides a way of rival theory model selection. In this work we have proposed two types of coefficients for the mixture criterion, one based on the density and another one based on the risk function. The simulation study and real data analysis confirme the proposed criteria.

Keywords: Akaike information criterion, Kullback-Leibler divergence, non-nested model selection.
Full-Text [PDF 279 kb]   (2257 Downloads)    
Type of Study: Research | Subject: General
Received: 2016/11/23 | Accepted: 2016/11/23 | Published: 2016/11/23
Send email to the article author

Add your comments about this article
Your username or Email:

CAPTCHA



XML   Persian Abstract   Print


Download citation:
BibTeX | RIS | EndNote | Medlars | ProCite | Reference Manager | RefWorks
Send citation to:

Sayyareh A. Admissible Set of Rival Models based on the Mixture of Kullback-Leibler Risks. JSRI 2016; 13 (1) :59-88
URL: http://jsri.srtc.ac.ir/article-1-217-en.html


Rights and permissions
Creative Commons License This work is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License.
Volume 13, Issue 1 (9-2016) Back to browse issues page
مجله‌ی پژوهش‌های آماری ایران Journal of Statistical Research of Iran JSRI
Persian site map - English site map - Created in 0.05 seconds with 42 queries by YEKTAWEB 4660