%0 Journal Article %A Sayyareh, Abdolreza %T Admissible Set of Rival Models based on the Mixture of Kullback-Leibler Risks %J Journal of Statistical Research of Iran %V 13 %N 1 %U http://jsri.srtc.ac.ir/article-1-217-en.html %R 10.18869/acadpub.jsri.13.1.4 %D 2016 %K Akaike information criterion, Kullback-Leibler divergence, non-nested model selection., %X Abstract: Model selection aims to find the optimum model. A good model will generally yield good results. Herein lies the importance of model evaluation criteria for assessing the goodness of a subjective model. In this work we want to answer to this question that, how could infinite set of all possible models that could have given rise to data, be narrowed down to a reasonable set of statistical models? This paper considers a finite mixture of the known criterion to the model selection problem to answer to the question. The aim of this kind of criteria is to select an reasonable set of models based on a measure of closeness. We demonstrate that a very general class of statistical criterion, which we call that finite mixture Kullback-Leibler criterion, provides a way of rival theory model selection. In this work we have proposed two types of coefficients for the mixture criterion, one based on the density and another one based on the risk function. The simulation study and real data analysis confirme the proposed criteria. %> http://jsri.srtc.ac.ir/article-1-217-en.pdf %P 59-88 %& 59 %! Admissible Set of Rival Models based on the... %9 Research %L A-10-1-165 %+ %G eng %@ 2538-5771 %[ 2016