Abstract: Model selection aims to find the optimum model. A good model will generally yield good results. Herein lies the importance of model evaluation criteria for assessing the goodness of a subjective model. In this work we want to answer to this question that, how could infinite set of all possible models that could have given rise to data, be narrowed down to a reasonable set of statistical models? This paper considers a finite mixture of the known criterion to the model selection problem to answer to the question. The aim of this kind of criteria is to select an reasonable set of models based on a measure of closeness. We demonstrate that a very general class of statistical criterion, which we call that finite mixture Kullback-Leibler criterion, provides a way of rival theory model selection. In this work we have proposed two types of coefficients for the mixture criterion, one based on the density and another one based on the risk function. The simulation study and real data analysis confirme the proposed criteria.