[Home ] [Archive]    
Main Menu
Journal Information::
Home::
Archive::
For Authors::
For Reviewers::
Principles of Transparency::
Contact us::
::
Search in website

Advanced Search
..
Committed to

AWT IMAGE

Attribution-NonCommercial
CC BY-NC


AWT IMAGE

Open Access Publishing


AWT IMAGE

Prevent Plagiarism

..
Registered in


..
Statistics
Journal volumes: 17
Journal issues: 34
Articles views: 703517
Articles downloads: 365151

Total authors: 581
Unique authors: 422
Repeated authors: 159
Repeated authors percent: 27

Submitted articles: 369
Accepted articles: 266
Rejected articles: 25
Published articles: 219

Acceptance rate: 72.09
Rejection rate: 6.78

Average Time to Accept: 282 days
Average Time to First Review: 27.2 days
Average Time to Publish: 26.1 days

Last 3 years statistics:
Submitted articles: 36
Accepted articles: 23
Rejected articles: 2
Published articles: 10

Acceptance rate: 63.89
Rejection rate: 5.56

Average Time to Accept: 145 days
Average Time to First Review: 6.9 days
Average Time to Publish: 154 days
____
..
:: Search published articles ::
Showing 5 results for Mohtashami Borzadaran

Mohammad Khorashadizadeh, Gholam Reza Mohtashami Borzadaran,
Volume 4, Issue 1 (9-2007)
Abstract

In most situations the best estimator of a function of the parameter exists, but sometimes it has a complex form and we cannot compute its variance explicitly. Therefore, a lower bound for the variance of an estimator is one of the fundamentals in the estimation theory, because it gives us an idea about the accuracy of an estimator.

It is well-known in statistical inference that the Cramér-Rao inequality establishes a lower bound for the variance of an unbiased estimator. But one has no idea how sharp the inequality is, i.e., how close the variance is to the lower bound. It states that, under regularity conditions, the variance of any estimator can not be smaller than a certain quantity.

An important inequality to follow the Cramér-Rao inequality is that of a Bhattacharyya (1946, 1947).

We introduce Bhattacharyya lower bounds for variance of estimator and show that Bhattacharyya inequality achieves a greater lower bound for the variance of an unbiased estimator of a parametric function, and it becomes sharper and sharper as the order of the Bhattacharyya matrix...[To continue please click here]


A. Mojiri, G. R. Mohtashami Borzadaran, Y. Waghei,
Volume 5, Issue 1 (9-2008)
Abstract

In the modern society and specially in our country discussion of poverty, wealth and social justice are the most important arguments of public and private circles. The most important graphical tools which are used to describe the quantity of centralization like wealth in a society is Lorenz curve. In these situations, most of econometricians measure the economic inequalities. In the discrete case, the Lorenz curve is therefore defined as ....[To continue please click here]


G. R. Mohtashami Borzadaran , M. Amini,
Volume 7, Issue 1 (9-2010)
Abstract

In applications of differential geometry to problems of parametric inference, the notion of divergence is often used to measure the separation between two parametric densities. Among them, in this paper, we will verify measures such as Kullback-Leibler information, J-divergence, Hellinger distance, -Divergence, … and so on. Properties and results related to distance between probability distributions derived via copula functions. Some inequalities are obtained in view of the dependence and information measures.


A Arefi, M Razmkhah, G. R. Mohtashami Borzadaran,
Volume 8, Issue 2 (3-2012)
Abstract

This paper focuses on a Bayes inference model for a simple step-stress life test using Type-I censored sample in a discrete set-up. Assuming the failure times at each stress level are geometrically distributed, the Bayes estimation problem of the parameters of interest is investigated in the both of point and interval approaches. To derive the Bayesian point estimators, some various balanced loss functions are used. Furthermore, a simulation study and sensitivity analysis is performed to carry out the performance of the results of the paper. An example is also presented to illustrate the proposed procedure. Finally, some conclusions are stated.


M. Yaghoobi Avval Riabi, G. H. Mohtashami Borzadaran, G. H. Yari,
Volume 11, Issue 1 (9-2014)
Abstract

In this paper, at first we derive a family of maximum Tsallis entropy distributions under optional side conditions on the mean income and the Gini index. Furthermore, corresponding with these distributions a family of Lorenz curves compatible with the optional side conditions is generated. Meanwhile, we show that our results reduce to Shannon entropy as $beta$ tends to one. Finally, by using actual data, we compare the maximum Tsallis entropy Lorenz curve with some parametric Lorenz curves.



Page 1 from 1     

مجله‌ی پژوهش‌های آماری ایران Journal of Statistical Research of Iran JSRI
Persian site map - English site map - Created in 0.07 seconds with 34 queries by YEKTAWEB 4700