:: Volume 4, Issue 1 (9-2007) ::
JSRI 2007, 4(1): 29-46 Back to browse issues page
The Structure of Bhattacharyya Matrix in Natural Exponential Family and Its Role in Approximating the Variance of a Statistics
Mohammad Khorashadizadeh , Reza Mohtashami Borzadaran 1
1- , gmb1334@yahoo.com
Abstract:   (3564 Views)

In most situations the best estimator of a function of the parameter exists, but sometimes it has a complex form and we cannot compute its variance explicitly. Therefore, a lower bound for the variance of an estimator is one of the fundamentals in the estimation theory, because it gives us an idea about the accuracy of an estimator.

It is well-known in statistical inference that the Cramér-Rao inequality establishes a lower bound for the variance of an unbiased estimator. But one has no idea how sharp the inequality is, i.e., how close the variance is to the lower bound. It states that, under regularity conditions, the variance of any estimator can not be smaller than a certain quantity.

An important inequality to follow the Cramér-Rao inequality is that of a Bhattacharyya (1946, 1947).

We introduce Bhattacharyya lower bounds for variance of estimator and show that Bhattacharyya inequality achieves a greater lower bound for the variance of an unbiased estimator of a parametric function, and it becomes sharper and sharper as the order of the Bhattacharyya matrix...[To continue please click here]

Keywords: natural exponential distributions, Bhattacharyya matrix, Bhattacharyya lower bound, Cramér-Rao lower bound, Fisher information.
Full-Text [PDF 1427 kb]   (841 Downloads)    
Type of Study: Research | Subject: General
Received: 2016/02/21 | Accepted: 2016/02/21 | Published: 2016/02/21



XML   Persian Abstract   Print



Rights and permissions
Creative Commons License This work is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License.
Volume 4, Issue 1 (9-2007) Back to browse issues page