[Home ] [Archive]    
:: Volume 7, Issue 1 (9-2010) ::
JSRI 2010, 7(1): 47-60 Back to browse issues page
Information Measures via Copula Functions
R. Mohtashami Borzadaran *1, M. Amini
1- , gmb1334@yahoo.com
Abstract:   (2908 Views)

In applications of differential geometry to problems of parametric inference, the notion of divergence is often used to measure the separation between two parametric densities. Among them, in this paper, we will verify measures such as Kullback-Leibler information, J-divergence, Hellinger distance, -Divergence, … and so on. Properties and results related to distance between probability distributions derived via copula functions. Some inequalities are obtained in view of the dependence and information measures.

Keywords: Information measures, Fisher information, Kullback-Leibler information, Hellinger distance, α-divergence
Full-Text [PDF 185 kb]   (1850 Downloads)    
Type of Study: Research | Subject: General
Received: 2016/01/10 | Accepted: 2016/01/10 | Published: 2016/01/10
Send email to the article author

Add your comments about this article
Your username or Email:


XML   Persian Abstract   Print

Download citation:
BibTeX | RIS | EndNote | Medlars | ProCite | Reference Manager | RefWorks
Send citation to:

Mohtashami Borzadaran R, Amini M. Information Measures via Copula Functions. JSRI. 2010; 7 (1) :47-60
URL: http://jsri.srtc.ac.ir/article-1-97-en.html

Rights and permissions
Creative Commons License This work is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License.
Volume 7, Issue 1 (9-2010) Back to browse issues page
مجله‌ی پژوهش‌های آماری ایران Journal of Statistical Research of Iran JSRI
Persian site map - English site map - Created in 0.04 seconds with 29 queries by YEKTAWEB 4447