[Home ] [Archive]   [ فارسی ]  
:: Main :: About :: Current Issue :: Archive :: Search :: Submit :: Contact ::
:: Volume 7, Issue 1 (9-2010) ::
JSRI 2010, 7(1): 47-60 Back to browse issues page
Information Measures via Copula Functions
R. Mohtashami Borzadaran G. * , M. Amini
, gmb1334@yahoo.com
Abstract:   (1429 Views)

In applications of differential geometry to problems of parametric inference, the notion of divergence is often used to measure the separation between two parametric densities. Among them, in this paper, we will verify measures such as Kullback-Leibler information, J-divergence, Hellinger distance, -Divergence, … and so on. Properties and results related to distance between probability distributions derived via copula functions. Some inequalities are obtained in view of the dependence and information measures.

Keywords: Information measures, Fisher information, Kullback-Leibler information, Hellinger distance, α-divergence
Full-Text [PDF 185 kb]   (410 Downloads)    
Type of Study: Research | Subject: General
Received: 2016/01/10 | Accepted: 2016/01/10 | Published: 2016/01/10
Send email to the article author

Add your comments about this article
Your username or Email:

CAPTCHA code



XML   Persian Abstract   Print


Download citation:
BibTeX | RIS | EndNote | Medlars | ProCite | Reference Manager | RefWorks
Send citation to:

Mohtashami Borzadaran R, Amini M. Information Measures via Copula Functions. JSRI. 2010; 7 (1) :47-60
URL: http://jsri.srtc.ac.ir/article-1-97-en.html


Volume 7, Issue 1 (9-2010) Back to browse issues page
مجله‌ی پژوهش‌های آماری ایران (علمی - پژوهشی) Journal of Statistical Research of Iran JSRI
Persian site map - English site map - Created in 0.05 seconds with 31 queries by YEKTAWEB 3742