[Home ] [Archive]   [ فارسی ]  
:: Main :: About :: Current Issue :: Archive :: Search :: Submit :: Contact ::
:: Volume 15, Issue 1 (9-2018) ::
JSRI 2018, 15(1): 83-98 Back to browse issues page
On Tsallis Relative Entropy Rate of Hidden Markov Models
Zohre Nikooravesh *
Birjand University of Technology , nikooravesh@birjand.ac.ir
Abstract:   (405 Views)
In this paper we study the Tsallis relative entropy rate between a homogeneous Markov chain and a hidden Markov chain defined by observing the output of a discrete stochastic channel whose input is the finite state space homogeneous stationary Markov chain. For this purpose, we obtain the Tsallis relative entropy between two finite subsequences of above mentioned chains with the help of the definition of Tsallis relative entropy between two random variables then we define the Tsallis relative entropy rate between these stochastic processes. Finally, we calculate Tsallis relative entropy rate for some hidden Markov models.
 
Keywords: Tsallis relative entropy rate, stochastic channel, hidden Markov models.‎
Full-Text [PDF 236 kb]   (100 Downloads)    
Type of Study: Research | Subject: General
Received: 2017/10/11 | Accepted: 2018/09/29 | Published: 2019/03/3
References
1. Amari, S. (1985). Differential Geometrical Methods in Statistics, Springer-Verlag, New York. [DOI:10.1007/978-1-4612-5056-2]
2. Baratpour, S. and Khammar, A.H. (2016). Tsallis Entropy Properties of Order Statistics and Some Stochastic Comparisons. J. Statist. Res. Iran, 13, 25–41. [DOI:10.18869/acadpub.jsri.13.1.2]
3. Baum, L.E. and Petrie, T. (1966). Statistical Inference for Probabilistic Functions of Finite State Markov Chains. J. Ann. Math. Statist., 37, 1554-1563. [DOI:10.1214/aoms/1177699147]
4. Breitner, J. and Skorski, M. (2017). Analytic Formulas for Renyi Entropy of Hidden Markov Models, Submitted, https://arxiv.org/abs/1709.09699.
5. Castello, P., Gonzalez, C., Chover, M., Sbert M. and Feixas, M. (2011). Tsallis Entropy for Geometry Simplification. Entropy, 13, 1805–1828. [DOI:10.3390/e13101805]
6. Chazottes, J.R., Giardina, C. and Redig, F. (2006). Relative Entropy and Waiting Times for Continuous Time Markov Processes. Electronic J. Prob., 11, 1049-1068. [DOI:10.1214/EJP.v11-374]
7. Csiszar, I. (1967). Information Type Measures of Difference of Probability Distributions and Indirect Observations. J. Stud. Sci. Math. Hung., 2, 299-318.
8. Ephraim, Y. and Merhav, N. (2002). Hidden Markov Processes. J. IEEE Trans. Inform. Theo., 48, 1518-1569. [DOI:10.1109/TIT.2002.1003838]
9. Kamath, S. and Verdu, S. (2016). Estimation of Entropy Rate and Renyi Entropy Rate for Markov Chains. IEEE International Symposium on Information Theory, ISIT 2016, Barcelona, Spain, July 10-15, 2016. [DOI:10.1109/ISIT.2016.7541386]
10. Kesidis, G. and Walrand. J. (1993). Relative Entropy between Markov Transition Rate Matrices. J. IEEE Trans. Inform. Theo., 39, 1056-1057. [DOI:10.1109/18.256516]
11. Kullback, S. and Leibler, R. (1951). On Information and Sufficiency. J. Ann. Math. Statist., 22, 79-86. [DOI:10.1214/aoms/1177729694]
12. Rached, Z., Alajaji, F. and Campbell, L.L. (2001). Renyi's Divergence and Entropy Rates for Finite Alphabet Markov Sources. J. IEEE Trans. Inform. Theo., 47.
13. Rudin, W. (1976). Principles of Mathematical Analysis. 3rd ed. McGraw-Hill, New York.
14. Shannon, E.C. (1948). A Mathematical Theory of Communication. J. Bell Syst. Tech., 27, 379-423, 623-656. [DOI:10.1002/j.1538-7305.1948.tb00917.x]
15. Tsallis, C. (1988). Possible Generalization of Bolzmann-Gibbs Statistics. J. Stat. Phys., 52, 479-487. [DOI:10.1007/BF01016429]
16. Tsallis, C. (2009). Introduction to Nonextensive Statistical Mechanics. Springer, New York.
17. Vila, M., Bardera, A. and Feixas, M. (2011). Tsallis Mutual Information for Document Classification. Entropy, 13, 1694–1707. [DOI:10.3390/e13091694]
18. Yaghoobi Avval Riabi, M., Mohtashami Borzadaran, G.R. and Yari, G.H. (2014). Tsallis Maximum Entropy Lorenz Curves. J. Statist. Res. Iran, 11, 41–56. [DOI:10.18869/acadpub.jsri.11.1.41]
19. Yari, G.H. and Nikooravesh, Z. (2011). Relative Entropy Rate between a Markov Chain and Its Corresponding Hidden Markov Chain. J. Statist. Res. Iran, 8, 97–109. [DOI:10.18869/acadpub.jsri.8.1.97]
20. Zuk, O. (2006). The Relative Entropy Rate for Two Hidden Markov Processes. Turbo codes and related topics; 6th International ITG-Conference on source and channel coding, 4th International symposium on.
Send email to the article author

Add your comments about this article
Your username or Email:

CAPTCHA



XML   Persian Abstract   Print


Download citation:
BibTeX | RIS | EndNote | Medlars | ProCite | Reference Manager | RefWorks
Send citation to:

Nikooravesh Z. On Tsallis Relative Entropy Rate of Hidden Markov Models. JSRI. 2018; 15 (1) :83-98
URL: http://jsri.srtc.ac.ir/article-1-291-en.html


Volume 15, Issue 1 (9-2018) Back to browse issues page
مجله‌ی پژوهش‌های آماری ایران (علمی - پژوهشی) Journal of Statistical Research of Iran JSRI
Persian site map - English site map - Created in 0.06 seconds with 32 queries by YEKTAWEB 3974