:: Volume 14, Issue 2 (3-2018) ::
JSRI 2018, 14(2): 247-266 Back to browse issues page
​Rank based Least-squares Independent Component Analysis
Jafar Rahmani Shamsi , Ali Dolati *
, adolati@yazd.ac.ir
Abstract:   (913 Views)
In this paper, we propose a nonparametric rank-based alternative to the least-squares independent component analysis algorithm developed. The basic idea is to estimate the squared-loss mutual information, which used as the objective function of the algorithm, based on its copula density version. Therefore, no marginal densities have to be estimated. We provide empirical evaluation of the proposed algorithm through simulation and real data analysis. Since the proposed algorithm uses rank values rather than the actual values of the observations, it is extremely robust to the outliers and suffers less from the presence of noise than the other algorithms.
Keywords: Copula, independent component analysis, squared-loss mutual information.
Full-Text [PDF 349 kb]   (198 Downloads)    
Type of Study: Research | Subject: General
Received: 2017/02/22 | Accepted: 2018/02/7 | Published: 2018/03/17
1. Amari, S., Cichocki, A. and Yang, H. (1996). A New Learning Algorithm for Blind Signal Separation. Advances in Neural Information Processing Systems. 757-763, MIT Press.
2. Ameri, M.R., Shokripour, M., Mohammadpour, A., and Nassiri, V. (2013). Parametric Independent Component Analysis for Stable Distributions. Artificial Intelligence Research, 2, 27-34. [DOI:10.5430/air.v2n3p27]
3. Bach, F.R. and Jordan, M.I. (2002). Kernel Independent Component Analysis. JMLR, 3, 1-48.
4. Bell, A.J. and Sejnowski, T.J. (1995). An Information Maximization Approach to Blind source Separation and Blind Deconvolution. Neural Comput. 7, 1129-1159. [DOI:10.1162/neco.1995.7.6.1129]
5. Bouezmarni, T. and Rolin, J.M. (2003). Consistency of the Beta Kernel Density Function Estimator. The Canadian Journal of Statistics/La Revue Canadienne de Statistique, 31, 98-89. [DOI:10.2307/3315905]
6. Calsaverini, R.S. and Vicente, R. (2009). An Information-theoretic Approach to Statistical dependence: Copula Information. EPL (Europhysics Letters), 88, 68003. [DOI:10.1209/0295-5075/88/68003]
7. Cardoso, J.F. and Souloumiac, A. (1993). Blind Beamforming for Non Gaussian Signals. IEE Proceedings-F, 140, 362-370. [DOI:10.1049/ip-f-2.1993.0054]
8. Charpentier, A., Fermanian, J.D. and Scaillet, O. (2006). The Estimation of Copulas: Theory and Practice, in Copulas: from Theory to Application in Finance, J. Rank, ed., Risk Book, London, pp. 35-60.
9. Chen, S.X. (1999). Beta Kernel Estimators for Density Functions, Comput. Statist. Data Anal., 31, 131-145. [DOI:10.1016/S0167-9473(99)00010-9]
10. Comon, P. (1994). Independent Component Analysis, a New Concept? Signal Proc., 36, 287-314. [DOI:10.1016/0165-1684(94)90029-9]
11. Gretton, A., Bousquet, O., Smola, A. and Scholkopf, B. (2005). Measuring Statistical Dependence with Hilbert-Schmidt Norms. In Algorithmic learning theory (pp. 63-77). Springer Berlin Heidelberg. [DOI:10.1007/11564089_7]
12. Hyvarinen, A. (1999). Fast and Robust Fixed-point Algorithms for Independent Component analysis. Neural Networks, IEEE Transactions on, 10, 626-634. [DOI:10.1109/72.761722]
13. Hyvarinen, A. and Oja, E. (2000). Independent Component Analysis: Algorithms and Application. Neural Networks, 13, 411-430. [DOI:10.1016/S0893-6080(00)00026-5]
14. Hyvarinen, A., Karhunen, J. and Oja, E. (2004). Independent Component Analysis. John Wiley & Sons, New York.
15. Jabari, N.H. (2009). Almost Sure Convergence of Kernel Bivariate Distribution Function Estimator under Negative Association. J. Statist. Res. Iran, 6, 243-255.
16. Joe, H. (1989). Relative Entropy Measures of Multivariate Dependence. JASA, 84, 157-164. [DOI:10.1080/01621459.1989.10478751]
17. Jones, M.C. (1993). Simple Boundary Correction for Kernel Density Estimation. Statistics and Computing, 3,135-146. [DOI:10.1007/BF00147776]
18. Jutten, C. and Herault, J. (1991). Blind Separation of Sources, PartI: An Adaptive Algorithm based on Neuromimetic Architecture. Signal Processing, 24, 1-10. [DOI:10.1016/0165-1684(91)90079-X]
19. Krishner, S. and Poczos, B. (2008). ICA and ISA using Schweizer-Wolff Measure of Dependence. In Proceedings of the 25th International Conference on Machine Learning (pp. 464-471). ACM. [DOI:10.1145/1390156.1390215]
20. Learned-Miller, E.G. and Fisher, J.W. (2003). ICA using Spacings Estimates of Entropy JMLR, 4, 1271-1295.
21. Mohtashami Borzadaran, R. and Amini, M. (2010). Information Measures via Copula Functions. J. Statist. Res. Iran, 7, 47-60. [DOI:10.18869/acadpub.jsri.7.1.47]
22. Muller, H.G. (1991). Smooth Optimum Kernel Estimators near Endpoints. Biometrika, 78, 521-530. [DOI:10.1093/biomet/78.3.521]
23. Nelsen, R.B. (2006). An Introduction to Copulas. Springer, New York.
24. Peng, H. and Siming Z. (2007). Handling of Incomplete Data Sets using ICA and SOM in Data Mining. Neural Computing and Applications, 16, 167-172. [DOI:10.1007/s00521-006-0058-6]
25. Shen, H., Jegelka, S. and Gretton, A. (2009). Fast Kernel-based Independent Component Analysis. Signal Processing, IEEE Transactions on, 57, 3498-3511. [DOI:10.1109/TSP.2009.2022857]
26. Sklar, A. (1959). Functions de R'epartitionan Dimensions et Leurs Marges. Publ. Inst. Statist. Univ. Paris, 8, 229-231.
27. Sun, Z., Liu, J., Sun, J., Sun, X. and Ling, J.(2009). A Motion Location based Video Watermarking Scheme using ICA to Extract Dynamic Frames. Neural Computing and Applications, 18, 507-514. [DOI:10.1007/s00521-009-0253-3]
28. Suzuki, T. and Sugiyama, M. (2011). Least-squares Independent Component Analysis. Neural Computation, 23, 284-301. [DOI:10.1162/NECO_a_00062]
29. Wand, M.P. and Jones, M.C. (1995). Kernel Smoothing. Chapman and Hall, London. [DOI:10.1007/978-1-4899-4493-1]
30. Taigang, H., Clifford, G. and Tarassenko, L. (2006). Application of Independent Component Analysis in Removing Artefacts from the Electrocardiogram. Neural Computing and Applications, 15, 105-116. [DOI:10.1007/s00521-005-0013-y]

XML   Persian Abstract   Print

Volume 14, Issue 2 (3-2018) Back to browse issues page