Abstract
In this paper, we analyze the performance of a semiparametric principal component analysis named Copula Component Analysis (COCA) (Han & Liu, 2012) when the data are dependent. The semiparametric model assumes that, after unspecified marginally monotone transformations, the distributions are multivariate Gaussian. We study the scenario where the observations are drawn from non-i.i.d. processes (m-dependency or a more general φ-mixing case). We show that COCA can allow weak dependence. In particular, we provide the generalization bounds of convergence for both support recovery and parameter estimation of COCA for the dependent data. We provide explicit sufficient conditions on the degree of dependence, under which the parametric rate can be maintained. To our knowledge, this is the first work analyzing the theoretical performance of PCA for the dependent data in high dimensional settings. Our results strictly generalize the analysis in Han & Liu (2012) and the techniques we used have the separate interest for analyzing a variety of other multivariate statistical methods.
Original language | English (US) |
---|---|
Pages | 240-248 |
Number of pages | 9 |
State | Published - 2013 |
Externally published | Yes |
Event | 30th International Conference on Machine Learning, ICML 2013 - Atlanta, GA, United States Duration: Jun 16 2013 → Jun 21 2013 |
Other
Other | 30th International Conference on Machine Learning, ICML 2013 |
---|---|
Country/Territory | United States |
City | Atlanta, GA |
Period | 6/16/13 → 6/21/13 |
ASJC Scopus subject areas
- Human-Computer Interaction
- Sociology and Political Science