Publication:
Sparse Sliced Inverse Regression via Lasso

No Thumbnail Available

Date

2019-03-09

Journal Title

Journal ISSN

Volume Title

Publisher

Informa UK Limited
The Harvard community has made this article openly available. Please share how this access benefits you.

Research Projects

Organizational Units

Journal Issue

Citation

Lin, Qian, Zhigen Zhao, Jun Liu. "Sparse Sliced Inverse Regression via Lasso." Journal of the American Statistical Association 114, no. 528 (2019): 1726-1739. DOI: 10.1080/01621459.2018.1520115

Research Data

Abstract

For multiple index models, it has recently been shown that the sliced inverse regression (SIR) is consistent for estimating the su cient dimension reduction (SDR) space if and only if rho = lim p/n = 0, where p is the dimension and n is the sample size. Thus, when p is of the same or a higher order of n, additional assumptions such as sparsity must be imposed in order to ensure consistency for SIR. By constructing artificial response variables made up from top eigenvectors of the estimated conditional covariance matrix, we introduce a simple Lasso regression method to obtain an estimate of the SDR space. The resulting algorithm, Lasso-SIR, is shown to be consistent and achieve the optimal convergence rate under certain sparsity conditions when p is of order o(n^2 lambda^2), where lambda is the generalized signal-to-noise ratio. We also demonstrate the superior performance of Lasso-SIR compared with existing approaches via extensive numerical studies and several real data examples.

Description

Other Available Sources

Keywords

Statistics, Probability and Uncertainty, Statistics and Probability

Terms of Use

This article is made available under the terms and conditions applicable to Open Access Policy Articles (OAP), as set forth at Terms of Service

Endorsement

Review

Supplemented By

Referenced By

Related Stories