摘要

We consider a multivariate finite mixture of Gaussian regression models for high-dimensional data, where the number of covariates and the size of the response may be much larger than the sample size. We provide an l(1)-oracle inequality satisfied by the Lasso estimator according to the Kullback-Leibler loss. This result is an extension of the l(1)-oracle inequality established by Meynet in [ESAIM: PS 17 (2013) 650-671]. in the multivariate case. We focus on the Lasso for its l(1)-regularization properties rather than for the variable selection procedure.

  • 出版日期2015