Lecture "Evaluation by SLOPE and Lasso estimation: improved oracle boundaries and optimality" by Professor Alexander Tsybakov
April 18 Head of the CREST Laboratory (Center for Research in Economics and Statistics) and Professor of the University Paris 6 (France) Tsybakov Alexander delivered a lecture "SLOPE assessment and Lasso assessment: improved oracle boundaries and optimality"
Overview of the lecture:
We show that two methods with polynomial time, namely, the Lasso estimate with an adaptively chosen tuning parameter and the Slope estimation, reach the exact minimax convergence rate s / n log (p / s) on the class of s-sparse target vectors in Rp in the problem of l2-estimation in the model of linear regression in spaces of large dimension. This is done with the RE (Restricted Eigenvalue) condition for Lasso and with a slightly more restrictive condition on the plan in Slope. The main results have the form of exact oracle inequalities that take into account the error of the model specification. Minimax optimal boundaries are also obtained for the estimation error in the lq metric, 1≤q≤2, for the case when the model is correctly specified. The results are non asymptotic and are valid both in probability and for mathematical expectations. The assumptions that we make about the plan are fulfilled with high probability for a wide class of random matrices with independent and, possibly, anisotropically distributed rows. We give a comparative analysis of the conditions under which oracle boundaries can be obtained for Lasso and Slope estimates. In particular, we show that the several known conditions, such as the RE condition and the sparse condition for the eigenvalues, are equivalent if l2 - norms of the regressors are uniformly bounded. This is a joint work with Pierre C. Bellec and Guillome Lecue.