**Date & Time:** Monday, January 18, 2010 12:30-13:15.

**Venue:** Ramanujan Hall

**Title:** Shrinkage estimation in the Lasso set up

**Speaker:** Rajendran Narayanan, Cornell University

**Abstract:** It is well-known that James-Stein type estimators dominate the MLE
of the mean of a multivariate normal distribution when the dimension
exceeds three. We consider the problem of improving the Lasso or l1
penalised regression. Lasso solution can be shown to be the projection of
the OLS estimator. Treating the Lasso estimator as the restricted MLE
of the regression parameters, we shrink the OLS vector to a closed convex
set, precisely, a p-dimensional crosspolytope whose size is parameterised
by a tuning parameter. Considering shrinkage estimation in this geometric
framework, we construct a class of estimators exhibiting risk gains over
Lasso with comparable prediction risk estimates. This in turn also enables
a data based method of choosing the tuning parameter. Borrowing from
principles of convexity theory, we propose a theoretical basis to ensure
model sparsity.