Lasso

Lasso/Elastic Net linear and generalized linear models


Keywords
julia, l1, lasso, regularized-linear-regression
License
Other

Documentation

Lasso

Build Status Coverage Status

Lasso.jl is a pure Julia implementation of the glmnet coordinate descent algorithm for fitting linear and generalized linear Lasso and Elastic Net models, as described in:

Friedman, J., Hastie, T., & Tibshirani, R. (2010). Regularization paths for generalized linear models via coordinate descent. Journal of Statistical Software, 33(1), 1. http://www.jstatsoft.org/v33/i01/

Lasso.jl also includes an implementation of the O(n) fused Lasso implementation described in:

Johnson, N. A. (2013). A dynamic programming algorithm for the fused lasso and L0-segmentation. Journal of Computational and Graphical Statistics, 22(2), 246–260. doi:10.1080/10618600.2012.681238

As well as an implementation of polynomial trend filtering based on:

Ramdas, A., & Tibshirani, R. J. (2014). Fast and flexible ADMM algorithms for trend filtering. arXiv Preprint arXiv:1406.2082. Retrieved from http://arxiv.org/abs/1406.2082

Quick start

To fit a Lasso path with default parameters:

fit(LassoPath, X, y, dist, link)

dist is any distribution supported by GLM.jl and link defaults to the canonical link for that distribution.

To fit a fused Lasso model:

fit(FusedLasso, y, λ)

To fit a polynomial trend filtering model:

fit(TrendFilter, y, order, λ)

More documentation is available at ReadTheDocs.

TODO

  • User-specified weights are untested
  • Support unpenalized variables besides the intercept
  • Maybe integrate LARS.jl

See also

  • GLMNet.jl, a wrapper for the glmnet Fortran code.
  • LARS.jl, an implementation of least angle regression for fitting entire linear (but not generalized linear) Lasso and Elastic Net coordinate paths.