# LassoNet: Neural Networks with Feature Sparsity

 Ismael Lemhadri Feng Ruan Louis Abraham Rob Tibshirani

LassoNet is a method for feature selection in neural networks, to enhance interpretability of the final network.

• It uses a novel objective function and learning algorithm, that encourage the network to use only a subset of the available input features. That is, the resulting network is "feature sparse"
• This is achieved not by post-hoc analysis of a standard neural network but is built into the objective function itself:
• Input-to-output (skip layer) connections are added to the network with an L1 penalty on its weights
• The weight for each feature in this layer acts as an upper bound for all hidden layer weights involving that feature
• The result is an entire path of network solutions, with varying amounts of feature sparsity. This is analogous to the lasso solution path for linear regression

# Installation

pip install lassonet

# Tips

LassoNet sometimes require fine tuning. For optimal performance, consider:
• standardizing the inputs
• making sure that the initial dense model (with $\lambda = 0$) has trained well, before starting the LassoNet regularization path. This may involve hyper-parameter tuning, choosing the right optimizer, and so on. If the dense model is underperforming, it is likely that the sparser models will as well.
• making sure the stepsize over the $\lambda$ path is not too large. By default, the stepsize runs in geometric increments until there is no feature left.

# Citation

The algorithms and method used in this package came primarily out of research in Rob Tibshirani's lab at Stanford University. If you use LassoNet in your research we would appreciate a citation to the paper:
    @article{JMLR:v22:20-848,
author  = {Ismael Lemhadri and Feng Ruan and Louis Abraham and Robert Tibshirani},
title   = {LassoNet: A Neural Network with Feature Sparsity},
journal = {Journal of Machine Learning Research},
year    = {2021},
volume  = {22},
number  = {127},
pages   = {1-29},
url     = {http://jmlr.org/papers/v22/20-848.html}
}