Sparse group lasso and high dimensional multinomial classification

Research output: Contribution to journalJournal articlepeer-review

Standard

Sparse group lasso and high dimensional multinomial classification. / Vincent, Martin; Hansen, N.R.

In: Computational Statistics & Data Analysis, Vol. 71, 01.03.2014, p. 771-786.

Research output: Contribution to journalJournal articlepeer-review

Harvard

Vincent, M & Hansen, NR 2014, 'Sparse group lasso and high dimensional multinomial classification', Computational Statistics & Data Analysis, vol. 71, pp. 771-786. https://doi.org/10.1016/j.csda.2013.06.004

APA

Vincent, M., & Hansen, N. R. (2014). Sparse group lasso and high dimensional multinomial classification. Computational Statistics & Data Analysis, 71, 771-786. https://doi.org/10.1016/j.csda.2013.06.004

Vancouver

Vincent M, Hansen NR. Sparse group lasso and high dimensional multinomial classification. Computational Statistics & Data Analysis. 2014 Mar 1;71:771-786. https://doi.org/10.1016/j.csda.2013.06.004

Author

Vincent, Martin ; Hansen, N.R. / Sparse group lasso and high dimensional multinomial classification. In: Computational Statistics & Data Analysis. 2014 ; Vol. 71. pp. 771-786.

Bibtex

@article{ae7391a10a0f499185a71828b03f7642,
title = "Sparse group lasso and high dimensional multinomial classification",
abstract = "The sparse group lasso optimization problem is solved using a coordinate gradient descent algorithm. The algorithm is applicable to a broad class of convex loss functions. Convergence of the algorithm is established, and the algorithm is used to investigate the performance of the multinomial sparse group lasso classifier. On three different real data examples the multinomial group lasso clearly outperforms multinomial lasso in terms of achieved classification error rate and in terms of including fewer features for the classification. An implementation of the multinomial sparse group lasso algorithm is available in the R package msgl. Its performance scales well with the problem size as illustrated by one of the examples considered - a 50 class classification problem with 10 k features, which amounts to estimating 500 k parameters.",
author = "Martin Vincent and N.R. Hansen",
year = "2014",
month = mar,
day = "1",
doi = "10.1016/j.csda.2013.06.004",
language = "English",
volume = "71",
pages = "771--786",
journal = "Computational Statistics and Data Analysis",
issn = "0167-9473",
publisher = "Elsevier",

}

RIS

TY - JOUR

T1 - Sparse group lasso and high dimensional multinomial classification

AU - Vincent, Martin

AU - Hansen, N.R.

PY - 2014/3/1

Y1 - 2014/3/1

N2 - The sparse group lasso optimization problem is solved using a coordinate gradient descent algorithm. The algorithm is applicable to a broad class of convex loss functions. Convergence of the algorithm is established, and the algorithm is used to investigate the performance of the multinomial sparse group lasso classifier. On three different real data examples the multinomial group lasso clearly outperforms multinomial lasso in terms of achieved classification error rate and in terms of including fewer features for the classification. An implementation of the multinomial sparse group lasso algorithm is available in the R package msgl. Its performance scales well with the problem size as illustrated by one of the examples considered - a 50 class classification problem with 10 k features, which amounts to estimating 500 k parameters.

AB - The sparse group lasso optimization problem is solved using a coordinate gradient descent algorithm. The algorithm is applicable to a broad class of convex loss functions. Convergence of the algorithm is established, and the algorithm is used to investigate the performance of the multinomial sparse group lasso classifier. On three different real data examples the multinomial group lasso clearly outperforms multinomial lasso in terms of achieved classification error rate and in terms of including fewer features for the classification. An implementation of the multinomial sparse group lasso algorithm is available in the R package msgl. Its performance scales well with the problem size as illustrated by one of the examples considered - a 50 class classification problem with 10 k features, which amounts to estimating 500 k parameters.

UR - http://www.scopus.com/inward/record.url?scp=84889083914&partnerID=8YFLogxK

U2 - 10.1016/j.csda.2013.06.004

DO - 10.1016/j.csda.2013.06.004

M3 - Journal article

AN - SCOPUS:84889083914

VL - 71

SP - 771

EP - 786

JO - Computational Statistics and Data Analysis

JF - Computational Statistics and Data Analysis

SN - 0167-9473

ER -

ID: 94754033