Robust and sparse multinomial regression in high dimensions | Data Mining and Knowledge Discovery Skip to main content
Log in

Robust and sparse multinomial regression in high dimensions

  • Published:
Data Mining and Knowledge Discovery Aims and scope Submit manuscript

Abstract

A robust and sparse estimator for multinomial regression is proposed for high dimensional data. Robustness of the estimator is achieved by trimming the observations, and sparsity of the estimator is obtained by the elastic net penalty. In contrast to multi-group classifiers based on dimension reduction, this model is very appealing in terms of interpretation, since one obtains estimated coefficients individually for every group, and also the sparsity of the coefficients is group specific. Simulation studies are conducted to show the performance in comparison to the non-robust version of the multinomial regression estimator, and some real data examples underline the usefulness of this robust estimator particularly in terms of result interpretation and model diagnostics.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Subscribe and save

Springer+ Basic
¥17,985 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Price includes VAT (Japan)

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11

Similar content being viewed by others

Explore related subjects

Discover the latest articles, news and stories from top researchers in related subjects.

References

  • Alfons A, Croux C, Gelper S (2013) Sparse least trimmed squares regression for analyzing high-dimensional large data sets. Ann Appl Stat 7(1):226–248

    Article  MathSciNet  MATH  Google Scholar 

  • Castilla E, Ghosh A, Martin N, Pardo L (2018) New robust statistical procedures for polytomous logistic regression models. Biometrics 74(4):1282–1291

    Article  MathSciNet  Google Scholar 

  • Cawley G, Talbot N, Girolami M (2006) Sparse multinomial logistic regression via bayesian L1 regularisation. Adv Neural Inf Process Syst 19:209–216

    Google Scholar 

  • Clemmensen L, Hastie T, Witten D, Ersbøll B (2011) Sparse discriminant analysis. Technometrics 53(4):406–413

    Article  MathSciNet  Google Scholar 

  • Development Core Team, R (2021) R Foundation for Statistical Computing Vienna Austria. https://www.R-project.org/

  • Friedman J, Hastie T, Tibshirani R (2010) Regularization paths for generalized linear models via coordinate descent. J Stat Softw 33(1):1–22

    Article  Google Scholar 

  • Friedman J, Hastie T, Tibshirani R, Narasimhan B, Tay K, Simon N, Qian J, Yang J (2021) glmnet: Lasso and Elastic-Net Regularized Generalized Linear Models. . R Foundation for Statistical Computing, Vienna, Austria. R package version 4.1–3. https://CRAN.R-project.org/package=glmnet

  • Hastie T, Buja A, Tibshirani R (1995) Penalized discriminant analysis. Ann Stat 23(1):73–102

    Article  MathSciNet  MATH  Google Scholar 

  • Hubert M, Van Driessen K (2004) Fast and robust discriminant analysis. Comput Stat Data Anal 45(2):301–320

    Article  MathSciNet  MATH  Google Scholar 

  • Johnson RA, Wichern DW (2007) Applied multivariate statistical analysis, 6th edn. Prentice Hall, Upper Saddle River

    MATH  Google Scholar 

  • Kurnaz FS, Hoffmann I, Filzmoser P (2018) enetLTS: Robust and sparse estimation methods for high-dimensional linear and logistic regression. R Foundation for Statistical Computing, Vienna, Austria. R package https://CRAN.R-project.org/package=enetLTS

  • Kurnaz FS, Hoffmann I, Filzmoser P (2018) Robust and sparse estimation methods for high-dimensional linear and logistic regression. Chemom Intell Lab Syst 172:211–222

    Article  Google Scholar 

  • Lê Cao K, Boitard S, Besse P (2011) Sparse PLS discriminant analysis: biologically relevant feature selection and graphical displays for multiclass problems. BMC Bioinf 12:253

    Article  Google Scholar 

  • Matan O, Kiang R, Stenard C, Boser B, Denker J, Henderson D, Howard R, Hubbard W, Jackel L, Le Cun Y (1990) Handwritten character recognition using neural network architectures. In: Proceedings of the 4th US Postal Service Advanced Technology Conference 2:1003–1011

  • Ortner I, Filzmoser P, Croux C (2020) Robust and sparse multigroup classification by optimal scoring approach. Data Min Knowl Disc 34:723–741

    Article  MathSciNet  MATH  Google Scholar 

  • Rousseeuw P, Leroy A (1987) Robust regression and outlier detection. Wiley Series in Probability and Statistics, John Wiley and Sons Ltd., New York

  • Rousseeuw PJ, Driessen KV (1999) A fast algorithm for the minimum covariance determinant estimator. Technometrics 41(3):212–223

    Article  Google Scholar 

  • Rousseeuw PJ, Driessen KV (2006) Computing its regression for large data sets. Data Min Knowl Disc 12(1):29–45

    Article  Google Scholar 

  • Tabatabai MA, Li H, Eby WM, Kengwoung-Keumo JJ, Manne U, Bae S, Fouad M, Singh KP (2014) Robust logistic and probit methods for binary and multinomial regression. J Biometrics Biostat 5(4)

  • Virta J, Koesner CL, Li B, Nordhausen K, Oja H, Radojicic U (2021) tensorbss: Blind source separation methods for tensor-valued observations. . R Foundation for Statistical Computing, Vienna, Austria. R package version 0.3.8 https://CRAN.R-project.org/package=tensorBSS

  • Yin M, Zeng D, Gao J, Wu Z, Xie S (2018) Robust multinomial logistic regression based on RPCA. IEEE J Sel Topics Signal Proces 12(6):1144–1154

    Article  Google Scholar 

Download references

Acknowledgements

This work is supported by grant TUBITAK 2219 from the Scientific and Technological Research Council of Turkey (TUBITAK).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Fatma Sevinç Kurnaz.

Ethics declarations

Conflict of interest

The author declares that there is no conflict of interest.

Ethical approval

Not required ethics approval.

Informed Consent

Not applicable.

Additional information

Responsible editor: Johannes Fürnkranz.

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Kurnaz, F.S., Filzmoser, P. Robust and sparse multinomial regression in high dimensions. Data Min Knowl Disc 37, 1609–1629 (2023). https://doi.org/10.1007/s10618-023-00936-6

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10618-023-00936-6

Keywords

Navigation