Assessing the Usability of Statistical Software Using a Discrete Choice Experiment | SpringerLink
Skip to main content

Assessing the Usability of Statistical Software Using a Discrete Choice Experiment

  • Conference paper
  • First Online:
HCI International 2024 Posters (HCII 2024)

Part of the book series: Communications in Computer and Information Science ((CCIS,volume 2114))

Included in the following conference series:

  • 495 Accesses

Abstract

The usability [2, 3] of modern statistical software is increasingly important as users with limited statistical training become more reliant on such tools to help them address complex real-world problems [1, 13]. This paper presents a case study that illustrates how a discrete choice experiment [16] may be used to assess the usability of the complex controls that are becoming more prevalent in such software. The case study focuses on an interactive plot intended for novice users of a commercially available statistical application, who need to interpret a linear model used to analyze the results of an experiment. Effective selection of model terms is critical for identifying important experimental factors and the accuracy of model predictions. The ability to easily determine whether terms should be entered into a model, or the reason for their inclusion, is essential for interpretability [4, 15] and to ensure user satisfaction of the application.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
¥17,985 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
JPY 3498
Price includes VAT (Japan)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
JPY 25167
Price includes VAT (Japan)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
JPY 12154
Price includes VAT (Japan)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Similar content being viewed by others

References

  1. Abbasnasab Sardareh, S., Brown, G.T., Denny, P.: Comparing four contemporary statistical software tools for introductory data science and statistics in the social sciences. Teach. Stat. 43, S157–S172 (2021)

    Article  Google Scholar 

  2. Bevan, N., Carter, J., Harker, S.: ISO 9241-11 revised: what have we learnt about usability since 1998? In: Kurosu, M. (ed.) HCI 2015. LNCS, vol. 9169, pp. 143–151. Springer, Cham (2015). https://doi.org/10.1007/978-3-319-20901-2_13

    Chapter  Google Scholar 

  3. Frøkjær, E., Hertzum, M., Hornbæk, K.: Measuring usability: are effectiveness, efficiency, and satisfaction really correlated? In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, pp. 345–352 (2000)

    Google Scholar 

  4. Guidotti, R., Monreale, A., Ruggieri, S., Turini, F., Giannotti, F., Pedreschi, D.: A survey of methods for explaining black box models. ACM Comput. Surv. (CSUR) 51(5), 1–42 (2018)

    Article  Google Scholar 

  5. Kessels, R., Jones, B., Goos, P.: Bayesian optimal designs for discrete choice experiments with partial profiles. J. Choice Model. 4, 52–74 (2011)

    Article  Google Scholar 

  6. Huber, J., Zwerina, K.: The importance of utility balance in efficient choice designs. J. Mark. Res. 33, 307–317 (1996)

    Article  Google Scholar 

  7. JMP Statistical Discovery LLC 2022–2023. JMP® 17 Design of Experiments Guide. Cary, NC: JMP Statistical Discovery LLC

    Google Scholar 

  8. Kim, S.-H., et al.: Ergonomic design of target symbols for fighter aircraft cockpit displays based on usability evaluation. In: Stephanidis, C. (ed.) HCI 2018. CCIS, vol. 850, pp. 176–182. Springer, Cham (2018). https://doi.org/10.1007/978-3-319-92270-6_24

    Chapter  Google Scholar 

  9. Li, X., Sudarsanam, N., Frey, D.D.: Regularities in data from factorial experiments. Complexity 11(5), 32–45 (2006)

    Article  Google Scholar 

  10. Louviere, J.J., Flynn, T.N., Carson, R.T.: Discrete choice experiments are not conjoint analysis. J. Choice Model. 3(3), 57–72 (2010)

    Article  Google Scholar 

  11. McFadden, D.: The choice theory approach to market research. Mark. Sci. 5(4), 275–297 (1986). JSTOR

    Google Scholar 

  12. Michalski, R.: Examining users’ preferences towards vertical graphical toolbars in simple search and point tasks. Comput. Hum. Behav. 27(6), 2308–2321 (2011)

    Article  Google Scholar 

  13. Rhyne, J., Bailey, M., Morgan, J., Lekivetz, R.: Assessing the usability of statistical software using designed experiments. In: Stephanidis, C., Antona, M., Ntoa, S., Salvendy, G. (eds.) HCII 2023. CCIS, vol. 1832, pp. 681–688. Springer, Cham (2023). https://doi.org/10.1007/978-3-031-35989-7_87

    Chapter  Google Scholar 

  14. SAS Institute Inc.: SAS/STAT® User’s Guide. The QUANTSELECT Procedure: Effect Selection Methods. SAS Institute Inc., Cary, NC (2024)

    Google Scholar 

  15. Silva, A., Schrum, M., Hedlund-Botti, E., Gopalan, N., Gombolay, M.: Explainable artificial intelligence: evaluating the objective and subjective impacts of XAI on human-agent interaction. Int. J. Hum.-Comput. Interact. 39(7), 1390–1404 (2023)

    Article  Google Scholar 

  16. Street, D.J., Burgess, L.: The Construction of Optimal Stated Choice Experiments: Theory and Methods. Wiley, New York (2007)

    Book  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Jacob Rhyne .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2024 The Author(s), under exclusive license to Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Rhyne, J., Bailey, M., Morgan, J., Lekivetz, R. (2024). Assessing the Usability of Statistical Software Using a Discrete Choice Experiment. In: Stephanidis, C., Antona, M., Ntoa, S., Salvendy, G. (eds) HCI International 2024 Posters. HCII 2024. Communications in Computer and Information Science, vol 2114. Springer, Cham. https://doi.org/10.1007/978-3-031-61932-8_24

Download citation

  • DOI: https://doi.org/10.1007/978-3-031-61932-8_24

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-031-61931-1

  • Online ISBN: 978-3-031-61932-8

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics