Abstract
The usability [2, 3] of modern statistical software is increasingly important as users with limited statistical training become more reliant on such tools to help them address complex real-world problems [1, 13]. This paper presents a case study that illustrates how a discrete choice experiment [16] may be used to assess the usability of the complex controls that are becoming more prevalent in such software. The case study focuses on an interactive plot intended for novice users of a commercially available statistical application, who need to interpret a linear model used to analyze the results of an experiment. Effective selection of model terms is critical for identifying important experimental factors and the accuracy of model predictions. The ability to easily determine whether terms should be entered into a model, or the reason for their inclusion, is essential for interpretability [4, 15] and to ensure user satisfaction of the application.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
References
Abbasnasab Sardareh, S., Brown, G.T., Denny, P.: Comparing four contemporary statistical software tools for introductory data science and statistics in the social sciences. Teach. Stat. 43, S157–S172 (2021)
Bevan, N., Carter, J., Harker, S.: ISO 9241-11 revised: what have we learnt about usability since 1998? In: Kurosu, M. (ed.) HCI 2015. LNCS, vol. 9169, pp. 143–151. Springer, Cham (2015). https://doi.org/10.1007/978-3-319-20901-2_13
Frøkjær, E., Hertzum, M., Hornbæk, K.: Measuring usability: are effectiveness, efficiency, and satisfaction really correlated? In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, pp. 345–352 (2000)
Guidotti, R., Monreale, A., Ruggieri, S., Turini, F., Giannotti, F., Pedreschi, D.: A survey of methods for explaining black box models. ACM Comput. Surv. (CSUR) 51(5), 1–42 (2018)
Kessels, R., Jones, B., Goos, P.: Bayesian optimal designs for discrete choice experiments with partial profiles. J. Choice Model. 4, 52–74 (2011)
Huber, J., Zwerina, K.: The importance of utility balance in efficient choice designs. J. Mark. Res. 33, 307–317 (1996)
JMP Statistical Discovery LLC 2022–2023. JMP® 17 Design of Experiments Guide. Cary, NC: JMP Statistical Discovery LLC
Kim, S.-H., et al.: Ergonomic design of target symbols for fighter aircraft cockpit displays based on usability evaluation. In: Stephanidis, C. (ed.) HCI 2018. CCIS, vol. 850, pp. 176–182. Springer, Cham (2018). https://doi.org/10.1007/978-3-319-92270-6_24
Li, X., Sudarsanam, N., Frey, D.D.: Regularities in data from factorial experiments. Complexity 11(5), 32–45 (2006)
Louviere, J.J., Flynn, T.N., Carson, R.T.: Discrete choice experiments are not conjoint analysis. J. Choice Model. 3(3), 57–72 (2010)
McFadden, D.: The choice theory approach to market research. Mark. Sci. 5(4), 275–297 (1986). JSTOR
Michalski, R.: Examining users’ preferences towards vertical graphical toolbars in simple search and point tasks. Comput. Hum. Behav. 27(6), 2308–2321 (2011)
Rhyne, J., Bailey, M., Morgan, J., Lekivetz, R.: Assessing the usability of statistical software using designed experiments. In: Stephanidis, C., Antona, M., Ntoa, S., Salvendy, G. (eds.) HCII 2023. CCIS, vol. 1832, pp. 681–688. Springer, Cham (2023). https://doi.org/10.1007/978-3-031-35989-7_87
SAS Institute Inc.: SAS/STAT® User’s Guide. The QUANTSELECT Procedure: Effect Selection Methods. SAS Institute Inc., Cary, NC (2024)
Silva, A., Schrum, M., Hedlund-Botti, E., Gopalan, N., Gombolay, M.: Explainable artificial intelligence: evaluating the objective and subjective impacts of XAI on human-agent interaction. Int. J. Hum.-Comput. Interact. 39(7), 1390–1404 (2023)
Street, D.J., Burgess, L.: The Construction of Optimal Stated Choice Experiments: Theory and Methods. Wiley, New York (2007)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2024 The Author(s), under exclusive license to Springer Nature Switzerland AG
About this paper
Cite this paper
Rhyne, J., Bailey, M., Morgan, J., Lekivetz, R. (2024). Assessing the Usability of Statistical Software Using a Discrete Choice Experiment. In: Stephanidis, C., Antona, M., Ntoa, S., Salvendy, G. (eds) HCI International 2024 Posters. HCII 2024. Communications in Computer and Information Science, vol 2114. Springer, Cham. https://doi.org/10.1007/978-3-031-61932-8_24
Download citation
DOI: https://doi.org/10.1007/978-3-031-61932-8_24
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-031-61931-1
Online ISBN: 978-3-031-61932-8
eBook Packages: Computer ScienceComputer Science (R0)