Abstract
The importance of digital degree programs has grown increasingly in recent years, due in part to their ability to provide a personalized learning experience for students. However, degree programs in this format have higher dropout rates than traditional degree programs. In the process of a user-centered design approach, a dashboard for the online degree programs of a university network is developed to provide information and recommendations about the learning process based on descriptive analysis and machine learning (ML) methods. For this purpose, ML models are developed, trained and evaluated. The goal of the dashboard is to promote self-regulation among students and reduce dropout rates. It will be set up as a plug-in through the learning management system (LMS) Moodle exclusively for students. In order to understand which aspects are important for users in relation to the cognitive processes involved in interacting with the dashboard, an eye-tracking study was conducted using the thinking aloud technique. The goal of the study was to investigate which cognitive demands are set for the users when interacting with the prototype and how the automatically generated information is perceived. When integrating the LD into the LMS, care should be taken to ensure that all content is realized in an understandable and easy-to-follow manner, due to the fact that otherwise the effort required to focus on the content elements of the LD could become greater - and with it the cognitive requirements.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Carter, B.T., Luke, S.G.: Best practices in eye tracking research. Int. J. Psychophysiol. 155, 49–62 (2020)
Diaz, D.P.: Online drop rates revisited. Technol. Sour. 3(3), 35–51 (2002)
Getto, B., Hintze, P., Kerres, M.: (Wie) Kann Digitalisierung zur Hochschulentwicklung beitragen?, pp. 13–25 (2018)
Hassenzahl, M.: The hedonic/pragmatic model of user experience. Towards UX Manifesto 10, 2007 (2007)
Drzyzga, G., Harder, T.: Student-centered development of an online software tool to provide learning support feedback: a design-study approach. In: Proceedings of the 6th International Conference on Computer-Human Interaction Research and Applications, Valletta, Malta, 27–28 October 2022, pp. 244–248. SCITEPRESS - Science and Technology Publications (2022)
Janneck, M., Merceron, A., Sauer, P.: Workshop on addressing dropout rates in higher education, online – everywhere. In Companion Proceedings of the 11th Learning Analytics and Knowledge Conference (LAK 2021), pp. 261–269 (2021)
Schrepp, M., Hinderks, A., Thomaschewski, J.: Design and evaluation of a short version of the user experience questionnaire (UEQ-S). Int. J. Interact. Multimedia Artif. Intell. 4(6), 103–108 (2017)
Wannemacher, K., Jungermann, I., Scholz, J., Tercanli, H., von Villiez, A.: Digitale Lernszenarien im Hochschulbereich (2016)
Keller, B., Baleis, J., Starke, C., Marcinkowski, F.: Machine learning and artificial intelligence in higher education: a state-of-the-art report on the German University landscape. Heinrich-Heine-Universität Düsseldorf, pp. 1–31 (2019)
Pintrich, P.R.: The role of goal orientation in self-regulated learning. In: Handbook of Self-Regulation, pp. 451–502. Academic Press (2000)
Paas, F., Renkl, A., Sweller, J.: Cognitive load theory and instructional design: Recent developments. Educ. Psychol. 38(1), 1–4 (2003)
Matcha, W., Uzir, N.A., Gasevic, D., Pardo, A.: A systematic review of empirical studies on learning analytics dashboards: a self-regulated learning perspective. IEEE Trans. Learn. Technol. 13(2), 226–245 (2020). https://doi.org/10.1109/TLT.2019.2916802
Chen, L., Lu, M., Goda, Y., Yamada, M.: Design of learning analytics dashboard supporting metacognition, pp. 175–182 (2019). https://doi.org/10.33965/celda2019_201911L022
Corrin, L., de Barba, P.: How do students interpret feedback delivered via dashboards?, pp. 430–431 (2015). https://doi.org/10.1145/2723576.2723662
Corrin, Linda; de Barba, Paula: Exploring students’ interpretation of feedback delivered through learning analytics dashboards (2014). https://www.researchgate.net/profile/Paula-De-Barba/publication/271769111_Exploring_students%27_interpretation_of_feedback_delivered_through_learning_analytics_dashboards/links/54d14ed20cf25ba0f0411598/Exploring-students-interpretation-of-feedback-delivered-through-learning-analytics-dashboards.pdf. Accessed 18 June 2021
Farahmand, A., Dewan, M.A.A., Lin, F.: Student-facing educational dashboard design for online learners, pp. 345–349 (2020). https://doi.org/10.1109/DASC-PICom-CBDCom-CyberSciTech49142.2020.00067
Schumacher, C., Ifenthaler, D.: Features students really expect from learning analytics. Comput. Hum. Behav. 78, 397–407 (2018). https://doi.org/10.1016/j.chb.2017.06.030
Mayring, P.: Qualitative content analysis: demarcation, varieties, developments [30 paragraphs]. Forum Qualitative Sozialforschung/Forum Qual. Soc. Res. 20(3), Art. 16 (2020). https://doi.org/10.17169/fqs-20.3.3343
Rädiker, S., Kuckartz, U.: Audio- und Videoaufnahmen transkribieren. In: Analyse qualitativer Daten mit MAXQDA. Springer VS, Wiesbaden (2019). https://doi.org/10.1007/978-3-658-22095-2_4
Acknowledgements
This work was funded by the German Federal Ministry of Education, grant No. 01PX21001B.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Appendix
Appendix
Procedure and questions for the usability test.
-
START
-
Task: open the different higher-level views on the study program.
-
Which ones are there?
-
How clear do you find the different views?
-
Do you feel informed about your learning progress respectively about your current status in the study program?
-
Are you missing any information? If yes, which one?
-
-
Task: Add a new card.
-
How explanatory/intuitive do you find the handling of this functionality?
-
-
Task: Open the help page.
-
Is the content understandable for you?
-
Is all essential information evident to you?
-
-
Task: Open the page with the recommendations on your learning progress
-
Would you trust the information/recommendations displayed?
-
-
-
END
GENERAL QUESTIONS ABOUT THE LOW-FIDELITY PROTOTYPE
-
How did you perceive the individual content elements? Rather nested or more intuitive/quicker to find?
-
How easy was it to be able to go back to the beginning?
-
Did you find the interaction elements well labeled?
-
Did you find the navigation elements easy to understand?
-
Did you know your position within the Learning Dashboard at all times?
-
Did you miss the drop-down menu to overall, semester, and module views in the subviews?
-
Do you need more help with the dashboard?
Rights and permissions
Copyright information
© 2023 The Author(s), under exclusive license to Springer Nature Switzerland AG
About this paper
Cite this paper
Drzyzga, G., Harder, T., Janneck, M. (2023). Cognitive Effort in Interaction with Software Systems for Self-regulation - An Eye-Tracking Study. In: Harris, D., Li, WC. (eds) Engineering Psychology and Cognitive Ergonomics. HCII 2023. Lecture Notes in Computer Science(), vol 14017. Springer, Cham. https://doi.org/10.1007/978-3-031-35392-5_3
Download citation
DOI: https://doi.org/10.1007/978-3-031-35392-5_3
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-031-35391-8
Online ISBN: 978-3-031-35392-5
eBook Packages: Computer ScienceComputer Science (R0)