Cognitive Effort in Interaction with Software Systems for Self-regulation - An Eye-Tracking Study | SpringerLink
Skip to main content

Cognitive Effort in Interaction with Software Systems for Self-regulation - An Eye-Tracking Study

  • Conference paper
  • First Online:
Engineering Psychology and Cognitive Ergonomics (HCII 2023)

Abstract

The importance of digital degree programs has grown increasingly in recent years, due in part to their ability to provide a personalized learning experience for students. However, degree programs in this format have higher dropout rates than traditional degree programs. In the process of a user-centered design approach, a dashboard for the online degree programs of a university network is developed to provide information and recommendations about the learning process based on descriptive analysis and machine learning (ML) methods. For this purpose, ML models are developed, trained and evaluated. The goal of the dashboard is to promote self-regulation among students and reduce dropout rates. It will be set up as a plug-in through the learning management system (LMS) Moodle exclusively for students. In order to understand which aspects are important for users in relation to the cognitive processes involved in interacting with the dashboard, an eye-tracking study was conducted using the thinking aloud technique. The goal of the study was to investigate which cognitive demands are set for the users when interacting with the prototype and how the automatically generated information is perceived. When integrating the LD into the LMS, care should be taken to ensure that all content is realized in an understandable and easy-to-follow manner, due to the fact that otherwise the effort required to focus on the content elements of the LD could become greater - and with it the cognitive requirements.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
¥17,985 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
JPY 3498
Price includes VAT (Japan)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
JPY 10295
Price includes VAT (Japan)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
JPY 12869
Price includes VAT (Japan)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    https://www.tobii.com/products/eye-trackers/wearables/tobii-pro-glasses-3.

References

  1. Carter, B.T., Luke, S.G.: Best practices in eye tracking research. Int. J. Psychophysiol. 155, 49–62 (2020)

    Article  Google Scholar 

  2. Diaz, D.P.: Online drop rates revisited. Technol. Sour. 3(3), 35–51 (2002)

    Google Scholar 

  3. Getto, B., Hintze, P., Kerres, M.: (Wie) Kann Digitalisierung zur Hochschulentwicklung beitragen?, pp. 13–25 (2018)

    Google Scholar 

  4. Hassenzahl, M.: The hedonic/pragmatic model of user experience. Towards UX Manifesto 10, 2007 (2007)

    Google Scholar 

  5. Drzyzga, G., Harder, T.: Student-centered development of an online software tool to provide learning support feedback: a design-study approach. In: Proceedings of the 6th International Conference on Computer-Human Interaction Research and Applications, Valletta, Malta, 27–28 October 2022, pp. 244–248. SCITEPRESS - Science and Technology Publications (2022)

    Google Scholar 

  6. Janneck, M., Merceron, A., Sauer, P.: Workshop on addressing dropout rates in higher education, online – everywhere. In Companion Proceedings of the 11th Learning Analytics and Knowledge Conference (LAK 2021), pp. 261–269 (2021)

    Google Scholar 

  7. Schrepp, M., Hinderks, A., Thomaschewski, J.: Design and evaluation of a short version of the user experience questionnaire (UEQ-S). Int. J. Interact. Multimedia Artif. Intell. 4(6), 103–108 (2017)

    Google Scholar 

  8. Wannemacher, K., Jungermann, I., Scholz, J., Tercanli, H., von Villiez, A.: Digitale Lernszenarien im Hochschulbereich (2016)

    Google Scholar 

  9. Keller, B., Baleis, J., Starke, C., Marcinkowski, F.: Machine learning and artificial intelligence in higher education: a state-of-the-art report on the German University landscape. Heinrich-Heine-Universität Düsseldorf, pp. 1–31 (2019)

    Google Scholar 

  10. Pintrich, P.R.: The role of goal orientation in self-regulated learning. In: Handbook of Self-Regulation, pp. 451–502. Academic Press (2000)

    Google Scholar 

  11. Paas, F., Renkl, A., Sweller, J.: Cognitive load theory and instructional design: Recent developments. Educ. Psychol. 38(1), 1–4 (2003)

    Article  Google Scholar 

  12. Matcha, W., Uzir, N.A., Gasevic, D., Pardo, A.: A systematic review of empirical studies on learning analytics dashboards: a self-regulated learning perspective. IEEE Trans. Learn. Technol. 13(2), 226–245 (2020). https://doi.org/10.1109/TLT.2019.2916802

    Article  Google Scholar 

  13. Chen, L., Lu, M., Goda, Y., Yamada, M.: Design of learning analytics dashboard supporting metacognition, pp. 175–182 (2019). https://doi.org/10.33965/celda2019_201911L022

  14. Corrin, L., de Barba, P.: How do students interpret feedback delivered via dashboards?, pp. 430–431 (2015). https://doi.org/10.1145/2723576.2723662

  15. Corrin, Linda; de Barba, Paula: Exploring students’ interpretation of feedback delivered through learning analytics dashboards (2014). https://www.researchgate.net/profile/Paula-De-Barba/publication/271769111_Exploring_students%27_interpretation_of_feedback_delivered_through_learning_analytics_dashboards/links/54d14ed20cf25ba0f0411598/Exploring-students-interpretation-of-feedback-delivered-through-learning-analytics-dashboards.pdf. Accessed 18 June 2021

  16. Farahmand, A., Dewan, M.A.A., Lin, F.: Student-facing educational dashboard design for online learners, pp. 345–349 (2020). https://doi.org/10.1109/DASC-PICom-CBDCom-CyberSciTech49142.2020.00067

  17. Schumacher, C., Ifenthaler, D.: Features students really expect from learning analytics. Comput. Hum. Behav. 78, 397–407 (2018). https://doi.org/10.1016/j.chb.2017.06.030

    Article  Google Scholar 

  18. Mayring, P.: Qualitative content analysis: demarcation, varieties, developments [30 paragraphs]. Forum Qualitative Sozialforschung/Forum Qual. Soc. Res. 20(3), Art. 16 (2020). https://doi.org/10.17169/fqs-20.3.3343

  19. Rädiker, S., Kuckartz, U.: Audio- und Videoaufnahmen transkribieren. In: Analyse qualitativer Daten mit MAXQDA. Springer VS, Wiesbaden (2019). https://doi.org/10.1007/978-3-658-22095-2_4

Download references

Acknowledgements

This work was funded by the German Federal Ministry of Education, grant No. 01PX21001B.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Thorleif Harder .

Editor information

Editors and Affiliations

Appendix

Appendix

Procedure and questions for the usability test.

  • START

    • Task: open the different higher-level views on the study program.

      • Which ones are there?

      • How clear do you find the different views?

      • Do you feel informed about your learning progress respectively about your current status in the study program?

      • Are you missing any information? If yes, which one?

    • Task: Add a new card.

      • How explanatory/intuitive do you find the handling of this functionality?

    • Task: Open the help page.

      • Is the content understandable for you?

      • Is all essential information evident to you?

    • Task: Open the page with the recommendations on your learning progress

      • Would you trust the information/recommendations displayed?

  • END

GENERAL QUESTIONS ABOUT THE LOW-FIDELITY PROTOTYPE

  • How did you perceive the individual content elements? Rather nested or more intuitive/quicker to find?

  • How easy was it to be able to go back to the beginning?

  • Did you find the interaction elements well labeled?

  • Did you find the navigation elements easy to understand?

  • Did you know your position within the Learning Dashboard at all times?

  • Did you miss the drop-down menu to overall, semester, and module views in the subviews?

  • Do you need more help with the dashboard?

Rights and permissions

Reprints and permissions

Copyright information

© 2023 The Author(s), under exclusive license to Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Drzyzga, G., Harder, T., Janneck, M. (2023). Cognitive Effort in Interaction with Software Systems for Self-regulation - An Eye-Tracking Study. In: Harris, D., Li, WC. (eds) Engineering Psychology and Cognitive Ergonomics. HCII 2023. Lecture Notes in Computer Science(), vol 14017. Springer, Cham. https://doi.org/10.1007/978-3-031-35392-5_3

Download citation

  • DOI: https://doi.org/10.1007/978-3-031-35392-5_3

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-031-35391-8

  • Online ISBN: 978-3-031-35392-5

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics