Abstract
This study investigated the relations between students’ self-reported perceptions of the blended learning environment, their observed online learning strategies, and their academic learning outcomes. The participants were 310 undergraduates enrolled in an introductory course on computer systems in an Australian metropolitan university. A Likert-scale questionnaire was used to examine students’ perceptions. The digital traces recorded in a bespoke learning management system were used to detect students’ observed online learning strategies. Using the data mining algorithms, including the Hidden Markov Model and an agglomerative hierarchical sequence clustering, four types of online learning strategies were found. The four strategies not only differed in the number of online learning sessions but also showed differences in the proportional distribution with regard to different online learning behaviors. A one-way ANOVA revealed that students adopting different online learning strategies differed significantly on their final course marks. Students who employed intensive theory application strategy achieved the highest whereas those used weak reading and weak theory application scored the lowest. The results of a cross-tabulation showed that the four types of observed online learning strategies were significantly associated with the better and poorer perceptions of the blended learning environment. Specially, amongst students who adopted the intensive theory application strategy, the proportion of students who self-reported better perceptions was significantly higher than those reporting poorer perceptions. In contrast, amongst students using the weak reading and weak theory application strategy, the proportion of students having poorer perceptions was significantly higher than those holding better perceptions.
Similar content being viewed by others
Explore related subjects
Discover the latest articles, news and stories from top researchers in related subjects.Avoid common mistakes on your manuscript.
Introduction
The coronavirus pandemic (COVID-19) emergency has required higher education learning and teaching around the world to rapidly respond, in particular, redeploying more learning and teaching activities to virtual learning spaces to promote physical distancing. As a result, the face-to-face courses have been delivered either as blended courses or as purely online courses (Tang, Chen, Law, Wu, & Lau, 2021). In such urgent transformation, it is important to examine the relations of how learners perceive their learning environment (their perceptions) and how they approach the learning (their learning strategies). While past research has indicated the importance of students’ perceptions of the learning environment and has examined the relations between the perceptions, learning strategies, as well as academic learning outcomes, much of the research is based on a single source of evidence, typically self-reports (Guo, 2018; Guo et al., 2017; Lizzio et al., 2002; Wilson & Fowler, 2005). To improve the robustness of the findings of such research, it is valuable to triangulate it with the observational measures of learning strategies, such as detailed digital traces recorded in the learning management system (LMS). The combined sources of the evidence will provide a more holistic understanding of what students actually do (reflected by the observational measures) and why they do it (reflected by the self-reported measures) (Ellis et al., 2016; Han & Ellis, 2020a). The current study aims to address this purpose by investigating students’ self-reported perceptions of the blended learning environment and their observed online learning strategies by drawing on Student Approaches to Learning (SAL) research and Learning Analytics research. The following section will review the relevant literature from the two areas.
Literature review
Relevant student approaches to learning research
Student Approaches to Learning (SAL) research is a recognised guiding framework for the enhancement and assessment of the quality of learning in higher education (Biggs & Tang, 2011; Trigwell & Prosser, 2020). This area of research has shown that students’ prior experiences of learning, the departmental context, and students’ perceptions of the current learning contexts, are all closely related to their learning processes and the quality of learning outcomes (Biggs & Tang, 2011; Trigwell & Prosser, 2020). Research in this area mostly uses self-reported measures, such as surveys and interviews, to examine the key aspects in the students’ experiences of learning (Ramsden, 2003). To describe the relations of these key aspects, Biggs (1989) proposed a Presage-Process-Product model (known as 3P model), which has later been refined by Prosser & Trigwell (1999) and is visually represented in Fig. 1.
The elements included in the 3P model represent a relational system, which denotes that these elements are not linear nor are bound by chains of causality, but rather they coexist simultaneously. The Presage factors can be closely related to the learning outcomes in some contexts and they can also have indirect relations with the learning outcomes via the elements in the Process, which serve as mediators (Prosser & Trigwell, 2017; Trigwell & Prosser, 2020).
Previous SAL research on students’ perceptions has demonstrated that when students perceive the teaching of a high quality and being well organised, and the assessment tasks fit the learning objectives, they are more likely to use deep approaches and strategies in learning. In contrast, when students see teaching goals being unclear and unfocused, workload being too heavy, assessment tasks inappropriate, and a lack of teacher-student interactions, they tend to adopt surface approaches and strategies (Crawford et al., 1998; Lizzio et al., 2002; Wilson & Fowler, 2005).
In blended course designs, which require students to shift back and forth between face-to-face and online modes, research has reported that students who perceive that face-to-face and online components of learning and teaching are well integrated, adopt more deep approaches to learning but less surface approaches. In contrast, those who see that the face-to-face and online learning are fragmented and unaligned, are more likely to approach learning at a surface level (Ellis & Bliuc, 2019; Han & Ellis, 2019). Furthermore, logical associations between perceptions, learning approaches and strategies, and the learning outcomes, have also been found, across a number of academic disciplines, such as business (Han & Ellis, 2019), sciences (Ellis & Bliuc, 2019), social sciences (Ellis et al., 2020), and engineering (Ellis et al., 2016). Students having relatively better perceptions and adopting deep approaches are often found to have higher academic achievement than their peers holding poorer perceptions and using surface approaches.
These previous SAL investigations have predominantly employed self-reported instruments and data to examine these relations (Ellis & Bliuc, 2019; Ellis et al., 2017; Han & Ellis, 2019). While the self-reporting has the merit to capture students’ perceptions and intentions, and help explicate the reasons behind their decisions on their learning actions, behaviors, and strategies (Zhou & Winnie, 2012); the self-reported evidence in objectively representing what and how students learn in reality have been questioned (Hadwin et al., 2007). In addition, compared with the observational measures of students’ online learning strategies, it is relatively more difficult for self-reported data to capture the complex and dynamic natures of students’ online learning behaviors. In the current study, observational measures will be used to represent students’ online learning strategies.
Relevant Learning Analytics Research
In the past decade, the development of educational technology has produced prolific learning analytic studies, which emphasize on the capacity to collect detailed digital traces of students’ interactions with a variety of online learning resources and activities. The digital trace type of data, also known as the observational data have the advantage of offering descriptions of students’ learning behaviors and strategies relatively more objectively and in a more granular details than using self-reported methods (Siemens, 2013). The observational analytic data when combined with students’ demographic information have been increasingly used in various domains in higher education sector, such as advising students’ career choice (Bettinger & Baker, 2014); detecting at risk students to improve retention (Krumm et al., 2014); providing personalised feedback (Gibson, Aitken, Sándor, Buckingham Shum, Tsingos-Lucas et al., 2017); identifying patterns of learning tactics and strategies (Chen et al., 2017); facilitating collaborative learning (Kaendler et al., 2015); monitoring students’ affect in learning (Ocumpaugh et al., 2014); and predicting their academic learning outcomes (Romero et al., 2013).
However, being solely dependent on the observational digital traces and overly relying on sets of quantitative numbers has the danger of producing meaningless results and may result in reduced insights in interpretation due to a lack of proper guidance from theories, hence limiting the usefulness of the analytic data to locate barriers of learning, to offer ideas for pedagogical reforms, and to provide guidance for learning designs (Buckingham Shum & Crick, 2012).
To address the drawbacks of an overly empirical approach to learning analytics research, proposals have been put forward to use a more holistic approach to designing research and to guiding data analysis and modelling in order to improve the interpretability of the quantitative results derived from the observational digital traces (Gašević et al., 2015; Toetenel & Rienties, 2016). As a result, an increasing number of studies have combined observational and self-reported measures to examine students’ learning (Lockyer, Heathcote, & Dawson, 2013). This combined approach allows students’ learning behaviors and strategies to be interpreted through a more holistic assessment (Reimann et al., 2014).
In adopting a combined approach comprising self-reported and observational measures and data, research has been conducted to achieve two main purposes. The first purpose is to increase the explanatory power of the prediction of the learning outcomes by using different types of data. The majority of the existing research for this purpose has demonstrated that an inclusion of the observational measures of students’ learning behaviors have significantly improved the prediction of the learning outcomes than by using self-reporting alone Han & Ellis 2020a; Rodríguez-Triana et al., 2015; Tempelaar, Rienties & Giesbers, 2018). For instance, Pardo et al., (2017) reported that adding the frequency of students’ interactions with the online learning activities explained an extra 25.00% of variance in students’ course marks than using their reported use of self-regulated learning strategies alone. Similarly, Ellis et al., (2017) also found that by adding the quantity of students’ online participation in the regression model has significantly increased the variance explained in students’ academic performance than merely using students’ reported learning approaches.
Another aim of the studies adopting a combined approach is to investigate the extent to which the self-reported and observational measures of students’ learning are consistent and aligned with each other (Rodríguez-Triana et al., 2015). Research in this category has examined the relations between the observed online learning behaviors and various self-reported measures involved in students’ learning processes, such as self-efficacy and anxiety (Pardo et al., 2017); learning orientations (Han & Ellis, 2021; Han et al., 2020); learning motives (Gašević et al., 2017); learning engagement (Ober, Hong, Rebouças-Ju, Carter, Liu et al., 2021); achievement goal orientations (Sun & Xie, 2020); and effort (Li et al., 2020). However, the research evidence between the self-reported and observational measures has not always been coherent.
For instance, drawing a self-regulated learning perspective, Pardo et al., (2017) found that Australian university students who self-reported having higher intrinsic motivation were also found to view the video course contents more frequently that their peers who reported a lower level of intrinsic motivation. In another study with 320 American high school students, however, Ober et al., (2021) found that students’ online learning behaviors measured by a number of indicators, including frequencies of their assignment completion and results checking, and the average duration of the computer sessions students produced, were largely uncorrelated with their responses to a learning engagement questionnaire. Clearly further research is required to investigate the extent of consistency between the self-reported and observational measures of students’ learning.
The current study and research questions
The current study will investigate the relation between students’ self-reported perceptions of the blended learning environment and their academic learning outcomes on one hand; the relation between students’ observed online learning strategies and their academic learning outcomes on the other hand. It will then examine the relation between students’ self-reported perceptions of the blended learning environment and their observed online learning strategies. Specifically, the study addressed three research questions:
-
1.
What is the relation between students’ self-reported perceptions of the blended learning environment and their academic learning outcomes?
-
2.
What is the relation between students’ observed online learning strategies and their academic learning outcomes?
-
3.
What is the relation between students’ self-reported perceptions of the blended learning environment and their observed online learning strategies?
Method
Participants and the research context
The participants of the study were 310 undergraduates (aged between 17 and 31, M = 19.67, SD = 2.05). They were all enrolled in a first-year introductory course on computer systems, which was a blended course required students to attend face-to-face lectures and tutorials and to interact online. The online learning, which was held in a bespoke LMS, consisted of five major online learning resources: printed course contents, video course contents, problem-solving sequences, multiple-choice questions of testing the key concepts, and a dashboard for feedback and online learning progression. The bespoke LMS was designed by the course coordinator and had been adopted in this course for many years. The reason for using a bespoke LMS rather than a commercial LMS was it had more advanced learning analytic functions, such as recording the exact time students’ logon and logoff time and the timestamps of sequences of students’ online learning bebaviors.
Data and Instruments
Self-reported perceptions collected by a questionnaire. Students’ self-reported perceptions of the blended learning environment were collected using a 5-point Likert-scale questionnaire, which consisted of two scales: (1) perceptions of the integration between face-to-face and online learning, which assessed students’ perceived level of how face-to-face and online learning in the course are integrated (7 items, α = 0.86); (2) perceptions of online contributions, which examined students’ perceptions of how they valued online learning (6 items, α = 0.87). The questionnaire was used in previous SAL research (Ellis & Bliuc, 2019; Han et al., 2010) and the validity and reliability have been reported in (Han & Ellis, 2020b).
Observed online learning behaviors recorded by the bespoke LMS. The observed online learning behaviors were extracted from the LMS using the analytic functions. The LMS recorded students’ identifiers (represented by unique identification numbers to anonymize the names of the students), the type of online learning behaviors, and the timestamps of sequences of online learning behaviors. An online learning behavior was defined as a click on a type of online learning resource. Hence, students’ clicks on the five different types of online learning resources represented five different types of online learning behaviors, namely: reading behaviors (reading the printed course contents); watching behaviors (watching the video course contents), theoretical testing behaviors (doing multiple-choice questions of testing the key concepts), theory application behaviors (applying theories in problem-solving sequences),and study monitoring behaviors (viewing the dashboard for feedback and online learning progression).
The academic learning outcome. The academic learning outcome was students’ course mark consisted of students’ lecture and tutorial attendance and a close-book examination in the multiple-choice questions format. The examination assessed students’ understanding of key theoretical points and their abilities to ultilise theories to solve practical problems.
Ethics Considerations of the Data Collection
The ethics guidelines were strictly followed to recruit the participants and to collect the data. Before the study, all the potential participants were informed about the purposes of the study. In the Participant Information Statement, it was clearly explained to students that their participation was entirely voluntary and their decisions on participation or not would by no means impact on their course marks as the teaching staff in the course had no access to the data. They were also ensured that their identification would be anonymized, and all the information collected would be used solely for the research purposes. Students were required to sign a written consent form should they wish to participate.
Data Analysis
To answer the first research question – the relation between students’ self-reported perceptions of the blended learning environment and the academic learning outcome, the Mean scores of the two perceptions scales were used to divide students into two groups of having better or poorer perceptions. A one-way ANOVA on students’ course marks between those having better and poorer perceptions was performed.
To provide an answer to the second research question – the relation between students’ observed online learning strategies and their academic learning outcome, the algorithm of the Hidden Markov Model (HMM) was applied on the sequences of students’ online learning sessions. One online learning session was defined as continuous online learning behaviors with less than 30-minute breaks. One online learning sessions may consist of varying number of the timestamped online learning behaviors. The HMMs transformed each online learning session into an online learning state, which was represented by a predominant online learning behavior (but might comprise more than online learning behaviors). After the procedure of HMM transformations, the chains of the transformed online learning states were subjected to an agglomerative hierarchical sequence clustering analysis to derive distinct patterns of students’ online learning strategies. Using the online learning strategies as a between-subjects variable, a one-way ANOVA on students’ course marks was conducted.
For the last research question – the relation between students’ self-reported perceptions of the blended learning environment and their observed online learning strategies, a cross-tabulation was conducted between groups by perceptions and by the online learning strategies.
Results
The relations between self-reported perceptions of the blended learning environment and the academic learning outcome
The result of the one-way ANOVA shows that on students’ reporting better and poorer perceptions differed significantly on their academic performance in the course: F (1, 308) = 8.33, p < .01, η2 = 0.02. Students who had higher ratings of the blended learning environment (M = 88.58, SD = 17.24) obtained significantly higher course marks than those with lower ratings of the blended learning environment (M = 83.11, SD = 16.14).
The relations between the observed online learning strategies and the academic learning outcome
Using the Akaike Information Criterion (AIC) and the Bayesian Information Criterion (BIC), the HMMs identified three states of the online learning sessions, which were described below:
-
reading states predominantly consisted of reading, and few study monitoring;
-
theory application states predominantly consisted of applying theories to solve practical problems, few reading and watching;
-
theoretical testing states predominantly consisted of theoretical testing, few reading and study monitoring.
Using the above three states, an agglomerative hierarchical sequence clustering analysis was performed. To select the optimal number of clusters, dendrograms were used to identify the most plausible segmentations.
of the tree structure (Kassambara, 2017). Four clusters were retained, with each cluster representing a distinct observed online learning strategy. The four online learning strategies are visually presented in Fig. 2.
In Fig. 2, each point in the X axis is a transformed HMM state of the corresponding online learning session. The Y axis shows the proportional distribution of the HMM states. As shown in Fig. 2, the students in the four clusters not only differed by the number of the learning states, but also differed on the proportional distribution of the types of the states. In general, the proportions of the reading states were similar amongst the first three types of the observed online learning strategies, which were all higher than that in the observed online learning strategy 4. The differences were mainly in the proportional distribution of the theory application states and theoretical testing states.
-
observed online learning strategy 1 (n = 97) – intensive theory application: Students adopting strategy 1 had high percentage of theory application states but low percentage of theoretical testing states. These students also had the most online learning sessions.
-
observed online learning strategy 2 (n = 138) – moderate theory application: Students adopting strategy 2 had moderate percentage of theory application states but low percentage of theoretical testing states. These students had the second most online learning sessions.
-
observed online learning strategy 3 (n = 57) – weak theory application and moderate theoretical testing: Students adopting strategy 3 showed features of having low percentage of theory application states but moderate theoretical learning. The online learning sessions of this group of students ranked the third.
-
observed online learning strategy 4 (n = 18) – weak reading and weak theory application: Students adopting strategy 4 had low percentages of both the reading states and the theory application states. The students in this cluster had the least online learning sessions.
The results of the one-way ANOVA by students’ observed online learning strategies showed that their learning outcomes significantly differed: F (3, 306) = 36.75, p < .01, η2 = 0.27. The post-hoc tests for pair-wise comparison were conducted. Due to the unequal sample size between groups, Gabriel’s post-hoc test was selected and the results of the pairwise comparison are displayed in Table 1.
Table 1 shows that students who adopted the intensive theory application strategy (strategy 1) (M = 94.49, SD = 11.56) obtained the highest course marks than the students using the other three types of online learning strategies. Students using the moderate theory application strategy (strategy 2) had the second highest marks (M = 87.18, SD = 15.50), followed by those with the weak theory application and moderate theoretical testing strategy (strategy 3) (M = 75.71, SD = 16.83). The students employing the weak reading and weak theory application strategy (strategy 4) obtained the lowest marks (M = 62.31, SD = 13.70).
The relation between the self-reported perceptions of the blended learning environment and the observed online learning strategies
The results of the 2 (two groups of students having better vs. poorer perceptions) x 4 (four groups of students using four online learning strategies) cross-tabulation was significant χ² (3) = 8.76, p < .05, φ = 0.03. The two-proportion z-tests displayed in Table 2 show that amongst 97 students who adopted the intensive theory application strategy (strategy 1), the proportion of students self-reporting better perceptions of the blended learning environment (59.80%) was significantly higher than the proportion of those reporting poorer perceptions (40.20%). In contrast, amongst 18 students who used weak the reading and weak theory application (strategy 4), the proportion of students having poorer perceptions of the blended learning environment (72.20%) was significantly higher than the proportion of students holding better perceptions (27.80%).
Discussion
This study examined the relations between students’ self-reported perceptions of the blended learning environment, their observed online learning strategies, and their academic learning outcome. Similar to the previous research findings (Guo, 2018; Guo et al., 2017; Ellis & Bliuc, 2019; Han & Ellis, 2020a), our study also found that students who had better perceptions of the learning environment (in this context, those who perceived that the online learning part was well blended with the face-to-face part in the course, and appraised the online contributions) tended to achieve better academic performance in the course.
Different from the methods used in the previous studies, which used self-reports to measure students’ learning strategies and/or approaches (Ellis & Bliuc, 2019; Ellis et al., 2017), we employed the digital traces left in the LMS – a more objective measure to represent students’ online learning strategies. The data mining techniques detected four types of the online learning strategies, which not only differed in terms of the number of the online learning sessions (how much students learned online), but also varied with regard to the proportional distribution of the different online learning behaviors (how they learned online). Similar to the results reported in Han & Ellis (2017), our results also indicated that the more the students participated in the online learning, the higher course marks they obtained. In addition, the results also suggested that the students who interacted more with theory application resource tended to achieve better academic learning outcomes; as students adopting online learning strategy 1 and 2 had significantly higher course marks than those using online learning strategy 3 and 4. One possible interpretation of the results could be that engagement with theory application might represent a deeper level of learning than merely testing theoretical concepts, as solving sequences of problems not only required a thorough understanding of theories, but also the abilities to apply theories in tackling problems and issues in real life. This meant that students might need to draw on relevant theories, apply formula, use mathematical methods, and build models in order to successfully complete the theory application tasks. Such findings seemed to align with previous SAL findings, which consistently reported association between the deep strategies reported by students and better learning outcomes (Trigwell & Prosser, 2020). The observed results from our study added more objective evidence and offered some triangulations for the previous self-reported research evidence.
Our research results also share some similarities with the learning analytics studies on detecting students’ online learning tactics and strategies. These studies reported that students differed in terms of how much they were engaged with different types of the online learning activities; and such differences on approaching a certain type or a combination of certain types of learning activities also tend to relate to their academic performance (Fincham et al., 2018; Jovanović et al., 2017). However, both the current study and the existing studies did not provide a clear answer to the question of whether how much students learned online (e.g., the total number of the online learning sessions), or how they learned online (e.g., the proportional distributions of the different online learning behaviors) or both of the two factors, are related to students’ academic performance. This question needs to be answered by clustering students using the criterion of either the quantity of the online learning or the proportional distributions of different types of the online learning behaviors.
With regard to the relation between the self-reported perceptions of the blended learning environment and the observed online learning strategies, the study found a significant association between the better and poorer perceptions and the patterns of the observed strategies. In particular, of the students using the intensive theory application strategy (also had the highest course marks), a higher proportion of them perceived the blended learning environment more positively; whereas of those employing the weak reading and weak theory application strategy (also had the lowest course marks), a higher proportion of them had more negative perceptions towards the blended learning environment.
These findings seem to be consistent with the studies employing the self-reported methods to examine the relations between the perceptions of learning environment and the learning strategies/approaches (Ellis & Bliuc, 2019; Guo, 2018; Han & Ellis, 2020a). The results of our study not only confirm and triangulate previous self-reported findings, the digital trace measures also offer much more detailed descriptions about the online learning behaviours than what can captured by using questionnaires. Notwithstanding such merit, cautions still need to be taken when comparing the results from the self-reporting methods and the observational methods, as the learning strategies measured by self-reports often include the information about students’ motives and intents of adopting certain types of strategies (why question). Therefore, the online learning strategies measured by observation can only be used to approximate the strategies and approaches measured by self-reports.
Limitations and future research direction
A number of limitations of the study need to be pointed out in order to inform future research. First, as mentioned, the clustering of students’ observed online learning strategies did not distinguish clearly between the number of the online learning sessions and the proportional distribution of the online learning states. Future research should purposely address some of the unanswered questions brought up by this limitation, such as whether the number of the online learning sessions, or the proportional distribution of the online learning activities, are related to students’ perceptions of the learning environment and their academic performance. Second, the research was conducted with students only from one academic discipline – computer science. To examine if there are disciplinary variations of the results, the similar research design with students from other academic disciplines should be conducted in the future. Furthermore, the self-reported data only measured students’ perceptions of the blended learning environment. The SAL research has indicated that students’ personal attributes, their prior knowledge, and their motivation in the learning context, are all related to their perceptions, strategies, and academic learning outcomes (Trigwell et al., 2013). New studies which address these issues in this area will help push the field onwards.
References
Bettinger, E. P., & Baker, R. B. (2014). The effects of student coaching: An evaluation of a randomized experiment in student advising. Educational Evaluation and Policy Analysis, 36(1), 3–19. https://doi.org/10.3102/0162373713500523
Biggs, J. B. (1989). Approaches to the enhancement of tertiary teaching. Higher Education Research and Development, 8(1), 7–25. https://doi.org/10.1080/0729436890080102
Biggs, J., & Tang, C. (2011). Teaching for quality learning at university. London: McGraw-Hill education
Buckingham Shum, S., & Crick, R. D. (2012). Learning dispositions and transferable competencies: Pedagogy, modelling and learning analytics. Proceedings of the 2nd International Conference on Learning Analytics & Knowledge (pp. 92–101). https://doi.org/10.1145/2330601.2330629
Chen, B., Resendes, M., Chai, C. S., & Hong, H. Y. (2017). Two tales of time: Uncovering the significance of sequential patterns among contribution types in knowledge-building discourse. Interactive Learning Environments, 25(2), 162–175. https://doi.org/10.1080/10494820.2016.1276081
Crawford, K., Gordon, S., Nicholas, J., & Prosser, M. (1998). Qualitatively different experiences of learning mathematics at university. Learning & Instruction, 8(5), 455–468. https://doi.org/10.1016/S0959-4752(98)00005-X
Ellis, R. A., & Bliuc, A. M. (2019). Exploring new elements of the student approaches to learning framework: The role of online learning technologies in student learning. Active Learning in Higher Education, 20(1), 11–24. https://doi.org/10.1177/1469787417721384
Ellis, R., Bliuc, A., & Han, F. (2020). Challenges in assessing the nature of effective group work in blended university courses. Australasian Journal of Educational Technology, 37(1), 1–14. https://doi.org/10.14742/ajet.5576
Ellis, R. A., Han, F., & Pardo, A. (2017). Improving learning analytics – Combining observational and self-report data on student learning. Journal of Educational Technology & Society, 20(3), 158–169. https://www.jstor.org/stable/26196127?seq=1
Ellis, R. A., Pardo, A., & Han, F. (2016). Quality in blended learning environments – Significant differences in how students approach learning collaborations. Computers & Education, 102, 90–102. https://doi.org/10.1016/j.compedu.2016.07.006
Fincham, E., Gašević, D., Jovanović, J., & Pardo, A. (2018). From study tactics to learning strategies: An analytical method for extracting interpretable representations. IEEE Transactions on Learning Technologies, 12(1), 59–72. https://doi.org/10.1109/TLT.2018.2823317
Gašević, D., Dawson, S., & Siemens, G. (2015). Let’s not forget: Learning analytics are about learning. TechTrends, 59(1), 64–71. https://doi.org/10.1007/s11528-014-0822-x
Gašević, D., Jovanović, J., Pardo, A., & Dawson, S. (2017). Detecting learning strategies with analytics: Links with self-reported measures and academic performance. Journal of Learning Analytics, 4(2), 113–128. https://doi.org/10.18608/jla.2017.42.10
Gibson, A., Aitken, A., Sándor, Á., Shum, B., Tsingos-Lucas, S., C., & Knight, S. (2017). Reflective writing analytics for actionable feedback. Proceedings of the Seventh International Learning Analytics & Knowledge Conference (pp. 153–162). https://doi.org/10.1145/3027385.3027436
Guo, J. (2018). Building bridges to student learning: Perceptions of the learning environment, engagement, and learning outcomes among Chinese undergraduates. Studies in Educational Evaluation, 59, 195–208. https://doi.org/10.1016/j.stueduc.2018.08.002
Guo, J., Yang, L., & Shi, Q. (2017). Effects of perceptions of the learning environment and approaches to learning on Chinese undergraduates’ learning. Studies in Educational Evaluation, 55, 125–134. https://doi.org/10.1016/j.stueduc.2017.09.002
Hadwin, A. F., Nesbit, J. C., Jamieson-Noel, D., Code, J., & Winne, P. H. (2007). Examining trace data to explore self-regulated learning. Metacognition and Learning, 2(2–3), 107–124. https://doi.org/10.1007/s11409-007-9016-7
Han, F., & Ellis, R. A. (2017). Variations in coherence and engagement in students’ experience of blended learning. In H. Partridge, K. Davis, & J. Thomas. (Eds.), Proceedings of the 34th International Conference on Innovation, Practice and Research in the Use of Educational Technologies in Tertiary Education (pp. 268–275). Toowoomba: University of Southern Queensland. https://2017conference.ascilite.org/wp-content/uploads/2017/11/Full-HAN_Feifei3.pdf
Han, F., & Ellis, R. A. (2019). Identifying consistent patterns of quality learning discussions in blended learning. The Internet & Higher Education, 40, 12–19. https://doi.org/10.1016/j.iheduc.2018.09.002
Han, F., & Ellis, R. A. (2020a). Combining self-reported and observational measures to assess university student academic performance in blended course designs. Australasian Journal of Educational Technology, 36(6), 1–14. https://doi.org/10.14742/ajet.6369
Han, F., & Ellis, R. A. (2020b). Initial development and validation of the Perceptions of the Blended Learning Environment Questionnaire. Journal of Psychoeducational Assessment, 38(2), 168–181. https://doi.org/10.1177/0734282919834091
Han, F., Pardo, A., & Ellis, R. A. (2020). Students’ self-report and observed learning orientations in blended university course design: How are they related to each other and to academic performance? Journal of Computer Assisted Learning, 36(6), 969–980. https://doi.org/10.1111/jcal.12453
Han, F., & Ellis, R. A. (2021). Predicting students’ academic performance by their online learning patterns in a blended course: To what extent is a theory-driven approach and a data-driven approach consistent? Educational Technology & Society, 24(1), 191–204. https://www.jstor.org/stable/26977867?seq=1
Jovanović, J., Gašević, D., Dawson, S., Pardo, A., & Mirriahi, N. (2017). Learning analytics to unveil learning strategies in a flipped classroom. The Internet & Higher Education, 33(4), 74–85. https://doi.org/10.1016/j.iheduc.2017.02.001
Kaendler, C., Wiedmann, M., Rummel, N., & Spada, H. (2015). Teacher competencies for the implementation of collaborative learning in the classroom: A framework and research review. Educational Psychology Review, 27(3), 505–536. https://doi.org/10.1007/s10648-014-9288-9
Kassambara, A. (2017). Practical guide to cluster analysis in R: Unsupervised machine learning. Sthda
Krumm, A. E., Waddington, R. J., Teasley, S. D., & Lonn, S. (2014). A learning management system-based early warning system for academic advising in undergraduate engineering. Learning analytics (pp. 103–119). New York, NY: Springer. https://doi.org/10.1007/978-1-4614-3305-7_6
Li, Q., Baker, R., & Warschauer, M. (2020). Using clickstream data to measure, understand, and support self-regulated learning in online courses. The Internet & Higher Education, 45, https://doi.org/10.1016/j.iheduc.2020.100727
Lizzio, A., Wilson, K., & Simons, R. (2002). University students’ perceptions of the learning environment and academic outcomes: implications for theory and practice. Studies in Higher Education, 27(1), 27–52. https://doi.org/10.1080/03075070120099359
Lockyer, L., Heathcote, E., & Dawson, S. (2013). Informing pedagogical action: Aligning learning analytics with learning design. American Behavioral Scientist, 57(10), 1439–1459. https://doi.org/10.1177/0002764213479367
Ober, T. M., Hong, M. R., Rebouças-Ju, D. A., Carter, M. F., Liu, C., & Cheng, Y. (2021). Linking self-report and process data to performance as measured by different assessment types. Computers & Education, 167, 104188. https://doi.org/10.1016/j.compedu.2021.104188
Ocumpaugh, J., Baker, R., Gowda, S., Heffernan, N., & Heffernan, C. (2014). Population validity for educational data mining models: A case study in affect detection. British Journal of Educational Technology, 45(3), 487–501. https://doi.org/10.1111/bjet.12156
Pardo, A., Han, F., & Ellis, R. A. (2017). Combining university student self-regulated learning indicators and engagement with online learning events to predict academic performance. IEEE Transactions on Learning Technologies, 10(1), 82–92. https://doi.org/10.1109/TLT.2016.2639508
Prosser, M., & Trigwell, K. (1999). Understanding learning and teaching: The experience in higher education. London: McGraw-Hill Education
Prosser, M., & Trigwell, K. (2017). Student learning and the experience of teaching. HERDSA Review of Higher Education, 4, 5–27
Ramsden, P. (2003). Learning to teach in higher education. London: Routledge
Siemens, G. (2013). Learning analytics: The emergence of a discipline. American Behavioral Scientist, 57(10), 1380–1400. https://doi.org/10.1177/0002764213498851
Reimann, P., Markauskaite, L., & Bannert, M. (2014). e-Research and learning theory: What do sequence and process mining methods contribute? British Journal of Educational Technology, 45(3), 528–540. https://doi.org/10.1111/bjet.12146
Rodríguez-Triana, M. J., Martínez‐Monés, A., Asensio‐Pérez, J. I., & Dimitriadis, Y. (2015). Scripting and monitoring meet each other: Aligning learning analytics and learning design to support teachers in orchestrating CSCL situations. British Journal of Educational Technology, 46(2), 330–343. https://doi.org/10.1111/bjet.12198
Romero, C., López, M. I., Luna, J. M., & Ventura, S. (2013). Predicting students’ final performance from participation in on-line discussion forums. Computers & Education, 68, 458–472. https://doi.org/10.1016/j.compedu.2013.06.009
Sun, Z., & Xie, K. (2020). How do students prepare in the pre-class setting of a flipped undergraduate math course? A latent profile analysis of learning behavior and the impact of achievement goals. The Internet and Higher Education, 46. https://doi.org/10.1016/j.iheduc.2020.100731
Tang, Y. M., Chen, P. C., Law, K. M., Wu, C. H., Lau, Y. Y., Guan, J., & Ho, G. T. (2021). Comparative analysis of student’s live online learning readiness during the coronavirus (COVID-19) pandemic in the higher education sector. Computers & Education, 168, 104211. https://doi.org/10.1016/j.compedu.2021.104211
Tempelaar, D., Rienties, B., Mittelmeier, J., & Nguyen, Q. (2018). Student profiling in a dispositional learning analytics application using formative assessment. Computers in Human Behavior, 78, 408–420. https://doi.org/10.1016/j.chb.2017.08.010
Toetenel, L., & Rienties, B. (2016). Analysing 157 learning designs using learning analytic approaches as a means to evaluate the impact of pedagogical decision making. British Journal of Educational Technology, 47(5), 981–992. https://doi.org/10.1111/bjet.12423
Trigwell, K., Ashwin, P., & Millan, E. S. (2013). Evoked prior learning experience and approach to learning as predictors of academic achievement. British Journal of Educational Psychology, 83(3), 363–378. https://doi.org/10.1111/j.2044-8279.2012.02066.x
Trigwell, K., & Prosser, M. (2020). Exploring university teaching and learning: Experience and context. London: Springer Nature
Wilson, K., & Fowler, J. (2005). Assessing the impact of learning environments on students’ approaches to learning: Comparing conventional and action learning designs. Assessment & Evaluation in Higher Education, 30(1), 87–101. https://doi.org/10.1080/0260293042003251770
Zhou, M., & Winne, P. H. (2012). Modeling academic achievement by self-reported versus traced goal orientation. Learning and Instruction, 22(6), 413–419. https://doi.org/10.1016/j.learninstruc.2012.03.004
Funding
Open Access funding enabled and organized by CAUL and its Member Institutions. This work was supported by the Australian Research Council [grant number DP150104163]
Author information
Authors and Affiliations
Corresponding author
Additional information
Publisher’s Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article’s Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/.
About this article
Cite this article
Han, F., Ellis, R.A. The relations between self-reported perceptions of learning environment, observational learning strategies, and academic outcome. J Comput High Educ 35, 111–125 (2023). https://doi.org/10.1007/s12528-022-09333-2
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s12528-022-09333-2