Teaching Matters

Measuring student engagement and participation using learning analytics

Back to program

Teaching Matters 2017 | Presentation Details | 28 NovemberNov 2017

Title

Measuring student engagement and participation using learning analytics: A case study at the University of Tasmania


Author(s)

Si Fan, School of Education
Soonja Yeom, School of Engineering and ICT
Saurabh Garg, School of Engineering and ICT
Gerry Kregor, Tasmanian Institute of Learning & Teaching


Subtheme

Advancing the Scholarship of Learning and Teaching


Presentation Type

Showcase Presentation


Room

Social Sciences 209


Time

12.55-13.15


Abstract

In recent years there has been an increase in the use of Big Data analytics for educational purposes, driven by pressure faced by educational institutions towards performance management and student retention (Beer, 2010) and facilitated by increasing use of the online environment, methodological developments and systems capabilities. One common form of Big Data analytics in higher education is learning analytics. Learning analytics applications are often used to build predictive models on student outcomes, to measure student engagement, or to identify at-risk students at an earlier stage (Rienties et al., 2016). The project is a case study to explore the use of Big Data and learning analytics to identify such patterns for Australian higher education, with the University of Tasmania as a starting point.

This project examines correlations between student engagement and participation and other factors, such as 1) the intended pedagogical design and ‘blend’ of the unit; 2) lecturers’ and tutors’ expectations and support; 3) student interactions with the lecturer, tutors and peers; 4) types of teaching materials/resources/activities available online; and 5) student results and evaluation. This project involved 18 sets of data from the online components of 10 units (two education units and eight ICT units, taught during the time 2013 to 2015). The collected data includes; the percentage of content completed, quizzes attempted, quizzes completed, discussion posts read, discussion posts created, discussion posts replied to, the length of time they stayed in the system, and students’ grades.

To date, we have conducted two sets of preliminary analysis using Excel’s statistical tool, the first set was done with a focus to understand how student engagement has changed over the three year period. In this preliminary analysis the numbers of messages posted or replied to in discussion boards were used as a dominant indicator of engagement. The analysis shows that high achieving students are the ones whose participation was the highest, although in the last two years the trend slightly changed. That is, students who failed the units became more active participants. This may indicate that the at-risk students were having more difficulty in understanding the contents, and that some changes in teaching patterns or support are required. The recently introduced practice of early detection of student’s engagement via Student Engaging Activity (SEA) will support teaching teams directly with an early notification.

The second set of analysis was done with a focus on two of the ICT units, in which the enrolled cohorts overlap to a large extent. The analysis validates that features such as the number of visits to learning modules and users’ interactions are important indicators of student learning performance and achievement. There are two key findings that emerged from this preliminary analysis. First, student interaction pattern remains broadly the same within the two units. Second, students access the teaching material more often if the unit is difficult or if it needed a higher level thinking or understanding. These findings will be further tested and validated with improved methods and larger data sets in later stages of the study.

The presentation will be a report of our exploratory study. We will present selected student and teacher interaction data and the analyses that have led to our interim findings. We will also discuss some limitations inherent in the data that is available at UTAS and then invite discussion about others’ experiences and thoughts on learning analytics.

References

Beer, C. (2010). Online student engagement: New measures for new methods. Unpublished Masters dissertation, CQUniversity, Rockhampton, Qld, Australia.

Rienties, B., Boroowa, A., Cross, S., Kubiak, C., Mayles, K., & Murphy, S. (2016). Analytics4Action evaluation framework: A review of evidence-based learning analytics interventions at the Open University UK. Journal of Interactive Media in Education, (1), 1-11.

Back to program