Session: Learning Analytics

ECTEL logo
<< Return to Programme Session: Learning Analytics Chair: Olga Viberg 10:30-11:00 CET When and How to Update Online Analytical Models for Predicting Students Performance? Chahrazed Labba and Anne Boyer University of Lorraine, France Abstract: One of the main concerns in online learning environments is the identification of students with learning

Speakers

Rafael Ferreira Mello
Rural Federal University of Pernambuco, Brasil
Daniela Rotelli
Università di Pisa

Start

08/09/2022 - 10:30

End

15/09/2022 - 12:30

Address

Room B308   View map

<< Return to Programme

Session: Learning Analytics

Chair: Olga Viberg

10:30-11:00 CET
When and How to Update Online Analytical Models for Predicting Students Performance?

Chahrazed Labba and Anne Boyer
University of Lorraine, France

Abstract: One of the main concerns in online learning environments is the identification of students with learning difficulties. Conventionally, analytical models trained offline on pre-prepared datasets are used to predict student performance. However, as learning data become progressively available over time, this learning method is no longer sufficient in real-world applications. Nowadays, incremental learning strategies are increasingly applied to update online analytical models by re-training them on newly received data. Various online incremental learning approaches have been proposed to overcome different issues such as catastrophic forgetting and concept drift. However, no approach addresses the question of when to update the model and how to determine whether the new data provide important information that the model should learn. In this paper, we propose a method for determining when an online classifier that predicts student performance and receives a real-time data stream, should be updated. In addition, we use a typical approach that maintains balanced old and new data examples to re-train the model when necessary. As a proof of concept, we applied our method on real data of k-12 learners enrolled in an online physics-chemistry module.

📄 Read More: https://link.springer.com/chapter/10.1007/978-3-031-16290-9_13


11:00-11:30 CET
Enhancing Instructors’ Capability to Assess Open-Response Using Natural Language Processing and Learning Analytics

Rafael Ferreira Mello[1,2], Rodrigues Neto[1], Giuseppe Fiorentino[1], Gabriel Alves[1], Verenna Arédes[1], João Victor Galdino Ferreira Silva[1], Taciana Pontual Falcão[1] and Dragan Gasevic[2]
[1] Universidade Federal Rural de Pernambuco, Brazil
[2] Monash University, Australia

Abstract: Assessments are crucial to measuring student progress and providing constructive feedback. However, the instructors have a huge workload, which leads to the application of more superficial assessments that, sometimes, does not include the necessary questions and activities to evaluate the students adequately. For instance, it is well-known that open-ended questions and textual productions can stimulate students to develop critical thinking and knowledge construction skills, but this type of question requires much effort and time in the evaluation process. Previous works have focused on automatically scoring open-ended responses based on the similarity of the students’ answers with a reference solution provided by the instructor. This approach has its benefits and several drawbacks, such as the failure to provide quality feedback for students and the possible inclusion of negative bias in the activities assessment. To address these challenges, this paper presents a new approach that combines learning analytics and natural language processing methods to support the instructor in assessing open-ended questions. The main novelty of this paper is the replacement of the similarity analysis with a tag recommendation algorithm to automatically assign correct statements and errors already known to the responses, along with an explanation for each tag.

📄 Read More: https://link.springer.com/chapter/10.1007/978-3-031-16290-9_8


11:30-12:00 CET
[Online] Uncovering Student Temporal Learning Patterns

Daniela Rotelli, Anna Monreale and Riccardo Guidotti
University of Pisa, Italy

Abstract: Because of the flexibility of online learning courses, students organise and manage their own learning time by choosing where, what, how, and for how long they study. Each individual has their unique learning habits that characterise their behaviours and distinguish them from others. Nonetheless, to the best of our knowledge, the temporal dimension of student learning has received little attention on its own. Typically, when modelling trends, a chosen configuration is set to capture various habits, and a cluster analysis is undertaken. However, the selection of variables to observe and the algorithm used to conduct the analysis is a subjective process that reflects the researcher’s thoughts and ideas. To explore how students behave over time, we present alternative ways of modelling student temporal behaviour. Our real-world data experiments reveal that the generated clusters may or may not differ based on the selected profile and unveil different student learning patterns.

📄 Read More: https://link.springer.com/chapter/10.1007/978-3-031-16290-9_25


12:00-12:30 CET
Privacy-Preserving Synthetic Educational Data Generation

Jill-Jênn Vie[1], Tomas Rigaux[1] and Sein Minn[2]
[1] SODA, Inria Saclay, France
[2] CEDAR, Inria Saclay, France

Abstract: Institutions collect massive learning traces but they may not disclose it for privacy issues. Synthetic data generation opens new opportunities for research in education. In this paper we present a generative model for educational data that can preserve the privacy of participants, and an evaluation framework for comparing synthetic data generators. We show how naive pseudonymization can lead to re-identification threats and suggest techniques to guarantee privacy. We evaluate our method on existing massive educational open datasets.

📄 Read More: https://link.springer.com/chapter/10.1007/978-3-031-16290-9_29