Learning Analytics and Task Design in Science Education
Journal article, Peer reviewed
MetadataShow full item record
Original versionEDULEARN proceedings. 2017, 8021-8024. 10.21125/edulearn.2017.0476
Learning analytics has become a buzzword, in practice often accompanied by comprehensive use of digital technologies. This study takes a qualitative approach by analysing performance data within the bounds of one module. High failure rates prompted an action research study to check for performance patterns as seen at the exam. Generally, students were successful at conducting calculations, but struggling when being asked conceptual and theoretical questions. Further analysis demonstrated a lack of alignment between tasks posed in exercises and at the exam. Particularly in the second half of the course, students claimed they were asked theoretical and conceptual questions that had not been previously addressed in the exercises. The project set out to align conceptual and algorithmic tasks as featured in the various parts of the curriculum. Furthermore, a theoretical mid-term test was introduced to focus exclusively on conceptual issues, some of which were re-addressed at the final exam. Due to the interventions, failure rates were greatly reduced and with students’ feedback generally supportive.