Evaluation of learning analytics metrics and dashboard in a software engineering project course

Research output: Contribution to journalArticlepeer-review

1 Citation (Scopus)

Abstract

Learning analytics is a technique to monitor a learning activity using metrics. The aim of this research was to find the impact of data filtering on metric quality, the learning analytics dashboard usability in a software development course project, and to compare a correlation-based dashboard with a randomly arranged dashboard. A quantitative method was applied, with the metric correlation set to lecturer scores; the system usability scale (SUS) was a tool for evaluating dashboards; and the Fisher-Irwin test and t-test were applied to compare dashboards. A qualitative method was applied to evaluate dashboard usability, with usability testing through lost our lease. A simple data filtering technique can improve metric quality except for code review metrics. As for usability, the learning analytics dashboard has a relatively good and acceptable SUS score of 70.75. Findings from the research reveal a significant difference between correlation-based and randomly arranged dashboards, whereas two other indicators suggest no significant differences are present.

Original languageEnglish
Pages (from-to)171-180
Number of pages10
JournalGlobal Journal of Engineering Education
Volume20
Issue number3
Publication statusPublished - 1 Jan 2018

Keywords

  • Dashboard
  • Learning analytics
  • Metric

Fingerprint

Dive into the research topics of 'Evaluation of learning analytics metrics and dashboard in a software engineering project course'. Together they form a unique fingerprint.

Cite this