The goal of this research is to provide a real-time lecture assistance system. The study’s focus is on-site classrooms where professors offer lectures and a large number of students listen to lecturers’ explanations, do exercises, and so on.

Design/methodology/approach

The suggested system collects teaching and learning activities from a teacher and students in real time using an e-learning system and an e-book system. The obtained data is promptly processed in order to offer feedback to the instructor both before and during the lesson. The preview accomplishment graph, for example, allows the instructor to see which pages were well previewed and which were not. Real-time analytics graphs are shown on the teacher’s PC during the lesson. The instructor can readily determine the state of the pupils and whether or not they are following the teacher’s explanation.

Findings

The authors initially validated the efficiency of each instrument produced in this research using the case study. The authors then carried out a large-scale experiment using a real-time analytics graph to see whether the suggested method might enhance teaching and learning in on-site classes. According to the findings, instructors could modify the tempo of their lectures depending on the real-time feedback system, which encouraged students to bookmark and highlight keywords and phrases.

Originality/value

Real-time learning analytics helps professors and students to improve their lecture teaching and learning. Teachers should immediately begin examining this new method to enhance their lectures.

Keywords \sCitation

“Real-time learning analytics system for improving on-site lectures,” Shimada, A., Konomi, S., and Ogata, H. (2018), Interactive Technology and Smart Education, Vol. 15 No. 4, pp. 314-331. https://doi.org/10.1108/ITSE-05-2018-0026

Emerald Publishing Limited is the publisher.

Atsushi Shimada, Shin’ichi Konomi, and Hiroaki Ogata, 2018.

License

Emerald Group Publishing Limited is the publisher. This article is licensed under the Creative Commons Attribution 4.0 International License. Anyone may reproduce, distribute, translate, and create derivative works of this article (for both commercial and non-commercial purposes), as long as the original publication and authors are properly credited. The complete terms of this license are available at http://creativecommons.org/licences/by/4.0/legalcode.

Introduction

Learning analytics has received a lot of attention in the technology-enhanced learning research domain. Learning analytics is defined by the Society for Learning Analytics and Research as “the measurement, collection, analysis, and reporting of data about learners and their context for the purpose of understanding and optimizing learning and the environments in which it occurs.” Researchers debated strategies for monitoring a learning environment and collecting data in the early phases of learning analytics. Virtual learning environments and learning management systems such as Blackboard (Bradford et al., 2007) and Moodle (Dougiamas and Taylor, 2003) have recently made it possible to gather massive amounts of educational data (educational big data) in a simple manner. The most recent study has concentrated on ways for analyzing educational big data and reporting, that is, how to deliver feedback to teachers/students on analysis findings.

Khalil et al. (2016) conducted a survey on learning analytics and classified the methods into seven categories: Data mining techniques – forecasting the relationship between studying time and learning performance (Jo et al., 2014); statistics and mathematics – building a grading system (Vogelsang and Ruppertz, 2015) and temporal discourse analysis of an online discussion (Asif et al., 2017). (Lee and Tan, 2017) ; text mining, semantics, and linguistic analysis – summarization of students’ learning journals (Taniguchi et al., 2017) and comprehension of students’ self-reflections (Kovanovi, 2018); visualization – comprehensive overview of students’ learning from learning management system (Poon et al., 2017), awareness tool for teachers and learners (Martinez-Maldonado et al., 2015), and a learning analytics dashboard (Aljohani et al., 2018) ; network analysis – a relationship analysis between technology use and cognitive presence (Kovanovi, 2017), classification of students’ patterns into categories based on level of engagement (Khalil and Ebner, 2016), and a network analysis of LAK (Learning Analytics and Knowledge) conference papers (Dawson et al., 2014); qualitative analysis – an evaluation of MOOC discussion forums (Ezen-Can et al., 2015), and analyzing instructors’ comments (Gardner e (Grann and Bushway, 2014).

Data mining techniques have been used to predict students’ academic achievement (Asif et al., 2017), identify students at risk using clicker responses (Choi et al., 2018), and forecast the relationship between studying time and learning performance (Jo et al., 2014).

Building a grading system with statistics and mathematics (Vogelsang and Ruppertz, 2015) and temporal discourse analysis of an online discussion (Lee and Tan, 2017);

Summarization of students’ learning journals (Taniguchi et al., 2017) and understanding students’ self-reflections (Kovanovi, 2018) using text mining, semantics, and linguistic analysis

Visualization provides a comprehensive overview of students’ learning from a learning management system (Poon et al., 2017), an awareness tool for teachers and learners (Martinez-Maldonado et al., 2015), and a learning analytics dashboard (Aljohani et al., 2018).

network analysis – relationship analysis between technology use and cognitive presence (Kovanovi, 2017), classification of student patterns into engagement levels (Khalil and Ebner, 2016), and a network analysis of LAK (Learning Analytics and Knowledge) conference papers (Dawson et al., 2014)

;

qualitative analysis – an assessment of MOOC discussion forums (Ezen-Can et al., 2015) and an examination of instructor comments (Gardner et al., 2016); and

Gamification – gamified e-assessment platform (Gaán et al., 2017), gamified dashboard (Freitas et al., 2017), and a competency map (Grann and Bushway, 2014).

Learning analytics results can help teachers and students improve their teaching and learning. As a result, obtaining feedback for optimizing the learning environment and learners themselves is an important issue in learning analytics. In terms of frequency, there are roughly three types of feedback loops: yearly, weekly, and real-time feedback. Because the analyses results are not immediately fed back to on-site teachers/students who provide their educational/learning logs for the analytics, the above-mentioned studies are categorized into yearly feedback or weekly feedback types. The reason is obvious: learning analytics is primarily carried out after classes, school terms, or school years. As a result, feedback is delayed accordingly. However, obtaining real-time feedback can be very useful and beneficial for teachers and students in on-site classrooms.

Our research centered on providing feedback – specifically, how to provide feedback – on efficient information to on-site classrooms even during lectures. The goal of this research is to implement real-time feedback, which has not been widely discussed in relation to on-site educational environments. Our goal is to have on-site classrooms where teachers give lectures and a large number of students listen to teachers explain things, do exercises, and so on. It is difficult for teachers to understand students’ situations and activities in such a large classroom. We use both an e-learning and an e-book system to collect real-time learning activities during lectures. We have created two major feedback systems. One is useful for a teacher right before a lecture. The system generates summary reports of the given materials’ previews and quiz results. Using the preview achievement graph, the teacher can determine which pages were well previewed and which were not. Furthermore, the teacher can see which quizzes were difficult for students and which pages should be used in the lecture to help students. Another feature is real-time analytics graphs, which allow the teacher to control his or her lecture speed during the lecture. The system sequentially collects e-book logs operated by students and performs real-time analytics to determine how many students are following the teacher’s explanation. The remainder of this paper describes the details of our real-time feedback system and reports on the experimental results.

Review of the literature

In terms of frequency, there are roughly three types of feedback loops: yearly, weekly, and real-time feedback. The assessment and improvement of education is a common example of a yearly (or term-by-term) feedback loop. Typically, students’ grades, examination results, class questionnaires, and so on are analyzed and evaluated. On the e-book system, the relationship between self-efficacy and learning behaviors was investigated. Teachers learn that student behaviors related to markers and annotations are related to students’ self-efficacy and the intrinsic value of the learning materials. A yearly (or term-by-term) feedback loop could also include an analysis of students’ performance (Okubo et al., 2016) and a prediction of students’ final grade. The yearly feedback loop is set up so that the feedback results are delivered the following year. In other words, students and teachers will not receive feedback directly from analyzing their own learning logs. Feedback could also include an examination of previous years’ learning logs.

A weekly feedback loop can recommend related materials based on students’ status, which can be determined by predicting academic performance using learning logs such as attendance reports and quiz results. For example, analytics of preview and review patterns (Oi et al., 2015) or learning behavior analytics (Yin et al., 2015) can be used to better understand students’ weekly performance. Text analytics technology provides summarized materials for preview and review (Shimada et al., 2015). Unlike a yearly feedback loop, the results of the analysis are fed directly back to the students and teachers who provide the learning logs.

There are several works that deal with real-time learning analytics. Minovic et al. proposed a visualization tool for teachers to use while playing a game to track students’ learning progress in real time. Piech et al. gathered tens of thousands of program codes and used a machine learning approach to identify students’ “sink” states. Students were given feedback just before they were about to enter such dangerous “sink” states. Freitas et al. discuss the efficacy of immediate feedback to students, particularly the impact of gamification in university education. Fu et al. also proposed real-time program code analysis. They provided a learning dashboard to capture student behavior in the classroom and identify various difficulties encountered by students. Although these studies obtained real-time feedback, the analytics and feedback were intended for activities in virtual learning environments. In contrast, the goal of our research is to create real-time feedback loops in which analysis results can be fed back to on-site students and teachers even during a lecture. A teacher can monitor what students are doing, such as whether they are following the explanation or doing something unrelated to the lecture. Rather than engaging in nonstop talking, a teacher can control the speed of the lecture and/or devote more time to exercises.