RRL 2
RRL 2
*Correspondence:
[email protected] Abstract
Higher Education Restrictions on physical gathering due to COVID-19 has compelled higher education
Development Centre,
University of Otago, institutions to rapidly embrace digital technologies to support teaching and learning.
Dunedin 9016, New Zealand While logistically, the use of digital technologies offers an obvious solution, attention
must be given to these methods’ pedagogical appropriateness, mainly how students
engage and learn in the spaces supported by these technologies. In this context, we
explored the degree to which digital technologies have contributed to teaching and
learning practices over the past decade. The study employed a systematic review
using a newly developed tripartite model for conducting and presenting literature
review reports. The model approaches the literature review process systematically and
employs three phases for the critical examination of literature: description, synthesis,
and critique. The current review focused on student engagement across technolo-
gies that encompass social media, video, and collaborative learning technologies.
Relevant articles were obtained from the Scopus and Web of Science databases. Three
core themes were identified: there was no shared understanding of what constitutes
student engagement with learning technologies, there was a lack of explanation
concerning the contextual variation and modalities of student engagement across the
digital technologies, and self-reporting was the primary method of measuring student
engagement, rendering results as perceptual rather than behavioural. We argue that
using multiple datasets and different methodological approaches can provide further
insights into student engagement with digital technologies. This approach to measur-
ing engagement can substantiate findings and most likely provide additional insights
into students’ engagement with digital technologies.
Keywords: Student engagement, Digital technologies, Student learning
Introduction
The contemporary higher education sector faces many challenges, including ability to
meet the learning needs of diverse students and, student retention (Kahu & Nelson, 2018;
Macfarlane & Tomlinson, 2017; Waldrop et al., 2019). These challenges are often linked to
how institutions design their learning environments and engage students in their learning
(Klem & Connell, 2004; Waldrop et al., 2019). Learning environments that support student
engagement can influence the learning process (Kahu, 2013) and lead to the development
© The Author(s), 2021. Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits
use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original
author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third
party material in this article are included in the article’s Creative Commons licence, unless indicated otherwise in a credit line to the mate-
rial. If material is not included in the article’s Creative Commons licence and your intended use is not permitted by statutory regulation or
exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://
creativecommons.org/licenses/by/4.0/.
Nkomo et al. Int J Educ Technol High Educ (2021) 18:34 Page 2 of 26
of student critical thinking skills (Carini et al., 2006) and support retention (Waldrop et al.,
2019; Wyatt, 2011).
Student engagement is a multifaceted and complex phenomenon to understand, however,
it is considered a critical factor in supporting student learning and development (Kahu,
2013). With higher education rapidly deploying various forms of digital technologies into
their learning environments, understanding how students engage with these technologies is
critical to the design of flexible and highly adaptive learning environments that can cater to
diverse student learning preferences. Also, understanding how students engage with digital
technologies can enable educators to train students with various digital literacy skills and
knowledge to support their learning.
Though the current generation of students entering university has a certain level of digital
literacy, such literacy might be limited to engaging with entertainment technologies and
games rather than using such skills to acquire vital knowledge and skills (Prior et al., 2016).
Since engagement is associated with academic achievement, researchers have identified
various strategies to support better engagement (Barnacle & Dall’Alba, 2017; Kahu & Nel-
son, 2018; Koranteng et al., 2019). However, the meaning of student engagement means dif-
ferent things to different people (Kahu, 2013). Also, there is limited understanding of how
students engage with learning technologies and the extent to which engagement with such
technologies fosters enhanced learning outcomes.
This article surveys a wide range of studies published on student engagement with vari-
ous forms of learning technologies in the last decade (2010–2020). We conducted an in-
depth analysis of the conception, meaning and nature of student engagement with digital
technologies and how researchers measure, analyse, and present student engagement. The
review focused on student engagement with three digital technologies (LMS, Social Media,
and Lecture Capture). We believe this article will provide readers with an important refer-
ence point that provides insights into how students engage with digital technologies, and
ways to design learning environments that are agile and cater to diverse student learning
preferences. Four guiding questions were utilised to frame the research area for review as
indicated below:
a Tools
b Algorithms
c Scales
d Methods
Methodology
This systematic review employed the systematic and the tripartite model. The model com-
prises a systematic approach to analysing and presenting the literature (Daniel & Harland,
2017). The model incorporates practical tools and strategies on how to write credible and
critical reports. The model consists of three essential phases. The first phase of the model is
deciding on articles to read, compiling summary abstracts and validating these with a men-
tor or peer. This stage is very similar to the procedure used in systematic literature reviews
(Higgins & Green, 2008; Liberati et al., 2009).
In the first stage of the model, an investigation area is identified, and the researcher estab-
lishes the context and purpose of the review. The researcher further frames a research area
for review and develops a search strategy, with explicit inclusion and exclusion criteria for
selecting materials. This process should yield all the published material on a topic based on
the criteria of interest. Once the purpose of the review is established, a search strategy is
developed. The strategy involves the formulation of concrete search terms. It is essential to
formulate relevant terms since this will determine the quality of resources identified (Boell
& Cecez-Kecmanovic, 2015). The second part of the model is referred to as the ’tripartite
approach’; it consists of three parts (description, synthesis & critique) and presents a model
that combines the two stages as a structured and systematic guide. Tripartite I (description):
In this stage, the systematic review presents a descriptive summary of the critical issues
identified in the literature. This process provides the reader with an overview of develop-
ments in the field, the main areas of debate and the outstanding research questions. This is
followed by the presentation of identified themes that have been carefully justified.
Tripartite II (synthesis): In the synthesis stage, the literature review goes beyond a
description of what is published; it includes the synthesis and articulation of relationships
between various published literature bodies. In this stage, the core focus is to synthesis
ideas. This involves the extraction of the most important ideas or themes and a process of
comparing and contrasting these to identify areas of similarity, difference and any contro-
versies. This allows the researcher to clarify and resolve inconsistencies in thinking in the
literature, thereby providing the best chance to make an original contribution to knowl-
edge. Through synthesis, the researcher ensures that the particular problem of interest can
be contextualised within the subject’s historical context.
Tripartite III (critique): In the third part, the researcher reflects on the synthesis of the
main ideas identified at the second stage to develop a critical view of the work reviewed in
light of claims and evidence available. After a thorough description and summary, a critical
thinking and judgment level can be applied in the review and presentation. Critical engage-
ment requires the development of particular skills and strategies, and it mainly implies
having the ability to examine claims against alternative evidence or views. It also requires
a questioning mind and an openness to alternative views or evidence from other sources.
The critique includes a positive dimension as the researcher aims to provide new ideas and
alternatives.
Nkomo et al. Int J Educ Technol High Educ (2021) 18:34 Page 4 of 26
Fig. 1 The systematic and tripartite model (Daniel & Harland, 2017)
Nkomo et al. Int J Educ Technol High Educ (2021) 18:34 Page 5 of 26
Scopus 772
Web of science 463
Total 1235
books, and conference proceedings. The Web of Science database was also used since it
provides an extensive set of world-class research literature from a rigorously selected set
of academic journals that allows for the in-depth exploration of specialized sub-fields
within an academic or scientific discipline (Li et al., 2018).
Search strings
The final search strategy was refined to include three search strings (Search String 1–3
(SS1-SS3)). These were established to obtain the relevant articles to review. The strings
were customized to meet the syntax of each database:
SS1: ("student engagement" OR "learner engagement") AND (LMS OR "Learning man-
agement systems") AND ("higher education" OR "tertiary education" OR university).
SS2: ("student engagement" OR "learner engagement") AND ("social media") AND
("higher education" OR "tertiary education" OR university).
SS3: ("student engagement" OR "learner engagement") AND ("lecture capture" OR
"recorded lecture*") AND (higher education OR tertiary education OR university).
After running the search strings, the papers’ abstracts were identified, read, and vali-
dated against the inclusion and exclusion criteria. The next step was to compile the full
list of the articles, which were then systematically reviewed following the systematic and
the tripartite model. In applying the systematic and tripartite model, the study utilised
the within-study analysis, which involves the analysis of an entire article as well as the
between study analysis, which consists of identifying the similarities and dissimilari-
ties in the key findings from other literature (Daniel & Harland, 2017; Onwuegbuzie &
Weinbaum, 2017).
Nkomo et al. Int J Educ Technol High Educ (2021) 18:34 Page 6 of 26
The results of the search strings, preliminary selection and final selection are sum-
marised in Table 2. The table shows the number of articles each string retrieved from
the respective databases before validating them against the inclusion and exclusion
criteria (N = 567). After validating against the search criteria and reading abstracts,
a preliminary selection of (N = 189) was obtained. The next step was to then read
through the articles for further validation and verification. This was done until
the studies’ findings became repetitive, which occurred after 30 articles had been
reviewed. Therefore 30 articles made the final selection.
Findings
Findings from this review showed substantial research on student engagement. How-
ever, there is no consensus on what constitutes student engagement (Baron & Corbin,
2012; Harris, 2008; Kahu, 2013; Kahu & Nelson, 2018). The lack of consensus makes
it difficult to ascertain the utility of engagement and its value in enhancing students’
learning experience and learning outcomes. The variation in the conceptions of stu-
dent engagement has led to various discourses of different dimensions of student
engagement (e.g. behavioural, social, and cognitive), though distinct from each other;
these diemsnions of engagement are often used interchangeably (Burch et al. 2015;
Christenson et al., 2012; Fredricks et al. (2016); leading to inconsistency in meas-
uring student engagement. Also, the lack of a shared conceptualisation of engage-
ment, makes it difficult to identify the semantic proximity between engagement and
related concepts such as motivation. Alexander (2017) states, "when researchers do
not explain their definitions of key constructs, they introduce a degree of concep-
tual ambiguity. And when the process of communicating theory or research starts
with conceptual ambiguity, theory integration is far less likely to result." (p. 347).
On the contrary, Christenson et al. (2012) view the lack of consensus as an opportu-
nity to view engagement from different perspectives, enriching the concept’s schol-
arly nature. A summary of the various conceptualisations of engagement is shown in
Table 3.
From the various definitions in Table 3, we extracted the fifty most frequently
used terms and presented them visually sing a Word Cloud. The Word Cloud in
Figure 2 shows various terms used to conceptualise the different dimensions of
student engagement.
1 95 31 36 14 15
2 106 49 73 18 8
3 123 42 134 35 7
Total 324 122 243 67 30
Table 3 Summary: conceptualisation of engagement
Author(s) Conceptualisation Key constructs Key assumptions Examples
(Astin, 1984) the amount of physical and psycho- Physical energy, psychological energy Engagement refers to the investment Devotes considerable energy to study-
logical energy that the student and academic experience of physical and psychological energy ing,
devotes to the academic experience Engagement occurs along a con- Spends much time on campus,
tinuum Participates actively
Nkomo et al. Int J Educ Technol High Educ
as certain non-academic aspects of Experience with faculty tunities to get students to utilize their
the student experience Campus environment time and effort in these activities
actively
(ACER, 2019) Students’ involvement with activities Student self-initiated involvement Academic challenge Students actively interact with their
and conditions likely to generate Active learning learning environments in a manner
high-quality learning Student and staff interactions that leads to learning
Enriching educational experiences
Supportive learning environment
Work-integrated learning
(Trowler, 2010) Interaction between the time, effort Student and institution Investment in Student and the institution interre- How the institution deploys its
and other relevant resources Time, effort and resources lated components resources and other learning oppor-
invested by both students and their tunities to get students to utilize their
institutions time and effort in these activities
actively
(Maguire et al., 2017) Engagement is a product of the Students react to surroundings Social Students react to the environment that
broader social and cultural context Cultural context is provided by the institution based
and not just the student’s attribute on how the institution adapts its
resources and teaching strategies to
encourage participation
(Fredricks et al., 2004) Malleable meta construct that is pre- Individual and context-based It can be changed Students engage based on what is
sumed to be based on the individual Results from a variety of antecedents availed to them
and the context in the context, both social and
academic, at both the school and
classroom levels
Page 8 of 26
Table 3 (continued)
Author(s) Conceptualisation Key constructs Key assumptions Examples
Nkomo et al. Int J Educ Technol High Educ
(Kahu, 2013) Four views: Interrelates between student and Students and University play a critical University and the student play a
Behavioural institution role crucial role in student engagement
Psychological with characteristics of both the
Sociocultural university and the student as well as
Holistic the relationship between the two is
fundamental
(2021) 18:34
(Filsecker & Kerres, 2014) The sustained effort invested by The sustained effort, self-imposed The student is crucial in their engage- The students decide what to engage in
an individual to manage and objectives ment
implement his or her intention of
pursuing a previously chosen goal
and entails cognitive, emotional,
and behavioural components that
reflect the individual’s volitional/post
decisional state
(Paulsen & McCormick, 2020) Student learning relates to time Time and effort from students, institu- The student invests time and effort Students are responsible for investing
and effort in studies by students; tions set up effective educational into their studies in a conducive in their engagement; the institution
students benefit from environments practices to promote student environment, set up by the institu- plays a part in providing the necessary
that are collegiate and support their success tion environment for students to engage
success institutions and faculty can in
facilitate effective educational prac-
tices in and out of the classroom
(Lawson, & Lawson, 2013) Engagement is conceptualized as Dynamic phenomenon, psychological Requires social and psychology invest- Collaborative, motivation to take part in
a dynamic system of social and and social, synergetic process ment in energy the learning process
psychological constructs as well as a
synergistic process
Page 9 of 26
Nkomo et al. Int J Educ Technol High Educ (2021) 18:34 Page 10 of 26
2016; Harris, 2008; Schmidt et al., 2018). The three dimensions are shown in Fig. 3 below.
These dimensions are interrelated and contribute to a student’s engagement.
Although each of the three aspects of engagement can be considered distinct, there
is considerable overlap. For example, Filsecker and Kerres (2014) indicated that the
behavioural part of the engagement that includes exerting effort and attention could be
regarded as cognitive engagement. There are other engagement dimensions identified in
the literature. Harris (2008) discussed academic engagement, specific to learning tasks,
to move away from the general behavioural engagement that covers non-academic activ-
ities. Linnenbrink-Garcia et al. (2011) added social-behavioural engagement as a con-
struct related to students affect and behaviour in collaborative group work (Fredricks
et al., 2016). Reeve and Tseng (2011) propose the addition of agentic engagement to
account for how students actively and constructively contribute to the learning environ-
ment. Agentic engagement factors in the student’s ability to purposefully and proactively
enhance the learning and teaching process. However, instead of a new dimension, it can
be viewed as the union between the cognitive and behavioural dimensions.
Combining the three dimensions can provide a more in-depth description of students
about their engagement (Fredricks et al., 2004). Therefore, it is important to measure
all the dimensions when measuring student engagement because focusing on only one
dimension can limit the understanding of student engagement. As behavioural, cogni-
tive and emotional engagement interrelate in a volatile manner among individual stu-
dents (Fredricks et al., 2004).
Critiques of behavioural engagement question whether participation in tasks can nec-
essarily lead to desirable learning outcomes. For example, students in the class can focus
on the instructor, which would be noted as engagement; however, the student’s attention
Nkomo et al. Int J Educ Technol High Educ (2021) 18:34 Page 12 of 26
could be elsewhere (Linnenbrink & Pintrich, 2003). In other words, a student can be
behaviourally engaged but not cognitively. Harris (2008) asserted that cognitive engage-
ment seems to be the most linked to learning and that a student’s physical participation
does not necessarily assure cognitive participation. This is echoed by Linnenbrink and
Pintrich (2003), who suggested that teachers need to engage students cognitively, not
just behaviourally. This entails that instructors need to ensure that students deeply, criti-
cally and creatively think about the content being learned and reflect on what they know
and do not know and utilise different learning strategies to help their understanding of
the content.
The emotional engagement has also been contested as to whether students "feel good"
about school learn (Skinner & Belmont, 1993). For example, students being enthusiastic
in class does not necessarily translate to better learning outcomes.
Further, although research has claimed cognitive engagement to be the most impor-
tant type of engagement, emotional and behavioural dimensions are seen as dimen-
sions that may be required to enable cognitive engagement (Harris, 2008). For example,
students need to be involved in the learning activity and, based on how they feel, then
decide to engage cognitively. This goes further in underlining the importance and rela-
tionship between these three dimensions of engagement.
Students may also engage cognitively in LMS through problem-based learning activities
and self-regulated learning through the access of resources at their own pace.
However, effective students’ engagement with LMS is dependent on how students and
instructors utilise LMS. Klobas and McGill (2010) investigated the role of student and
instructor involvement in LMS success. They found that when instructors provide regu-
lar guidance to students in LMSs, students are likley to gain improved effectiveness and
productivity when studying.
Little-Wiles and Naimi (2011) looked at what educators can do to ensure students are
fully engaged when interacting in LMS. They found that students use LMSs to create
self-awareness of learning, e.g., checking one’s progress and requirements of a course
and communicating with their peers.
A behaviourism perspective tends to explain better the various forms of stu-
dent engagement in LMSs. For instance, this can be seen from students’ navigational
pathways in LMSs (observable change in behaviour), depending on how the LMS is set
up by instructors (the stimuli). Corrigan et al. (2015) explored the impact of present-
ing students with their engagement data in a VLE to determine trends linked to student
attainment. The study found students who received notifications on their engagement
with the VLE, compared to non-participants in the various courses, showed improve-
ment in their grades.
Though studies look at engagement in LMS from the behavioural perspective, Henrie
et al. (2018) scrutinized the relationship between student activity log data in LMS and
self-reported student engagement survey scores, intending to understand whether or
not LMS log data could be used as a proxy measure for students emotional and cognitive
engagement. The study did not find any significant relationship between the use of log
data and students self-reported emotional and cognitive engagement. This underscores
the relationship between observed and reported states of engagement.
LMSs provide tools to track engagement through tools such as the Moodle Engage-
ment Analytics Plugin (MEAP) and Blackboard Analytics. The MEAP monitors student
behaviour on three tasks: forum activity (if students are participating in the forum),
login activity (the duration frequency and time of login) and assessment activity (if sub-
missions are made on time) (Liu et al., 2015; Luna et al., 2017; Yassine et al., 2016). How-
ever, the drawback of MEAP is that lecturers need to enter thresholds for each of the
three entities, which some lectures might see as an issue. Similarly, Blackboard Analytics
monitors students engagement patterns, evaluates learning outcomes, and assess the use
and adoption of online learning tools (Jones, 2012). Unlike MEAP, Blackboard analyt-
ics provides a more holistic approach in analysing students data by including data such
as demographic data and previous course data (Whitmer, 2015). However, this requires
integration with the institutions’ student information system, which may have some pri-
vacy concerns.
role in facilitating engagement and learning within an LMS is critical to students suc-
cess (Baragash & Al-Samarraie, 2018; Barua et al., 2018; Klobas & McGill, 2010; Little-
Wiles & Naimi, 2011). In order for LMS implementation to be successful, both students
and instructors need to play their part in the process. For instance, the instructor’s
involvement in LMS course design can benefit students by integrating interactive course
designs that allow collaboration and communication (Swart, 2015; Wang, 2017).
Findings from this review suggest that engagement in LMS is predominantly behav-
ioural oriented, where students are expected to respond to a stimulus (the LMS envi-
ronment) set up by an instructor (Barua et al., 2018; Little-Wiles & Naimi, 2011;
Venugopal-Wairagade, 2016). Most of the LMS actions, such as logging on, posting on
forums, accessing learning resources, and assignments, are behavioural traits and would
mostly favour the behavioural dimension. Several studies have revealed that students’
engagement in LMS can be influenced by demographic characteristics such as age,
digital literacy and educational background (Baragash & Al-Samarraie, 2018; Klobas &
McGill, 2010; Swart, 2017; Venugopal-Wairagade, 2016). For example, although most
students are technologically astute, their digital literacy may be limited to digital learn-
ing environments (Prior et al., 2016). Therefore, students’ different experiences can lead
them to engage differently with LMS.
Most studies have approached the measurement of engagement with LMS through
students self-reporting measures, such as questionnaires (Barua et al., 2018; Klobas &
McGill, 2010; Little-Wiles et al., 2010; Venugopal & Jain, 2015). However, most of these
questionnaires span across disciplines and are generally produced as a one-size-fits-all
instrument. They are unlikely to capture engagement as teaching and learning genuinely
are likely to differ across disciplines.
LMS generate data based on user actions. The use of analytics for analysing these
data may be more insightful (see Liu et al., 2015; Luna et al., 2017; Messias et al., 2015;
Yassine et al., 2016). Some studies have utilised questionnaires and system logs to meas-
ure engagement (see Baragash & Al-Samarraie, 2018; Henrie et al., 2018; Wang, 2017).
These studies found a positive relationship between engagement with LMS and achieve-
ment (see Baragash & Al-Samarraie, 2018; Wang, 2017). However, Henrie et al. (2018),
focusing on emotional and cognitive engagement, found logs inefficient as a proxy for
engagement. Studies utilising logs independently have also found them indicative of
how engagement affects achievement (see Swart, 2015; Umer et al., 2018). Research sug-
gests discussion forums, frequency of logins as well as submission activities are the most
common data for analysing engagement in LMS (Henrie et al., 2018; Liu et al., 2015;
Luna et al., 2017; Messias et al., 2015; Swart, 2015; Venugopal & Jain, 2015; Yassine et al.,
2016). Wang (2017) indicates it is relatively easier to measure and collect behaviour
engagement in LMS. Moreover, when students initially engage with LMS, it is highly
dependent on how the instructor sets it up; therefore, measuring behaviours alone is
most likely the best way to understand engagement with LMS.
of engagement dimensions. The factors used in some studies do not necessarily well rep-
resent the dimensions of engagement they claim to measure. Furthermore, studies can
define and measure the same dimensions differently, thus creating an overlap in clar-
ity. In some cases, the methods used to measure student engagement may not genuinely
reflect student engagement’s accurate measurement. For example, observation studies
where the students know they were being observed can suffer from the Hawthorne effect
as students change their behaviour due to knowing they are being followed. The use of
survey instruments alone also limits our understanding of student engagement as they
are limited to collecting perception data.
Furthermore, the use of analytics without context can further limit our understanding
of student engagement. For example, measuring clickstream data can be inaccurate as a
click does not necessarily equate to engagement. A click to download a document may
not mean the same engagement as posting a discussion on the forum. Factors such as
prior knowledge, technical ability, and student’s motivation to learn, among others, can
influence the level of student engagement within LMS as well as student’s intentions to
engage with an LMS; however, as LMS engagement is mostly measured in behaviours,
most studies do not include these factors. Moreover, though there is a general belief that
most students are digitally literate, it is crucial to assess the level of digital literacy of stu-
dents as students may not engage with LMS due to digital illiteracy. The failure, in some
instances, to account for students across various disciplines when measuring engage-
ment can further limit our understanding. For example, students from some fields can
be more inclined to engage in specific ways with LMS technologies than other students
based on their domains of study. This can be due to students having different task-based
interactions with LMS making their engagement vary. In terms of datasets, most studies
utilise datasets that are convenient as small samples ranging from single courses with
a few students and one instructor to multiple courses are typically used (see Corrigan
et al., 2015; Klobas & McGill, 2010; Little-Wiles & Naimi, 2011; Swart, 2017; Umer et al.,
2018). Therefore, it can be difficult to infer causation from cross-sectional data. The
results of some studies are therefore applicable only to specific cohorts.
Further, studies that do not utilise control and treatment groups in studies with
comparisons can have somewhat less reliable results. In general, most studies in LMS
measure the behaviours of students to infer student engagement. This, however, leaves
questions such as if these measurements are accurate reflections as they do not include
the other dimensions of student engagement.
Similarly, Williams and Whiting (2016) explored students’ use of Twitter in enhancing
engagement. The study noted, "some define engagement as the frequency with which
students participate in activities that represent effective educational practices and con-
ceive of it as a pattern of involvement in a variety of activities and interactions both in
and out of the classroom and throughout a student’s college career. Additionally, the
phrase "student engagement" has come to refer to how involved or interested students
appear to be in learning and how connected they are to their classes, institutions, and
each other" (Williams & Whiting, 2016, p. 312). The study indicated students felt more
engaged when twitter and the LMS were used. The use of Twitter also had a positive
relationship with students’ perceptions of engagement in the marketing course. Seniors
students were found to use the LMS more frequently than their junior counterparts, and
no difference in the use of Twitter. Furthermore, there was no difference between junior
and senior student’s engagement levels.
Alshuaibi et al. (2018) stated that social media could enhance student’s cognitive
engagement in learning as they found cognitive dimension had a mediating role in the
relationship between social media and academic performance. Fagioli et al. (2015) ana-
lysed the use of a social media site and learning outcomes regarding community colleges.
The study found a relationship between social media use and academic outcomes. Stu-
dents who are actively engaged in social media tend to perform better in their learning
outcomes than inactive students. Furthermore, the posted comments and discussion’s
quality and relevance was a significant factor in sustaining the application’s continued
use. Suggesting students find value in meaningful peer discussions.
Saunders and Gale (2012) noted that students are less engaged in large lecture halls
and hardly ask questions. Ellis (2015) suggested that the use of Padlet as a social media
tool that allows students to post comments on an online wall can enhance the learn-
ing experience students as they engage with materials posted by other students Tiernan
(2014) found Twitter enabled students to contribute to discussions in a less intimidat-
ing manner and enabled engagement with peers and course content (Soluk and Buddle
(2015). Ally (2012) further indicated the ability of social media to promote engagement
through collaboration and communication, similar to Ellis (2015); Soluk and Buddle
(2015); Tiernan (2014). Ally (2012) found most participants embraced Twitter as an
enhancer to collaboration and communication in the classroom. The study further noted
increased class participation levels, attentiveness, and engagement compared to previ-
ous years, where traditional means were used to encourage interaction. This suggests
students find the use of social media for interaction to be fulfilling for them.
discussions inherent in social media are of good quality and relevant, this leads to sus-
tained interests in social media use, which can positively influence student learning out-
comes (Fagioli et al., 2015). Lack of confidence and boredom are issues students face in
traditional lectures. Social media has been used to try and avert this by allowing stu-
dents to use social media to post questions for discussions in a non-intimidating manner
with students (Ellis, 2015; Tiernan, 2014).
Like LMS, most of the studies used quantitative approaches relating to usage, such as
the number of posts, frequency of posts, etc., mostly associated with behaviour (Fagi-
oli et al., 2015; Junco, 2012; Tiernan, 2014; Williams & Whiting, 2016). The use of self-
reported measures such as questionnaires to examine if social media enhances student
engagement and learning experiences is prevalent (see Junco, 2012; Tiernan, 2014;
Williams & Whiting, 2016). Further, statistical methods such as t-test, regression and
descriptive statistics were used for analysis (Alshuaibi et al., 2018; Ellis, 2015; Junco,
2012). Some educators have made social media voluntary in their classes (Tiernan, 2014;
Williams & Whiting, 2016). Some have made activities that are graded to be conducted
on social media (Soluk & Buddle, 2015). Similar to studies measuring engagement in
LMS, behaviours from self-reported measures and quantitative analysis are utilised.
However, as a tool that supports interaction, social media can facilitate other dimensions
of engagement, such as emotional and cognitive engagement. Therefore, other measures
such as content, thematic and social network analysis can provide more insights into
student engagement with social media.
other obligations. Most lecturers were undecided about the value of recorded lectures.
Therefore, differences in lecturer-based engagement were noted based on disciplines.
Lecturers in business and social science were more positive towards the recorded lec-
ture system than lecturers in the engineering and science disciplines. The findings from
Dona et al. (2017) raise the question on the idea of a "one-size-fits- all" lecture recording
system as differences in discipline lecturer styles and approaches to teaching are noted.
They were indicating that not all students may engage with lecture recordings the same.
Further Trenholm et al. (2019) investigated undergraduate mathematics students’ cog-
nitive engagement with recorded lectures. The study approached cognitive engagement
via two scales on measuring learning approaches: surface and deep. The study found
that the combination of a decline in lecture attendance and reliance on recorded lecture
videos had an association with an increase in surface approaches to learning. Edwards
and Clinton (2019) examined the usage and impact made by introducing lecture cap-
ture in a Bachelor of Science programme course. The study found; the impact of lecture
recordings was negative as students live lecture attendance dropped. They illustrated the
drawbacks of over-reliance on lecture capture as a replacement for attending lectures as
attendance is seen as an engagement indicator. However, viewing the lecture recordings
had no significant association with attainment. Moreover, the study indicates lecture
capture availability will most likely negatively affect less engaged students who might
utilise more of a surface learning approach.
Studies suggest a need to address best practices in using recorded lecture videos not
only in mathematics but possibly in other fields as well Edwards and Clinton (2019);
Trenholm et al. (2019). The Instructional design of lecture recordings can influence
best practices for utilising lecture recordings. Costley et al. (2017) examined the rela-
tionship between instructional design and student engagement in video lectures. The
study outlines five instructional design indicators that can lead to the watching and com-
pletion of videos. These are utilizing the medium, establishing netiquette, establishing
time parameters, setting the curriculum and design methods. These five elements aim
to provide students with a clear pathway to success in the online learning environment.
Instructional design is present when students view these elements as enhancing their
engagement and learning. Findings suggested that the videos’ design does influence stu-
dents’ engagement, and therefore instructors should pay attention with regards to how
their courses are designed. Moreover, Seifert (2019) aimed to identify students’ learning
preferences as well as their attitudes with regards to using recorded lectures and how
this affected student attendance in lectures. The findings indicated students had a posi-
tive experience that aided them in understanding the learning materials, as the lecture
recordings met students’ various needs.
Ebbert and Dutke (2020) identified five clusters of students based on how they utilised
lecture recordings. The study outlines behaviour variables representing lecture capture,
usage frequency, selective and repetitive watching, live lecture attendance, social context
and location. These variables can represent the behavioural and cognitive dimensions
of engagement. The clusters in descending order of size were frequent repetition, Selec-
tive repetition, Frequent consultation, Selective consultation and Increased absenteeism.
These clusters indicate students engage differently with lecture recordings; therefore,
strategies should be generated to support the different ways students engage with lecture
Nkomo et al. Int J Educ Technol High Educ (2021) 18:34 Page 20 of 26
recordings. Lecture recordings are a flexible platform concerning how students engage
with them. Therefore, they keep a flexible type of engagement that enables students to
utilise lecture recordings according to their preferences. Although most studies address
the behavioural dimension, lecture recordings can facilitate other dimensions of engage-
ment. The common use of self-reported measures alone also limits how students engage
with lecture recordings as these obtain student perceptions.
most studies do not consider design. Studies that utilised different approaches such as
analytics only analyse behavioural data such as the total number of views. Although lec-
ture capture views go through peaks and declines, there is no general understanding of
why this is the case.
Furthermore, the methods used to infer videos as being watched somewhat rely on
assumptions as it is not guaranteed that students watch the recordings they play. The
use of courses from single instructors provides results that leave pending questions chal-
lenging to generalize. The emphasis on measuring the impact on mostly students’ live
lecture attendance has left a limited understanding of student’s engagement with lecture
recordings. Furthermore, more insights can be obtained by utilising different datasets,
including those obtained from trace data sets. This can help identify other dimensions of
student engagement to understand student engagement with lecture recordings better.
digital learning environments. We argue that a more holistic approach that would incor-
porate participants from more diverse domains may yield a better understanding of how
students with different demographic characteristics studying different subjects engage
with digital technologies.
The cohort of students entering higher education are digitally savvy with different lev-
els of technology literacy, therefore it is essential to incorporate demographics for richer
insights in understanding how students engage with learning technologies.
In the studies reviewed, most approaches to the measurement of student engagement
rely on self-reported measures, this is a concern, as some students may not recall their
pre-self-report actions. Further, the use of a single source in the form of questionnaires
mostly used in the literature is liable for single-source bias. Some studies have used
learning analytics approaches that are less intrusive than self-reported measures. How-
ever, these have been mostly conducted in a short timeframe making durable patterns
difficult to establish.
Studies can look to use larger samples, and accounting for more variables as noted by
Helal et al. (2018), institutions may look to undertake the complex task of understanding
student academic performance predictors, which may be affected by numerous factors
such as the economic, social, demographic, cultural and academic background. Student
engagement is similar in its complexity of variables that may affect it. One may even
argue that addressing those multiple factors at the engagement level can help under-
stand student outcomes. In particular, the use of perception data alone for student’s
engagement with digital technologies limits our understanding of student engagement
in these environments. With most studies skewed towards a perceived behavioural
dimension of engagement, it can be fruitful for researchers to incorporate different data
sets to the more traditional data sets, such as trace data, as most digital technologies
generate data in their system logs. In conclusion, the following outstanding issues need
to be addressed:
• Current research on student engagement has not covered typologies of the various
forms of learning theories and how these can guide different forms of engagement in
a technology-enhanced learning environment.
• Most studies reviewed measure student engagement through perception data; how-
ever, different data types such as trace data can provide further insights.
Abbreviations
LMS: Learning management system; Moodle: Modular object-oriented dynamic learning environment.
Acknowledgements
Not applicable.
Authors’ contributions
LMN led the development of the manuscript under the guidance of BKD. BKD co-developed the tripartite approach
utilised in the study. RJB provided critical feedback and support on structure. All authors contributed to the final manu-
script. All authors read and approved the final manuscript.
Funding
The authors received no funding for this research.
Declarations
Competing interests
The authors declare that they have no competing interests.
References
ACER. (2019). Australasian survey of student engagement. https://www.acer.org/au/ausse/background. Accessed 11 May
2019.
Alexander, P. A. (2017). Issues of constructs, contexts, and continuity: Commentary on learning in higher education.
Educational Psychology Review, 29(2), 345–351. https://doi.org/10.1007/s10648-017-9409-3
Ally, M. (2012). Student attention, engagement and participation in a twitter-friendly classroom. Paper presented at the 23rd
Australasian Conference on Information Systems, ACIS 2012, Geelong, VIC.
Alshuaibi, M. S. I., Alshuaibi, A. S. I., Shamsudin, F. M., & Arshad, D. A. (2018). Use of social media, student engagement, and
academic performance of business students in Malaysia. International Journal of Educational Management, 32(4),
625–640. https://doi.org/10.1108/ijem-08-2016-0182
Altunoglu, A. (2017). Initial perceptions of open higher education students with learner management systems. Turkish
Online Journal of Distance Education, 18(3), 96–104
Astin, A. W. (1984). Student involvement—A developmental theory for higher-education. Journal of College Student
Development, 25(4), 297–308
Avcı, Ü., & Ergün, E. (2019). Online students’ LMS activities and their effect on engagement, information literacy and
academic performance. Interactive Learning Environments, 1–14.
Baragash, R. S., & Al-Samarraie, H. (2018). Blended learning: Investigating the influence of engagement in multiple
learning delivery modes on students’ performance. Telematics and Informatics, 35(7), 2082–2098. https://doi.org/10.
1016/j.tele.2018.07.010
Barnacle, R., & Dall’Alba, G. (2017). Committed to learn: Student engagement and care in higher education. Higher Educa-
tion Research & Development, 36(7), 1326–1338. https://doi.org/10.1080/07294360.2017.1326879
Baron, P., & Corbin, L. (2012). Student engagement: Rhetoric and reality. Higher Education Research & Development, 31(6),
759–772. https://doi.org/10.1080/07294360.2012.655711
Barua, P. D., Zhou, X., Gururajan, R., & Chan, K. C. (2018). Determination of factors influencing student engagement using a
learning management system in a tertiary setting. Paper presented at the IEEE/WIC/ACM International Conference on
Web Intelligence (WI).
Beer, C., Clark, K., & Jones, D. (2010). Indicators of engagement. Paper presented at the Curriculum, technology transforma-
tion for an unknown future. Proceedings ascilite Sydney.
Bervell, B., & Umar, I. N. (2017). A decade of LMS acceptance and adoption research in Sub-Sahara African higher educa-
tion: A systematic review of models, methodologies, milestones and main challenges. Eurasia Journal of Mathemat-
ics, Science and Technology Education, 13(11), 7269–7286. https://doi.org/10.12973/ejmste/79444
Nkomo et al. Int J Educ Technol High Educ (2021) 18:34 Page 24 of 26
Boell, S. K., & Cecez-Kecmanovic, D. (2015). On being ‘systematic’in literature reviews. Formulating research methods for
information systems. (pp. 48–78). Springer.
Burch, G. F., Heller, N. A., Burch, J. J., Freed, R., & Steed, S. A. (2015). Student engagement: developing a conceptual frame-
work and survey instrument. Journal of Education for Business, 90(4), 224–229. https://doi.org/10.1080/08832323.
2015.1019821.
Cabero-Almenara, J., Arancibia, M., & del Prete, A. (2019). Technical and didactic knowledge of the moodle LMS in higher
education. Beyond functional use. Journal of New Approaches in Educational Research (NAER Journal), 8(1), 25–33
Carini, R. M., Kuh, G. D., & Klein, S. P. (2006). Student engagement and student learning: Testing the linkages. Research in
Higher Education, 47(1), 1–32
Chapin, L. A. (2018). Australian university students’ access to web-based lecture recordings and the relationship with
lecture attendance and academic performance. Australasian Journal of Educational Technology, 34(5), 1–12. https://
doi.org/10.14742/ajet.2989
Christenson, S. L., Reschly, A. L., & Wylie, C. (2012). Handbook of research on student engagement. Springer Science & Busi-
ness Media.
Coates, H. (2007). A model of online and general campus-based student engagement. Assessment & Evaluation in Higher
Education, 32(2), 121–141. https://doi.org/10.1080/02602930600801878
Corrigan, O., Glynn, M., McKenna, A., Smeaton, A., & Smyth, S. (2015). Student data: Data is knowledge: Putting the knowl-
edge back in the students’ hands. Paper presented at the 14th European Conference on e-Learning, ECEL 2015.
Costley, J., Hughes, C., & Lange, C. (2017). The effects of instructional design on student engagement with video lectures
at cyber universities. Journal of Information Technology Education-Research, 16, 189–207
Daniel, B. K., & Harland, T. (2017). Higher education research methodology: A step-by-step guide to the research process.
Routledge.
Dona, K. L., Gregory, J., & Pechenkina, E. (2017). Lecture-recording technology in higher education: Exploring lecturer and
student views across the disciplines. Australasian Journal of Educational Technology. https://doi.org/10.14742/ajet.
3068
Draper, M. J., Gibbon, S., & Thomas, J. (2018). Lecture recording: A new norm. Law Teacher, 52(3), 316–334. https://doi.org/
10.1080/03069400.2018.1450598
Ebbert, D., & Dutke, S. (2020). Patterns in students’ usage of lecture recordings: A cluster analysis of self-report data.
Research in Learning Technology. https://doi.org/10.25304/rlt.v28.2258
Edwards, M. R., & Clinton, M. E. (2019). A study exploring the impact of lecture capture availability and lecture cap-
ture usage on student attendance and attainment. Higher Education, 77(3), 403–421. https://doi.org/10.1007/
s10734-018-0275-9
Ellis, D. (2015). Using padlet to increase student engagement in lectures. Paper presented at the 14th European Conference
on e-Learning, ECEL 2015.
Esteve Del Valle, M., Gruzd, A., Haythornthwaite, C., Paulin, D., & Gilbert, S. (2017). Social media in educational practice:
Faculty present and future use of social media in teaching. Paper presented at the Proceedings of the 50th Hawaii
International Conference on System Sciences.
Fagioli, L., Deil-Amen, R., Rios-Aguilar, C., Deil-Amen, R., & Rios-Aguilar, C. (2015). Changing the context of student
engagement: Using Facebook to increase Community College student persistence and success. Teachers College
Record, 117(12), 1–42
Filsecker, M., & Kerres, M. (2014). Engagement as a volitional construct: A framework for evidence-based research on
educational games. Simulation & Gaming, 45(4–5), 450–470. https://doi.org/10.1177/1046878114553569
Fredricks, J. A., Blumenfeld, P. C., & Paris, A. H. (2004). School engagement: Potential of the concept, state of the evidence.
Review of Educational Research, 74(1), 59–109. https://doi.org/10.3102/00346543074001059
Fredricks, J. A., Filsecker, M., & Lawson, M. A. (2016). Student engagement, context, and adjustment: Addressing defini-
tional, measurement, and methodological issues. Learning and Instruction, 43, 1–4. https://doi.org/10.1016/j.learn
instruc.2016.02.002
Graham, C. R., Woodfield, W., & Harrison, J. B. (2013). A framework for institutional adoption and implementation of
blended learning in higher education. The Internet and Higher Education, 18, 4–14. https://doi.org/10.1016/j.iheduc.
2012.09.003
Greener, S. (2020). Attendance and attention. Interactive Learning Environments, 28(1), 1–2. https://doi.org/10.1080/10494
820.2020.1712105
Harper, S. R., & Quaye, S. J. (2008). Student engagement in higher education: Theoretical perspectives and practical approaches
for diverse populations. Taylor & Francis Group.
Harris, L. R. (2008). A phenomenographic investigation of teacher conceptions of student engagement in learning. The
Australian Educational Researcher, 35(1), 57–79. https://doi.org/10.1007/BF03216875
Helal, S., Li, J., Liu, L., Ebrahimie, E., Dawson, S., Murray, D. J., & Long, Q. (2018). Predicting academic performance by
considering student heterogeneity. Knowledge-Based Systems, 161, 134–146. https://doi.org/10.1016/j.knosys.2018.
07.042
Henrie, C. R., Bodily, R., Larsen, R., & Graham, C. R. (2018). Exploring the potential of LMS log data as a proxy meas-
ure of student engagement. Journal of Computing in Higher Education, 30(2), 344–362. https://doi.org/10.1007/
s12528-017-9161-1
Higgins, J. P., & Green, S. (2008). Cochrane handbook for systematic reviews of interventions. Wiley.
Jones, S. J. (2012). Technology review: The possibilities of learning analytics to improve learner-centered decision-making.
The Community College Enterprise, 18(1), 89–92
Junco, R. (2012). The relationship between frequency of Facebook use, participation in Facebook activities, and student
engagement. Computers & Education, 58(1), 162–171. https://doi.org/10.1016/j.compedu.2011.08.004
Junco, R., Heiberger, G., & Loken, E. (2011). The effect of Twitter on college student engagement and grades. Journal of
Computer Assisted Learning, 27(2), 119–132. https://doi.org/10.1111/j.1365-2729.2010.00387.x
Kahu, E. R. (2013). Framing student engagement in higher education. Studies in Higher Education, 38(5), 758–773. https://
doi.org/10.1080/03075079.2011.598505
Nkomo et al. Int J Educ Technol High Educ (2021) 18:34 Page 25 of 26
Kahu, E. R., & Nelson, K. (2018). Student engagement in the educational interface: Understanding the mechanisms of
student success. Higher Education Research & Development, 37(1), 58–71. https://doi.org/10.1080/07294360.2017.
1344197
Klem, A. M., & Connell, J. P. (2004). Relationships matter: Linking teacher support to student engagement and achieve-
ment. Journal of School Health, 74(7), 262–273
Klobas, J. E., & McGill, T. J. (2010). The role of involvement in learning management system success. Journal of Computing
in Higher Education, 22(2), 114–134. https://doi.org/10.1007/s12528-010-9032-5
Koranteng, F. N., Wiafe, I., & Kuada, E. (2019). An empirical study of the relationship between social networking sites and
students’ engagement in higher education. Journal of Educational Computing Research. https://doi.org/10.1177/
0735633118787528
Kuh, G. D. (2016). Making learning meaningful: Engaging students in ways that matter to them. New Directions for Teach-
ing and Learning, 2016(145), 49–56. https://doi.org/10.1002/tl.20174
Lawson, M. A., & Lawson, H. A. (2013). New conceptual frameworks for student engagement research, policy, and prac-
tice. Review of Educational Research, 83(3), 432–479
Li, K., Rollins, J., & Yan, E. (2018). Web of Science use in published research and review papers 1997–2017: A selec-
tive, dynamic, cross-domain, content-based analysis. Scientometrics, 115(1), 1–20. https://doi.org/10.1007/
s11192-017-2622-5
Liberati, A., Altman, D. G., Tetzlaff, J., Mulrow, C., Gøtzsche, P. C., Ioannidis, J. P., et al. (2009). The PRISMA statement for
reporting systematic reviews and meta-analyses of studies that evaluate health care interventions: explanation and
elaboration. Annals of Internal Medicine, 151(4), W65–W94
Linnenbrink, E. A., & Pintrich, P. R. (2003). The role of self-efficacy beliefs in student engagement and learning in the class-
room. Reading & Writing Quarterly, 19(2), 119–137
Linnenbrink-Garcia, L., Rogat, T. K., & Koskey, K. L. K. (2011). Affect and engagement during small group instruction. Con-
temporary Educational Psychology, 36(1), 13–24. https://doi.org/10.1016/j.cedpsych.2010.09.001
Little-Wiles, J. M., Hundley, S. P., & Koehler, A. (2010). Work in progress—Maximizing student engagement in a learning
management system. Paper presented at the 40th Annual Frontiers in Education Conference: Celebrating Forty Years
of Innovation, FIE 2010, Arlington, VA.
Little-Wiles, J. M., & Naimi, L. L. (2011). A study of traditional undergraduate student engagement in blackboard learning
management system. Paper presented at the ASEE Annual Conference and Exposition, Conference Proceedings.
Liu, D., Froissard, C., Richards, D., & Atif, A. (2015). Validating the effectiveness of the moodle engagement analytics plugin to
predict student academic performance. Paper presented at the 2015 Americas Conference on Information Systems,
Puerto Rico.
Luna, J. M., Castro, C., & Romero, C. (2017). MDM tool: A data mining framework integrated into Moodle. Computer Appli-
cations in Engineering Education, 25(1), 90–102. https://doi.org/10.1002/cae.21782
Macfarlane, B., & Tomlinson, M. (2017). Critiques of student engagement. Higher Education Policy, 30(1), 5–21. https://doi.
org/10.1057/s41307-016-0027-3
Maguire, R., Egan, A., Hyland, P., & Maguire, P. (2017). Engaging students emotionally: The role of emotional intelligence in
predicting cognitive and affective engagement in higher education. Higher Education Research & Development, 36(2),
343–357. https://doi.org/10.1080/07294360.2016.1185396
McGowan, A., & Hanna, P. (2015). How video lecture capture affects student engagement in a higher education computer
programming course: A study of attendance, video viewing behaviours and student attitude. Paper presented at the
eChallenges e-2015 Conference.
Messias, I., Morgado, L., & Barbas, M. (2015). Students’ engagement in distance learning: Creating a scenario with LMS and
Social Network aggregation. Paper presented at the 17th International Symposium on Computers in Education, SIIE
2015.
Newton, G., Tucker, T., Dawson, J., & Currie, E. (2014). Use of lecture capture in higher education—Lessons from the
trenches. TechTrends, 58(2), 32–45. https://doi.org/10.1007/s11528-014-0735-8
Nkomo, L., & Nat, M. (2017). Comparison between students’ and instructors’ perceived use and effectiveness of online social
technologies. Paper presented at the 6th Cyprus International Conference on Educational Research (CYICER-2017).
https://doi.org/10.18844/prosoc.v4i4
Onwuegbuzie, A. J., & Weinbaum, R. (2017). A framework for using qualitative comparative analysis for the review of the
literature. Qualitative Report. https://doi.org/10.46743/2160-3715/2017.2175
Palmer, S., Chu, Y., & Persky, A. M. (2019). Comparison of re-watching class recordings and retrieval practice as post-class
learning strategies. American Journal of Pharmaceutical Education, 83, 7217
Paulsen, J., & McCormick, A. C. (2020). Reassessing disparities in online learner student engagement in higher education.
Educational Researcher, 49(1), 20–29. https://doi.org/10.3102/0013189x19898690
Pekrun, R., & Linnenbrink-Garcia, L. (2012). Academic emotions and student engagement. In S. L. Christenson, A. L.
Reschly, & C. Wylie (Eds.), Handbook of research on student engagement. (pp. 259–282). Springer, US.
Prior, D. D., Mazanov, J., Meacheam, D., Heaslip, G., & Hanson, J. (2016). Attitude, digital literacy and self efficacy: Flow-on
effects for online learning behavior. The Internet and Higher Education, 29, 91–97. https://doi.org/10.1016/j.iheduc.
2016.01.001
Reeve, J., & Tseng, C.-M. (2011). Agency as a fourth aspect of students’ engagement during learning activities. Contempo-
rary Educational Psychology, 36(4), 257–267. https://doi.org/10.1016/j.cedpsych.2011.05.002
Rhode, J., Richter, S., Gowen, P., Miller, T., & Wills, C. J. O. L. (2017). Understanding faculty use of the learning management
system. Online Learning, 21(3), 68–86
Sánchez, R. A., & Hueros, A. D. (2010). Motivational factors that influence the acceptance of Moodle using TAM. Computers
in Human Behavior, 26(6), 1632–1640. https://doi.org/10.1016/j.chb.2010.06.011
Saunders, F. C., & Gale, A. W. (2012). Digital or didactic: Using learning technology to confront the challenge of large
cohort teaching. British Journal of Educational Technology, 43(6), 847–858
Nkomo et al. Int J Educ Technol High Educ (2021) 18:34 Page 26 of 26
Schmidt, J. A., Rosenberg, J. M., & Beymer, P. N. (2018). A person-in-context approach to student engagement in science:
Examining learning activities and choice. Journal of Research in Science Teaching, 55(1), 19–43. https://doi.org/10.
1002/tea.21409
Seifert, T. (2019). Two pedagogical models of video integration in multiparticipant courses. Journal of Educators Online,
16(1), n1. https://doi.org/10.9743/jeo.2019.16.1.12
Skinner, E. A., & Belmont, M. J. (1993). Motivation in the classroom: Reciprocal effects of teacher behavior and student
engagement across the school year. Journal of Educational Psychology, 85(4), 571
Soluk, L., & Buddle, C. M. (2015). Tweets from the forest: Using Twitter to increase student engagement in an undergradu-
ate field biology course. F1000Research. https://doi.org/10.12688/f1000research.6272.1
Swart, A. J. (2015). Student usage of a learning management system at an open distance learning institute: A case study
in electrical engineering. International Journal of Electrical Engineering Education, 52(2), 142–154. https://doi.org/10.
1177/0020720915575925
Swart, A. J. (2017). Using reflective self-assessments in a learning management system to promote student engagement and
academic success. Paper presented at the 8th IEEE Global Engineering Education Conference, EDUCON 2017.
Tiernan, P. (2014). A study of the use of Twitter by students for lecture engagement and discussion. Education and Infor-
mation Technologies, 19(4), 673–690. https://doi.org/10.1007/s10639-012-9246-4
Trenholm, S., Hajek, B., Robinson, C. L., Chinnappan, M., Albrecht, A., & Ashman, H. (2019). Investigating undergraduate
mathematics learners’ cognitive engagement with recorded lecture videos. International Journal of Mathematical
Education in Science and Technology, 50(1), 3–24. https://doi.org/10.1080/0020739x.2018.1458339
Trowler, V. (2010). Student engagement literature review. The Higher Education Academy, 11(1), 1–15
Umer, R., Susnjak, T., Mathrani, A., & Suriadi, S. (2018). A learning analytics approach: Using online weekly student engage-
ment data to make predictions on student performance. In 2018 International conference on computing, electronic
and electrical engineering (ICE Cube). https://doi.org/10.1109/icecube.2018.8610959.
Venugopal, G., & Jain, R. (2015). Influence of learning management system on student engagement. Paper presented at the
2015 IEEE 3rd International Conference on MOOCs, Innovation and Technology in Education (MITE).
Venugopal-Wairagade, G. (2016). Creating a supportive learning environment for better student engagement. Paper pre-
sented at the 4th International Conference on Learning and Teaching in Computing and Engineering, LaTiCE 2016.
Waldrop, D., Reschly, A. L., Fraysier, K., & Appleton, J. J. (2019). Measuring the engagement of college students: Administra-
tion format, structure, and validity of the student engagement instrument-college. Measurement and Evaluation in
Counseling and Development, 52(2), 90–107
Wang, F. H. (2017). An exploration of online behaviour engagement and achievement in flipped classroom supported by
learning management system. Computers & Education, 114, 79–91. https://doi.org/10.1016/j.compedu.2017.06.012
Whitmer, J. (2015). How blackboard analytics improves learner Engagement. www.blackboard.com/Images/Analytics_
Ebook_Section3_learner%20engagement_tcm21-26911.pdf
Williams, D., & Whiting, A. (2016). Exploring the relationship between student engagement, Twitter, and a learning
management system: A study of undergraduate marketing students. International Journal of Teaching & Learning in
Higher Education, 28(3), 302–313
Wyatt, L. G. (2011). Nontraditional student engagement: Increasing adult student success and retention. The Journal of
Continuing Higher Education, 59(1), 10–20
Yassine, S., Kadry, S., & Sicilia, M. (20166). A framework for learning analytics in moodle for assessing course outcomes. Paper
presented at the 2016 IEEE Global Engineering Education Conference (EDUCON).
Zheng, Y., Wang, J., Doll, W., Deng, X., & Williams, M. (2018). The impact of organisational support, technical support, and
self-efficacy on faculty perceived benefits of using learning management system. Behaviour & Information Technol-
ogy, 37(4), 311–319. https://doi.org/10.1080/0144929X.2018.1436590
Zhoc, K. C., Webster, B. J., King, R. B., Li, J. C., & Chung, T. S. (2019). Higher education student engagement scale (HESES):
Development and psychometric evidence. Research in Higher Education, 60(2), 219–244
Publisher’s Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.