0% found this document useful (0 votes)
14 views23 pages

For Providing Actionable Insights To Learners

Uploaded by

rogercortin
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
14 views23 pages

For Providing Actionable Insights To Learners

Uploaded by

rogercortin
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 23

Susnjak et al.

Int J Educ Technol High Educ (2022) 19:12


https://doi.org/10.1186/s41239-021-00313-7

RESEARCH ARTICLE Open Access

Learning analytics dashboard: a tool


for providing actionable insights to learners
Teo Susnjak* , Gomathy Suganya Ramaswami and Anuradha Mathrani

*Correspondence:
[email protected] Abstract
School of Mathematical This study investigates current approaches to learning analytics (LA) dashboarding
and Computational Sciences,
Massey University, Auckland, while highlighting challenges faced by education providers in their operationalization.
New Zealand We analyze recent dashboards for their ability to provide actionable insights which
promote informed responses by learners in making adjustments to their learning
habits. Our study finds that most LA dashboards merely employ surface-level descrip-
tive analytics, while only few go beyond and use predictive analytics. In response to
the identified gaps in recently published dashboards, we propose a state-of-the-art
dashboard that not only leverages descriptive analytics components, but also inte-
grates machine learning in a way that enables both predictive and prescriptive analyt-
ics. We demonstrate how emerging analytics tools can be used in order to enable
learners to adequately interpret the predictive model behavior, and more specifically
to understand how a predictive model arrives at a given prediction. We highlight how
these capabilities build trust and satisfy emerging regulatory requirements surround-
ing predictive analytics. Additionally, we show how data-driven prescriptive analytics
can be deployed within dashboards in order to provide concrete advice to the learners,
and thereby increase the likelihood of triggering behavioral changes. Our proposed
dashboard is the first of its kind in terms of breadth of analytics that it integrates, and is
currently deployed for trials at a higher education institution.
Keywords: Dashboard, Learner analytics, Actionable insights, Model interpretability,
Explainable AI, Counterfactuals

Introduction
Analytics technologies have proliferated across many sectors of society for extracting
data-driven insights, improving decision making and driving innovation. The tertiary
educational sector is particularly in-tune with the advantages that data analytics can
offer, and generally, seeks to leverage these advances. Deployment of analytics technolo-
gies is becoming increasingly important as this sector is undergoing disruptions across
different parts of the world, as well as due to the COVID-19 pandemic crisis (Aristovnik
et al., 2020). The current pandemic responses have shifted education delivery to online
modes, further accelerating ongoing disruptions. The education sector is already facing
financial and competitive pressures (Muhammad et al., 2020) in some regions, and this
global shift to online learning has amplified them even more. These shifts have altered

© The Author(s) 2022. Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits
use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original
author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third
party material in this article are included in the article’s Creative Commons licence, unless indicated otherwise in a credit line to the mate-
rial. If material is not included in the article’s Creative Commons licence and your intended use is not permitted by statutory regulation or
exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://​
creat​iveco​mmons.​org/​licen​ses/​by/4.​0/.
Susnjak et al. Int J Educ Technol High Educ (2022) 19:12 Page 2 of 23

the competitive landscape between universities in countries with low or non-existent


subsidies with those located in countries that have strong government support by bring-
ing them into direct competition as geographic and physical boundaries now have a
diminished relevance. This has been accentuated by the continuing rise in higher educa-
tion costs (Blankenberger & Williams, 2020), together with the questioning of the value
proposition that higher education offers in many of the available qualifications. These
challenges call for a need to adapt and do things differently. For these reasons, data ana-
lytics with its use of innovative products has risen to become one of the key enablers in
the educational sector. Analytics in this sector focuses on maximizing student retention
rates by identifying the at-risk students in the early stages and then by initiating inter-
ventions, whilst improving the quality of the educational experience for the learners.
One of the analytics innovation products that have been deployed to benefit the
enrolled learners, are Learning Analytics dashboards (LADs) which have been opera-
tionalized in numerous institutions (Table 1) in the last few years. The purpose of
existing LADs is not dissimilar to the dashboards widely used in industry. They aim to
provide learners with a snapshot of how they are progressing in their courses. Graphi-
cal displays highlight trends in the learners’ academic and engagement levels through
the digital footprints generated by learners and provide them with a basis for awareness,
reflection and new insights (Yoo & Jin, 2020). In more advanced cases, the dashboards
are designed to make predictions on where learners are likely to end up in respect to
meeting learning outcomes based on their current trajectory. Analytics tools reveal
learner insights that could prompt a reflection process that would otherwise not be real-
ized. It is assumed that these reflections may (in some cases) trigger positive behavioral
changes that support learners in maximizing learning outcomes and course completions
as well as retention rates.
However, operationalizing these types of analytics products is accompanied with
numerous challenges. Given the financial investment and the human resource effort
involved in their productionization (Mahroeian et al., 2017), it is also not altogether clear
what visual elements LADs should possess, that is, what type of information is effective
at triggering positive behavioral adjustments in learners, or what aspects are detrimental
by potentially inducing anxiety among learners. Overall, evidence about whether learn-
ing analytics and LADs improve learning in practice is scarce (Ferguson & Clow, 2017;
Guzmán-Valenzuela et al., 2021; Knight et al., 2020; Rets et al., 2021; Wilson et al., 2017)
and has room for further investigation.
This paper expands on the challenges of operationalizing LADs and identifies gaps in
current LADs. It does this by integrating analyses of recently published LADs by fol-
lowing a systematic line of inquiry. The following subsections offer definitions of vari-
ous analytics layers that we use in our analysis of the current state of practice in LADs,
as well as an outline of our research agenda for establishing an institutional-level dash-
board that offers more value to learners.

Analytics layers
Analytics can provide different levels of informational insights to enable users in mak-
ing informed decisions. At the most basic level, descriptive analytics highlight snap-
shots of variables of interest. These convey information about trends and the current
Susnjak et al. Int J Educ Technol High Educ (2022) 19:12 Page 3 of 23

Table 1 Reviewed papers overview


Analysis Studies

Descriptive analytics content Conducted Aljohani et al., 2019; Baneres et al.,


2019; Bodily et al., 2018; Chatti et al.,
2020; Chen et al., 2019; Fleur et al.,
2020; Gras et al., 2020; Han et al.,
2021; He et al., 2019; Karaoglan
Yilmaz & Yilmaz, 2020; Kia et al., 2020;
Kokoç & Altun, 2021; Majumdar et al.,
2019; Naranjo et al., 2019; Owatari
et al., 2020; Ulfa et al., 2019; Valle
et al., 2021
Predictive analytics content and Not conducted Aljohani et al., 2019; Bodily et al.,
reported accuracy 2018; Chatti et al., 2020; Chen et al.,
2019; Gras et al., 2020; Han et al.,
2021; He et al., 2019; Karaoglan
Yilmaz & Yilmaz, 2020; Kia et al., 2020;
Majumdar et al., 2019; Naranjo et al.,
2019; Owatari et al., 2020; Ulfa et al.,
2019
Conducted Baneres et al., 2019; Fleur et al., 2020;
Kokoç & Altun, 2021; Valle et al., 2021
Accuracy not reported Fleur et al., 2020; Valle et al., 2021
80–89% accuracy achieved Kokoç & Altun, 2021
90–95% accuracy achieved Baneres et al., 2019
Prescriptive analytics content Conducted None
Conducted non-data driven Baneres et al., 2019; Bodily et al.,
2018; Gras et al., 2020; Han et al.,
2021; Karaoglan Yilmaz & Yilmaz,
2020; Majumdar et al., 2019
Not conducted Aljohani et al., 2019; Chatti et al.,
2020; Chen et al., 2019; Fleur et al.,
2020; He et al., 2019; Kia et al., 2020;
Kokoç & Altun, 2021; Naranjo et al.,
2019; Owatari et al., 2020; Ulfa et al.,
2019; Valle et al., 2021
Model interpretability and explain- Conducted None
ablility
Dashboard evaluation and effec- Evaluation conducted within a pilot Bodily et al., 2018; Chatti et al., 2020;
tiveness study context Gras et al., 2020; Han et al., 2021; He
et al., 2019; Kia et al., 2020; Kokoç
& Altun, 2021; Naranjo et al., 2019;
Owatari et al., 2020; Ulfa et al., 2019
No evaluation conducted within a Chen et al., 2019; Majumdar et al.,
prototype study context 2019
Positive effects on student out- Aljohani et al., 2019; Fleur et al., 2020;
comes reported Han et al., 2021; Kokoç & Altun, 2021
Dashboard Color content 1-3 colors Fleur et al., 2020
4-6 colors Bodily et al., 2018; Kia et al., 2020;
Naranjo et al., 2019; Ulfa et al., 2019;
Valle et al., 2021
> 6 colors Aljohani et al., 2019; Baneres et al.,
2019; Chatti et al., 2020; Chen et al.,
2019; Gras et al., 2020; Han et al.,
2021; He et al., 2019; Karaoglan
Yilmaz & Yilmaz, 2020; Kokoç & Altun,
2021; Majumdar et al., 2019; Owatari
et al., 2020)
Susnjak et al. Int J Educ Technol High Educ (2022) 19:12 Page 4 of 23

status relative to other identified measures. Descriptive analytics are the simplest form
of insights to extract from data, and while useful, have a limited utility.
Predictive analytics on the other hand emphasize some form of forecasting and
embody the ability to estimate future outcomes based on current and past data patterns.
Predictive analytics are mostly driven by machine learning algorithms which learn from
historic datasets in order to produce classifiers that can make inferences about possible
future outcomes from current data inputs. Data products based on predictive analytics
represent a considerable increase in complexity over mere descriptive analytics and offer
more value, but they also possess shortcomings. One shortcoming is that they usually
produce black-box models which lack transparency into their internal workings (Adadi
& Berrada, 2018). This means that it is often not possible for users to understand how
these models make predictions, and what aspects of the learners’ behaviors are driving
the predictions towards prognosticated outcomes. This lack of model interpretability
and explainability of outputs associated with most predictive models lowers their utility,
and over time erodes the trust of users (Baneres et al., 2021). Therefore, a trend is emerg-
ing at regulatory1 levels requiring predictive models to expose their reasoning behind
the predictions in comprehensible ways.
The most complex and arguably the most insight-rich form of analytics is prescriptive
analytics. Prescriptive analytics can leverage predictive analytics in such a way that the
underlying models are also able to infer possible causal relationships and consequently
generate recommendations and suggestions to users about which specific behavioral
changes are most likely to result in positive outcomes. These prescriptive outputs are
tailored to each learner, but their suggestions are data-driven and thus based on similar
students who achieved positive outcomes in the past. By issuing advice on behavioral
adjustments and learning strategies that learners can undertake to maximize their learn-
ing outcomes, the decision-making process for the learners can be simplified and the
guesswork removed.
Currently, descriptive LADs are most commonly in use with an increasing number
integrating predictive components. However, to the best of our knowledge, examples of
dashboards incorporating data-driven prescriptive aspects of analytics do not exist.

Aims
This paper has three parts and contributions. We first provide an overview of the exist-
ing challenges in developing institutional LADs. We highlight three challenges (namely,
representation and actions, ethics, and agility) faced in deploying LAD initiatives involv-
ing student-facing dashboards. Secondly, we provide an extensive survey of the most
recently published state-of-the-art LADs in literature. Our search identified 17 LADs
and we assess their common characteristics as well as strengths and weaknesses. Thirdly,
we propose our LAD which addresses many of the shortcomings, and to the best of our
knowledge, is the first LAD that brings descriptive, predictive and data-driven prescrip-
tive analytics into one display. We conclude with offering inferences on what we see

1
The General Data Protection Regulation (GDPR) is an example of a regulation requiring a “right to explanation” in
respect to predictive models.
Susnjak et al. Int J Educ Technol High Educ (2022) 19:12 Page 5 of 23

as being future directions and emerging frontiers in LA dashboarding in the short to


medium term.
Our research questions are as follows:

RQ1. What are unique challenges in developing student-facing LADs?


RQ2. How ubiquitous are LADs?
RQ3. What is the current evidence for the effectiveness of LADs?
RQ4. What are the strengths and weaknesses of the current approaches to LA dashboard-
ing?
RQ5. What are future directions of LA dashboarding and how can some of the existing
weakness be addressed?

Analytics in education
There are broadly three streams of research within educational analytics. Learning Ana-
lytics (LA) focuses on learners. Its primary concern is optimizing teaching and learning
processes. Educational Data Mining (EDM) on the other hand seeks to develop meth-
ods for exploring educational data in order to better understand learners, and to extract
insights about them and the educational systems. Academic Analytics (AA) draws on the
insights gained from educational data for supporting strategic decision-making, resolv-
ing academic issues such as retention and improving marketing strategies.
These three streams intersect at various points and share much of the underlying data,
and though they could all be grouped under the same umbrella as Educational Data Sci-
ence, they differ in the stakeholders which they target. EDM tends to target both teachers
and learners, while LA primarily addresses the needs of learners. However, institutional
administrators, managers and educational policymakers are the key stakeholders of AA
applications. The three streams also affect different levels of the educational systems. LA
is linked to course-level granularity and to department-level concerns within institu-
tions, while EDM spans departmental through to faculty and institutional-level concerns
(Nguyen et al., 2020). Meanwhile, AA affects universities at the institutional level, that
has implications for policy making, thus it spans regional, national and possibly interna-
tional levels.

Challenges for building LADs


While there are some differences between LA, AA and EDM, they all share some com-
mon challenges. Numerous studies have reported implementation details of LA prod-
ucts; however, a recent study by Leitner et al. (2020) pointed out that they rarely provide
comprehensive descriptions of challenges faced in productionizing these systems. This
study shortlisted seven general challenges for deploying LA initiatives:

1. Purpose and Gain: managing expectations of different stakeholders.


2. Representation and Actions: facilitation of actionable insights by LA products.
3. Data: communication to students regarding what is being done with their data, and
formulating suitable policies to manage data processes.
Susnjak et al. Int J Educ Technol High Educ (2022) 19:12 Page 6 of 23

4. IT Infrastructure: balancing the pros and cons of opting to use internal or external
service providers for implementing and running the LA products.
5. Development and Operation: planning and implementation of the process of devel-
oping and operating an LA initiative.
6. Privacy: ensuring both security of learners’ data and compliance with increasingly
stringent legal requirements worldwide.
7. Ethics: ensuring that LA products do not bring harm and provide learners with the
ability to opt-out.

The above challenges are generic and broadly applicable to all LA projects. We draw
on recent literature to expand on two particular challenges above (2 and 7), and we tailor
them to the difficulties which specifically relate to LAD projects. In addition, with sup-
porting literature we posit an additional challenge, namely Agility, to the original seven
identified by Leitner et al. (2020).

Representation and actions


Dashboard visualization is more of a science than art. The dashboard designer must pos-
sess a degree of understanding of how the human visual cortex perceives various visual
cues in order to optimally match different data types to suitable visual representations.
Some data are quantitative and others are ordinal or categorical in their attributes. The
values of each data type are best represented by different cues which could comprise
contrasting colors, differing spatial positions or variations in symbols denoting length,
size, shape and orientation amongst others. The designer also needs to possess both
domain expertise in learning theories and paradigms, as well as technical capabilities in
developing dashboards (Klerkx et al., 2017).
Choosing the correct visualization technique can present difficulties largely due to
the increasing amounts of available data and the candidate variables/indicators that can
be incorporated (Leitner et al., 2019). Ensuring that dashboards are informative with-
out overwhelming the user is a challenging balancing act. From an aesthetic perspec-
tive, Tufte (2001) cautions against use of ‘non-data-ink’ and ‘chartjunk’ in graphs, that
is, he maintains that excessive use of colors, patterns or gridlines can confuse and clog
the recipient’s comprehension. Bera (2016) specifically mentions the overuse and mis-
use of color in business dashboards and the role this has on the users’ decision-making
abilities. Bera’s research finds that contrasting colors vie for user’s attention, and unless
necessary, they distract and affect the decision-making processes. By using eye track-
ing technology, the study demonstrated that the cognitive overload associated with the
misuse of color in dashboards leads to longer fixation periods on irrelevant aspects of
dashboards and prolongs the ability of users to comprehend the information.
Use of predictive modelling is becoming more prominent within LA (Bergner, 2017),
and these techniques are emerging more frequently within dashboards. A recent study
(Baneres et al., 2021) into developing LA technologies acting as early warning systems
for identifying at-risk learners highlighted the need to move beyond ‘old-fashioned’
dashboards that simply rely on descriptive analytics and to instead, orient efforts
towards incorporating predictive analytics amongst other advanced features. However,
building highly accurate and reliable predictive models is not trivial. Firstly, it requires
Susnjak et al. Int J Educ Technol High Educ (2022) 19:12 Page 7 of 23

considerable technical expertise which is not always easy to acquire. Secondly, predict-
ing outcomes based on human behavior reflects a non-deterministic problem. Further,
for scalability reasons, we ideally require generic predictive models which can predict
student outcomes across widely disparate courses. However, since courses have different
attributes, styles of delivery and assessment types, it is a considerable challenge to cre-
ate single generic predictive models that can work optimally across diverse courses. On
the other hand, developing tailored predictive models for each different course creates
technical resource overheads. Tailored models are also likely to perform badly in many
instances due to scarcity of data leading to overfitting, since individual courses may have
small class numbers or have limited historical data. A recent systematic literature review
on the current state of prediction of student performance within LA, Namoun and
Alshanqiti (2020) found that the state of predictive modeling of student outcomes is not
fully exploited and warrants further work. The study found that not only the accuracy
of the existing models has room for improvement, but more robust testing for its valid-
ity, portability (or generic models) and overall generalizability needs to be conducted.
In a recent study, Umer et al. (2021) concluded that many datasets used to build predic-
tive models in this domain were small, often having less than 10% of the overall data
points for certain class labels, leading to unreliable predictive accuracies especially when
course-tailored predictive models are being created. The study also calls for enhancing
the scope of engagement data to cover learner interaction data from forum messages,
identifying pedagogically meaningful features and developing dashboard visualizations
that have some underlying pedagogical intent.
Developing accurate classifiers is further complicated by the negative effects of con-
cept drift (Lu et al., 2018). Concept drift describes the degradation in accuracies of pre-
dictive models over time since data used to build models may become disconnected with
current real-life data. This can occur when learners’ study patterns gradually or abruptly
change (as in the case of pandemic responses), and current digital footprints no longer
correlate with previous patterns in the historic record. For example, the gradual shift
towards the use of virtual learning environments (VLE) over the last 10–15 years repre-
sents a concept drift. Learners’ study patterns prior to this period in the historic record
bear little resemblance to the patterns of learners of today, and thus, data from his-
toric period will likely degrade predictive accuracies of current students. Concept drift
can also happen suddenly, as indeed the sudden migration to full online learning dur-
ing the recent pandemic crisis brought into play additional technologies and different
digital patterns and footprints that students leave behind. This disconnect between the
independent and dependent variables from historic data needed to train the predictive
models, with the independent variables being used as input to predict the outcomes of
current students, is constantly evolving. This phenomenon represents a technical and a
capability challenge for universities, as concept drift needs to be detected and accounted
for, while the mechanisms for achieving this effectively are still being researched (Lu
et al., 2018).
The above challenges are considerable. However, even if they can all be addressed, it
is now no longer sufficient to deploy predictive models and solely display their outputs
without providing the learners with explainability of how a model arrived at a given pre-
diction. It is also becoming more apparent that learners will engage with a LAD only if
Susnjak et al. Int J Educ Technol High Educ (2022) 19:12 Page 8 of 23

they understand how displayed values are generated (Rets et al., 2021). Liu and Koed-
inger (2017) argue for the importance of interpretability which leads onto actionability.
Models need to possess explanatory characteristics so that learners understand why a
model produced given predictions, what the underlying driving factors are, and impor-
tantly, what insights can be derived from these explanations in order to trigger actionable
behavioral adjustments. Not only should interpretability of models and explainability of
their individual predictions be provided to the learners, but also counterfactuals, which
explicitly demonstrate alternative outcomes for the learner if a behavioral change were
to take place in specific areas. Recent studies (Rets et al., 2021; Valle et al., 2021) in LADs
have highlighted the necessity of integrating insights which are prescriptive and take on
forms of recommendations to guide students in their learning. Producing such rich and
sophisticated outputs is a challenge, because extracting simplified representations of
predictive black-box models and their reasoning is complex. There are limited available
tools with sufficient maturity that support this functionality, which again requires a high
level of expertise to implement and leverage.

Ethics
The challenges surrounding ethical use of data within LA products are generally well
understood and accepted. They center around questions of what personal data should
be collected and processed by these systems, what insights should be extracted and with
whom they should be shared. Additional concerns exist around possible consequences
on learners when conveying personalized information; therefore, institutions need to be
aware of intrusive advising or inappropriate labelling that may lead to learner resent-
ment or demotivation (Campbell et al., 2007). As such, avoidance of harm to learners,
alongside compliance with legal requirements are paramount.
Given the importance of practical ethical underpinnings when using LA systems, it is
acknowledged that robust and clear policies need to be formulated on what empirical
data is permitted to be used for analytical purposes and to what end (Kitto & Knight,
2019). The study supports that awareness of these policies must be communicated to
the learners together with the purported educational benefits that such systems claim
to bring, together with the potential risks. A key concern however is the uncertainty
regarding the benefits distribution, which may not be the same for everyone (Rubel &
Jones, 2016); hence, institutions are encouraged to create a sense of transparency about
LA systems by including statements on their data practices and limitations.
Beyond the well accepted dilemmas of LA systems listed above, predictive models
used in LADs bring with them some other acute challenges. Predictive models naturally
embody within them the process of generalization. As the machine learning algorithms
learn and induce predictive models, they move from individual and specific examples
to more general descriptors of the data. With this natural induction process, errors are
invariably introduced. The ethical concern and challenge come into play when we con-
sider both incorrect and correct classifications and the effects that they might have on
learners. If a student is mis-classified as being “at-risk” this might have the effect of dis-
couraging them and eventuate in the “fulfillment of the prophecy” despite the fact they
were originally on-track to successful completions. Or, in using oversimplified classifica-
tion labels, we can diminish the predictive value and in turn reduce the trustworthiness
Susnjak et al. Int J Educ Technol High Educ (2022) 19:12 Page 9 of 23

of the analytical approach. This challenge will always remain since learners are not deter-
ministic and predictive models in non-deterministic domains are inherently imperfect.
Likewise, Bowker and Star (2000) note that even with correct predictions, for some this
may be an incentive if they are already motivated and capable of positively adjusting
their course in order to alter their predicted outcome, while for others, the prediction
may only serve to further deflate.

Agility
Agility is the ability to rapidly adapt to changing requirements, be flexible and able to
seize new opportunities. Universities are more resistant to change than industrial enti-
ties (Menon & Suresh, 2020); they are typically considered to be fractured and decen-
tralized (Bunton, 2017), while possessing complex and non-standard business processes
(Mukerjee, 2014a). However, financial constraints coupled with pressure from competi-
tion as a consequence of the unfolding digital revolution, have put universities on high
alert to engage with new technologies (Mukerjee, 2014a). It is recognized that organiza-
tional agility is a crucial capability for universities at these times (Mukerjee, 2014b). Both
the use of data insights and analytics as well as the development of these projects, places
immediate demands of agility on behalf of the organization operationalizing them. Agil-
ity is therefore a key challenge for universities attempting to productionize LADs.
The requirement for agility comes at different levels in respect to LADs. Translating
LADs into products that genuinely improve learning outcomes requires constant moni-
toring and analysis of their usage patterns, user feedback and ultimately the gathering of
evidence into their efficacy. The consequences of this are an increase in resource costs
for maintenance and continuous refinement of the LADs. Continuing support from
the institutions and willingness to provide ongoing long-term refinements need to be
secured ahead of time. Sun et al. (2019) point out that improvements of these types of
systems needs to go beyond pilot and deployment stages, and that underlaying assump-
tions used to develop these systems need to be re-assessed as adjustments are made to
enhance the design or functionality. For best results, the design of dashboards should be
iterative with continuous feedback from learners in order to ensure that an operational-
ized product is actually useful. This is time and resource intensive and requires agility.
From a data-oriented point of view, agility and the ability to integrate new data streams
into LADs are paramount. Universities are rapidly incorporating modern technologies
for course delivery and improving the learning experience. The technologies sometimes
augment what is already in place, while other times, they completely replace legacy pro-
cesses and systems with new ones. This process has been accelerating recently and will
continue to do so. The consequence is that new and more diverse digital footprints will
continue to be generated by learners especially with the increased demand in online
education in the aftermath of COVID-19. Therefore, adaptability and rapid responses in
integrating new data sources must be set forth to identify new features that can improve
the predictive power of deployed models.
Finally, profound insights are compelling. They demand action if negligence is to be
avoided. Deep insights can be game-changers and often call for swift action even when
this is inconvenient. For example, if predictive models powering the LADs identify cer-
tain qualifications within an institution’s portfolio as being key predictive drivers towards
Susnjak et al. Int J Educ Technol High Educ (2022) 19:12 Page 10 of 23

Fig. 1 Methodology used in this systematic review (Moher et al., 2009)

poor completion rates, then this would need to trigger action and possibly advice on
changes that may neither be convenient for an institution, nor even align with their over-
arching strategic goals. With deployment of LADs, therefore, comes the responsibility of
asking the tough questions in adapting to the suggested changes that can have a better
institutional impact.

Methods
The focus of this study was to review the most recent developments in LADs. To that
end, the search focused on studies published from 2018 until the time the search was
completed (September 2021). The search followed the PRISMA framework (Moher
et al., 2009) which requires a principled approach to defining the inclusion and exclusion
criteria as well as search parameters.
We first conducted a keyword search targeting Google Scholar using the “Publish or
Perish” tool in order to retrieve the initial academic articles. The following search terms
were used: “learning analytics dashboard” or “visualization tool” or “early warning sys-
tem” or “student dashboard”. The search yielded the following total number of results per
year: 2018 n = 340, 2019 n = 977, 2020 n = 960, and 2021 n = 403. A total of 2680 papers
were obtained. This was reduced to 1450 papers following the elimination of duplicates.
Next, papers that were not written in English and those containing less than 3 pages
were filtered, resulting in 600 papers. The abstracts of these papers were screened, and
finally only the papers that focused on dashboards targeting learners and instructors
Susnjak et al. Int J Educ Technol High Educ (2022) 19:12 Page 11 of 23

Fig. 2 Total number of published articles presenting LADs that are covered in this study. The number of
publications for 2021 is listed up to September of that year.

were retained. This yielded a total of 17 papers that successfully passed all the inclu-
sion criteria and only these were included in the final analysis. Figure 1 outlines the
overall methodology used for data collection, while Fig. 2 depicts the histogram of the
17 LAD papers by year.
Against the backdrop of recently published literature, the first part of our study has
already identified challenges facing LA and specifically difficulties associated with the
development and deployment of dashboards in educational contexts. The second part
of our study analyzes the data on existing LADs. Our analysis approach is based on
five key assertions that are grounded in dashboard literature. The assertions act as a
prism through which we reviewed the LADs and directed our investigation towards
the design of our dashboard subsequently. These are as follows:

1. Given that LADs using only descriptive analytics is not enough, it is meaningful to
identify dashboards which have started to incorporate predictive and data-driven
prescriptive analytics.
2. Since accuracy of predictive models is an identified challenge, it is informative to
determine what accuracies are being reported for recent LADs using predictive
modeling, and if they communicate the confidence of their predictive outputs to the
learners.
3. Assuming that there is value in providing learners with some level of interpretability
of the underlying predictive models and explanations of how the models have arrived
at predictions for individual students, it is instructive for future research directions
to ascertain how prevalent is the presentation of these features on the LADs.
4. Since the evidence of the effectiveness of LADs to affect positive outcomes for learn-
ers is not complete, it is instructive to know what evaluation efforts have been made
in recent studies on the utility of LADs.
5. LADs using higher number of colors are more likely to misuse and overuse colors
and contribute towards confusion for the learners.
Susnjak et al. Int J Educ Technol High Educ (2022) 19:12 Page 12 of 23

Analysis
Our analysis is divided into two parts. The first part reviews each of the 17 LADs and
highlights noteworthy aspects of each one. The second part analyzes the dashboards
at an aggregate level and offers analyses of the state of LA dashboarding in a summa-
tive format.

Review of dashboards from literature


Our framework for reviewing all the LAD studies uses a scheme whereby we consider
each LAD from the perspective of how they have implemented descriptive, predictive
and analytics functionalities, as well as the reported evidence outlining the effective-
ness of LADs on learner outcomes.

LADs with descriptive analytics capabilities


All studies incorporated some aspects of descriptive analytics. Frequently, the
descriptive analytics were in the form of graphical displays depicting comparisons
of a student in respect to class averages or patterns in relation to students (Aljohani
et al., 2019) across metrics like assessment scores, participation levels and interaction
with online activities (Chen et al., 2019; Fleur et al., 2020; Gras et al., 2020; Han et al.,
2021; Karaoglan Yilmaz & Yilmaz, 2020; Kokoç & Altun, 2021; Ulfa et al., 2019; Valle
et al., 2021). Some studies focused on status updates of progression through online
course materials such as video, reports on time spent on eBooks and summaries of
course notes (Bodily et al., 2018; He et al., 2019; Majumdar et al., 2019; Owatari et al.,
2020). Other studies also assisted learners in planning and provided alerts of upcom-
ing assessment submission deadlines (Baneres et al., 2019; Kia et al., 2020; Naranjo
et al., 2019). Certain LADs (Chatti et al., 2020) went beyond static dashboards and
enabled direct customizations of them by allowing learners to dynamically generate
indicators of their choice.

LADs with predictive analytics capabilities


Several studies went further than mere descriptive analytics and incorporated pre-
dictive analytics elements into their dashboards. A descriptive and a predictive
dashboard was developed by Valle et al. (2021). The descriptive dashboard aimed at
displaying the students’ performance relative to the class average while the predic-
tive dashboard displayed the probability of learners attaining specific grades. The
authors reported that the predictive dashboard helped only the highly motivated stu-
dents to sustain their motivation levels, while both dashboards failed to demonstrate
their effectiveness in affecting final outcomes. Similarly, Fleur et al. (2020) developed
a LAD with class-comparative descriptive components as well as the student’s pre-
dicted final grade. The students in the treatment group accessed the dashboard and
their performance was analyzed in formative and summative assessments. The study
reported that students in the treatment group performed better in the formative
assessment only. Baneres et.al (2019) focused on devising an early warning system
for learners and instructors that identifies at-risk students. Their Graduate At-Risk
(GAR) model used grades to predict course outcomes. Additionally, an intervention
Susnjak et al. Int J Educ Technol High Educ (2022) 19:12 Page 13 of 23

mechanism was incorporated that automated sending personalized messages to at-


risk students. While GAR noted an improvement in performance of the at-risk stu-
dents, it could not be determined which factors were responsible. In a similar vein,
a prescriptive learning dashboard (PLD) using personalized recommendation texts
was developed by Kokoc and Altun (2021) which also focused on generating of stu-
dent risk status and displaying it on the dashboard. The authors concluded that those
students who used PLD performed significantly better in their courses. None of the
examined studies took steps to communicate to students through the dashboard how
reliable the underlying predictive models were, nor were technologies used which
could elucidate to learners how the models operated, or how the predictions were
generated based on specific student’s data.

LADs with prescriptive analytics capabilities


Certain studies already mentioned above like Kokoc and Altun (2021) and Baneres et at.
(2019), considered the dispatch of personalized messages as prescriptive components.
Indeed, a number of other studies also leveraged different forms of messaging, recom-
mendation techniques and communication features through the LADs in order to claim
prescriptive capabilities. Bodily et al. (2018) developed LADs that recommended content
and skill-building activities such as practice exercises. Their study noted that skill-related
recommendation components were found by students to be more useful compared to
the content recommender features. Along similar lines, Karaoglan Yilmaz and Yilmaz
(2020) took the approach of delivering weekly reports over the course duration along
with personalized recommendations to each student. The study claimed that providing
analytics reports positively increased student motivation. Gras et al. (2020) expanded
the capabilities of their LAD by providing students with an action button which accessed
direct help from the instructor and can therefore be categorized as having prescriptive
aspects. Direct contact between students and instructors was also enabled by the face-
to-face collaborative argumentation (FCA) dashboard developed by Han et al. (2021).
This tool monitored students’ learning progress and facilitated prescriptive interven-
tions by instructors with students requiring additional assistance. Meanwhile, LAView
a dashboard was developed by Majumdar et al. (2019) which computed an engagement
score as an aggregate value across several student interaction measures. Based on the
engagement score, the instructor initiated prescriptive measures in the form of person-
alized emails to corresponding students. LADs are clearly emerging with some forms
of prescriptive components, though many of these can also be defined as human inter-
ventions. Others which have more of an automated algorithmic approach to dispensing
recommendations of content and activities are based on simplistic hard-code heuristics
and thresholds. More sophisticated prescriptive components within LADs leveraging
algorithmic and data-driven analytics have yet to emerge.

Reported effectiveness of LADs


Value assessments of the various LAD projects have taken two distinct approaches
amongst the examined studies. Some studies (e.g., Bodily et al., 2018; Chatti et al.,
2020; Gras et al., 2020; Han et al., 2021; He et al., 2019; Kia et al., 2020; Kokoç & Altun,
2021; Naranjo et al., 2019; Owatari et al., 2020; Ulfa et al., 2019) have largely conducted
Susnjak et al. Int J Educ Technol High Educ (2022) 19:12 Page 14 of 23

qualitative evaluations of the various LAD deployments within pilot contexts. These
included surveys and interviews of usability aspects and covered subjective responses on
the degree that LADs facilitated learning. However, other studies provided quantitative
findings supported by statistical analyses that demonstrated that LAD usage had posi-
tive effects on student outcomes (Aljohani et al., 2019; Fleur et al., 2020; Han et al., 2021;
Karaoglan Yilmaz & Yilmaz, 2020; Kokoç & Altun, 2021).

Dashboard data analysis


Table 1 summarizes all the revised dashboards through the analysis approach listed in
“Methods” Section. We find that 59% LADs include some form of descriptive analyt-
ics information to the learners. The remaining LADs focus on assisting students with
planning, helping them monitor progress through online learning materials and provide
learners with a medium through which instructors can more effectively interact with the
learners.
The majority of the LADs either did not use any form of predictive analytics or did
not report on this capability if implemented. This large group consisted of 76% of the
most recently developed dashboards for learner-facing educational contexts. Out of
the remaining 24% which did use predictive analytics, one of the dashboards generated
predictive models with an accuracy range between 80 and 89%, while another achieved
higher accuracies reaching up to 95%; however, while these accuracies were reported in
literature, the model accuracies were not presented to the students on the dashboards.
The remaining dashboards that used predictive analytics did not report on their predic-
tive accuracies.
The data also indicates that model transparency approaches and technologies have not
entered usage amongst the dashboard developers. Of all the dashboards which used pre-
dictive modeling, we find that no attempt was made to offer model interpretability to the
learners in terms of what were the key features. Additionally, we find that the none of the
reviewed dashboards tried to explain to the learners how the predictive models actually
arrived at the predictions that were presented to them.
From our review, we found that none of the recent LADs utilize data-driven prescrip-
tive analytics. Our data indicates that 47% used some form of prescriptive features asso-
ciated with the dashboards which took the form of encouraging messages, supportive
emails or instructive suggestions being issued to learners by the teachers. However, none
employed automated instructions or recommendations generated by prescriptive mod-
elling algorithms.
Widely contrasting approaches were adopted by researchers in respect to evaluating
the usability of the dashboards, as well as their overall ability to affect positive learning
outcomes. We found that 59% of the dashboards which were deployed in some form of
productionized environment, evaluated the usability of the dashboards through quali-
tative approaches that involved surveys and interviews with learners. A further 12%
of the studies created dashboard prototypes and conducted a qualitative investigation
into their usability, while a same proportion developed prototypes, but did not evaluate
them. A quarter of the studies performed a qualitative investigation into the effective-
ness of the dashboards ability to impact student outcomes. These studies concluded that
their LADs exhibited a positive impact on student outcomes. A common feature across
Susnjak et al. Int J Educ Technol High Educ (2022) 19:12 Page 15 of 23

Table 2 Dashboard technologies and size of study cohorts


Study Technology Programming/ Cohort size
expertise

Bodily et al., 2018 N/A 180


Chen et al., 2019 N/A –
Aljohani et al., 2019 ASP MVC4, HTML5, jQuery and High- High 86
charts JavaScript
Ulfa et al., 2019 N/A 67
Majumdar et al., 2019 N/A –
He et al., 2019 HTML5, JavaScript and Echarts High 327
Naranjo et al., 2019 Vue.js, HTML, CSS High 64
Baneres et al., 2019 Web application High 247
Gras et al., 2020 N/A 127
Karaoglan Yilmaz & Yilmaz, 2020 LMS messaging tool Low 81
Fleur et al., 2020 Django High 79
Chatti et al., 2020 Google charts and C3.js High 414
Kia et al., 2020 JavaScript, D3.js High 449
Owatari et al., 2020 Web application High 108
Han et al., 2021 Web application High 88
Kokoç & Altun, 2021 Google visualization API and AJAX API High 126
Valle et al., 2021 R and Shiny High 179

half of the LADs which demonstrated a positive student outcome was they possessed
predictive analytics capabilities. They all depicted information for each student in rela-
tion to where they were situated in respect to their peers across various metrics, and one
of the dashboards implemented prescriptive features.
Our analysis indicates that 59% of the LADs have used six or more different colors
on their display, potentially contributing towards information overload and miscom-
munication of insights. 29% used between four and six colors, while the remaining 12%
employed up to 3 colors only.
Technology is an important aspect through which LADs ought to be considered.
The chosen technology determines the range of capabilities of LADs and the agility of
the projects. Where reported, Table 2 indicates that the chosen tools for implement-
ing LADs have so far mostly been web application frameworks which carry with them
a requirement of a high level of technical expertise. Usage of off-the-shelf commercial
dashboarding products which do not require a high level of technical and programming
expertise appear not to be a chosen medium yet. Both web application frameworks and
off-the-shelf dashboarding products generally possess very limited advanced analytics
capabilities, which would then require additional technologies to overcome this limi-
tation. The exceptions being, Shiny (R) and Django (Python), where both technologies
have access to a large ecosystem of analytics capabilities.

Proposed learning analytics dashboard


The previous section reviewed dashboards and highlighted their key themes, analytic
capabilities and reported efficacies. Drawing on contributions from these studies as
well as the strengths and weaknesses of various dashboarding approaches, we pro-
pose our learner-facing dashboard design (shown in Fig. 3). The proposed dashboard
Susnjak et al. Int J Educ Technol High Educ (2022) 19:12 Page 16 of 23

Fig. 3 Learning analytics dashboard designed for students

attempts to integrate all levels of analytics capabilities missing in reviewed dash-


boards. The proposed dashboard has descriptive, predictive as well as prescriptive
components built into it. To the best of our knowledge, this dashboard is the first of
its kind to embed data-driven prescriptive capabilities involving counterfactuals into
its display. In addition, our dashboard possesses a high degree of transparency and
communicates to the learners how reliable the predictive models are; what the key
factors are that drive the predictions, as well as the conversion of a black-box predic-
tive model into a glass-box, human interpretable model for the learners so that they
can understand how their prediction is being derived.

Analytics layers
The proposed dashboard distributes the descriptive, predictive and prescriptive com-
ponents across three panels seen in Fig. 3. The first panel highlights student engage-
ment levels. This panel contains only descriptive analytics components and compares
the learner’s engagements versus that of the cohort’ average. Engagement includes
weekly login counts into the virtual learning environment, number of learning resources
accessed, and total forum posts created as a measure of communication exchange levels.
The second panel displays information regarding a learner’s academic performance.
This panel has both descriptive and predictive analytics components. The descriptive
component in the top half, displays the snapshot of a learners’ assignment grades, quiz-
zes and tests. The learner’s data is contrasted again with that of the cohort. The student
can rapidly see their deviation from the class mean and can also inspect in greater detail
how far their score deviates from their peers by viewing the overall class distribution
through the box-and-whisker plots. The dashboard’s predictive component begins in
the lower half of the second panel. This component provides a student with estimates of
what scores they are likely to achieve in the upcoming assignment and their final exam
based on the learners who have exhibited similar learning attributes in the past.
Susnjak et al. Int J Educ Technol High Educ (2022) 19:12 Page 17 of 23

However, the key predictive analytics component and the novel prescriptive analytics
features are found in the third panel. In this panel, an overall prediction is made regard-
ing the learner’s estimated risk profile for meeting the course’s learning outcomes. Given
the importance of this model, we emphasize key aspects of its nature that were miss-
ing in previous studies. The dashboard communicates the accuracy of the underlying
model to the learner, and provides interpretability to the user in terms of what factors
are deemed important to the model at a high-level when it makes a prediction. In addi-
tion, the dashboard contains an explainability component which communicates to the
learner how the model has arrived at a given prediction for their individual case with the
student’s specific input values. The model reasoning provides the learners with a sug-
gestion of what they can alter in their learning behavior in order to alter their outcomes.
The above model transparency capability is further built upon and expanded by the
dashboard’s prescriptive analytics features which incorporate counterfactuals. The coun-
terfactuals indicate to the learner what specific factors together with minimal changes to
their values, would produce different, and more positive predictive outcomes. The coun-
terfactuals make some plausible assumptions about the existence of causal links in the
underlying data, and based on this, generate automated advice to learners about how to
maximize their learning outcomes.
From an aesthetic point of view, the dashboard attempts to minimize the use of color
and renders the display in three hues, thus minimizing the risk of information overload.
Additionally, the dashboard uses a neutral pastel palate to further reduce negative effects
that colors can have, while attempting to maximize the data-to-ink ratio.
From a functional point of view, the proposed dashboard provides comprehensive ana-
lytics capabilities that are not found in existing LADs and demonstrates the state-of-the-
art in terms of incorporating these functionalities. However, the dashboard is currently
in a pilot stage at a tertiary institution with students from across 20 classes actively trial-
ing the tool and evaluating it for usability; therefore, data on its effects on outcomes is
not yet available.

Dashboard design details


The underlying data for the dashboard originated from Moodle, an open-source learning
management system which provides the virtual learning platform for e-learning at the
institution. The dashboard was implemented following a client server architecture. On
the client-side the Power BI2 tool was used to develop the web-based dashboard appli-
cation. Meanwhile, on the server-side Python3 was used for both analytics and for the
extract, transform, load phases.4
We used a mixture of Python’s scikit-learn library and the CatBoost (Dorogush et al.,
2018) classification algorithm for generating student outcome predictions. The underly-
ing features used to make the predictions were engagement deviation score, engagement

2
Power BI is a commercial software package owned by Microsoft. Effective at building dashboards, but at present lacks
capabilities for predictive and prescriptive modelling, as well as model interpretability.
3
Python is a general-purpose programming language with a rich set of capabilities for implementing all required ana-
lytics; however, it lacks easy-to-use capabilities for building front-end dashboards.
4
Both Power BI and Python can effectively be substituted for R and it’s R Shiny technology for constructing dashboards
and developing the underlying analytics functionalities.
Susnjak et al. Int J Educ Technol High Educ (2022) 19:12 Page 18 of 23

rolling average score from the Moodle, assignment rolling average score, assignment
deviation score, and previous grade from the student management system, and demo-
graphic information such as age, English equivalent test, and highest school qualifica-
tion. Models were trained on a dataset comprising 4000 students. Hold-out method was
used and the final accuracy from all the test datasets was displayed to the learners as a
measure of confidence in the reliability of the underlying model. The dataset was divided
into weeks and were used for the prediction analysis. Moreover, the data was used in a
cumulative fashion for making predictions. For example, when making predictions for
week 2 the students’ data from week 1 were taken into consideration as well. The reason
being that prediction accuracy improves as more data becomes available in upcoming
weeks. Prediction accuracy at early stages is important so that timely interventions can
be made to help students.
Model interpretability was implemented used feature importance analysis which
depicts the relative contribution and importance of each variable towards making pre-
dictions. This was also complemented with the use of anchors5 (Ribeiro et al., 2018).
Anchors have recently devised as an approach for making black-box models interpret-
able. Anchors create proxy models which mimic the behavior of the underlying black-
box model but present themselves to the user as a glass-box model. Proxy models are
approximations of the real model and they present themselves as succinct human-read-
able decision rules.
In order to realize prescriptive capabilities, we used data-driven counterfactuals
(Wachter et al., 2017) to suggest to students how an adjustment in certain behavioral
learning patterns would result in a more positive prediction. For example, the coun-
terfactual may suggest to a learner that an increase in their next assignment mark by
a specific amount would change their classification from high-risk to low-risk. Such
data-driven counterfactuals are based on correlation and do not guarantee causal links;
however, in many cases when features are judiciously selected some degree of poten-
tial causality can safely be assumed. We use the Python counterfactual library6 (Mothilal
et al., 2020) to generate the prescriptive analytics on the dashboards. The advantage with
this tool is that the outputs are once again in a rule-based format and easy to compre-
hend. Additionally, the prescriptive suggestions represent a minimum shift in the values
of key features that would need to take place in order to achieve a different outcome to
what is currently predicted.

Discussion
Our study reveals that learner-facing LADs are steadily gaining popularity (Fig. 2), while
it is reasonable to assume that numerous others may have been deployed but remain
unpublished. While the value of LADs are recognized by education providers, we find
from literature that many of the published dashboards are only in their prototype phases,
and only few in the pilot implementation stages. This is also in agreement with findings
from other studies (Chen et al., 2019; Karaoglan Yilmaz & Yilmaz, 2020). A speculative
link could be argued between the low deployment rates of LADs covered in this study

5
Our implementation of anchors used the anchor-exp library https://​pypi.​org/​proje​ct/​anchor-​exp/
6
Dice-ml https://​pypi.​org/​proje​ct/​dice-​ml/
Susnjak et al. Int J Educ Technol High Educ (2022) 19:12 Page 19 of 23

and the underlying technology choices taken as seen in Table 2. The technologies used in
the studies are heavy-duty in respect to design, development and maintenance of LADs,
requiring significant resource investments and agility. As discussed, higher education
organizations are short on both of the latter requirements at present which are therefore
possibly contributing factors. Given that the reviewed LADs mostly used only descrip-
tive analytics, it could be argued that off-the-shelf commercial dashboarding software
would have delivered the same functionalities for a vastly reduced effort, with higher
prospects of productionization.
Given the significant resources required to operationalize LADs, our study has
revealed that there is paucity of evidence on their effectiveness to affect learner out-
comes. This is again supported by Fig. 2 which suggests that learner-facing LADs are
a relatively new and emerging technology in the LA space and so comprehensive and
conclusive meta-study research into their effects has simply not yet taken place. We also
see in Table 2 that most of the past LAD papers involved relatively small study cohorts to
support conclusive findings, with the median being 126 subjects. Larger studies involv-
ing several hundred subjects are emerging, and more will be needed in future in order
to answer this question concretely. Encouragingly, our research did find that about a
quarter of the studies concluded that their LADs produced positive impacts on student
outcomes. However, the number of studies were too small to determine which types of
visualizations or dashboard features directly contributed towards positive impacts on
learner outcomes. Further still, it is unclear if any effects could be attributed to dash-
boards themselves, or to the associated human interventions.
Given the ubiquity of machine learning now, it is a little surprising that predictive
modelling has not featured in a larger percentage of reviewed LADs. A possible hypoth-
esis could be that stringent ethics requirements and risk-averse positions taken by
Research Ethics Committees may be playing a role. Some of the research using predic-
tive analytics could also be encountering obstacles due to emerging legal requirements
that predictive modelling is made completely transparent, interpretable and the predic-
tions explainable to those affected, thus lifting much higher the barrier to entry for those
seeking to leverage machine learning.
Undoubtedly though, data-driven prescriptive analytics represents the next frontier
of LAD development. This is the most sophisticated level of analytics with the ability
to offer learners evidence-based concrete suggestions or recommendations about what
adjustments in learning behaviors would most likely result in positive outcomes (Aljo-
hani et al., 2019; Lu et al., 2018).

Future directions
Our final research question considers future directions of LA dashboarding and inquires
into how some of the existing weaknesses can be addressed. We find that personalization
of learning, which could be referred to as “precision learning” is the future, and there
is a role for LADs in supporting this through embedding of recommendation-like fea-
tures which suggest next steps to learners for maximizing outcomes. In addition, LADs
can take on greater roles in early intervention responses to learners identified as being
at-risk. Integrating automated interventions within the dashboards and evaluating their
effectiveness will be one of the future research directions. The focus on personalized
Susnjak et al. Int J Educ Technol High Educ (2022) 19:12 Page 20 of 23

learning and early interventions as an area of needed focus is also supported by (Gras
et al., 2020; Han et al., 2021) while closing the loop and ensuring that the insights gener-
ated by LA systems, or dashboards, is actionable and not just interesting, is emphasized
by (Baneres et al., 2019; Chen et al., 2019).
Maisarah et al. (2020) noted in their broad survey of LADs the importance of embed-
ding customization capabilities within dashboards in order to make them user-friendly
and thus promote long-term usage of the dashboards. Future research will focus on
developing technologies that possess these capabilities and are able to seamlessly inte-
grate with native platforms used by institutions for their existing Virtual Learning Envi-
ronments. Leitner et al. (2019) also mention the utility of embedding analytics within
dashboards themselves in order to directly gather information on learner usage patterns
of the dashboards themselves in order to optimize them in subsequent iterations of
development.
Lastly, Sedrakyan et al. (2020) go further and ambitiously suggest integrating data from
activities in the learning-process which may not be directly linked with the institutional
learning environments. They propose data acquisition from multi-modal sources such as
biofeedback from various wearable sensors, audio/video streams and using them to aug-
ment LADs. Thus, scalability in processing capabilities of live data streams originating
from wearable sensors would form yet another requirement of future work for LADs.

Study limitations
We acknowledge that the search time-window of 2018 to 2021 is constrained, and that
data from 2021 is partially collected which constitutes a limitation of this study. Data on
the usability of the proposed dashboard and its effects on student outcomes are being
collected. A further limitation of this study is that these data cannot yet be presented,
neither can this tool be made available publicly for trial purposes at this point in time
due to software licensing constraints.

Conclusion
Learning Analytics dashboards (LADs) are becoming increasingly commonplace within
the educational sector with the aims of improving the quality of the learning experience
and thereby maximizing learner outcomes. Our study focused on identifying challenges
associated with LAD projects as well as analyzing characteristics of recent advances in
LADs. We comprehensively surveyed existing LADs and analyzed them through the
prism of the sophistication of insights they deliver and ways in which they help learn-
ers make informed decisions about making adjustments to their learning habits. Finally,
in considering the strengths and weaknesses of existing LADs, we propose a dashboard
currently being deployed for trials at a tertiary institution that attempts to address some
of the gaps we found in literature. Our research findings have both theoretical and prac-
tical implications.

Theoretical implications
We have added to the body of knowledge surrounding what we know to be challenges
in operationalizing Learner Analytics (LA) projects. We refined these challenges to
LAD projects and have identified the lack of agility in higher education institutions as
Susnjak et al. Int J Educ Technol High Educ (2022) 19:12 Page 21 of 23

one of the key pressure points. Our work has confirmed that learner-facing LADs are
on the rise within higher education institutions, but significant gaps in understanding
and quantifying the effectiveness of LADs exists. In particular, uncertainty exists about
which components within LADs are more effective at improving learning outcomes. We
find that predictive modeling functionalities are not used in majority of cases within the
reviewed LADs, and examples of interpretability of the models and the ability to explain
their predictions to the learners do not yet exist in published studies. Additionally, our
study reveals the absence of data-driven prescriptive analytics which, with other gaps,
highlights numerous worthwhile avenues for future studies to pursue.

Practical implications
A key practical implication of this study is a demonstration of how a sophisticated LAD
can be developed which integrates all forms of analytics: descriptive, predictive and pre-
scriptive. We have demonstrated how interpretability of predictive models can be made
available to the learners and critically, how the specific predictions for a given learner
can be explained to them. This will establish trust with the users through transparency
of moving beyond black-box predictive models, and in the process satisfy emerging
regulatory requirements. Additionally, we have demonstrated how automated and data-
driven prescriptive analytics can be leveraged within LADs. Our research also points the
analytics practitioners towards recently developed technologies which more than ever,
make these capabilities accessible to the wider audience.

Abbreviations
AA: Academic Analytics; EDM: Educational Data Mining; LA: Learning analytics; LADs: Learning analytics dashboard(s);
RQ(s): Research question(s); VLE: Virtual Learning Environment.

Acknowledgements
Not applicable.

Authors’ contributions
Conceptualization, TS, GR and AM; Methodology, TS and GR; Software, TS and GR; Validation, TS, GR and AM; Formal
analysis, TS; Investigation, TS and GR; Data curation, TS and GR; Writing—original draft preparation, TS; Writing—review
and editing, TS and AM; Visualization, TS and GR; Supervision, TS and AM; Project administration, TS. All authors read and
approved the final manuscript.

Funding
Not applicable.

Availability of data and materials


Not applicable.

Declarations
Competing interests
The authors declare no competing interests.

Received: 23 October 2021 Accepted: 14 December 2021

References
Adadi, A., & Berrada, M. (2018). Peeking inside the black-box: a survey on explainable artificial intelligence (XAI). IEEE
Access, 6, 52138–52160. https://​doi.​org/​10.​1109/​ACCESS.​2018.​28700​52
Aljohani, N. R., Daud, A., Abbasi, R. A., Alowibdi, J. S., Basheri, M., & Aslam, M. A. (2019). An integrated framework for course
adapted student learning analytics dashboard. Computers in Human Behavior. https://​doi.​org/​10.​1016/j.​chb.​2018.​03.​
035
Susnjak et al. Int J Educ Technol High Educ (2022) 19:12 Page 22 of 23

Aristovnik, A., Keržič, D., Ravšelj, D., Tomaževič, N., & Umek, L. (2020). Impacts of the COVID-19 pandemic on life of higher
education students: a global perspective. Sustainability. https://​doi.​org/​10.​3390/​su122​08438
Baneres, D., Guerrero-Roldán, A. E., Rodríguez-González, M. E., & Karadeniz, A. (2021). A predictive analytics infrastructure
to support a trustworthy early warning system. Applied Sciences. https://​doi.​org/​10.​3390/​app11​135781
Baneres, D., Rodriguez, M. E., & Serra, M. (2019). An early feedback prediction system for learners at-risk within a first-year
higher education course. IEEE Transactions on Learning Technologies. https://​doi.​org/​10.​1109/​TLT.​2019.​29121​67
Bera, P. (2016). How colors in business dashboards affect users’ decision making. Communications of the ACM. https://​doi.​
org/​10.​1145/​28189​93
Bergner, Y. (2017). Measurement and its uses in learning analytics. Handbook of Learning Analytics. https://​doi.​org/​10.​
18608/​hla17.​003
Blankenberger, B., & Williams, A. M. (2020). COVID and the impact on higher education: The essential role of integrity and
accountability. Administrative Theory & Praxis, 42(3), 404–423. https://​doi.​org/​10.​1080/​10841​806.​2020.​17719​07
Bodily, R., Ikahihifo, T. K., Mackley, B., & Graham, C. R. (2018). The design, development, and implementation of student-
facing learning analytics dashboards. Journal of Computing in Higher Education, 30(3), 572–598. https://​doi.​org/​10.​
1007/​s12528-​018-​9186-0
Bowker, G. C., & Star, S. L. (2000). Sorting things out: Classification and its consequences. MIT Press.
Bunton, T. E. (2017). Agility within higher education it organizations: a loosely coupled systems perspective. https://​dc.​uwm.​
edu/​etd/​1451
Campbell, J. P., Deblois, P. B., & Oblinger, D. G. (2007). Academic Analytics: a new tool for a new era.
Chatti, M. A., Muslim, A., Guliani, M., & Guesmi, M. (2020). The LAVA model: learning analytics meets visual analytics. https://​
doi.​org/​10.​1007/​978-3-​030-​47392-1_5
Chen, L., Lu, M., Goda, Y., & Yamada, M. (2019). Design of learning analytics dashboard supporting metacognition. 16th
International Conference on Cognition and Exploratory Learning in Digital Age, CELDA 2019, 175–182. https://​doi.​org/​10.​
33965/​celda​2019_​20191​1l022
Dorogush, A. V., Ershov, V., & Gulin, A. (2018). CatBoost: gradient boosting with categorical features support. http://​arxiv.​org/​
abs/​1810.​11363
Ferguson, R., & Clow, D. (2017). Where is the evidence? A call to action for learning analytics. Proceedings of the Seventh
International Learning Analytics & Knowledge Conference, 56–65.
Fleur, D. S., van den Bos, W., & Bredeweg, B. (2020). Learning analytics dashboard for motivation and performance. https://​
doi.​org/​10.​1007/​978-3-​030-​49663-0_​51
Gras, B., Brun, A., & Boyer, A. (2020). For and by student dashboards design to address dropout. https://​hal.​inria.​fr/​hal-​02974​
682
Guzmán-Valenzuela, C., Gómez-González, C., Rojas-Murphy Tagle, A., & Lorca-Vyhmeister, A. (2021). Learning analytics in
higher education: a preponderance of analytics but very little learning? International Journal of Educational Technol-
ogy in Higher Education. https://​doi.​org/​10.​1186/​s41239-​021-​00258-x
Han, J., Kim, K. H., Rhee, W., & Cho, Y. H. (2021). Learning analytics dashboards for adaptive support in face-to-face collabo-
rative argumentation. Computers & Education. https://​doi.​org/​10.​1016/j.​compe​du.​2020.​104041
He, H., Dong, B., Zheng, Q., & Li, G. (2019, May 9). VUC. Proceedings of the ACM Conference on Global Computing Education.
https://​doi.​org/​10.​1145/​33001​15.​33095​14
Karaoglan Yilmaz, F. G., & Yilmaz, R. (2020). Learning analytics as a metacognitive tool to influence learner transactional
distance and motivation in online learning environments. Innovations in Education and Teaching International.
https://​doi.​org/​10.​1080/​14703​297.​2020.​17949​28
Kia, F. S., Teasley, S. D., Hatala, M., Karabenick, S. A., & Kay, M. (2020). How patterns of students dashboard use are related
to their achievement and self-regulatory engagement. Proceedings of the Tenth International Conference on Learning
Analytics & Knowledge. https://​doi.​org/​10.​1145/​33754​62.​33754​72
Kitto, K., & Knight, S. (2019). Practical ethics for building learning analytics. British Journal of Educational Technology.
https://​doi.​org/​10.​1111/​bjet.​12868
Klerkx, J., Verbert, K., & Duval, E. (2017). Learning Analytics Dashboards. In Handbook of Learning Analytics (pp. 143–150).
Society for Learning Analytics Research (SoLAR). https://​doi.​org/​10.​18608/​hla17.​012
Knight, S., Gibson, A., & Shibani, A. (2020). Implementing learning analytics for learning impact: Taking tools to task. Inter-
net and Higher Education, 45, 100729. https://​doi.​org/​10.​1016/j.​iheduc.​2020.​100729
Kokoç, M., & Altun, A. (2021). Effects of learner interaction with learning dashboards on academic performance in an
e-learning environment. Behaviour & Information Technology. https://​doi.​org/​10.​1080/​01449​29X.​2019.​16807​31
Leitner, P., Maier, K., & Ebner, M. (2020). Web analytics as extension for a learning analytics dashboard of a massive open online
platform. https://​doi.​org/​10.​1007/​978-3-​030-​47392-1_​19
Leitner, P., Ebner, M., & Ebner, M. (2019). Learning analytics challenges to overcome in higher education institutions. In
D. Ifenthaler, D. K. Mah, & J. Y. Yau (Eds.), Utilizing learning analytics to support study success (pp. 91–104). Springer
International Publishing. 10.1007/978-3-319-64792-0_6.
Liu, R., & Koedinger, K. R. (2017). Going beyond better data prediction to create explanatory models of educational data.
The Handbook of Learning Analytics. https://​doi.​org/​10.​18608/​hla17.​006
Lu, J., Liu, A., Dong, F., Gu, F., Gama, J., & Zhang, G. (2018). Learning under concept drift: A review. IEEE Transactions on
Knowledge and Data Engineering. https://​doi.​org/​10.​1109/​TKDE.​2018.​28768​57
Mahroeian, H., Daniel, B., & Butson, R. (2017). The perceptions of the meaning and value of analytics in New Zealand
higher education institutions. International Journal of Educational Technology in Higher Education. https://​doi.​org/​10.​
1186/​s41239-​017-​0073-y
Maisarah, N., Khuzairi, S., & Cob, Z. C. (2020). The divergence of learning analytics research. International Journal of
Advanced Science and Technology 29(6s). http://​sersc.​org/​journ​als/​index.​php/​IJAST/​artic​le/​view/​9300
Majumdar, R., Akçapınar, A., Akçapınar, G., Flanagan, B., & Ogata, H. (2019). LAViEW: Learning Analytics Dashboard Towards
Evidence-based Education. In: In Companion Proceedings of the 9th International Conference on Learning Analytics and
Knowledge (pp. 1–6). Society for Learning Analytics Research (SoLAR).
Susnjak et al. Int J Educ Technol High Educ (2022) 19:12 Page 23 of 23

Menon, S., & Suresh, M. (2020). Factors influencing organizational agility in higher education. Benchmarking: an Interna-
tional Journal, 28(1), 307–332. https://​doi.​org/​10.​1108/​BIJ-​04-​2020-​0151
Moher, D., Liberati, A., Tetzlaff, J., & Altman, D. G. (2009). Preferred reporting items for systematic reviews and meta-analy-
ses: The PRISMA statement. BMJ (online). https://​doi.​org/​10.​1136/​bmj.​b2535
Mothilal, R. K., Sharma, A., & Tan, C. (2020). Explaining machine learning classifiers through diverse counterfactual expla-
nations. FAT* 2020—Proceedings of the 2020 Conference on Fairness, Accountability, and Transparency, 607–617. https://​
doi.​org/​10.​1145/​33510​95.​33728​50
Muhammad, R. N., Tasmin, R., & Nor Aziati, A. H. (2020). Sustainable competitive advantage of big data analytics in higher
education sector: An Overview. Journal of Physics: Conference Series. https://​doi.​org/​10.​1088/​1742-​6596/​1529/4/​
042100
Mukerjee, S. (2014a). Agility: A crucial capability for universities in times of disruptive change and innovation. Australian
Universities’ Review, 56, 56–60.
Mukerjee, S. (2014b). Organizational agility in Universities (pp. 15–25). IGI Global.
Namoun, A., & Alshanqiti, A. (2020). Predicting student performance using data mining and learning analytics techniques:
A systematic literature review. Applied Sciences. https://​doi.​org/​10.​3390/​app11​010237
Naranjo, D. M., Prieto, J. R., Moltó, G., & Calatrava, A. (2019). A visual dashboard to track learning analytics for educational
cloud computing. Sensors. https://​doi.​org/​10.​3390/​s1913​2952
Nguyen, A., Gardner, L., & Sheridan, D. P. (2020). Data analytics in higher education: An integrated view. Journal of Informa-
tion Systems Education, 31(1), 61–71.
Owatari, T., Shimada, A., Minematsu, T., Hori, M., & Taniguchi, R. (2020). Real-time learning analytics dashboard for students
in online classes. International Conference on Teaching, Assessment, and Learning for Engineering (TALE). https://​doi.​
org/​10.​1109/​TALE4​8869.​2020.​93683​40
Rets, I., Herodotou, C., Bayer, V., Hlosta, M., & Rienties, B. (2021). Exploring critical factors of the perceived usefulness of
a learning analytics dashboard for distance university students. International Journal of Educational Technology in
Higher Education. https://​doi.​org/​10.​1186/​s41239-​021-​00284-9
Ribeiro, M. T., Singh, S., & Guestrin, C. (2018). Anchors: high-precision model-agnostic explanations. In: Proceedings of the
AAAI Conference on Artificial Intelligence, 1527–1535. www.​aaai.​org
Rubel, A., & Jones, K. M. L. (2016). Student privacy in learning analytics: An information ethics perspective. The Information
Society, 32(2), 143–159. https://​doi.​org/​10.​1080/​01972​243.​2016.​11305​02
Sedrakyan, G., Malmberg, J., Verbert, K., Järvelä, S., & Kirschner, P. A. (2020). Linking learning behavior analytics and
learning science concepts: Designing a learning analytics dashboard for feedback to support learning regulation.
Computers in Human Behavior. https://​doi.​org/​10.​1016/j.​chb.​2018.​05.​004
Sun, K., Mhaidli, A. H., Watel, S., Brooks, C. A., & Schaub, F. (2019). It’s My Data! Tensions among stakeholders of a learning
analytics dashboard. Poceedings of the CHI 2019 Conference on Human Factors in Computing Systems. https://​doi.​org/​
10.​1145/​32906​05.​33008​24
Tufte, E. The Visual Display of Quantitative Information, (2001).
Ulfa, S., Fattawi, I., Surahman, E., & Yusuke, H. (2019). Investigating learners’ perception of learning analytics dashboard to
improve learning interaction in online learning system. 2019 5th International Onference on Education and Technology
(ICET). https://​doi.​org/​10.​1109/​ICET4​8172.​2019.​89872​29
Umer, R., Susnjak, T., Mathrani, A., & Suriadi, L. (2021). Current stance on predictive analytics in higher education: Oppor-
tunities, challenges and future directions. Interactive Learning Environments. https://​doi.​org/​10.​1080/​10494​820.​2021.​
19335​42
Valle, N., Antonenko, P., Valle, D., Sommer, M., Huggins-Manley, A. C., Dawson, K., Kim, D., & Baiser, B. (2021). Predict or
describe? How learning analytics dashboard design influences motivation and statistics anxiety in an online statis-
tics course. Educational Technology Research and Development. https://​doi.​org/​10.​1007/​s11423-​021-​09998-z
Wachter, S., Mittelstadt, B., & Russell, C. (2017). Counterfactual explanations without opening the black box: Automated
decisions and the GDPR. SSRN J, 31, 841.
Wilson, A., Watson, C., Thompson, T. L., Drew, V., & Doyle, S. (2017). Learning analytics: Challenges and limitations. Teaching
in Higher Education, 22(8), 991–1007. https://​doi.​org/​10.​1080/​13562​517.​2017.​13320​26
Yoo, M., & Jin, S.-H. (2020). International forum of educational technology & society development and evaluation of learn-
ing analytics dashboards to support online discussion activities. Technology & Society, 23(2), 1–18. https://​doi.​org/​10.​
2307/​26921​130

Publisher’s Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

You might also like