0% found this document useful (0 votes)
19 views20 pages

Learning Analytics For Learning Design

Learning Analytics for Learning Design

Uploaded by

fm.decouto
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
19 views20 pages

Learning Analytics For Learning Design

Learning Analytics for Learning Design

Uploaded by

fm.decouto
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd

See discussions, stats, and author profiles for this publication at: https://www.researchgate.

net/publication/327406391

Learning Analytics for Learning Design: A Systematic Literature Review of


Analytics-Driven Design to Enhance Learning

Article in IEEE Transactions on Learning Technologies · September 2018


DOI: 10.1109/TLT.2018.2868673

CITATIONS READS

138 5,104

2 authors:

Katerina Mangaroska Michail Giannakos

29 PUBLICATIONS 298 CITATIONS


Norwegian University of Science and Technology
284 PUBLICATIONS 5,856 CITATIONS
SEE PROFILE
SEE PROFILE

Some of the authors of this publication are also working on these related projects:

International Journal of Learning Analytics and Artificial Intelligence for Education (www.i-jai.org) View project

SOCRATIC - Social Creative Intelligence Platform for achieving Global Sustainability Goals View project

All content following this page was uploaded by Michail Giannakos on 05 September 2018.

The user has requested enhancement of the downloaded file.


This article has been accepted for publication in a future issue of this journal, but has not been fully edited. Content may change prior to final publication. Citation
information: DOI 10.1109/TLT.2018.2868673, IEEE Transactions on Learning Technologies
1

Learning analytics for learning design: A systematic


literature review of analytics-driven design to
enhance learning
Katerina Mangaroska, Michail Giannakos, Senior Member, IEEE

Abstract
As the fields of learning analytics and learning design mature, the convergence and synergies between the two are becoming
an important area for research. This paper intends to summarize the main outcomes of a systematic review of empirical evidence
on learning analytics for learning design. Moreover, this paper presents an overview of what and how learning analytics have been
used to inform learning design decisions and in what contexts. The search was performed in seven academic databases, resulting in
43 papers included in the main analysis. The results from the review depict the ongoing design patterns and learning phenomena
that emerged from the synergy that learning analytics and learning design impose on the current status of learning technologies.
Finally, this review stresses that future research should consider developing a framework on how to capture and systematize learning
design data grounded in learning analytics and learning theory, and document what learning design choices made by educators
influence subsequent learning activities and performances over time.

Index Terms - Learning analytics, Learning design, Empirical studies, Systematic literature review.

I. I NTRODUCTION good practices across educational contexts [4, p.3]. Yet, past
research has been focused on “conceptualizing learning design
The use of analytics to discover important learning phenom- principles, without evaluating what happens after the design
ena (e.g. moment of learning or misconception) and portray process” [5, p.333]. In addition, several studies have tried to
learners’ experiences and behaviors, is evident and commonly understand and improve the learning design experiences by
accepted due to the pervasiveness of learning technologies. utilizing learning analytics, but only few of them established
Learning analytics holds a critical role in understanding human the usage of learning analytics on existing principles and
learning, teaching, and education, by identifying and validating theories in learning sciences, educational research, technology
relevant measures of processes, outcomes, and activities. In acceptance, and human-computer interaction [6], [7].
addition, learning analytics supports and promotes evidence- As it can be observed from the literature, learning
based practices derived from evaluation and assessment of analytics is an interdisciplinary field embracing methods
learners’ progress, motivation, attitudes, and satisfaction. How- and approaches from various disciplines, but it lacks a
ever, learning analytics lacks theoretical orientation that can consolidated model to systematize how those disciplines
assist researchers to explain inconsistencies, avoid misinter- are merged together [1]. Moreover, research is missing to
pretations, and consider and clarify any contextual condi- measure what learning design decisions stimulate productive
tions (e.g. instructional, sociological, psychological, etc.) that learning environments, and what learning analytics generate
affect learning [1], [2]. Moreover, Reimann highlights that actionable design insights [1]. To bridge the gap, this paper
"atheoretical approaches to learning analytics might produce centers in a systematic literature review with an aim to
misconceptions because it is the logical (and ethical) error of examine the intersection between learning analytics and
using descriptions of the past as prescriptions for the future” [2, learning design, and provide important insights beyond the
p.136]. Consequently, without theoretical grounding of learn- specific research findings within the individual disciplines.
ing analytics and contextual interpretation of the collected data, Although the field of learning analytics is still relatively
learning analytics design capabilities are limited. From this young [8], also indicated by Google Trends (see Figure 1),
perspective, learning design is utterly important as it provides enough work has already been done to conduct a review [9],
the framework for analyzing and interpreting data, learner’s [4]. Thus, the study addresses the following research questions:
behavior, and successful or inefficient learning patterns.
Learning design defines the educational objectives and the RQ1: What is the current status of learning analytics for
pedagogical approaches that educators can reflect upon, take learning design research, seen through the lens of educational
decisions, and make improvements. In other words, learn- contexts (i.e. users and rational for use), distribution of
ing design is the “application of methods, resources, and pedagogical practices, and methodologies (i.e. types of data
theoretical frameworks to achieve a particular pedagogical and data analysis techniques employed)?
goal in a given context” [3, p.88]. Moreover, learning design
“documents the sequence of learning tasks and teaching meth- RQ2: What learning analytics have been used to inform
ods” as main premises for re-usability and transferability of learning design decisions, and to what extent learning analytics
This article has been accepted for publication in a future issue of this journal, but has not been fully edited. Content may change prior to final publication. Citation
information: DOI 10.1109/TLT.2018.2868673, IEEE Transactions on Learning Technologies
2

However, recent studies advocate that learning analytics should


not promote "one-size-fits-all" research due to the fact that
students’ individual differences have strong implications on
the learning behavior, motivation, and engagement [15], [16].
Thus, designing and applying personalized learning activities
could hold the promise to improve student’s performance and
learning progress [17]. Along the same lines, Papamitsiou
and Economides [9], [18] have systematized the significant
contribution of learning analytics empirical research and iden-
tified some early indications of how learning analytics and
data mining might support personalized and adaptive learning
Fig. 1. Search interest in Learning Analytics (blue line) and Learning Design experiences utilizing rich data. Consequently, developing and
(orange line) according to Google Trends. employing personalized learning and feedback mechanisms to
support learners to follow and regulate their progress, involves
more than just data easily collected. It actually tackles the
have the capacity to support dynamic and data-driven learning learning design activities, grounded in theory and data, where
design decisions? educators decide how to introduce the analytics and how to
frame aspects of their interpretation within a socio-technical
The rest of the paper is organized as follows. In the next system. This follows on the idea that "human decision-making
section the authors present the related work; the third sec- and consequent actions are as much a part of any successful
tion describes the methodology used for the literature review analytics solution as its technical components" [19, p.4].
describing how the studies were selected and analyzed. The
fourth section presents the research findings derived from the
data analysis based on the specific areas of focus. Finally, in B. Learning design
the last section, the authors discuss the results and identify Learning design is another field that has been associated
gaps, while making suggestions for future considerations. with online and technology enhanced learning (TEL) research
in the late 1990s and 2000s, holding different theoretical
II. R ELATED W ORK background than the domain of instructional design [3]. The
term was coined to replace the already established term instruc-
A. Learning analytics tional design grounded on behaviorism and cognitivism [20],
In the last ten years, learning analytics has highlighted the [21], by embracing educational interventions based on socio-
gradual shift from technological towards educational perspec- constructivist approaches mediated by technology [3], [22]. In
tive, despite its roots in business intelligence, recommender addition, the field of learning design has altered the perception
systems, and educational data mining [10]. Its emergence as a of educator’s role in education, in the way Laurillard proposed:
separate field is due to the increasing trend of digitization in "not to transmit knowledge to a passive recipient, but to struc-
the field of education, the appearance of distributed learning ture the learner’s engagement with the knowledge, practising
environments, and the increased engagement in online learning the high-level cognitive skills that enable them to make that
experiences [11]. The practice of learning analytics evolved knowledge their own" [23, p.527].
around the idea of harnessing the power of digital technologies At present, learning design is very diverse, because the
to collect traces that users leave behind, in order to understand conceptualization of the term is contingent on observer’s
activities and behaviors associated with user’s learning [8]. As choice of perspective [24], [3]. This stems from the discourse
a result, learning analytics holds the potential to: 1) explain among the researchers and practitioners that shape the field
unexpected learning behaviors, 2) identify successful learning of learning design [25]. Nonetheless, learning design must be
patterns, 3) detect misconceptions and misplaced effort, 4) conceptualized before it can be utilized as a process that leads
introduce appropriate interventions, and 5) increase users’ to explicit and sharable design outputs [26].
awareness of their own actions and progress [11]. Undoubtedly, Thus, some researchers see learning design as “a form of
learning analytics is an interdisciplinary field that embraces a documentation of pedagogical intent that provides the context
holistic approach to study learning contexts and environments, to interpret the analytics from the diverse data sets" [4, p.1].
and to address questions in educational research [12]. As a For others, learning design is "a methodology that educators
term it has a generally accepted definition, adopted by the use and communicate with each other to make informed
Society for Learning Analytics Research (SoLAR): "Learning decisions in designing learning activities and interventions
analytics is the measurement, collection, analysis, and report- with effective use of resources and technologies" [25, p.121].
ing of data about learners and their contexts, for purposes of However, in a more general sense, learning design can be
understanding and optimizing learning and the environments defined as "the description of the teaching-learning process
in which it occurs" [13]. that takes place in a unit of learning (e.g., a course, a lesson
Previous research [14] supports and promotes evidence- or any other designed learning event)." [27, p.14].
based practices of learning analytics’ potential in understand- Although learning design and instructional design perspec-
ing and optimizing the complexities of the learning process. tives have a substantial overlap in the literature, learning design
This article has been accepted for publication in a future issue of this journal, but has not been fully edited. Content may change prior to final publication. Citation
information: DOI 10.1109/TLT.2018.2868673, IEEE Transactions on Learning Technologies
3

emphasizes more the learner’s context and the constructivist forms of assessment.
approach in the learning activities [3]. Thus, learning design Having understanding of specific issues and phenomena
can be seen "as a creative process of perpetual educational during the implementation of learning design activities in
innovation grounded in well-defined context of practice and technology-rich educational settings is of utmost importance
pedagogical theory for generating new solutions to various [31]; and in the same time very challenging to be addressed
educational challenges." [3, p.93]. On one hand, its aim as without the support of learning analytics. Moreover, there is no
a field is to explore educational issues and support educators other classification of learning design concepts, that has been
to make pedagogically grounded decisions in their teaching empirically used to compare, on a large scale, module designs
practices [25]. Consequently, a common language for learning across disciplines in university institutions [5]. Consequently,
design is needed, as to streamline the process of constructing, the authors want to explore what type of learning design
validating, and disseminating design knowledge among the activities have been used in the selected studies, and what
research community. Along the same lines rest the idea behind learning analytics the researchers have applied to see how
Perisco and Pozzi’s call for multi-dimensional framework, that student’s behavior relates to a specific learning activity. In
brings together a number of approaches and tools for the particular, the focus is towards a well-designed learning design
design of learning, rather than just schemes and graphical activities that provide foundation for effective scaffolding of
representations [28]. All these ideas could lead towards a student’s learning behavior.
positive impact on sharing, discussing, and improving the
educational processes [22].
On the other hand, its aim is also to support educators to or- D. Theoretical alignment
chestrate all of the activities that learning design encompasses, Adoption of data-driven approaches in learning analytics
including the constraints and challenges (e.g. time, attention, emphasize the power of data science methods with unprece-
cognitive overload) they face in their everyday practice [29], dented amounts of data collected from students and teachers in
[30]. Orchestration is a research phenomenon that deals with distributed learning environments [35]. However, data-driven
the complexity of learning design activities and application of approaches were later recognized as not sufficiently informa-
technological innovations in education [31]. As a definition, tive [36]. Furthermore, as highlighted by Gašević et. al, [36]
orchestration covers "the process by which teachers and other one of the most important tasks of learning analytics is the
actors design, manage, adapt and assess learning activities, "development of measures that can increase the understanding
aligning the resources at their disposal to achieve the max- into the learning processes and interpret those measures to
imum learning effect, informed by theory while complying inform existing theories for the purpose of developing action-
pragmatically with the contextual constraints of the setting" able initiatives in teaching practices and design of learning
[32]. Furthermore, the field of orchestration research proposes environments". Consequently, theory orientation in learning
tools and frameworks to conceptualize and adapt the available analytics is essential. It helps to identify meaningful patterns
pedagogical and technological innovations, as a way to achieve and associations between digital traces and learning outcomes
improvement in teaching and learning. As such, it cannot be [37], [1]; decide what questions to research to improve TEL
overlooked in the discourse apropos of learning design and [38]; what methods and analysis to select [39]; and how
learning analytics [33]. to interpret the outcomes to produce actionable insights for
various stakeholders [36]. In the existing literature, there is a
reference model that identifies four critical learning analytics
C. Learning design taxonomy dimensions: what (data is gathered, managed and analyzed),
The term learning design in this paper refers to the process who (is the target audience), why (data is gathered and
of designing effective learning experiences with the use of analyzed, and how (data will be analyzed) that need to be
technological innovations and resources. If effective, this pro- considered when designing learning activities [40]. Similarly
cess could be shared between educators and reused or adapted. to Chatti et al. [40] Greller and Drachsler [41] identified six
Thus, there are several initiatives to create descriptive frame- critical dimensions of leaning analytics that need to be covered
work of instructional practices so that teaching approaches by the design to ensure use of learning analytics in an "educa-
are shared among educators [34]. One of those initiatives is tionally beneficial way". Another conceptual four dimensional
the Open University Learning Design Initiative, that categorize framework proposed by Martinez et al. [42] provides guide-
learning design in seven broad learning design activities [34]. lines how to design learning analytics technologies that will
Assimilative are learning activities in which students attend address orchestration challenges utilizing data from interactive
to information as required by their instructors; Finding and surfaces. Finally, a new conceptual framework (Orchestrating
handling information includes learning activities which focus Learning Analytics - OrLA) is proposed to overcome the gap
on skills development; Communicative activities encompass all in the adoption of learning analytics innovations by supporting
communication between students, or students and instructors; inter-stakeholder dialogue at the practitioner level [33].
Productive activities focus on active learning where students Despite the widely accepted and used term "learning an-
build artifacts; Experiential activities support students to apply alytics", the reference to "learning" is still young as learning
their knowledge in real-world settings; Interactive/adaptive ac- analytics only recently began to make connection with learning
tivities include role-play, problem-based scenarios in simulated theories [10], [15]. This deficiency leads to another current
experiments; and finally assessment activities that include all issue of misalignment between the information generated by
This article has been accepted for publication in a future issue of this journal, but has not been fully edited. Content may change prior to final publication. Citation
information: DOI 10.1109/TLT.2018.2868673, IEEE Transactions on Learning Technologies
4

learning analytics, and the needs, problems, and concerns with learning analytics, underlining the importance of learning
teachers have with the learning design activities. The reason for design in learning experiences and teaching practices. More-
this misalignment can also be found in the gap between data over, they also emphasized that "learning design focuses on
easily captured from system logs and data that is pedagogically what students do as part of their learning, rather than on
valuable. One possible solution to overcome the disconnec- the content that is delivered by the teacher" [46]. However,
tion between research and everyday pedagogical practice, is there is a paucity of evidence for how learners respond to
development of a common framework that will systematize different learning designs, that hinders researchers to explore
the process of establishing effective solutions using learning which pedagogies and conceptualizations work best [34]. Fur-
analytics grounded in theory to open educational issues. The thermore, several studies highlighted and acknowledged the
advantage of developing a common framework should be seen need to align both approaches with a conceptual framework
in establishing understanding, validity, reliability, and direct that will facilitate further maturation of the fields [36], [6],
support by clear guidance of the types of analytics and tools [47], [48], [49]. Having a generally accepted framework might
essential for particular learning contexts. For example, we help researchers to understand how specific design elements
have Perisco and Pozzi’s [28] framework of representations, (i.e. design of learning tasks) influence students’ behaviors,
approaches, and tools from which teacher’s training in learning engagement, and learning; while at the same time discover
design can draw upon, combined with hands on experience. how students engage and learn within an authentic pedagogical
Next is the proposed conceptual framework that links learning and technological context. In addition, Lockyer and Dawson’s
analytics to learning design with the aim to support enquiry- [50] work complements these studies by demonstrating the
based evaluation and scaffolding of learning designs [43]. evaluative potential of learning analytics to inform pedagogical
Furthermore, the concept of orchestrating learning analytics action, and accordingly, improvements in the learning design.
(OrLA) aims to provide conceptual framework and guidelines In their later research, they highlighted the importance of inter-
that support teacher’s complex activities utilizing learning ana- preting learners’ behaviors and reactions, as well as developing
lytics in authentic educational practices [33]. Another possible a conceptual model for educators’ use of learning analytics
solution to overcome the disconnection between research and when developing learning design strategies [4]. Additionally,
everyday pedagogical practice is utilizing effective learning Persico and Pozzi [28] collocated much of the work done in the
techniques grounded in theory to help students achieve their fields of learning analytics and learning design, and highlighted
learning goals that can later be empirically validated and the issues using a multitude of different approaches, tools,
modeled to more directly guide behavior [44]. and representations that are not interoperable, and as such
Although learning analytics is receiving close attention in present various epistemological issues. Finally, aligning the
the TEL community, there are issues that the field is struggling both approaches with a conceptual framework could also
to answer. This is due to the lack of theoretical grounding increase the communication among the various stakeholders
in interdisciplinary approaches, such as educational research, about the adoption of learning analytics at practitioner level
learning sciences, psychology, human-computer interaction, [33].
data mining, and research methods [1]. Therefore, Gašević Although learning analytics and learning design share com-
et al. [1] proposed a consolidated model how theory, design, mon goals, their alignment and convergence is still limited. To
and data mutually interact and inform decisions related to address this issue, the research community needs to reach out
practice, privacy, ethics, policy, and standards. This model in both directions; learning analytics needs to consider educa-
incorporates theory to identify which associations between tional research and theory in the design of the analytics; while
learning analytics and learning outcomes are meaningful, and learning design needs to utilize data mining and information
as such, insert them into analytical models; design grounded contextualization before designing for analytics use [1].
in learning theory and tailored to activate particular learning Consequently, for the purpose of this review study and
mechanisms; and data to identify indicators and measures in interpretation of the results based on the findings from the wide
learning analytics far from just using counts of click-streams literature, the authors outline a definition of learning analytics
data. for learning design in the context of the proposed research
questions, as: "usage of learners and educators-produced data
E. Learning analytics for learning design to discover behavior patterns that are of core interest to
both groups, for the purpose of devising explicit, sharable,
As the field of learning analytics matures, its convergence and reusable learning designs, practices, resources, and tools,
and synergy with the field of learning design becomes an aimed at achieving educational goals in a given learning
important area for research. The alignment between learning context". Considering this definition, the current review study
analytics and learning design derives from the possibility to: investigates what empirically-based learning analytics are em-
1) utilize learning analytics to "facilitate the drive from ployed to inform actionable design decisions using what types
tacit educational practice to explicit" [34], of data and analysis methods; as well as to report the influence
2) utilize learning design in pedagogical context to trans- of those analytics-driven design-decisions in learning and
late the learning analytics’ findings into meaningful teaching.
information [4], [28], [45].
Rienties et al. [34] presented a review study of ten years
research at Open University UK in aligning learning design
This article has been accepted for publication in a future issue of this journal, but has not been fully edited. Content may change prior to final publication. Citation
information: DOI 10.1109/TLT.2018.2868673, IEEE Transactions on Learning Technologies
5

TABLE I. I NCLUSION /E XCLUSION (I/E) CRITERIA


III. M ETHODOLOGY
Inclusion criteria Exclusion criteria
To answer the research questions, the authors decided to The research addresses educational practices. No abstract.
conduct a systematic review of the literature by following The research is an empirical study. The paper is written before 2010.
transparent procedure adopted in the field of computer science The research explores data analytics role
The paper is not written in English.
in supporting learning design activities.
in order to minimize potential researcher biases and support Not a research/peer-reviewed paper
Research target audience are
reproducibility [51]. students and/or teachers.
(e.g. editorial, workshop,
expert opinion, work-in-progress).

A. Systematic review planning


To the authors knowledge, no previous work aimed at (CASP) [52], [53] and by principles of good practice for
producing a systematic and comprehensive overview of the conducting empirical research in software engineering [54]. As
existing empirical work on the convergence and synergy be- Dybå and Dingsøyr [52] specified, the quality criteria needs to
tween learning analytics and learning design. Thus, the aim of cover three main issues (i.e. rigour, credibility, and relevance)
this paper is to systematize and summarize the empirical work that needs to be considered when evaluating the quality of the
in the fields over time, and aggregate the insights from the selected studies. Finally, the retrieved papers can be duplicates,
review. The comprehensive review provided in this paper could overlapping or extended versions from the same authors. In
help different stakeholders (especially instructional designers such cases, the duplicate papers will be immediately discarded,
and TEL researchers) to understand what has already been the overlapping papers will be integrated and treated as one
explored, implemented, and validated at the intersection of paper, while for the extended papers a selection will always
learning analytics and learning design. In particular, the authors be made for the extended publication due to the details the
aim to investigate the current status of learning analytics for extended version provides.
learning design; classify what learning analytics indicators
have been used to inform learning design decisions; and offer B. Search string construction
a synthesis of the existing approaches towards the alignment
The search string used during the search covers three main
of learning design and learning analytics.
terms (analytics, design, and learning) which have to appear
Search strategies. To find primary studies relevant for the
in the potentially relevant primary studies. The combination of
review study, the authors decided to include only empirical
the three main terms should capture a large scale of potential
peer-reviewed work as a standard for the quality of the selected
research at the intersection between learning analytics and
studies. The peer-reviewed papers need to be published in one
learning design. The terms, analytics and design, are the main
of the main five academic electronic databases in Technology
topics of the study. However, the authors are only interested
Enhanced Learning (TEL): ACM DL, IEEE Explore, Springer-
how these two terms are used in the field of education; thus
Link, Science Direct, and Wiley, and two additional databases,
adding the third term learning. The search string used is:
SAGE and ERIC. The second cycle included an independent
“analytics” AND “design” AND “learning”. Due to the
search in key educational technology journals listed in the
high number of irrelevant papers (i.e., false positives) returned
Google metrics sub-category: Educational Technology; i.e.
back using the search string “analytics” AND “design” AND
Computers & Education, British Journal of Educational Tech-
“learning”, the authors decided to narrow the search by
nology (BJET), The Internet and Higher Education, Journal
combining the three words into (”learning analytics” AND
of Educational Technology & Society, Journal of Computer
”design”) and (”analytics” AND ”learning design”). In ad-
Assisted Learning, Educational Technology Research and De-
dition to the three main terms, the authors decided to add
velopment, International Journal of Computer-Supported Col-
one more term, orchestration, as shown in Figure 2, that is
laborative Learning, IEEE Transactions on Learning Technolo-
already embraced by the educational researchers, to explain
gies, and the International Conference of Learning Analytics
the practical issues and task that are “not directly linked
and Knowledge (LAK). Moreover, a search in Google Scholar
with learning but can shape learning”, making it relevant
for potentially relevant literature that is not normally indexed
for utilizing learning analytics in learning settings [55]. The
in the most common academic databases (e.g., ICLS, EDM,
authors added this fourth term to capture potential literature
CSCL etc.) was also performed. The third and final cycle
that uses the expression orchestration to refer to the complexity
included a search in the reference section for each selected
of learning design activities not only in the classroom, but
paper in order to find additional relevant papers (i.e. the
also in online or blended learning scenarios, that otherwise
snowball technique).
might have been omitted. The additional search string used
Selection criteria. The primary studies retrieved from the
is: “analytics” AND “design” AND “learning” AND “or-
databases or the educational and technology journals, need to
chestration”. Consequently, the authors decided to use the
be filtered using different sets of criteria. Initially, the authors
following search strings:
will consider four inclusion and four exclusion criteria to select
papers to be further analyzed in the review study, as shown in 1) ”learning analytics” AND ”design”
Table I. Next, from the initial selection of papers, the authors 2) ”analytics” AND ”learning design”
will continue the selection process according to another set 3) “analytics” AND “design” AND “learning” AND
of eight quality criteria shown in Table II. These criteria were “orchestration”
informed by the proposed Critical Appraisal Skills Programme customized to the specific syntax for each database.
This article has been accepted for publication in a future issue of this journal, but has not been fully edited. Content may change prior to final publication. Citation
information: DOI 10.1109/TLT.2018.2868673, IEEE Transactions on Learning Technologies
6

TABLE III. S EARCH RESULTS BY SOURCE


Source Raw results I/E criteria
SpringerLink 473 39
Wiley 258 35
ACM Digital Library 470 40
IEEE Xplore 452 65
Science Direct 306 68
SAGE 108 8
ERIC 280 33
Total 2347 288

authors, and critically appraised according to the eight criteria


shown in Table II. These criteria were informed by CASP and
adapted for the purpose of this study following the Quality
Assessment form used in a systematic review study about
empirical studies of agile software development [52]. Each of
the eight criteria was graded on a "yes" or "no" scale. Thus,
Fig. 2. Search query based on combination of four words this step returned 38 papers for which we could say with a
confidence, that the selected studies could make a valuable
TABLE II. Q UALITY C RITERIA contribution to this review.
Since almost all of the key educational technology jour-
1. Does the study clearly address the research problem?
nals mentioned in the systematic review planning section are
2. Is there a clear statement of the aims of the research? included in the selected databases, the second search cycle
3. Is there an adequate description of the context in which the research (i.e. independent search in key educational and technology
was carried out?
4. Was the research design appropriate to address the aims of the research? journals), returned no additional papers that needed to be
5. Does the study clearly determine the research methods included. The third and final cycle, a search in the reference
(subjects, instruments, data collection, data analysis)? section of each of the selected 38 papers, also returned no
6. Was the data analysis sufficiently rigorous?
7. Is there a clear statement of findings? new additional papers to be included in the systematic review
8. Is the study of value for research or practice? analysis.
As of June 2018, the authors performed an additional search
(i.e., for 2017) following the same steps for papers published
C. Systematic review execution after the initial search period (i.e. 2010-2016). The additional
search returned 5 papers.
The first step after constructing the search string includes In conclusion, the search process uncovered a total of 43
execution of the search queries in the selected databases and papers that were read it entirely, coded, and critically assessed
journals, from mid-October to mid-December 2016. One of to the review context of this systematic study. The joint
the researchers searched the titles, abstracts, and keywords probability of agreement measure was 80%, meaning that 80%
of the articles in the included electronic databases, and the of the time the two researchers had an overall agreement
educational and technology journals. A temporal filter was rate during the selection stages. The two researchers resolved
applied since learning analytics is a relatively new field that any disagreement in consensus meetings. A summary for the
emerged back in 2010. This search strategy resulted in a systematic review execution process is shown in Figure 3.
total of 3251 "hits" that included 2347 distinct papers, as
shown in Table III. In the second step, both researchers (i.e.
the authors of this paper) went through the titles, abstracts,
metadata, and keywords, of all studies that resulted from step
one, to determine their relevance for the systematic review.
At this stage, the researchers excluded studies that were not
about educational practices or have nothing to do with learning
and teaching. For example, the search returned papers about
music due to the inclusion of the term orchestration. In this
step, the researchers followed the four inclusion and four
exclusion criteria mentioned in Table I. Moreover, in this
stage, the researchers faced two issues. One, there were cases
in which some authors used witty titles that could mislead
the actual content of the paper. Second, some abstracts were
missing, poor, or misleading. Therefore, at this stage, the D. Data coding
Fig. 3. Summary of the systematic review execution process
researchers scanned the full text of those studies, looking at During the coding process, the authors extracted data for
the methodology section and the reported findings. This step more than 15 variables. However, a consensus was reached
returned 288 papers, as shown in Table III. In the third step, based on the most important variables that could direct unbi-
each of the 288 studies was assessed independently by both ased and ethical analysis of the selected papers, with the final
This article has been accepted for publication in a future issue of this journal, but has not been fully edited. Content may change prior to final publication. Citation
information: DOI 10.1109/TLT.2018.2868673, IEEE Transactions on Learning Technologies
7

aim to answer the research questions. Thus, the focus areas data, 5) course-related performance, and 6) course meta-data
used for the analysis are: 1) the research design employing the (i.e., data regarding the course structure, goals, resources).
classification presented in [56] and the topic of the study; 2) the
educational context in which the study took place 1 proposed
by [9] and the learning scenario; 3) the sample size and the unit F. Towards learning analytics for learning design taxonomy
of analysis in the study; 4) the pedagogical practices and goals In order to comprehensively study learning analytics met-
that have been adopted and used (e.g., collaborative learning, rics and learning design decisions, researchers need to find
self-regulation); 5) the type of the learning platform; 6) the some systematic way to organize, describe, and communicate
technology and tools used by the subjects during the study the research findings using the ontology of the domain (i.e.
(e.g., digital: social media tools, or traditional: mouse, key- specifications of conceptualizations) [61]. Thus, selecting and
boards, pen and paper); 7) the data sources and data collection labeling instances under study, and classifying those instances
instruments (e.g., interviews, surveys); 8) the methodology and in terms of similarities and differences, leads towards hier-
data analysis techniques; 9) the research objectives reported in archical classifications of entities within a specific domain
the studies (e.g., behavior modelling, student assessment), and [62]. A good taxonomy should separate the entities "into
10) the impact of learning analytics on subject’s behavior and mutually exclusive, unambiguous groups and subgroups that,
learning performance. taken together, include all possibilities" [62, p.52].
Finally, the authors also strove to understand if the studies Consequently, the authors want to propose a conceptual
integrated Campbell and Oblinger’s five step model of learning model towards a learning analytics for learning design taxon-
analytics [57], or in other words, if the studies managed to omy, deriving classification from existing research and from
close the learning analytics loop effectively [58]. Based on the review study. The authors will use the already estab-
the categories and subcategories defined (see Table IV), the lished learning design taxonomy proposed by Rienties et al.
two researchers coded all the papers and solved any potential [34] without any alterations, and build upon Campbell and
differences. After the coding of the papers, a descriptive anal- Oblinger’s [57] five-step model of learning analytics: capture,
ysis of the results was performed to explore the current status report, predict, act, and refine. Thus, the mapping between
of learning analytics for learning design, and classify what Campbell and Oblinger’s five-step learning analytics model
learning analytics indicators have been used to inform learning and the results from the review study are the following:
design decisions. The selected variables with a description and 1) capture will incorporate the different data collection
a scoring criteria are reported in Table IV. The results from methods commonly used in the selected studies to
the coding process are reported in Appendix A. gather user data;
2) report will refer to the techniques researchers used to
E. Categorization scheme report the analytics back to the users;
3) predict will include the purpose for usage of predictive
In order to provide a more holistic view of the current status modelling;
of learning analytics for learning design, the authors decided 4) act will include the actions researchers applied;
to classify what learning analytics metrics have been used, 5) refine will refer to the interventions and the re-design
referring to and applying a categorization scheme proposed by of learning activities reported in the selected studies.
[60]. This way, the descriptive analysis will be complemented
with categorization of learning analytics according to five The conceptual model of learning analytics for learning
perspectives and six data sources. The perspective category design means that a taxonomy of learning analytics’ metrics
includes: 1) individual student (i.e., indicators dedicated to must be related to the taxonomy of learning design activities
individual student activities; e.g., receptive activities vs active for classifying what type of metrics were used for what
participation indicators), 2) group (i.e., indicators related to learning design activities, and what was the outcome in the
a group of students), 3) course (i.e., indicators for monitor- defined context. This could add more precision and common
ing and analyzing the overall course data), 4) content (i.e., understanding at the intersection of learning analytics and
indicators that present students’ interactions with the learning learning design research. However, establishing selection cri-
content), and 5) teacher (i.e., indicators about teacher’s actions teria for extracting pedagogically valuable learning analytics
and activities). The data source category includes: 1) student- for the development of a taxonomy for research purposes is
generated data, 2) context/local data (i.e., data that surround the a challenging task. Therefore, the authors will only propose
student, such as local or mobile data), 3) academic profile (e.g., a conceptual model towards a learning analytics for learning
demographic data, data about past performances), 4) evaluation design taxonomy, that will serve as a springboard for further
research.
1 VLEs/LMSs: controlled environment, used for gathering learner and ac-
tivity data, MOOC/social learning: informal, social learning setting, Web- IV. F INDINGS
based education: web-based e-learning environments except from VLEs, LMSs
and MOOCs, Cognitive tutors: special software, utilized for the needs of Analysis of the studies was performed using non-statistical
the study, Computer-based education: other environments that include some methods considering the variables reported in Table IV. Before
type of computer technology (e.g. desktop applications, etc.) except from
those belonging to one of the other categories, Multimodality: learner data continuing with the reporting of the findings, it should be noted
in different modalities, Mobility: mobile devices used as the primary learning that most of the studies had more than one sample population,
mediator. used more than one data analysis technique, or reported more
This article has been accepted for publication in a future issue of this journal, but has not been fully edited. Content may change prior to final publication. Citation
information: DOI 10.1109/TLT.2018.2868673, IEEE Transactions on Learning Technologies
8

TABLE IV. C ODING SCHEMA FOR THE SELECTED RESEARCH PAPERS


Variable Description Scoring criteria
Exp - Experiment,
CaseSt - Case study,
Category [56] What is the design of the study?
SDA - Secondary Data Analysis,
Etno - Ethnography
STEM,
CS - Computer Science,
Research topic What is the domain subject?
SS - Social Sciences,
AH - Arts & Humanities
VLEs / LMSs - Virtual Learning Environments /
Learning Management Systems,
MOOC / social learning
WBE - Web-based education,
Learning Environment [9] What is the setting of the learning environment?
CT - Cognitive Tutors,
CBE - Computer-based education,
MM - Multimodality,
Mob - Mobility
F - Formal,
Is it intentional and structured learning provided
Learning scenario NF - Non-formal,
by an educational institution?
IF - Informal
MS - Middle School students,
HS - High School students,
UG - Undergraduate students,
Population Sample population
G - Graduate students,
E - Educators (e.g., teachers, instructors),
R - Researchers
Sample size Size of sample population Report the actual number of subjects or leave it blank
I - Individual,
Unit of analysis What is the entity that is analyzed in the study? T - Team (or group),
C - Course
PBL - Problem-based learning,
SRL - Self-regulated learning,
IBL - Inquiry-based learning,
Pedagogical approach [59] What pedagogical approach is adopted?
GBL - Game-based learning,
CSCL - Computer Supported Collaborative Learning
Constructivism
Moodle,
Learning platform What type of learning management system is used? Blackboard,
Other - write down if reported
What type of tools and technology are being Write the reported technologies and tools or leave it blank.
Technology and tools
used by the subjects in the study? See Appendix A.
Write the reported data collection methods or leave it blank.
Data collection [56] Type of data source/collection methods
See Appendix A.
Qual - Qualitative,
Methodology Type of methodology used Quant - Quantitative,
MMs - Mixed methods
DS - Descriptive statistics
What type of data analysis methods IS - Inferential statistics:
Data analysis [56]
have been used? P - Parametric
NP - Non-parametric
Write down if authors have reported any research objectives or leave it blank.
Research objective What has been examined in the study?
See Appendix A.
Write down if authors have reported or leave it blank.
Behavior What was the impact of learning analytics on subject’s behavior?
See Appendix A.
Write down if authors have reported or leave it blank.
Performance What was the impact of learning analytics on learning performance?
See Appendix A.

than one research objective, especially studies that outline two in its infancy, one can see that it receives recognition in the
or more case studies. Thus these are aggregated numbers of research community. Moreover, the increasing trend shown
studies that reported such data. The following findings give an by years in Fig. 4 also indicates the expanding interest of
answer to the first research question. exploring the domain of learning analytics for learning design.
Publication and study design. In regard to the jour- When it comes to distribution of the selected studies accord-
nal/conference of publication, most of the studies are published ing to the adopted research strategy [56], majority of the papers
in one acknowledged peer-reviewed journal - BJET, and one were case studies (n = 36 studies), following by experiments
acknowledged peer-reviewed conference - LAK, as shown in (n = 5 studies), ethnography (n = 1 studies), and secondary
Table V. The high number of published papers in BJET was data analysis (n = 1 studies). Regarding the research topic
due to a special issue named "Teacher-led Inquiry and Learning (as reported in the selected studies), the dominant subjects
Design" in 2015. The published work shows an expanding come from computer science (n = 12 studies), of which 6
interest of exploring the intersection between learning analytics studies were in programming courses; following STEM (n = 8
and learning design in the last two years. Although it is still studies) of which 3 studies were concentrating on mathematics
This article has been accepted for publication in a future issue of this journal, but has not been fully edited. Content may change prior to final publication. Citation
information: DOI 10.1109/TLT.2018.2868673, IEEE Transactions on Learning Technologies
9

TABLE V. D ISTRIBUTION OF PUBLISHED WORK BY TABLE VI. D ISTRIBUTION OF PUBLISHED WORK PER SAMPLE
JOURNAL / CONFERENCE POPULATION

Journal/Conference Num. of studies Sample population Num. of studies


British Journal of Educational Technology 6 Middle school students 2
Computers and Education 1 High school students 6
Computers in Human Behavior 4 Undergraduate students 19
The Internet and Higher Education 2 Graduate students 5
Entertainment Computing 1 Educators
19
American Behavioral Scientist 1 (teachers, instructors, teaching assistants)
LAK 8
IEEE Transactions on Learning Technologies 3
International Journal of CSCL 1
Journal of Learning Analytics 1 23 studies). Some studies (n = 5 studies) used a combination
Journal of Computer Assisted Learning 2 of VLE and WBE. The rest of the studies used WBE (n
Other 13
= 13 studies), CBE (n = 1 studies), multimodality (n = 9
studies), and mobile (n = 1 studies). Moreover, the studies
and statistics; social sciences (n = 8 studies), and arts and that were conducted within LMSs and who reported the type
humanities (n = 3 studies). of the learning platform, have shown that Moodle was the most
used type of a learning platform (n = 6 studies), followed
by Blackboard (n = 2 studies), and other social learning
platforms (n = 8 studies) such as EdX [64], Khan Academy
[65], Coursera [66], Elgg [67], THEOL [63], ed-Venture [68],
video-based social platform [69] and virtual world platform
[70]. Furthermore, with regard to the setting of the learning
environment, some studies were conducted in purely digital
learning environments (n = 15 studies), some in blended
learning environments (n = 11 studies), and some in face-to-
face learning environments (n = 10 studies), where students
collaborated face-to-face using digital tools [71], multi-surface
tabletops [42], or used mobile devices as a learning mediator
[72].
Another important information for the learning context was
Fig. 4. Distribution of published work by type and number of publications the pedagogical approach that has been used. Majority of the
per year papers (n = 26) did not report the use of a specific pedagogical
approach, but the results for those who did, are shown in Fig.
Sample population and unit of analysis. The predominant 5.
sample population in the selected papers consisted of under-
graduate students (n = 19 studies) and educators (n = 19
studies) as shown in Table VI. Some studies reported only
students as a sample population (n = 13 studies) without
referring to a specific category, and only two studies included
PhD students as a sample population.
Almost all of the studies reported the sample size, that
ranged from 4 to 111.256 learners. Substantial sample size
was reported only from studies that examined large-scaled
MOOCs. Therefore, the researchers decided to calculate the
median and the mode of the samples (not considering the
sample size reported from MOOCs). As a result, for learners
(including middle school, high school, undergraduate and
graduate students) the median is 40, and the mode is 15. For
educators, the median is 7, and the mode is 12.
On the other hand, the unit of analysis is a very important
indicator which defines what type of data should be collected
and from whom [56]. Thus, the largest number of papers (n =
12 studies) reported group as their unit of analysis, individuals
(n = 9 studies), and pairs (n = 1 study). Only one study Fig. 5. Distribution of published work by pedagogical approach
reported the course as a unit of analysis to examine the student
learning behavior without ignoring the instructor’s role or the Within the setting of the learning environment, the authors
interaction between the students and the instructor [63]. also tried to systematically categorize the technology and
Setting of the learning environment. Based on the learning tools used by the subjects during the studies. The most used
settings, most studies were conducted within VLEs/LMSs (n = technologies and tools are reported in Table VII.
This article has been accepted for publication in a future issue of this journal, but has not been fully edited. Content may change prior to final publication. Citation
information: DOI 10.1109/TLT.2018.2868673, IEEE Transactions on Learning Technologies
10

TABLE VII. T ECHNOLOGY AND TOOLS USED IN THE PUBLISHED TABLE IX. D ISTRIBUTION OF PUBLISHED WORK PER DATA ANALYSIS
STUDIES TECHNIQUE

Technology and tools Num. of studies Data analysis techniques Num. of studies
Software suits and web-based applications 8 Inferential statistics 24
Web 2.0 tools Parametric statistics 16
6
(wikis, chats, blogs, skype, social media tools, google, apps) Non-parametric statistics 4
Dashboards and visualization tools 5 Descriptive statistics 15
Kinetic sensors, EEG, eye-tracking devices 4 Content analysis 5
LA tools 3 Discourse analysis 3
Mobile phones/ iPad 2 Thematic analysis 3
Tabletops 2 Semantic analysis 1
Traditional tools Dispositional analysis 1
2
(whiteboard, pen & paper, laptop, mouse, keyboard)

TABLE VIII. D ISTRIBUTION OF PUBLISHED WORK PER DATA


COLLECTION METHODS cient management of learning scenarios (n = 23 studies);
Data collection methods Num. of studies • Student’s learning behavior and engagement: examples
System logs (e.g. mobile logs, external 2.0 tools/apps) 19 include work that covers monitoring or evaluating stu-
LMS logs (e.g. course content, assessment, quiz, grades) 19 dent’s learning behavior/patterns and engagement within
Surveys/Questionnaires 16
Documentation (e.g. notes, diary, peer evaluation, e-mail) 13 the learning environment (n = 20 studies);
Interviews 10 • Usefulness of LA tools: e.g. what are the benefits of
Artifacts 7
Observations 6
using LA tools or how users perceive LA tools (n = 12
Video/audio recordings 6 studies);
Discussion boards 5 • Teacher’s professional development: examples include
Multimodal (e.g. EEG headset, kinetic sensor, eye–tracking) 4
Workshops/meetings 2 work that focuses on increasing teacher’s awareness or
Data set originally collected for different research purpose 1 approaches that improve teacher’s skills by incorporating
new teaching techniques (n = 7 studies);
• Improved orchestration: e.g., enactment-time; adaptation
Methodology and data analysis techniques. Regarding the of available pedagogical and technological resources in
type of methodology, the authors alluded to the type of classrooms, or learner’s support in online/blended learn-
methods used in the studies. Thus, majority of the studies ing environments to help them achieve their intended
used quantitative analysis (n = 21 studies), following mixed learning goals (n = 7 studies);
methods analysis (n = 14 studies), and qualitative analysis (n • Student’s self-reflection/self-assessment (n = 7 studies);
= 5 studies). The findings show that quantitative analysis is • Predictive modelling (n = 6 studies);
still the dominant methodology in learning analytics research. • Collaboration and interaction (n = 5 studies);
This shows that most of the analysis is done using learners’ • Student assessment (n = 4 studies);
data from the LMSs or data gathered using any software • Overall user satisfaction (n = 3 studies);
suit or web-based application. Considering the data collection • Student retention (n = 2 studies);
methods used in the studies, it can be concluded that various • Personalized learning (n = 2 studies).
methods have been used and the most practiced data collection Moreover, some of the studies (n = 20 studies in total)
methods are presented in Table VIII. reported the impact of learning analytics on subject’s behavior
With respect to data analysis, learning analytics adopts a (n = 18 studies) and the impact of learning analytics on
wide range of techniques from statistics, data mining, text learning performance (n = 9 studies). Appendix B lists the
analysis, and social network analysis. For the purpose of impact of learning analytics on subject’s behavior and learn-
this study, the authors decided to follow the general clas- ing performance as reported by the authors of the selected
sification presented in [56] based on qualitative and quan- studies. From the results it can be observed that learning
titative methods used. Thus, the authors decided to report analytics usage generally increased user awareness and user
the results as a descriptive or inferential statistics, of which informedness in the learning environment [47], [67], [73],
the inferential statistics can also be divided into parametric [74]. Next, usage of learning analytics assisted teachers to
and non-parametric statistics. Consequently, one of the most manage time better [47], [75], [42]; to identify problems in
used techniques is regression, either linear or multiple, bi- the course design [73], [72]; to follow student behavior and
variate analysis (i.e. correlation), and cluster analysis. Table engagement with content overtime and apply informed changes
IX displays the classification of the most used data analysis to keep the level of instructional quality [76], [77]; to arrange
techniques in the selected papers. the monitoring process according to their needs and think a
Research objective, behavior and performance. The se- priori about possible solutions [47], [69]; and to utilize real-
lected papers mostly focused on the following research ob- time tracking and provide instantly corrective feedback [42],
jectives: [68], [7], [71]. On the other hand, usage of learning analytics
• Design and management of learning scenarios/activities: helped students to apply diagnostic assessment following their
examples cover work that underlines design of learning own performance [74], [72], [78], [49]; to initiate and steer
activities or work that evaluates the effects of teaching conversation with their peers or the instructor [42]; to better
practices, re-design of learning activities, or more effi- navigate and utilize the course content [77]; to reflect, self-
This article has been accepted for publication in a future issue of this journal, but has not been fully edited. Content may change prior to final publication. Citation
information: DOI 10.1109/TLT.2018.2868673, IEEE Transactions on Learning Technologies
11

direct their progress, and make informed decisions how to to the data sources, majority of the studies used student
continue reaching learning goals [7], [78], [71]. However, the generated data and evaluation data, neglecting course meta-
selected studies did not directly considered nor clearly reported data and local data. The complete list of the extracted learning
any measurement of learning gains, or any other learning- analytics (152 in total) from the selected studies is presented
related constructs. Finally, it was interesting to notice that in Appendix D.
students perceived learning analytics as "setting reasonable Finally, to summarize, learning analytics can support dy-
parameters of what is appropriate, what makes sense, and what namic and data-driven learning design decisions if they are:
you are trying to do" [48]. The complete list from the selected collected from multiple data sources, modes, or learning set-
studies is presented in Appendix B. tings; embedded in teachers’ everyday practice; and a regular
To answer the second research question, the authors looked part of student’s learning processes. Researchers that reported
at what learning analytics were used throughout the studies usage of multimodal learning analytics, [84], [42], [75], [92],
to inform learning design decisions. As one can notice in [93] could support the analysis and study of complex, open-
Appendix C, most of the analytics used in the studies are ended, hands-on learning experiences in authentic learning
extracted analytics, presented back to the learners, usually settings. This way, researchers have gained more insights into
as visualizations or dashboards. These analytics were used users’ needs and expectations as a promising way to support
to quickly locate elements in the learning design or user’s dynamic and real-time learning design activities. Moreover, use
performance that deviated from a defined threshold values [49], of learning analytics only from digital footprints contextualized
[77], [79], [80]. with qualitative data from users’ experiences, and framed
On the other hand, few studies reported use of embedded usage process during the learning activities, could also be
analytics, that are integrated into the learning environment a promising area that needs to be explored more [7], [47],
and could be used for real-time feedback [7], [47], [67], [78]. A good example is Wise et al. [48] study that underlines
[71]. Furthermore, findings show that learning analytics mostly the importance of setting up the frame for learning analytics
have been used to explore the physical, digital, and human use, interpretation, and decision-making as an integral part of
elements in the learning ecosystem [42], [81], for post-course students and teachers’ everyday activities tied to goals and
reflection and recommendation of resources [67], for prediction expectations. Another example is Rodríguez-Triana et al. [47]
[82], [83], [46], [66], as a tool [71], [77], [84], to detect study that focuses on explicit guidance on how to use, interpret
weaknesses and strengths in the learning design [5], [85], [86], and reflect learning analytics findings to adequately refine and
and to define scripts with monitoring information [47]. Another re–design learning activities.
conclusion that can be drawn from the selected studies is
the common use of time-related learning analytics to evaluate
the design of learning activities and learners’ online behavior
[87], [71]; score and frequency learning analytics to evaluate V. D ISCUSSION AND F UTURE R ESEARCH D IRECTIONS
performance, difficulty of the learning content and assessment
strategies [63], [88]; and aggregated learning analytics to
identify trends and behaviors [5], [89]. Learning design as a field has produced methodologies,
The selected studies can also be distinguished from the tools, and representations to assist educators in designing
usage approach of learning analytics. Some of the studies learning activities, while learning analytics holds the metrics,
focused on aggregating data from large data sets (usually analysis, and reporting of data, to inform and influence the
blended courses or MOOCs) for the purpose of finding patterns design process, and ensure appropriate refinement. Looking
within the data that can be applied in different contexts and at the publication distribution per year, the interplay between
among various modules [46], [5], [65], [90]. Moreover, these learning analytics and learning design has gained expanding
studies also aim to explore how learning design links to interest in the TEL community for further exploration of their
student’s online behavior [91]. Others, focused on actionable alignment and conditional maturation.
analytics in authentic learning settings [43], [42], [48], [49]. In general, the rational behind the use of learning analytics
Both approaches are highly relevant as they supplement the for learning design is to discover learning phenomena (e.g.,
research work to address the main challenge in learning moment of learning or misconception) and design improved
analytics, i.e., to deliver actionable feedback, derived from and pedagogically sound learning environments utilizing tech-
theory, the learning context, and the methods in which the nology and resources. Thus, the majority of the studies focused
learning activities are situated [90], [5]. on:
Moreover, the authors evaluated learning analytics indica- • utilization of learning analytics tools from which analyt-
tors according to five perspectives: individual student, group, ics were extracted and used to further develop the tools,
course, content, teacher; and six data sources: student gen- as to offer better practical support and informed decision
erated data, local data, academic profile, evaluation, course- making [71], [77], [76], [94], [74], [93], [73], [80];
related performance, and course meta-data. The findings re- • development of frameworks that add a theoretical clarity
vealed that most of the learning analytics are linked to in- to the learning process, identify analytics metrics, and
dividual students and their actions; students interaction with create guidelines and recommendations that can inform
the learning content; or group’s interactions; while learning the design of learning activities [43], [87], [75], [48],
analytics gathered from educators are less common. In regard [69], [49], [81], [64].
This article has been accepted for publication in a future issue of this journal, but has not been fully edited. Content may change prior to final publication. Citation
information: DOI 10.1109/TLT.2018.2868673, IEEE Transactions on Learning Technologies
12

A. Interpretation of the results with respect to the first research easily acquire data and answer questions related to quantitative
question measures of use or differentiation between learning offerings
[38]. Hence, researchers use various indicators from system
The summary from the selected studies have shown that stu- logs to understand:
dents and teachers as users of learning analytics became more • what type of content students use, how often, and how
knowledgeable about their learning behaviors and progress. much time they spend interacting with the content? [90],
For example, they could anticipate how lack of information or [76], [63];
emergence of problems could affect their activities and expec- • what type of indicators help teachers to notice behaviors
tations [49], [67], [7], [88]. As a result, increased awareness and patterns? [65], [72], [71];
led to improved and informed decision-making, and potential • what metrics and frequencies, such as visit duration and
growth in users’ skills and competencies [42], [47]. Next, number of sessions are useful for teachers to reflect upon
although the usage of learning analytics could have impact on the analytics? [5], [83].
user’s behavior, there are chances that users can utilize learning However, when it comes to more complex questions, such
analytics to monitor their progress but not necessarily change as "which effects do specific learning offerings have on col-
their behavior [69]. However, learning analytics could also do laborative learning processes?", researchers need more than
the opposite, enact unintentional change [7]. Moreover, due to just a quantitative data [78], [88]. These questions, which
the lack of usage of qualitative studies, researchers might fail are concerned with user satisfaction, preferences, or needs,
to gain awareness of the scale of learning analytics metrics are partially answered, due to shortage of qualitative data.
that students find it useless or inaccurate [48], [7]. Thus, the As a result, there is a misalignment between the information
research community could benefit if there is a taxonomy of generated by learning analytics tools with the needs, problems,
identified learning analytics that are likely to create an impact and concerns that teachers have regarding learning designs and
on user’s behavior (positive or negative) and induce change learning activities [43], [95]. Likewise, students face the same
in situ. Also, the research community needs educators to take problem as users of analytics, due to:
part in the process of design and implementation of learning
1) the lack of metrics which are pedagogically valuable
analytics, as this could fill in the gap between what various
[96];
learning analytics metrics present and what educators actually
2) failure of thoughtful design to encourage and shape
need [47], [87], [43].
analytics use [48];
Furthermore, from the analysis of the selected studies (see
3) failure to ground and tie analytics metrics to learners’
Appendix A), it can be noted that the use of technology
goals and expectations [7].
(learning analytics tools, digital learning platforms, sensor-
based tools) has increased the range for data collection. Re- These issues, place students as passive users of learning
searchers can collect data not just from the digital learning analytics, falling to empower them to take responsibility and
platforms or tools, but also from the physical spaces where regulate their own learning and performance. Considering the
learning is happening [42], [31], [93]. However, although data findings from the selected studies, the authors want to point
collection from physical spaces is becoming more common, the out to:
multimodal approaches to analyze the learning experiences in • the use of broad set of complementary metrics (e.g.,
the physical spaces are not yet widespread [31]. In addition, multimodal learning analytics) [42], [84], [75], [92] that
the results from this review study support the findings from the will incorporate more aspects that characterize the learn-
orchestration research, that modelling and supporting teacher’s ers and help researchers learn more about the learning
orchestration in technology-rich physical and digital learning process,
environments develops a great practical importance for the • and the importance of grounding learning analytics in
research community [31]. theory before introducing and integrating the same into
When it comes to methodology, the results show that the learning environment [2], [94], [97].
quantitative studies still take precedence over mixed methods Next, throughout the reviewed papers one can observe that
and qualitative studies due to the abundance of user activity there are few learning analytics designs that are grounded in
data from LMSs. Therefore, the most practiced data collection explicit pedagogical models [47], [67], [7]. Many pedagogical
methods were system logs, followed by surveys, while the models are implicit, or it happens the study to not even focus
most practiced data analysis techniques were derived from on any particular model. However, researchers need to explore
inferential statistics. However, simple clicking behavior in a and document various pedagogical factors that contribute to
LMS is a poor proxy for the actual learning behavior students student success (e.g., learning gain and engagement during
have [82]. This heavy reliance on digital footprints, often learning) so that subsequent work can have reference point
using a single platform as a source of data, focuses only on from past research. For example, Berlands et al. [71] reported
factors connected to numeric methods and hinders the holistic that using pedagogically defined learning analytics grounded
approach to understand the learning process as an ecosystem. in theory (i.e., Zone of Proximal Development specifically
In regard to the setting of the learning environment, it for learning Computer Science), provided a strong proof-of-
can be noted that most of the research is performed in concept that real-time support using personalized and theoret-
supervised environments (i.e., virtual learning environments/ ically grounded learning analytics can improve student perfor-
learning management systems). This allows researchers to mance, increase and maintain quality, and engage students to
This article has been accepted for publication in a future issue of this journal, but has not been fully edited. Content may change prior to final publication. Citation
information: DOI 10.1109/TLT.2018.2868673, IEEE Transactions on Learning Technologies
13

work together on more complex problems. Consequently, in fu- lytics, more and more studies are reporting results utilizing
ture, it is advised to consider the context that critically shapes visualized analytics to increase awareness among students for
learning [98], and investigate if and how data-driven design- self-monitoring and self-reflection [6], [7], [72]. The results
decisions can be validated. In other words, the researchers presented in Appendix C, show increase in the usage of
need to find a way to explicitly label efficient learning design learning analytics metrics that suppose to reinforce students’
decisions build on particular data-driven analytics that are self-reflection. The main idea behind self-regulated learning
theoretically justified [97], [2], [58]. derives from "context-specific processes that are selectively
used by students to succeed in school" [100], when numerous
educational institutions have moved to “technology-rich” envi-
B. Interpretation of the results with respect to the second ronments in which learning and teaching expands beyond the
research question walls of university lecture settings. However, there is a limited
In the last ten years, learning is becoming more blended research on how students accept, interpret, and use learning
and distributed across different learning environments and analytics to follow and improve their own performance [7],
contexts. Hence, turning to holistic approaches and considering [6], [42].
how learning takes place today becomes a necessity. Along Moreover, from Appendix D one can observe that only few
these lines, the integration, aggregation, and harmonization of analytics indicators have been explicitly devised to collect and
learning-related data across multiple sources and spaces has present educator’s data (i.e., data gathered from teachers, that
the potential to offer rich evidence-driven design that could allows teachers to reflect upon their teaching practices and
amplify humans’ learning capacities [42], [75]. Consequently, course design) [47], [67], [63], [75], [69], [90], [42]. This
if learning analytics integration is neglected, future learning shows that there is some educator’s data stored in databases,
design will be guided from poor insights drawn from limited mostly data from teacher’s interactions with content or stu-
learning activities. dents, or log data from teacher’s use of dashboards. However,
Learning analytics collected from the digital learning spaces the selected studies failed to report course meta-data, such as
are often complemented with data coming from student man- data regarding the course structure, resources or teaching con-
agement systems [73], [79] or from self-reported variables cepts. Consequently, lack of teacher’s data and course meta-
utilizing surveys [88], [99], [5]. On one hand, the dynamic data, limit educator’s opportunities to reflect on their teaching
and data-driven learning design decisions would not be guided activities, pedagogical practices, the quality of the learning
solely by digital footprints of learning systems and numeric content and the interactions, that might lead to improvements
methods [88], but will incorporate more aspects and metrics in their professional development and dissemination of their
that holistically characterize learners, their needs, and their hands-on experiences [101].
expectations. On the other hand, utilizing analytics coming What is really interesting to be further applied at the
from a single and many times limited learning environment, intersection of learning analytics and learning design, is finding
have no added value when rich and more representative data- meaningful behavior patterns and interactions that can be
sets are available [82]. Consequently, the combination of learn- "mapped back to the planned learning activities to explain
ing analytics coming from several digital and sometimes even why certain peaks and falls occur over time" [102], [83],
physical footprints (e.g., learning environments, self-reported [42]. For this, information regarding the course meta-data,
information or through use of sensor-based tools) could im- the intended teaching concepts, and a feedback loop will be
prove the interpretation of the observed learning behavior and necessary. As Reinmann has noted, "more is needed than
the patterns noticed within the learning environment. This way, just data to “discover” meaningful relations" [2]. Thus, the
educators could properly scaffold the design process through research community could join efforts to develop conceptual
informed decisions utilizing awareness and reflection. framework that could model the complexities of the learning
When it comes to learning design activities (see Appendix process towards comprehensible analytics and visualization
C), most of the studies included in the review have used requirements to transform the learning design into a teacher-
assimilative, assessment, communication, finding and handling led enquiry-based practice [43], [33]. Furthermore, what is
of information, and productive activities, as defined in the often overlooked and underestimated but immensely important
learning design taxonomy [34]. This is also supported by to educators, is the need for explicit guidance on how to
Rienties and Nguyen research studies on the impact of learning use, interpret, and reflect on the learning analytics findings to
design in university settings [5], [46], [86], [85]. They reported adequately refine and re–design learning activities [47], [67],
that learning design has a strong influence on learner’s satisfac- [101]. A direction towards closing this gap is to consider
tion [5]. In particular, communication and interactive activities establishing a participatory culture of design, and a habit
engage students to spend more time in VLEs compared to among educators to see learning design as an inquiry process
productive and experiential activities [86]. Although the design and learning analytics as a part of the teaching culture [28],
of learning activities depend on the module [85], educators [33].
design learning activities differently over the time line of the Finally, not all of the studies implicitly reported that they
course and reduce the variety of learning design activities when seek to visualize the impact of learning design activities to
they introduce assessment activities [46]. learners and educators from which they have collected the data
Although most of the studies follow the traditional paradigm [90], [102], [63]. At present, the authors agree with Clow’s
in which the teacher is the main end-user of learning ana- argument that the learning analytics cycle can be completed
This article has been accepted for publication in a future issue of this journal, but has not been fully edited. Content may change prior to final publication. Citation
information: DOI 10.1109/TLT.2018.2868673, IEEE Transactions on Learning Technologies
14

even if interventions in learning design does not reach the the systematic literature review, we identified that the majority
learners from whom originally the data was generated as long of the selected studies apply design-based research to inform
as it is used to apply improvements for the next cohort [58]. practice and advance theory by iterations, as suggested by
However, for future work, researchers could try and close the [2]. Thus, applying concept-driven approach to design-based
cycle with the same group of learners due to authenticity and research could potentially lead to construction of generative
context. Returning the data gathered from authentic settings to design knowledge [104].
students or teachers from which it has been collected, could Thinking in this direction, developing and applying strong
assist in getting practical comments for further improvements concepts requires skills for thorough understanding of partic-
to local decisions in which the end-users were involved [47], ular design scenarios and situations, that goes beyond just
[71], as well as to increase the awareness among the users for mapping novel experiences from single studies. This should
the applicability of learning analytics. encourage researchers to engage in longitudinal studies that
will not only change the way we think about studies in
educational research (e.g., from short-term interventions to
C. Theoretical and practical implications continuous monitoring) [47] but also whether and when certain
One of the biggest challenges for researchers and practition- scaffolds can gradually be removed to avoid cognitive load
ers will be to create a strong relation among learning analytics [105], how to design learning activities differently over time
and learning theories to empower reflective practices [10], [1], [86], [46], and expertise reversal effects [106]. Theories and
[103]. principles are, by definition, formulated on a high level of
Theory validates associations and patterns between digital abstraction, so that they can be applied to many different
traces and learning outcomes (i.e., learning-related constructs) situations (generalization), which are then presented as in-
that can trigger a learning process to reach specific goals [2]. stantiations of the abstract notions. To elicit principles and
Applying the consolidated model of theory, design and data pave the way towards a unified theory of learning analytics
science, proposed by [1], could bring invaluable insights to for learning design, researchers need to triangulate research
researchers. This means that researchers will know what data findings across the different case studies and meta-analyze
to collect in order to understand whether certain learning pro- the empirical knowledge. This will allow researchers to move
cesses are activated, and what learning outcomes are associated from instances/case studies, to intermediate-level knowledge,
with what design decisions [47]. Failing to consider context, and then to theory construction.
could lead to misinterpretations of findings and limit the design Finally, the authors would like to present a conceptual model
replication in various learning settings [15], [98]. towards learning analytics for learning design (LA4LD) taxon-
As it can be observed from the findings, much of the omy (see Figure 6). This taxonomy should derive classification
work in learning analytics has been related to development of from existing research and from the review study. Thus, on one
visualizations [47], [72], [71], [75], [76], [42], [49], [74], [87]. hand, the proposed conceptual model incorporates the already
However, there is a limited empirical evidence that visually existing learning design taxonomy proposed by [34] which
presenting analyzed data could promote desirable learning identifies seven broad types of learning activities to guide
practices and increase understanding in learning analytics strategy and creativity in the design process. On the other hand,
interpretation [94]. Wise et al. [7] conducted a study grounded the authors employed the Campbell and Oblinger’s [107] five-
in computer supported collaborative learning (CSCL) theory, step model of learning analytics: capture, report, predict, act,
and proposed embedded and extracted learning analytics to and refine, and tried to map the findings from the review study
generate visualizations for the purpose of self-directed and in accordance with the model.
self-regulated learning. This work is particularly significant as As described in the methodology section, subsection III.F,
it aims to establish feedback mechanism between students and the authors proposed second level branches in the "taxonomy
teachers by fostering dialog where learning analytics is the tree" that correspond to the five steps of learning analytics
conversation starter. In addition, Lockyer et al. [4] highlighted [107]. In particular, capture corresponds to the data collection;
that learning design should consider the inclusion of learning report corresponds to the techniques used to provide feedback
analytics within a specific pedagogical design as a mean using learning analytics; predict corresponds with the purpose
to encourage learning. Both studies underline the need for to use learning analytics for prediction of grades or failures; act
theoretical integration of learning analytics and learning design corresponds with the applied actions; and refine corresponds
that could be seen as a promising start to connect theory, with interventions and re-design of learning scenarios, tools,
design, and data to inform research and practice of learning etc. This is a proposition on how the findings from the selected
analytics [1]. Also, this review study can serve as a springboard studies can be used to derive classification and establish
on furthering the alignment between learning analytics and selection criteria for extracting pedagogically valuable learning
learning design. analytics metrics from specific learning design activities. Fur-
Another theoretical implication this study presents is the thermore, LA4LD taxonomy could summarize the objectives
development of intermediate-level body of knowledge in learn- of the selected papers, as shown in the tree trunk (see Figure
ing analytics for learning design. Intermediate-level knowledge 6). These objectives are derived from the synergy between
(some researchers also refer to it as strong concepts) includes learning analytics and learning design, as reported in the
“solution-oriented pieces of generative knowledge, residing on selected studies. The proposed LA4LD taxonomy, offers a
a level of abstraction between instances and theories” [104]. In springboard for further work; so other conceptual knowledge
This article has been accepted for publication in a future issue of this journal, but has not been fully edited. Content may change prior to final publication. Citation
information: DOI 10.1109/TLT.2018.2868673, IEEE Transactions on Learning Technologies
15

development endeavors can utilize it. Thus, the proposed questions that entails certain limitations. Other approaches like
LA4LD taxonomy can be seen as an essential step towards Systematic Mapping Study [108] might not go deep into an
future empirical research, and a support tool to researchers area, but create a map of a wide research field. Despite the
in establishing a connection between learning analytics for limitations of the selected methodology, it is a well-accepted
learning design and theory. and widely used literature review method [9], [18] in TEL
providing certain assurance of the results.
D. Future work VI. C ONCLUSION
Based on the reviewed papers, the authors want to offer the The current review demonstrates the present landscape at
following checklist for future work on learning analytics for the intersection between learning analytics and learning design.
learning design: Learning analytics and learning design are two complementary
• provide details about the learning environment and the fields within educational technology, that together hold the
used pedagogical approaches, where improvements in promise to optimize the learning process, and contribute to
learning design experiences based on learning analytics the creation of more meaningful learning experiences, tools,
outcomes will be measured [47]; and evidence-based practices. The authors analyzed 43 peer-
• indicate how learning analytics metrics offer insights reviewed articles selected from the literature within the period
into the learning process and can be theoretically of 2010-2017. This review aimed to explore what learning an-
grounded for meaningful interpretation to inform theory alytics have been used to inform learning design decisions, and
and design [6]; what were the main design approaches researchers embraced
• evaluate and compare what learning design patterns and over the last seven years. Furthermore, the authors explored the
learning phenomena make learning effective [5]; convergence of learning analytics and learning design within
• evaluate and denote student learning outcomes, or any the reported learning settings, the pedagogical contexts, the
other learning-related constructs [71]; data collection methods, the data analysis techniques, and the
• evaluate and denote the impact of learning analytics reported research objectives.
outcomes on learning design decisions and experiences The review has shown that future research should consider
[72]; developing a framework on how to capture and systematize
• evaluate and denote how educators are planning, de- learning design data, and follow what learning design choices
signing, implementing, and evaluating learning design made by educators influence subsequent learning activities
decisions [101]; and performances over time. Furthermore, it is of utmost
• provide common guidance on how to use, interpret and importance to theoretically ground the learning analytics be-
reflect on the learning analytics to adequately refine and cause: 1) the choice of methods and analysis used in the
redesign learning activities [7] studies should be driven by theory and practice of learning
analytics; 2) the findings from the studies should be used to
E. Limitations inform theory and design. Addressing these elements could
help in further maturation of the fields of learning analytics
The main limitations of this review can be seen as a: and learning design, and provide foundation for longitudinal
• bias in the selection of databases, journals, and publi- and comparative studies among various educational contexts.
cations due to possible subjectivity and lack of relevant Furthermore, educators and researchers need to leverage the
information; use of learning analytics and focus on developing students’
• bias in the search string because keywords are discipline skills and natural predispositions by designing personalized
and language-specific; learning and feedback, while decreasing assimilative activities
• shortage to draw more general conclusions since the as traditional lecturing, reading, or watching videos. Future
focus was on empirical research; learning needs to be direct it towards personalizing learners’
• bias and inaccuracy in data extraction as it was per- experiences and adapting it to their strengths, interest, and as-
formed only by the two authors; pirations. Also, educators need to re-think their role of simply
• bias from interpretation of some findings, methods, or being providers of knowledge, to designers and facilitators of
approaches, as some parts of the methodology from the learning.
selected studies were not described accurately. As a summary, the authors like to highlight the main idea
However, the authors attempted to ensure unbiased review pro- of aligning learning analytics with learning design, as an
cess by developing a research protocol in advance with already essential condition to create more meaningful tools, methods,
defined research questions. The search string was developed and representations of data for educators and learners. Thus,
using the research questions and considering a possible lack this alignment would lead to improved and informed learning
of standardization in keywords as they can be discipline- and decisions, and towards the development of design principles
language-specific. Furthermore, the authors performed a search and knowledge between data representation and data-driven
of the state-of-the-art in TEL, in terms of journals, databases, actions.
and previous review studies. Finally, the selected methodology
(i.e., Systematic Literature Review) is an in-depth study of A PPENDIX
a relatively narrow area using specific and pointed research • Appendix A. Results from the coding process
This article has been accepted for publication in a future issue of this journal, but has not been fully edited. Content may change prior to final publication. Citation
information: DOI 10.1109/TLT.2018.2868673, IEEE Transactions on Learning Technologies
16

Fig. 6. Learning analytics for learning design taxonomy

• Appendix B. Learning analytics impact on subject’s R EFERENCES


behavior and learning performance
• Appendix C. Learning analytics for learning design [1] D. Gašević, V. Kovanović, and S. Joksimović, “Piecing the learning
analytics puzzle: a consolidated model of a field of research and
• Appendix D. Learning analytics indicators categorized practice,” Learning: Research and Practice, vol. 3, no. 1, pp. 63–78,
according perspective and data source 2017.
[2] P. Reimann, “Connecting learning analytics with learning research:
the role of design-based research,” Learning: Research and Practice,
vol. 2, no. 2, pp. 130–142, 2016.
ACKNOWLEDGMENT
[3] Y. Mor and B. Craft, “Learning design: reflections upon the current
landscape,” Research in learning technology, vol. 20, no. sup1, p.
This work was supported by the Research Council of Nor- 19196, 2012.
way under the project FUTURE LEARNING (255129/H20). [4] L. Lockyer, E. Heathcote, and S. Dawson, “Informing pedagogical
In addition, the authors are extremely grateful to the associate action: Aligning learning analytics with learning design,” American
editor and the reviewers for their constructive comments and Behavioral Scientist, vol. 57, no. 10, pp. 1439–1459, 2013.
useful insights, which significantly improved the paper. [5] B. Rienties and L. Toetenel, “The impact of learning design on
This article has been accepted for publication in a future issue of this journal, but has not been fully edited. Content may change prior to final publication. Citation
information: DOI 10.1109/TLT.2018.2868673, IEEE Transactions on Learning Technologies
17

student behaviour, satisfaction and performance: A cross-institutional [26] E. Dobozy, “Typologies of learning design and the introduction of a
comparison across 151 modules,” Computers in Human Behavior, ‘ld-type 2’case example,” eLearning Papers–special edition, 2012.
vol. 60, pp. 333–341, 2016. [27] R. Koper, “Current research in learning design,” Educational Technol-
[6] D. Gašević, N. Mirriahi, S. Dawson, and S. Joksimović, “Effects of ogy & Society, vol. 9, no. 1, pp. 13–22, 2006.
instructional conditions and experience on the adoption of a learning [28] D. Persico and F. Pozzi, “Informing learning design with learning
tool,” Computers in Human Behavior, vol. 67, pp. 207–220, 2017. analytics to improve teacher inquiry,” British Journal of Educational
[7] A. Wise, Y. Zhao, and S. Hausknecht, “Learning analytics for online Technology, vol. 46, no. 2, pp. 230–248, 2015.
discussions: Embedded and extracted approaches,” Journal of Learn- [29] L. P. Prieto, Y. Dimitriadis, C.-K. Looi et al., “Orchestration in learning
ing Analytics, vol. 1, no. 2, pp. 48–71, 2014. technology research: evaluation of a conceptual framework,” 2015.
[8] G. Siemens, “Learning analytics: The emergence of a discipline,” [30] P. Dillenbourg, “Design for classroom orchestration,” Computers &
American Behavioral Scientist, vol. 57, no. 10, pp. 1380–1400, 2013. Education, vol. 69, pp. 485–492, 2013.
[9] Z. Papamitsiou and A. A. Economides, “Learning analytics and [31] L. Prieto, K. Sharma, Ł. Kidzinski, M. Rodríguez-Triana, and P. Dil-
educational data mining in practice: A systematic literature review lenbourg, “Multimodal teaching analytics: Automated extraction of
of empirical evidence,” Journal of Educational Technology & Society, orchestration graphs from wearable sensor data,” Journal of computer
vol. 17, no. 4, p. 49, 2014. assisted learning, vol. 34, no. 2, pp. 193–203, 2018.
[10] R. Ferguson, “Learning analytics: drivers, developments and chal- [32] L. P. Prieto Santos et al., “Supporting orchestration of blended cscl
lenges,” International Journal of Technology Enhanced Learning, scenarios in distributed learning environments,” 2012.
vol. 4, no. 5-6, pp. 304–317, 2012.
[33] L. Prieto, M. Rodríguez-Triana, R. Martínez-Maldonado,
[11] G. Siemens and P. Long, “Penetrating the fog: Analytics in learning Y. Dimitriadis, and D. Gašević. (2018) Orchestrating learning
and education.” EDUCAUSE review, vol. 46, no. 5, p. 30, 2011. analytics (orla): supporting the adoption of learning analytics at the
[12] G. Siemens and D. Gašević, “Guest editorial-learning and knowledge practitioner level. [Online]. Available: osf.io/y2p7j
analytics.” Educational Technology & Society, vol. 15, no. 3, pp. 1–2, [34] B. Rienties, Q. Nguyen, W. Holmes, and K. Reedy, “A review of ten
2012. years of implementation and research in aligning learning design with
[13] P. Long, G. Siemens, G. Conole, and D. Gašević, in Proceedings of learning analytics at the open university uk,” Interaction Design and
the 1st International Conference on Learning Analytics and Knowledge Architecture (s), vol. 33, pp. 134–154, 2017.
(LAK’11). ACM, 2011. [35] K. Verbert, N. Manouselis, H. Drachsler, and E. Duval, “Dataset-driven
[14] K. E. Arnold and M. D. Pistilli, “Course signals at purdue: Using research to support learning and knowledge analytics,” 2012.
learning analytics to increase student success,” in Proceedings of the [36] D. Gašević, S. Dawson, and G. Siemens, “Let’s not forget: Learning
2nd international conference on learning analytics and knowledge. analytics are about learning,” TechTrends, vol. 59, no. 1, pp. 64–71,
ACM, 2012, pp. 267–270. 2015.
[15] D. Gašević, S. Dawson, T. Rogers, and D. Gasevic, “Learning analytics [37] P. Goodyear and Y. Dimitriadis, “In medias res: reframing design for
should not promote one size fits all: The effects of instructional learning,” Research in Learning Technology, vol. 21, no. 1, p. 19909,
conditions in predicting academic success,” The Internet and Higher 2013.
Education, vol. 28, pp. 68–84, 2016. [38] A. L. Dyckhoff, “Implications for learning analytics tools: A meta-
[16] G. D. Magoulas, Y. Papanikolaou, and M. Grigoriadou, “Adaptive web- analysis of applied research questions,” International Journal of Com-
based learning: accommodating individual differences through sys- puter Information Systems and Industrial Management Applications,
tem’s adaptation,” British journal of educational technology, vol. 34, vol. 3, no. 1, pp. 594–601, 2011.
no. 4, pp. 511–527, 2003. [39] S. Joksimović, A. Manataki, D. Gašević, S. Dawson, V. Kovanović,
[17] A. Klašnja-Milićević, B. Vesin, M. Ivanović, Z. Budimac, and L. C. and I. F. De Kereki, “Translating network position into performance:
Jain, E-Learning Systems: Intelligent Techniques for Personalization. importance of centrality in different network configurations,” in Pro-
Springer, 2016, vol. 112. ceedings of the sixth international conference on learning analytics &
[18] Z. Papamitsiou and A. A. Economides, “Learning analytics for smart knowledge. ACM, 2016, pp. 314–323.
learning environments: a meta-analysis of empirical research results [40] M. A. Chatti, A. L. Dyckhoff, U. Schroeder, and H. Thüs, “A reference
from 2009 to 2015,” Learning, Design, and Technology: An Inter- model for learning analytics,” International Journal of Technology
national Compendium of Theory, Research, Practice, and Policy, pp. Enhanced Learning, vol. 4, no. 5-6, pp. 318–331, 2012.
1–23, 2016. [41] W. Greller and H. Drachsler, “Translating learning into numbers: A
[19] M. Van Harmelen and D. Workman, “Analytics for learning and generic framework for learning analytics,” Journal of Educational
teaching,” CETIS Analytics Series, vol. 1, no. 3, pp. 1–40, 2012. Technology & Society, vol. 15, no. 3, p. 42, 2012.
[20] K. L. Gustafson and R. M. Branch, Survey of instructional development [42] R. Martinez-Maldonado, B. Schneider, S. Charleer, S. B. Shum,
models. ERIC, 1997. J. Klerkx, and E. Duval, “Interactive surfaces and learning analytics:
data, orchestration aspects, pedagogical uses and challenges,” in Pro-
[21] R. A. Reiser, “A history of instructional design and technology: Part
ceedings of the Sixth International Conference on Learning Analytics
ii: A history of instructional design,” Educational technology research
& Knowledge. ACM, 2016, pp. 124–133.
and development, vol. 49, no. 2, pp. 57–67, 2001.
[43] A. Bakharia, L. Corrin, P. de Barba, G. Kennedy, D. Gašević,
[22] P. M. Ghislandi and J. E. Raffaghelli, “Forward-oriented designing for
R. Mulder, D. Williams, S. Dawson, and L. Lockyer, “A conceptual
learning as a means to achieve educational quality,” British Journal of
framework linking learning design with learning analytics,” in Pro-
Educational Technology, vol. 46, no. 2, pp. 280–299, 2015.
ceedings of the Sixth International Conference on Learning Analytics
[23] D. Laurillard, “Technology enhanced learning as a tool for pedagogical & Knowledge. ACM, 2016, pp. 329–338.
innovation,” Journal of Philosophy of Education, vol. 42, no. 3-4, pp. [44] J. Dunlosky, K. A. Rawson, E. J. Marsh, M. J. Nathan, and D. T.
521–533, 2008. Willingham, “Improving students’ learning with effective learning
[24] M. Maina, C. Brock, and M. Yishay, The art & science of learning techniques: Promising directions from cognitive and educational psy-
design. Springer, 2015, no. 9. chology,” Psychological Science in the Public Interest, vol. 14, no. 1,
[25] G. Conole, Designing for learning in an open world. Springer Science pp. 4–58, 2013.
& Business Media, 2012, vol. 4. [45] Y. Mor, R. Ferguson, and B. Wasson, “Learning design, teacher inquiry
This article has been accepted for publication in a future issue of this journal, but has not been fully edited. Content may change prior to final publication. Citation
information: DOI 10.1109/TLT.2018.2868673, IEEE Transactions on Learning Technologies
18

into student learning and learning analytics: A call for action,” British a learner decision journey analytic framework,” in Teaching, Assess-
Journal of Educational Technology, vol. 46, no. 2, pp. 221–229, 2015. ment, and Learning for Engineering (TALE), 2015 IEEE International
[46] Q. Nguyen, B. Rienties, L. Toetenel, R. Ferguson, and D. Whitelock, Conference on. IEEE, 2015, pp. 149–156.
“Examining the designs of computer-based assessment and its impact [65] J. A. Ruipérez-Valiente, P. J. Muñoz-Merino, D. Leony, and C. D.
on student engagement, satisfaction, and pass rates,” Computers in Kloos, “Alas-ka: A learning analytics extension for better understand-
Human Behavior, vol. 76, pp. 703–714, 2017. ing the learning process in the khan academy platform,” Computers
[47] M. J. Rodríguez-Triana, A. Martínez-Monés, J. I. Asensio-Pérez, and in Human Behavior, vol. 47, pp. 139–148, 2015.
Y. Dimitriadis, “Scripting and monitoring meet each other: Aligning [66] B. K. Pursel, L. Zhang, K. W. Jablokow, G. Choi, and D. Velegol,
learning analytics and learning design to support teachers in orchestrat- “Understanding mooc students: motivations and behaviours indicative
ing cscl situations,” British Journal of Educational Technology, vol. 46, of mooc completion,” Journal of Computer Assisted Learning, vol. 32,
no. 2, pp. 330–343, 2015. no. 3, pp. 202–217, 2016.
[48] A. F. Wise, J. M. Vytasek, S. Hausknecht, and Y. Zhao, “Developing [67] P. A. Haya, O. Daems, N. Malzahn, J. Castellanos, and H. U. Hoppe,
learning analytics design knowledge in the “middle space”: The “Analysing content and patterns of interaction for improving the
student tuning model and align design framework for learning analytics learning design of networked learning environments,” British Journal
use,” Online Learning, vol. 20, no. 2, 2016. of Educational Technology, vol. 46, no. 2, pp. 300–316, 2015.
[49] E. Koh, A. Shibani, J. P.-L. Tan, and H. Hong, “A pedagogical [68] Á. Serrano-Laguna, J. Torrente, P. Moreno-Ger, and B. Fernández-
framework for learning analytics in collaborative inquiry tasks: an Manjón, “Application of learning analytics in educational
example from a teamwork competency awareness program,” in Pro- videogames,” Entertainment Computing, vol. 5, no. 4, pp. 313–
ceedings of the Sixth International Conference on Learning Analytics 322, 2014.
& Knowledge. ACM, 2016, pp. 74–83.
[69] J. Castellanos, P. Haya, and J. Urquiza-Fuentes, “A novel group en-
[50] L. Lockyer and S. Dawson, “Where learning analytics meets learn- gagement score for virtual learning environments,” IEEE Transactions
ing design,” in Proceedings of the 2nd International Conference on on Learning Technologies, 2016.
Learning Analytics and Knowledge. ACM, 2012, pp. 14–15.
[70] R. Baker, J. Clarke-Midura, and J. Ocumpaugh, “Towards general
[51] S. Keele et al., “Guidelines for performing systematic literature models of effective science inquiry in virtual performance assess-
reviews in software engineering,” in Technical report, Ver. 2.3 EBSE ments,” Journal of Computer Assisted Learning, vol. 32, no. 3, pp.
Technical Report. EBSE. sn, 2007. 267–280, 2016.
[52] T. Dybå and T. Dingsøyr, “Empirical studies of agile software devel- [71] M. Berland, D. Davis, and C. P. Smith, “Amoeba: Designing for
opment: A systematic review,” Information and software technology, collaboration in computer science classrooms through live learning
vol. 50, no. 9-10, pp. 833–859, 2008. analytics,” International Journal of Computer-Supported Collaborative
[53] T. Greenhalgh, How to read a paper. BMJ Publishing Group London, Learning, vol. 10, no. 4, pp. 425–447, 2015.
2001, vol. 2. [72] J. Melero, D. Hernández-Leo, J. Sun, P. Santos, and J. Blat, “How
[54] B. A. Kitchenham, S. L. Pfleeger, L. M. Pickard, P. W. Jones, D. C. was the activity? a visualization support for a case of location-based
Hoaglin, K. El Emam, and J. Rosenberg, “Preliminary guidelines for learning design,” British Journal of Educational Technology, vol. 46,
empirical research in software engineering,” IEEE Transactions on no. 2, pp. 317–329, 2015.
software engineering, vol. 28, no. 8, pp. 721–734, 2002. [73] B. Florian-Gaviria, C. Glahn, and R. F. Gesa, “A software suite for
[55] L. P. Prieto, M. Holenko Dlab, I. Gutiérrez, M. Abdulwahed, and efficient use of the european qualifications framework in online and
W. Balid, “Orchestrating technology enhanced learning: a literature blended courses,” IEEE Transactions on Learning Technologies, vol. 6,
review and a conceptual framework,” International Journal of Tech- no. 3, pp. 283–296, 2013.
nology Enhanced Learning, vol. 3, no. 6, pp. 583–598, 2011. [74] K. Kitto, M. Lupton, K. Davis, and Z. Waters, “Designing for
[56] A. Bhattacherjee, Social science research: Principles, methods, and student-facing learning analytics,” Australasian Journal of Educational
practices. Global Text Project, 2012. Technology, vol. 33, no. 5, pp. 152–168, 2017.
[57] J. P. Campbell, P. B. DeBlois, and D. G. Oblinger, “Academic [75] R. Martinez-Maldonado, K. Yacef, and J. Kay, “Tscl: A conceptual
analytics: A new tool for a new era,” EDUCAUSE review, vol. 42, model to inform understanding of collaborative learning processes
no. 4, p. 40, 2007. at interactive tabletops,” International Journal of Human-Computer
[58] D. Clow, “The learning analytics cycle: closing the loop effectively,” in Studies, vol. 83, pp. 62–82, 2015.
Proceedings of the 2nd international conference on learning analytics [76] A. Tervakari, K. Kuosa, J. Koro, J. Paukkeri, and M. Kailanto,
and knowledge. ACM, 2012, pp. 134–138. “Teachers’ learning analytics tools in a social media enhanced learn-
[59] O. D. Space. Glossary of teaching approaches. [Online]. ing environment,” in Interactive Collaborative Learning (ICL), 2014
Available: http://portal.opendiscoveryspace.eu/tr-activity/22-macibam- International Conference on. IEEE, 2014, pp. 355–360.
pieeja-vardnica-669764 [77] M. A. Chatti, M. Marinov, O. Sabov, R. Laksono, Z. Sofyan,
[60] A. L. Dyckhoff, V. Lukarov, A. Muslim, M. A. Chatti, and A. M. F. Yousef, and U. Schroeder, “Video annotation and analytics in
U. Schroeder, “Supporting action research with learning analytics,” coursemapper,” Smart Learning Environments, vol. 3, no. 1, pp. 1–21,
in Proceedings of the Third International Conference on Learning 2016.
Analytics and Knowledge. ACM, 2013, pp. 220–229. [78] A. Pardo, R. A. Ellis, and R. A. Calvo, “Combining observational
[61] R. E. Clark, “Constructing a taxonomy of media attributes for research and experiential data to inform the redesign of learning activities,”
purposes,” AV Communication Review, vol. 23, no. 2, pp. 197–215, in Proceedings of the Fifth International Conference on Learning
1975. Analytics And Knowledge. ACM, 2015, pp. 305–309.
[62] D. Gaševic, D. Djuric, and V. Devedžic, Model driven architecture and [79] N. Li, V. Marsh, and B. Rienties, “Modelling and managing learner
ontology development. Springer Science & Business Media, 2006. satisfaction: Use of learner feedback to enhance blended and online
[63] J. Ma, X. Han, J. Yang, and J. Cheng, “Examining the necessary learning experience,” Decision Sciences Journal of Innovative Educa-
condition for engagement in an online learning environment based on tion, vol. 14, no. 2, pp. 216–242, 2016.
learning analytics approach: The role of the instructor,” The Internet [80] F. Martin and A. Ndoye, “Using learning analytics to assess student
and Higher Education, vol. 24, pp. 26–34, 2015. learning in online courses,” Journal of University Teaching & Learning
[64] C.-U. Lei, X. Hou, T. T. Kwok, T. S. Chan, J. Lee, E. Oh, D. Gonda, Y.- Practice, vol. 13, no. 3, p. 7, 2016.
C. A. Yeung, and C. Lai, “Advancing mooc and spoc development via [81] K. Thompson, D. Ashe, L. Carvalho, P. Goodyear, N. Kelly, and
This article has been accepted for publication in a future issue of this journal, but has not been fully edited. Content may change prior to final publication. Citation
information: DOI 10.1109/TLT.2018.2868673, IEEE Transactions on Learning Technologies
19

M. Parisio, “Processing and visualizing data in complex learning learning and bolster school reform,” Educational Psychologist, vol. 41,
environments,” American Behavioral Scientist, vol. 57, no. 10, pp. no. 1, pp. 5–17, 2006.
1401–1420, 2013. [99] S. Freitas, D. Gibson, C. Du Plessis, P. Halloran, E. Williams,
[82] D. T. Tempelaar, B. Rienties, and B. Giesbers, “Verifying the sta- M. Ambrose, I. Dunwell, and S. Arnab, “Foundations of dynamic
bility and sensitivity of learning analytics based prediction models: learning analytics: Using university student data to increase retention,”
An extended case study,” in International Conference on Computer British Journal of Educational Technology, vol. 46, no. 6, pp. 1175–
Supported Education. Springer, 2015, pp. 256–273. 1188, 2015.
[83] N. Z. Zacharis, “A multivariate approach to predicting student out- [100] B. J. Zimmerman, “Academic studing and the development of personal
comes in web-enabled blended learning courses,” The Internet and skill: A self-regulatory perspective,” Educational psychologist, vol. 33,
Higher Education, vol. 27, pp. 44–53, 2015. no. 2-3, pp. 73–86, 1998.
[84] K. Thompson, “Using micro-patterns of speech to predict the correct- [101] S. McKenney and Y. Mor, “Supporting teachers in data-informed
ness of answers to mathematics problems: an exercise in multimodal educational design,” British journal of educational technology, vol. 46,
learning analytics,” in Proceedings of the 15th ACM on International no. 2, pp. 265–279, 2015.
conference on multimodal interaction. ACM, 2013, pp. 591–598. [102] B. Rienties, L. Toetenel, and A. Bryan, “Scaling up learning design:
impact of learning design activities on lms behavior and performance,”
[85] Q. Nguyen, B. Rienties, and L. Toetenel, “Mixing and matching
in Proceedings of the Fifth International Conference on Learning
learning design and learning analytics,” in International Conference
Analytics And Knowledge. ACM, 2015, pp. 315–319.
on Learning and Collaboration Technologies. Springer, 2017, pp.
302–316. [103] A. F. Wise and D. W. Shaffer, “Why theory matters more than ever
in the age of big data.” Journal of Learning Analytics, vol. 2, no. 2,
[86] ——, “Unravelling the dynamics of instructional practice: a longitu- pp. 5–13, 2015.
dinal study on learning design and vle activities,” 2017.
[104] K. Höök and J. Löwgren, “Strong concepts: Intermediate-level knowl-
[87] F. Martin, A. Ndoye, and P. Wilkins, “Using learning analytics to edge in interaction design research,” ACM Transactions on Computer-
enhance student learning in online courses based on quality matters Human Interaction (TOCHI), vol. 19, no. 3, p. 23, 2012.
standards,” Journal of Educational Technology Systems, vol. 45, no. 2,
pp. 165–187, 2016. [105] J. S. Brown, A. Collins, and P. Duguid, “Situated cognition and the
culture of learning,” Educational researcher, vol. 18, no. 1, pp. 32–42,
[88] A. Pardo, F. Han, and R. A. Ellis, “Combining university student 1989.
self-regulated learning indicators and engagement with online learn-
[106] S. Kalyuga, “Expertise reversal effect and its implications for learner-
ing events to predict academic performance,” IEEE Transactions on
tailored instruction,” Educational Psychology Review, vol. 19, no. 4,
Learning Technologies, vol. 10, no. 1, pp. 82–92, 2017.
pp. 509–539, 2007.
[89] R. F. Kizilcec, C. Piech, and E. Schneider, “Deconstructing disen- [107] D. Oblinger and J. Campbell, “Academic analytics, educause white
gagement: analyzing learner subpopulations in massive open online paper,” Retrieved October, vol. 20, p. 2011, 2007.
courses,” in Proceedings of the third international conference on
learning analytics and knowledge. ACM, 2013, pp. 170–179. [108] B. A. Kitchenham, D. Budgen, and O. P. Brereton, “Using mapping
studies as the basis for further research–a participant-observer case
[90] S. Joksimović, D. Gašević, T. M. Loughin, V. Kovanović, and study,” Information and Software Technology, vol. 53, no. 6, pp. 638–
M. Hatala, “Learning at distance: Effects of interaction traces on 651, 2011.
academic achievement,” Computers & Education, vol. 87, pp. 204–
217, 2015. Katerina Mangaroska is a PhD student at the
[91] L. Corrin, P. G. de Barba, and A. Bakharia, “Using learning analytics Department of Computer Science at the Norwegian
to explore help-seeking learner profiles in moocs,” in Proceedings of University of Science and Technology. Her primary
the seventh international learning analytics & knowledge conference. research area is learning analytics. Currently she
ACM, 2017, pp. 424–428. is engaged in the "Future Learning: Orchestrating
21st Century Learning Ecosystem using Analytics"
[92] M. Worsley and P. Blikstein, “Leveraging multimodal learning analyt-
project that aims to develop new knowledge how
ics to differentiate student learning strategies,” in Proceedings of the
analytics allow us to better orchestrate different
Fifth International Conference on Learning Analytics And Knowledge.
learning tools and practises, and design optimal
ACM, 2015, pp. 360–367.
learning environments. She is also working with
[93] K. Pantazos and R. Vatrapu, “Enhancing the professional vision of eye-tracking to gain insights about user behavior
teachers: A physiological study of teaching analytics dashboards of in programming environments (e.g., how user processes information when
students’ repertory grid exercises in business education,” in System learning to program or interacts with visual information during coding). Her
Sciences (HICSS), 2016 49th Hawaii International Conference on. other research interests center around learning design, intelligent tutoring
IEEE, 2016, pp. 41–50. systems, learning infrastructure for computing education, and human-computer
[94] A. F. Wise, “Designing pedagogical interventions to support student interaction. Mangaroska is a Fulbright scholar.
use of learning analytics,” in Proceedings of the Fourth International
Conference on Learning Analytics And Knowledge. ACM, 2014, pp. Michail Giannakos is an associate professor of
203–211. interaction design and learning technology at the
Department of Computer Science, Norwegian Uni-
[95] K. Avramides, J. Hunter, M. Oliver, and R. Luckin, “A method for versity of Science and Technology and research
teacher inquiry in cross-curricular projects: Lessons from a case study,” director of the Centre for Excellent IT Education.
British Journal of Educational Technology, vol. 46, no. 2, pp. 249–264, His research interests center on making sense of
2015. user experiences and practices in order to redesign
[96] S. B. Shum and R. Ferguson, “Social learning analytics,” Journal of and optimize the education settings and systems.
educational technology & society, vol. 15, no. 3, p. 3, 2012. Giannakos is member on the executive board of IEEE
Technical Committee on Learning Technology (IEEE
[97] M. Neelen and P. Kirschner. Where are the learning TCLT). He has worked at several research projects
sciences in learning analytics research? [Online]. Available: funded by diverse sources like EC, Microsoft Research, Norwegian Research
https://3starlearningexperiences.wordpress.com/2017/10/17/where- Council (NRC), NSF, DAAD and Cheng Endowment; Giannakos is also a
are-the-learning-sciences-in-learning-analytics-research/ recipient of a Marie Curie fellowship, the Norwegian CAREER award and he
[98] P. H. Winne, “How software technologies can improve research on is one of the twenty outstanding academic fellows of NTNU.

View publication stats

You might also like