1
Learning Analytics: A Study Guide for
Teachers in Training
What is Learning Analytics?
Learning Analytics (LA) refers to the use of data analysis techniques to gain insights into
teaching and learning processes. In simple terms, it involves collecting and examining
data about learners and their activities in order to understand and improve education.
A widely cited definition describes LA as “the measurement, collection, analysis and
reporting of data about learners and their contexts, for purposes of understanding and
optimising learning and the environments in which it occurs.” In practice, this means
taking the digital traces students leave (e.g. in a Learning Management System) and
turning them into actionable information that can enhance learning outcomes.
Purpose: The ultimate goal of learning analytics is to improve learning and teaching. By
identifying patterns in educational data, educators can make evidence-based decisions
– for example, spotting which students need help, which teaching strategies are working,
or what resources are underused. Learning Analytics builds on long-standing practices
of using evidence to improve education, but exploits new opportunities o ered by big
data and computational analysis. In essence, it bridges the gap between raw data and
pedagogical insight, helping teachers optimize instruction and helping learners better
understand their own learning.
Why Learning Analytics Matters
Learning analytics provides tangible benefits to both teachers and students by
supporting a more informed, responsive educational experience:
Better understanding of learning behaviors: Both students and teachers gain
insights into how learning actually happens. For example, students can see
patterns in their study habits, and teachers can identify which concepts students
struggle. This leads to greater self-awareness for learners and informed strategy
adjustments for instructors.
Personalized assistance: Analytics can enable personalized learning, where
support and content are tailored to individual needs. Teachers can use data to
group students by skill level, recommend resources, or adjust pacing so that each
learner gets the right level of challenge and support.
Timely feedback and intervention: Because many analytics tools work in real-
time, teachers can receive early alerts when a student is falling behind or
AI in Education Practices – LN
2
disengaging. This allows for quick feedback or intervention – for example, reaching
out to a student who hasn’t logged in or is scoring low on quizzes, before the
problem escalates. Students, in turn, get faster feedback on their performance,
which research shows is critical for learning.
Independent, self-regulated learning: Data can empower students to take
charge of their own learning. Analytics dashboards can show students their
progress, compare it with peers or goals, and thus motivate self-regulation.
Students appreciate information that helps them track their own progress and
adjust their study strategies. In other words, analytics can encourage a growth
mindset by making learning progress visible.
Evidence-based improvement: For teachers and schools, learning analytics
supports an evidence-based culture. Instead of relying purely on intuition,
educators can use concrete evidence of what’s working or not. This leads to more
e ective teaching strategies and better outcomes for students. Over time, using
data can help in refining curricula, allocating resources where they’re needed
most, and generally improving the quality of education.
In summary, learning analytics matters because it enhances visibility into the learning
process. It equips teachers in training with data-informed insights to better understand
student needs, personalize instruction, and continuously improve their teaching
practice. Students benefit through more personalized support and a clearer sense of
their own learning trajectory.
Key Features of Learning Analytics (Scale, Speed, Sources, Authenticity)
Certain distinguishing features make learning analytics particularly powerful in the digital
age:
Scale: Learning analytics deals with large-scale data (“big data”) collected from
many learners over time. Modern educational technology can capture everything
from granular click-stream data to course-wide performance. This means insights
can span micro-level details (individual actions) up to macro-level trends across
an entire class, school, or district. Such scale was not possible with traditional
assessments alone.
Speed: Unlike end-of-term evaluations, many analytics systems provide real-
time or on-demand analysis. Teachers can get immediate evidence of student
performance (e.g. a dashboard updating with each quiz attempt), enabling timely
feedback and adjustments. The speed of LA allows teachers and learners to
“course-correct” during the learning process, not after it’s over.
AI in Education Practices – LN
3
Sources: Learning analytics can draw from multiple data sources. While
Learning Management System (LMS) logs (e.g. logins, page views, forum posts,
quiz results) are a primary source, LA isn’t limited to them. It can incorporate data
from student information systems, online tools, or even physical sensors. For
example, some advanced systems might include biometric or engagement data
(heart rate, facial expression, GPS location, etc.) alongside academic data. By
fusing data from di erent sources, LA provides a more comprehensive picture of
learning.
Authenticity: Much of the data in learning analytics is collected unobtrusively
during real learning activities. This means it reflects authentic learner behavior
in real settings, rather than just self-reports or one-o test results. Because data
is captured in the background as students interact with materials and activities, it
is less influenced by what students say they do and more by what they actually do.
This authenticity makes the insights more reliable and “evidence-based” –
grounded in actual behavior rather than opinions or memory.
Together, these features (often called the 4 S’s of learning analytics) enable powerful
analyses. Scale gives breadth, speed gives immediacy, diverse sources give richness,
and authenticity gives validity. For a teacher in training, understanding these
characteristics highlights why LA can reveal patterns that traditional assessments might
miss.
Types of Learning Analytics: Descriptive, Diagnostic, Predictive, Prescriptive
Learning analytics techniques can be categorized into di erent types, each answering a
particular question about the learning process. The main types are descriptive,
diagnostic, predictive, and prescriptive analytics, adapted from general data analytics
frameworks to the education context:
Descriptive Analytics – “What has happened?” This is the most basic form of
analytics, which summarizes past data to reveal what has already occurred. In
learning contexts, descriptive analytics often involves data visualization and
reporting of learning patterns – for example, charts of a student’s attendance or
average quiz scores over time. It provides insights like which assignments had the
lowest scores or what the distribution of grades looks like. Descriptive LA helps
instructors identify trends or anomalies in retrospect. It is typically achieved
through aggregating and mining existing data to report on historical performance.
Diagnostic Analytics – “Why did it happen?” This type digs deeper into causes
and correlations. Diagnostic learning analytics tries to explain why certain
outcomes occurred by examining relationships in the data. For example, if many
students struggled in a unit, diagnostic analysis might reveal that those who
AI in Education Practices – LN
4
skipped a particular prerequisite activity performed poorly, suggesting a causal
factor. Techniques like drill-down analysis, correlation, or statistical tests are used
to identify factors associated with successes or failures. Hypothesizing factors of
performance and satisfaction is a key aspect of diagnostic LA– it moves beyond
surface metrics to potential reasons (e.g., students who engaged in forum
discussions did better on the exam, indicating the value of discussion).
Predictive Analytics – “What will happen?” Predictive analytics uses patterns in
historical and current data to forecast future outcomes. In education, predictive
LA is commonly used to identify students at risk of dropping out or failing before it
happens. By analyzing indicators (like low engagement or past grades), algorithms
can predict which students are likely to need intervention. For instance, a model
might estimate a probability that a student will not submit the next assignment on
time. Predictive analytics empowers teachers and advisors to take proactive steps
(such as early warning alerts) to improve student success. It’s essentially about
using what we know now to anticipate what’s next.
Prescriptive Analytics – “What should we do?” This is the most advanced form,
which not only predicts future scenarios but also recommends actions to take.
Prescriptive learning analytics might suggest optimal interventions or learning
pathways for a student. For example, if predictive models identify a student as at-
risk in math, a prescriptive system could recommend that the student attend
tutoring or that the teacher provide supplemental material on certain topics. It
often uses machine learning and algorithmic decision rules to propose one or
more prescribed solutions or strategies. In short, prescriptive analytics closes the
loop by advising educators (or learners) on how to optimize outcomes given the
data.
These four types build on each other: descriptive analytics looks backward to describe,
diagnostic looks backward to explain, predictive looks forward to anticipate, and
prescriptive looks forward and suggests next steps. As a teacher in training, recognizing
these types helps you understand the kind of insight an analysis is providing. For
example, a simple grade book report is descriptive; a student risk alert is predictive; an
automated tutoring recommendation is prescriptive. Using all four in concert leads to a
comprehensive analytics approach.
Six Dimensions of Learning Analytics (Greller & Drachsler, 2012)
When implementing learning analytics, it’s helpful to consider the framework of six
critical dimensions proposed by Greller and Drachsler (2012). These dimensions
describe the key aspects that define any learning analytics initiative:
AI in Education Practices – LN
5
Stakeholders: The people involved in or a ected by learning analytics. This
includes data subjects (usually learners, who generate the data) and data clients
(the beneficiaries who use the analytics results). For example, students and
teachers are primary stakeholders – students generate data through their
activities, and teachers act on the analytics insights. Other stakeholders might be
academic advisors, administrators, or even parents, depending on the context. It’s
important to ask: Who will use the analytics, and who is it about?
Objectives: The goals and purposes of the analytics. What do we want to achieve
or find out? Common objectives include supporting student reflection, predicting
risk, improving instructional design, enhancing engagement, etc. Each LA
deployment should have clear objectives – e.g., “reduce dropout rates” or “help
students develop self-regulation skills.” Greller & Drachsler highlight two broad
categories of objectives: reflection (helping stakeholders understand and self-
evaluate) and prediction (forecasting outcomes to intervene early).
Data: The educational data being collected and analyzed, along with its context.
This dimension looks at what data is available – for instance, LMS logs, student
demographics, assessment results, discussion posts, etc.– and under what
conditions (where it comes from, how it’s shared). Key considerations include
data quantity, quality, variety, and privacy. Educational institutions often have lots
of data siloed in di erent systems; combining these can yield new insights.
However, data access can be limited by privacy protections – so this dimension
also considers how open or protected the data.
Methods (Instruments): The techniques, tools, and algorithms used to analyze
the data. This dimension asks “How will we analyze the data to meet our
objectives?” Methods can range from simple statistical reports to complex
machine learning models. It also includes the theoretical approaches
underpinning the analysis (for example, social network analysis, educational data
mining, or learning theory frameworks). Essentially, this covers the analytics
software, data mining methods, dashboards, and any other instruments used to
transform raw data into meaningful information. The choice of method a ects
what insights you can get – e.g., using network analysis might reveal peer
interaction patterns, while a regression model might identify predictors of
performance.
External Constraints: The constraints, requirements, or limitations coming
from outside factors. These often include ethical, legal, and organizational
constraints. For instance, data privacy laws (like GDPR or FERPA) may limit how
data can be collected or shared; school policies might require consent from
students; cultural norms might a ect acceptance of monitoring. External
AI in Education Practices – LN
6
constraints ensure analytics is done responsibly – addressing questions of data
ownership, student consent, and acceptable use. They remind teachers to
consider: What are the policies or ethical boundaries we must respect? Ignoring
this dimension can lead to breaches of trust or compliance issues.
Internal Limitations (Competences): The internal capabilities and limitations
of those involved in the analytics process – essentially, the competencies required
to make use of LA. Even the best data and tools are useless if stakeholders don’t
know how to interpret or act on them. This dimension covers the data literacy and
skills of teachers, students, or administrators who use analytics. It asks whether
users have the training and understanding to derive meaningful insights and
implement changes. For example, a teacher needs some understanding of
reading data visualizations or statistical results. If not, professional development
may be necessary. Internal limitations could also include technical infrastructure
or support within the institution.
These six dimensions provide a holistic checklist when considering learning analytics in
an educational setting. As a teacher in training, you can use them to guide questions like:
Who is this analytics for and what is it trying to achieve? What data and tools are we
using? Are we allowed to do this, and do we have the skills to do it well? By addressing
Stakeholders, Objectives, Data, Methods, Constraints, and Competences, you
increase the chances that a learning analytics project will be educationally e ective and
ethically sound.
Evidence-Based Learning and Instructional Improvement
Zimmerman’s three-phase model of self-regulated learning (SRL) highlights Forethought
(planning and goal-setting), Performance (implementing strategies and monitoring
progress), and Self-Reflection (evaluating performance and adjusting) phases. Learning
analytics can act as a “reflection amplifier,” feeding objective feedback into the Self-
Reflection phase for both learners and teachers.
AI in Education Practices – LN
7
One of the most powerful roles of learning analytics is supporting evidence-based
learning – using concrete evidence to inform decisions in the learning process. We can
think of analytics as providing “learning evidence” that can trigger reflection and
improvement. For students, this means data-driven feedback can prompt them to reflect
on their learning strategies and outcomes. For teachers, analytics o ers an objective
mirror of what’s happening in their classroom, which can inform instructional
adjustments. In this way, LA serves as a “reflection amplifier”, enhancing how we
evaluate and respond to learning progressfile-7i37scdzkw21ajmwzsme3k.
In fact, learning analytics aligns closely with established educational theories of self-
regulation and continuous improvement. As noted in one study, analytics can be seen as
part of the self-evaluation process in Zimmerman’s SRL modelfile-
7i37scdzkw21ajmwzsme3k. Students who receive ongoing data about their performance
(e.g. time spent, quiz scores, concept mastery levels) have concrete evidence to reflect
upon, which strengthens the self-reflection and adjustment cycle. A teacher in training
can leverage this by, for example, showing students their learning dashboard and guiding
them to set goals or adjust study strategies based on the data.
For instructional improvement, analytics provides a form of evidence-based practice.
Rather than guessing which teaching methods are e ective, instructors can rely on
analytics findings. For instance, if analytics show that students who engage with a certain
interactive simulation achieve higher mastery, a teacher can increase the use of that
simulation or spend more time on its related content. Or if a heatmap of an online course
shows many students stop reading at a particular slide, the teacher might redesign that
slide or add an engaging element at that point. In essence, every piece of data is
AI in Education Practices – LN
8
feedback: some feedback is aimed at students (to adjust their learning tactics), and
some is aimed at instructors (to adjust their teaching tactics).
Using data as evidence helps in making informed changes. This ties into the concept of
“closing the loop” in the learning analytics cycle (data → insight → action → improved
learning). Research in learning analytics emphasizes that data by itself doesn’t improve
learning; it’s the actions taken based on data that lead to improvement. Thus, teachers
should approach analytics with the mindset of an iterative improvement cycle: gather
evidence, reflect on what it means, implement an instructional change or intervention,
and then observe the new data to see if it helped.
Finally, learning analytics contributes to building a culture of evidence-based decision
making in education. When teachers in training get used to looking at data (like
assessment results, engagement metrics, etc.) to celebrate successes or identify
problems, they are e ectively using evidence to guide their practice. Over time, this can
lead to a more reflective and scientific approach to teaching – akin to action research,
where you hypothesize an instructional strategy, test it out, and use data to confirm its
impact. In summary, learning analytics embeds an evidence-based ethos: it helps
quantify learning, thereby making the process of improvement more transparent and
systematic.
Applications of Learning Analytics in Teaching and Learning
Learning analytics can be applied in many ways to enhance teaching and learning. Here
we focus on four key applications particularly relevant to teachers in training:
personalized learning, improved student performance, enhanced teaching
strategies, and data-driven decision-making. These often overlap – for example,
personalizing learning can lead to improved performance – but each highlights a distinct
benefit of leveraging analytics.
Personalized Learning
One of the most exciting uses of analytics is to enable personalized learning
experiences. Every student learns di erently, and analytics provides the data needed to
customize instruction to those di erences. Here’s how learning analytics supports
personalization:
Individualized Learning Paths: By analyzing what a learner has mastered and
where they struggle, an analytics system can suggest the next appropriate topic
or activity for that student. For example, if a student breezed through practice
problems on Topic A but performed poorly on Topic B, the system might
recommend remedial resources for Topic B before moving on. This adaptive
AI in Education Practices – LN
9
sequencing ensures students get content and support tailored to their actual
needs and pace.
Targeted Interventions: Teachers can use dashboards to filter students by
certain criteria and then intervene appropriately. For instance, a teacher might
identify a group of students who haven’t accessed the course in over a week and
then reach out to re-engage them. Or, if analytics show a student attempts an
assessment multiple times without success, the teacher can o er one-on-one
tutoring on that topic. Such data-informed targeting means support is given when
and where it’s needed most, rather than a one-size-fits-all approach.
Di erentiated Instruction: Analytics can reveal learning preferences and
content e ectiveness for di erent students. Suppose some students
consistently engage more with video tutorials while others prefer text readings. A
teacher could use this information to o er materials in multiple formats, or to
guide students toward the format where they learn best. Similarly, tracking which
content each student spends time on (and how that correlates with performance)
helps in tailoring instruction. This is essentially personalization at scale: even in a
large class, data helps the teacher di erentiate tasks (like o ering an advanced
problem set to those who are ahead, while providing revision exercises to those
who are behind).
Personalized and Timely Feedback: Learning analytics systems often provide
automated feedback to learners. For example, an intelligent tutor might say, “You
seem to have mastered concept X; great job! Now here’s a challenge
question,” or “You struggled with concept Y; review this hint or video.” This kind
of immediate, personalized feedback keeps learners informed of their progress
and what to focus on next. Teachers can augment this by adding personal
comments based on the analytics (e.g., congratulating a student who improved
their quiz scores significantly, as shown on the dashboard).
Overall, personalized learning fueled by analytics leads to a more student-centered
classroom. Students feel more seen – the instruction responds to their performance.
Teachers in training should note, however, that personalization requires careful
interpretation of data and a responsive approach. It’s about using analytics as a compass
to navigate each student’s journey, ensuring everyone reaches their learning destination
through a path that works for them.
Improving Student Performance and Outcomes
Learning analytics has a strong track record in helping to boost student performance,
largely by identifying problems early and enabling proactive support. Several strategies
illustrate how teachers can use analytics to improve outcomes:
AI in Education Practices – LN
10
Early Warning Systems: Many institutions use analytics to create early warning
indicators for at-risk students. For example, if a student’s grade drops two weeks
in a row, or if they haven’t logged into the course for 10 days, the system can flag
this student for attention. Teachers or advisors receive these alerts and can then
act – maybe by contacting the student or providing additional resources. Research
has found that such systems can significantly reduce failure rates when acted
upon, because they catch issues (low engagement, poor understanding, personal
di iculties) before those issues irreversibly impact final grades.
Real-Time Progress Monitoring: Teachers can improve performance by
continually monitoring class and individual progress via analytics dashboards.
Instead of waiting for the final exam to see who didn’t understand the material, a
teacher might track weekly quiz scores, assignment submissions, or even time
spent on learning materials. If the data shows, say, that a student is consistently
scoring below passing by mid-term, the teacher can arrange a remedial plan or
tutoring right away. This ongoing insight ensures that students don’t “fall through
the cracks.” It also motivates students, as they can see their progress in real time
and are encouraged to keep up or seek help proactively.
Identifying Knowledge Gaps: Analytics can pinpoint specific content areas or
skills where students are underperforming. For instance, item analysis of quiz
questions might reveal that 70% of the class missed question 5, which was about
a particular concept. This indicates a possible knowledge gap or
misunderstanding. Armed with this information, a teacher can revisit that concept
in the next class, use a di erent teaching approach for it, or provide
supplementary exercises. Addressing these gaps promptly leads to stronger
overall performance on later assessments. It’s a shift from re-teaching after final
exams (when it’s too late for that cohort) to re-teaching in the moment for
immediate benefit.
Enhanced Feedback and Reinforcement: Improved performance is closely tied
to e ective feedback. Analytics often provide the raw material for more nuanced
feedback. For example, rather than just giving a student a final grade on an essay,
a teacher with analytics might also note, “You started the assignment 2 days
before the due date, and spent 3 hours total. Students who started earlier tended
to score higher on this assignment.” This kind of feedback, drawn from data, not
only addresses the outcome (the essay quality) but also the process (study habits)
that led to it. By coaching students on process (time management, consistency of
work, etc.) using analytics, teachers help improve the behaviors that underlie
performance.
AI in Education Practices – LN
11
Data-Informed Intervention Programs: On a larger scale, schools can analyze
patterns of successful students vs. struggling students to inform programs. For
instance, analytics might show that students who attend extra study sessions or
use an online practice quiz bank improve their course grades by a letter on
average. This could lead to an intervention program where all students are
encouraged or even required to engage in those activities. In this way,
institutional strategies (like mentoring, supplemental instruction, study skill
workshops) can be guided by analytics findings to lift student performance
broadly.
In summary, learning analytics helps turn reactive measures into proactive ones. By
continuously shining a light on student performance and engagement, it enables timely
support that can greatly improve learning outcomes. Teachers in training should view
analytics as a supportive tool – it won’t replace their personal engagement with students,
but it will inform them where to focus that engagement to make the biggest di erence.
Enhancing Teaching Strategies
For teachers, one of the most valuable aspects of learning analytics is the insight it
provides into teaching e ectiveness. Analytics can reveal which strategies are working
and which might need adjustment, thereby directly contributing to professional growth
and more e ective teaching:
Insight into Teaching Methods: By analyzing student data, teachers can evaluate
the impact of their teaching strategies. For example, if a teacher tries a new
interactive activity in week 3 and then looks at the analytics, they might notice
higher student engagement that week (e.g., more forum posts, longer time spent
on the module) and better quiz scores immediately after, compared to prior
weeks. This suggests the activity was e ective. Conversely, if introducing a
complex video lecture leads to many students replaying it multiple times or
quitting halfway (data which some platforms provide), it might signal that the
content was confusing or too long. In this way, analytics function like a feedback
loop for instructors, highlighting what pedagogical approaches yield better
student responses and understanding.
Experimentation and Continuous Improvement: Teachers can use analytics to
conduct small teaching experiments. For instance, split a class into two groups
– one gets additional peer discussion opportunities, the other doesn’t – and then
use analytics to compare engagement or performance metrics between the
groups. While formal experiments require rigor, even informal observation of
trends can be illuminating. Over time, a teacher builds a data-informed intuition:
“When I do X, student participation increases, but when I do Y, it decreases.” This
AI in Education Practices – LN
12
leads to continuously refined strategies, much like a scientist refining a
hypothesis. In fact, this process turns teaching into a form of action research
backed by data.
Personalizing Teaching Tactics: Just as data helps personalize student learning,
it also helps personalize teaching. A teacher in training might discover through
analytics that di erent groups of students respond di erently to certain methods.
For example, visual learners might thrive with video content (as seen by their high
completion rates of video lessons), whereas others might excel when given hands-
on assignments. A savvy teacher can adapt by incorporating multiple
representations of content. They might start a concept with a brief video and a
written explanation, then check analytics to see which format students accessed
more or performed better after. Adjustments can then be made in real-time or for
the next cohort.
Informed Reflection and Self-Assessment: Many teaching standards encourage
instructors to reflect on each lesson – what went well, what didn’t. Learning
analytics provides concrete data to support this reflection. Instead of purely
relying on memory or anecdotal student comments, a teacher can look at, say, the
drop in activity during a particular unit, or the improvement in assignment scores
after a deadline extension, and reflect on the underlying causes. It adds an
objective lens to teacher self-evaluation. For example: “I notice from the LMS
report that only 50% of students completed last week’s reading before class,
which might be why the discussion fell flat. Perhaps I need to make the reading
more engaging or hold students accountable with a quiz.” This kind of insight leads
to actionable changes in teaching strategy.
Sharing Best Practices: On a broader level, when multiple instructors use
analytics, they can compare notes and share strategies. For instance, if one
teacher’s class consistently shows higher engagement on the analytics
dashboard, colleagues might inquire what strategies they’re using. This
encourages a collaborative, data-driven professional community. Teachers in
training who learn to use analytics will be equipped to contribute to such
discussions with evidence (“In my class, using weekly online polls increased
participation by 20% as seen in the analytics”). It moves pedagogical discussions
beyond philosophy into the realm of measurable outcomes.
In short, learning analytics serves as a coach for teachers. It provides feedback on
teaching the way a fitness tracker provides feedback on exercise – by measuring relevant
metrics. With it, teachers can refine their craft, trying new techniques and immediately
seeing the e ects. This leads to more e ective teaching strategies and, ultimately, a
better learning experience for students.
AI in Education Practices – LN
13
Data-Driven Decision Making in Education
Beyond the classroom-level changes, learning analytics informs broader decision-
making in educational contexts. For teachers (and future school leaders), being data-
driven means that decisions about curriculum, resource allocation, and policy are
grounded in evidence rather than hunches. Key aspects include:
Curriculum and Content Decisions: Analytics can show which parts of a
curriculum are working well and which are not. If data over several semesters
indicate that students consistently struggle with a particular module (low quiz
scores, many forum questions, etc.), educators might decide to redesign that
module or o er additional support when teaching it. Conversely, if a new e-
learning tool introduced for a unit is correlated with improved outcomes, a
decision might be made to integrate it into more courses. Curriculum committees
can use aggregated analytics to decide where to focus revisions. Thus, the
curriculum evolves based on what the evidence suggests will improve learning.
Resource Allocation: At a school or department level, learning analytics can
highlight where resources are most needed. For example, analysis might reveal
that freshman courses have the highest rates of D/F grades or dropouts. In
response, a school might allocate extra tutoring services or supplemental
instruction sessions to those courses. Or, analytics might show low usage of a
particular educational software the school has licensed, prompting a decision to
either train teachers to use it more e ectively or to invest in a di erent tool.
Essentially, data directs where to invest time, money, and support for maximum
impact.
Policy and Planning: Data-driven decision-making extends to policies like
attendance requirements, grading schemes, or support programs. If learning
analytics at a district level show that schools implementing a mandatory
homework policy have better student performance, other schools might adopt
similar policies. Similarly, if flexible deadline policies correlate with equal or
better learning outcomes (perhaps by reducing student stress), an institution
might decide to encourage that practice. On a planning horizon, analytics can help
in forecasting needs – for example, predicting enrollment in certain courses,
which informs how many class sections or teachers will be needed. This
predictive use of data makes institutional planning more accurate.
Evaluating Innovations: When new initiatives are tried (be it a flipped classroom
model, a new app, or a professional development program for teachers), learning
analytics is crucial for evaluating their e ectiveness. Rather than assuming an
innovation works, data can confirm or challenge that assumption. For instance, a
school implements project-based learning in some classes – analytics could then
AI in Education Practices – LN
14
compare student engagement and achievement data between project-based
classes and traditional classes. If the data shows improvement, leaders can
decide to scale up the innovation; if not, they may pivot to a di erent approach.
This fosters a culture of continuous improvement at the organizational level,
driven by evidence.
Building a Data-Driven Culture: Embracing data-driven decision making means
cultivating an environment where educators regularly look at data and feel
empowered to act on it. Teachers in training should develop data literacy and a
comfort with basic analytics tools so that using data becomes second nature.
Schools might establish “data teams” or set aside time for teachers to review
analytics reports (for example, weekly meetings to discuss recent trends and plan
interventions). When all stakeholders – teachers, administrators, even students –
treat data as a valuable input for decisions, the education system becomes more
responsive and e ective. However, it’s equally important to remember that data-
driven does not mean data-dictated: human judgment and pedagogical context
are always needed to interpret analytics correctly.
In conclusion, data-driven decision making in education ensures that choices from the
classroom to the boardroom are informed by real evidence of learning. For a teacher in
training, developing this mindset means always asking “What does the evidence say?”
and being willing to seek out or collect data to answer that question. This approach leads
to more transparent, rational, and e ective educational practices.
The eLearning Analytics Cycle
When implementing learning analytics, it’s useful to think in terms of a cycle or loop that
turns raw data into actionable improvements. One model of an e-learning analytics cycle
breaks it into four key parts:
1. Learning Environment (Data Generation): This is where the data originates. It
includes all the platforms and contexts where learning takes place and
stakeholders produce data. For example, an online course in a Learning
Management System, a discussion forum, quizzes, surveys, or even physical
classroom clickers – these environments are where students and teachers
interact, and thus generate user activity data. In this stage, every click,
submission, or interaction by a learner becomes a data point (e.g., a student
answers a quiz question, or accesses a video lecture).
2. Big Data (Data Collection & Storage): All those individual data points are
collected into a massive dataset (hence “big data”). The data from the learning
environment is aggregated and stored, often in a learning record store or
database. At this part of the cycle, we have raw data – which might include logs of
AI in Education Practices – LN
15
every action, timestamps, user IDs, content items, etc. This data may also be
combined with other institutional data (like student information systems
containing demographics or prior grades) to enrich it. The result is a
comprehensive dataset that can be analyzed. Key here is that this big data needs
to be cleaned, stored securely, and made accessible for the next stage.
3. Analytics (Data Analysis & Metrics): In this stage, we apply analytical
techniques and metrics to the collected data. This is the “number crunching”
part – running reports, visualizations, statistical models, or machine learning
algorithms on the data. The goal is to transform raw data into meaningful insights
or indicators. For example, calculating each student’s course progress
percentage, generating an engagement score, predicting a risk level, or clustering
students into participation categories are all analytics processes. Often this
involves dashboard tools that present various metrics (e.g., average quiz score,
time spent per week, network centrality in a forum, etc.). By the end of this part,
the data has yielded information – patterns and findings that we can interpret.
4. Review/Action (Insight to Practice): In the final stage, the analytics results are
reviewed against the objectives and used to optimize the learning environment.
Essentially, we close the loop by taking action based on the insights. For example,
if the analytics showed certain students at-risk, this is when the teacher
intervenes (emails them, o ers help). Or if a particular content item is
underperforming, the teacher modifies it. This stage can also involve feeding the
insights back into the environment in automated ways: sending alerts to students
(“you haven’t logged in this week”), creating personalized dashboards for
instructors or learners to view, or even adaptive systems that alter the course
content (providing learning paths suggestions based on the analytics). After
actions are taken, the cycle starts anew – the learning environment has changed
or learners respond to interventions, generating new data, which will be collected,
analyzed, and reviewed in turn.
Importantly, this cycle is iterative and continuous. Successful learning analytics isn’t a
one-o analysis but a recurring process where data is constantly informing adjustments
to improve learning, and those adjustments produce new data to be analyzed. In the
context of an academic term, this cycle might loop on a weekly basis (weekly reports
leading to weekly tweaks in instruction), and in a larger context, it can loop each term or
year (program-level analytics leading to curriculum changes the next year, for example).
Illustration of learning analytics data flow from user activity to insight. In this schematic,
individual student activity data (from an LMS, student information system, etc.) and
institutional data (demographics, academic records, etc.) feed into an analytics cloud
or learning record store for processing. The results are then delivered back as
AI in Education Practices – LN
16
actionable insights: for example, personalized learning paths, real-time alerts, or
instructor dashboards to guide decision-making
For a teacher in training, understanding this cycle emphasizes that learning analytics is
not just about tools, but about a feedback loop. You collect data, analyze it, act on it,
and then repeat – continually closing the loop to get ever closer to optimal learning
experiences (this concept is sometimes called “closing the loop” in learning analytics).
Keeping the cycle in mind ensures that you always plan for the crucial last step: acting on
the analytics. Without that, analytics might yield interesting charts, but no real change in
learning.
Challenges and Considerations in Learning Analytics
While learning analytics o ers powerful benefits, it also comes with important
challenges and considerations. Teachers in training should be aware of these issues to
use analytics responsibly and e ectively:
Privacy and Ethical Issues: Learning analytics involves collecting detailed data
on student behavior, which raises privacy concerns. Educators must ensure they
comply with data protection laws and ethical standards. Students should know
what data is being collected and how it’s used. There can be a tension between
wanting as much data as possible for insight and respecting individuals’ rights. For
example, tracking a student’s location via an app might be technically useful but
is likely a privacy overreach. Ethical use of analytics also means avoiding any
misuse of data (e.g., not publicly shaming a student based on their dashboard
metrics) and ensuring informed consent where appropriate. Schools often
AI in Education Practices – LN
17
develop policies and guidelines to address these issues, and as a teacher, it’s
crucial to follow them – treating student data with the same care as sensitive
personal information.
Data Interpretation and Bias: Data and analytics are not neutral – they require
interpretation, and they can embed biases. One must be careful not to take
analytics at face value without critical thinking. For instance, if an algorithm
predicts a student is likely to fail, that prediction might be based on historical data
that contained biases (perhaps underestimating certain groups). Blindly acting on
it could reinforce disparities. Also, analytics often deal in probabilities, not
certainties. A teacher should avoid labeling a student as “doomed to fail” just
because a risk model says so; instead, use it as a prompt to provide support,
recognizing the prediction could be wrong. Visualization complexity is another
factor – analytics dashboards can be complex, and misreading a chart might lead
to wrong conclusions. Always consider alternative explanations for what the data
shows and, if possible, corroborate analytics with your own observations or other
feedback.
Pedagogical Alignment: A common critique is that some analytics systems push
a behaviorist or quantitative view of learning (what can be easily measured) at
the expense of deeper learning aspects. For example, analytics might count
discussion posts, but quantity doesn’t equal quality of learning. Teachers should
be cautious that the metrics available don’t skew focus away from important but
less measurable outcomes (like creativity or critical thinking). There’s also a risk
of “teaching to the dashboard” – where instructors focus narrowly on improving
the metrics rather than the underlying learning (similar to teaching to the test). To
mitigate this, use analytics as one input among many, and combine quantitative
indicators with qualitative insights about student learning. Align analytics with
sound pedagogy: if collaboration is a goal, don’t just measure logins – measure
something meaningful about collaboration (perhaps via peer assessment
analytics or social network analysis of interactions).
Student Perceptions and Motivation: How students react to analytics is a
consideration. If not implemented carefully, dashboards can potentially
demotivate or stress students. For example, a student seeing themselves at the
bottom of a leaderboard may feel discouraged. There’s also the phenomenon of
gaming the system – students might focus on whatever the analytics measures,
sometimes in superficial ways, just to improve their metrics (e.g., clicking through
all pages rapidly to mark them “completed” without actually learning, if
completion is tracked). To address this, teachers should frame analytics to
students in a supportive way (“this is information to help you, not judge you”) and
ensure that multiple measures of learning are considered. It might be wise to keep
AI in Education Practices – LN
18
certain analytics for teacher-use only (like risk predictions) and not expose all data
to students, or to opt-in approaches where students choose to see comparative
analytics.
Technical and Skill Challenges: Implementing learning analytics requires the
right tools and skills. Not all schools have advanced LMS reporting or data
scientists on hand. For a teacher, a practical challenge can be data overload –
there may be so much data that it’s hard to know what to focus on. Developing the
ability to filter out noise and identify meaningful patterns is key. Additionally, as
mentioned in the six dimensions, teacher competences in data literacy are
crucial platform.. A challenge for many is learning to use analytics software and
interpret statistics. Ongoing professional development and support (perhaps data
coaches or IT support in the school) can help teachers overcome this. There can
also be technical hurdles: integrating data from multiple sources, ensuring data
accuracy, and having analytics tools that play nicely with your LMS. Teachers
should be prepared for a learning curve and advocate for the technical support or
training they need.
Validity and Actionability: Finally, one must ask: Are we measuring what we think
we’re measuring? If an analytics metric is supposed to represent “engagement,”
is it valid? (Maybe a student can be mentally engaged without clicking a lot, which
wouldn’t show up.) Data quality issues (missing data, errors) can also mislead.
And even good data is useless if it’s not actionable – so a challenge is aligning
analytics with things you can actually do something about. For instance, knowing
that some students prefer night study might not be actionable for teaching,
whereas knowing who hasn’t mastered a prerequisite skill is actionable. Always
consider the actionability of an analytic: if a metric doesn’t lead to a potential
action, its value is limited. This ties back to setting clear objectives – collect data
that has a purpose.
In summary, while learning analytics is a powerful tool, it must be used with care. Ethical
considerations, critical thinking in interpretation, and proper training are all essential. For
teachers in training, the takeaway is to approach analytics as a helpful guide, but always
apply your professional judgment and ethical standards when using data in educational
decisions.
LMS Reporting and Useful Metrics to Track
Most learning happens (or is recorded) in a Learning Management System (LMS) or
similar platforms. LMS reporting tools are therefore a primary source of learning
analytics data for teachers. Modern LMSs (like Moodle, Canvas, Google Classroom, etc.)
o er built-in reports and analytics features that can give you a wealth of information
AI in Education Practices – LN
19
about student activity and performance. Being familiar with these and knowing which
metrics to track will help you monitor your class and intervene e ectively.
Here are some useful LMS metrics and reports for teachers in training to pay attention to:
Course Progress & Completion: Metrics like course progress percentage or
module completion rates show how far along students are. You can quickly see
who has completed assignments or modules and who is falling behind. A drop-o
report might highlight at which point students tend to stop or drop out of the
course – useful for identifying problematic units.
Enrollment and Attendance: Reports on how many students are actively enrolled
and participating. For example, active enrollments vs. inactive/withdrawn
students, or a last access report showing the last login date for each student. If a
student hasn’t accessed the course in a long time, that’s a red flag to follow up. In
blended or in-person settings, attendance and participation logs serve a similar
purpose.
Time Spent (Engagement Time): Many systems track the total time a student
spends on the course or on specific activities. While not perfect, it gives a rough
gauge of engagement. If a usually good student only spent 5 minutes on a
homework reading (when others spent 30), that might explain a poor quiz
performance and prompt a review. Conversely, tracking total time online each
week can help identify students who may be struggling (e.g., spending too much
time perhaps indicating they’re confused and rereading content repeatedly).
Activity Participation: This includes metrics like number of forum posts or
comments, number of quizzes attempted, assignments submitted on time, etc.
User activity tracking can highlight how engaged each student is in discussions or
other interactive elements. For instance, a report might show that Student A
posted 5 messages and replied to 3 peers, whereas Student B hasn’t posted at all.
This could lead you to encourage Student B to participate or check if something is
hindering them.
Assessment Performance: Your LMS gradebook and quiz analytics are key. Look
at average quiz scores, distribution of grades, and individual question
breakdowns. Many systems provide item analysis – e.g., what percentage of
students chose each multiple-choice option. Useful metrics include
quiz/assessment performance reports that identify which questions most
students missed. Also, attempts and answers breakdown can show if students are
guessing (multiple attempts, wide variance in answers) or consistently getting
certain types of questions wrong. Tracking assignment grades over time can reveal
trends (is performance improving after a certain point?).
AI in Education Practices – LN
20
Content Engagement & Preferences: LMS reports often show which resources
are viewed most. For example, most viewed pages or files, or video analytics that
indicate how much of a video was watched. If some content is rarely accessed or
quickly skimmed, students might be finding it less useful or engaging.
Alternatively, if supplementary materials (like an optional reading or a practice
quiz) have high usage, that tells you those were valued and perhaps should be
made central next time. Content preference data helps in refining course
materials.
Learning Path/Navigation: Some LMS analytics can chart the path a learner takes
(sequence of content accessed). A learning path report might show, for instance,
that students who first review lecture slides before attempting the quiz do better
than those who jump straight to the quiz. Understanding common paths can
identify e ective study patterns to recommend. It can also show if students are
skipping around in an unintended way (maybe the navigation is unclear).
Completion of Key Activities: Metrics like number of students who completed a
particular activity or prerequisite can be useful. For example, how many
attempted the practice quiz or how many accessed the library article. Low
numbers might indicate students aren’t finding or bothering with an activity,
suggesting you might need to integrate it more tightly or emphasize its importance.
Engagement Score or Composite Indices: Some systems provide an
“engagement score” for students, which is a composite of various actions (login
frequency, participation, assignments on time, etc.). While each system
calculates it di erently, it can be a quick way to identify students who are generally
disengaged versus highly engaged. Treat such scores as starting points for
investigation.
Gamification and Motivation Stats: If your course uses gamification elements
(badges, points, etc.), track those. For example, a badges earned report can show
who has unlocked which achievements. While somewhat peripheral, it can
indicate who is exploring bonus content or going above and beyond (earning
optional badges), or conversely, if no one is earning badges, maybe that feature
isn’t motivating as intended.
Comparative Reports and Trends: Some LMSs allow you to compare sections or
look at trends over time. For instance, comparing this semester’s class average to
last semester’s on the same assessments (useful if you teach the same course
again). Or monitoring weekly progress: Week 1 saw 90% completion of tasks,
Week 2 saw 80% – why the drop? These trends help in course pacing and
identifying tough weeks where students may need extra support.
AI in Education Practices – LN
21
When tracking these metrics, the key is not to be overwhelmed by data but to focus on
those that align with your educational goals. For example, if fostering active learning is a
goal, then discussion participation and time on task are crucial metrics. If mastery of
content is key, then quiz performance and completion rates matter. Use the LMS
reporting features to spot patterns: Are there students who consistently stand out (either
high or low engagement)? Are there weeks where everyone slacks o (perhaps a sign of
external stress or a less engaging topic)? Are certain resources underutilized?
Once you identify a pattern, close the loop by acting on it. LMS metrics should trigger
questions and actions: Student X hasn’t logged in — I’ll reach out. Half the class didn’t
finish the project on time — was it too hard or unclear? Let’s gather feedback. Everyone
watched the bonus video — maybe I’ll include more like it.
Finally, remember that metrics are indicators, not full stories. Combine them with your
personal knowledge of students. Use them to enhance your intuition, not replace it. With
practice, reading LMS reports will become a routine part of your teaching workflow,
enabling you to be a more responsive and data-informed educator.
AI in Education Practices – LN