DRC TMFA Charmalou Ogarte ZNNHS-Turno

Download as pdf or txt
Download as pdf or txt
You are on page 1of 66

TECHNOLOGY- MEDIATED

FORMATIVE ASSESSMENT:
EFFECT ON STUDENTS
PERFORMANCE LEVEL IN PHYSICS

Ogarte, Charmalou P.
Teacher III
Zamboanga del Norte National High School-Turno
East District

Region IX
2

Technology-Mediated Formative Assessment : Effect on


Students’ Performance Level in Physics

Ogarte, Charmalou P.

Abstract

This study aimed to investigate the technology- mediated formative


assessment in assessing the level of understanding of Grade 8 students of
Zamboanga del Norte National High School – Turno during the school year
2023- 2024.The Quasi-experimental Design 10 was employed. A total of 71
Grade 8 students coming from two different sections in the Enhanced Basic
Education Program , 35 students comprised the control group and 36
students comprised the experimental group. The students responded to the
50-item teacher made test and data were collected and analyzed using the
mean, z- test and t- test. The study revealed that the result of the students’
pretest performance in both control and experimental groups do not vary.
It showed that the students in both control and experimental groups do not
have the same level of development in the different areas covered in the
study. After the intervention, the experimental group showed slight
improvement as compared to the control group which performed
significantly. Using technology in formative assessments is indeed effective
in improving the academic performance of students as compared to the
traditional way of checking their level of understanding. It is recommended
that Science teachers should employ in classroom situation the technology-
mediated formative assessment. It is also encouraged that leaders and
teachers at all levels should be trained in the use of technology- mediated
formative assessment.

Keywords: Formative Assessment; Technology-Mediated; Academic Performance


3

Introduction

Assessment lies at the heart of the educational process,serving as a


critical tool to measure, evaluate, and enhance learning outcomes.
Assessment encompasses a broad and comprehensive perspective that
encompasses the entire educational setting, from national and institutional
policies to educational standards and program evaluation. It is a
multifaceted process that guides educational systems in shaping
curriculum, pedagogy, and overall instructional practices to foster optimal
student development and academic achievement.
At its core, assessment seeks to measure students' progress,evaluate
their understanding, and inform pedagogical decisions to enhance their
learning experiences. Two primary types of assessments commonly used in
educational settings are summative assessment and formative assessment.
Each assessment type serves distinct purposes and operates at different
points in the learning journey, providing valuable insights into students'
progress and understanding.
Formative assessment involves evaluating the accomplishments of
students and continually offering feedback to both teachers and students,
aiming to enhance teaching and learning. Additionally, it aids students in
enhancing their performance by preventing them from repeating mistakes
made in the past.
Assessment extends its influence to the international stage, where global
assessments such as the Programme for International Student Assessment
(PISA) and the Trends in International Mathematics and Science Study (TIMSS)
play a pivotal role. These assessments are conducted periodically across
participating countries, providing a platform for comparing educational
performances and identifying best practices worldwide.
4

Among the 79 nations when the Philippines first participated in PISA in


2018, Filipino students performed appallingly in reading comprehension and
in mathematics, and scientific literacy (OECD, 2018). When Filipino students
took the said assessment in 2018, there were “factors that negatively impacted
their performance,” according to Bay (2012), a Senior Psychometrics Director
at College Board (USA). Bay, who also serves as Senior Advisor of
Frontlearners, said that these factors include their lack of familiarity with the
testing environment as well as their lack of experience with technology-laden
assessment questions.
There has been a variety of studies that have been conducted recently
on how to enhance students’ relevant learning of science. In order to develop
and produce a wide range of conceptual, procedural, and metacognitive
knowledge as well as a wider range of cognitive processes, science students
demand student-centered learning environments (Brindha, 2018). In order to
help students understand Physics on the macro, micro, and symbolic levels, a
variety of teaching methodologies have been proposed and applied (Miroslav et
al., 2018). Learning Physics in particular necessitates the presenting of topics
in ways that are both accurate representations of scientific concepts and simple
enough to be understood.
Technology plays a critical role in maintaining such a setting of active
involvement through enhanced content visualizations, simulations, and
modeling, as well as supporting laboratory instructions (Krause et al., 2017;
Haluk and Akbar, 2018). In a science classroom with these types of settings,
students can utilize cell phones, projectors, wireless internet access, interactive
whiteboards, laptop computers, tablets, and other developing technologies.
These tools, when utilized correctly, can improve student-centered education
(Awad, 2014; D’Angelo, 2018; Nawzad et al., 2018; Gupta and Belford, 2019).
5

However, there has not been enough empirical research on how


technology-supported formative assessment enhances students’
understanding of Physics in particular. In Zamboanga del Norte National High
School- Turno, most of the teachers employ the traditional approach.
Formative assessments are often conducted using pen-and-paper-based
methods. A significant challenge lies in providing timely feedback to students
as this approach often encounter delays in analyzing and delivering feedback,
hindering its effectiveness in addressing immediate learning needs.
This study had been designed to fill this gap by examining the effects
of technology-mediated formative assessment on the academic performance of
secondary school students in Physics. The study aimed to investigate the
impact of technology-mediated formative assessment on the academic
performance of students in Physics.

Literature Review

This section is a discussion of the literature and the results of other


related research to which the present study is related or has some bearing or
similarity. The reviewed literature and studies give the author enough
background in understanding the current study.
Technology – Mediated Formative Assessment. According to Sun and
Wu (2020), technology-mediated formative assessment refers to the use of technology
tools and platforms to implement ongoing, continuous assessment practices that
provide feedback to both teachers and students during the learning process. Wang and
Liu (2023) aver that unlike summative assessments, which evaluate learning at the
end of a unit or course, formative assessments are designed to monitor student
understanding, identify areas of difficulty, and inform instructional decisions in real-
time. Technology-mediated approaches enhance traditional formative assessment
methods by leveraging digital tools and resources to collect, analyze, and interpret
student data more efficiently and effectively.learning experiences to learners.

By the same token, Yu and Liu (2021) mentioned that one of the key benefits of
technology-mediated formative assessment is its ability to offer immediate and
6

personalized feedback to students. Digital platforms and tools can provide instant
feedback on quizzes, assignments, and activities, allowing students to understand
their strengths and weaknesses, identify misconceptions, and make necessary
adjustments to their learning strategies. This immediate feedback fosters a more
dynamic and interactive learning environment, promoting student engagement,
motivation, and self-regulation (Azevedo & Hadwin, 2021). Additionally, technology-
mediated formative assessment allows teachers to tailor instruction to meet the
individual needs of students, address misconceptions, and provide targeted support,
leading to enhanced learning experiences and improved academic performance.
Shreds of literature accentuates that technology-mediated formative
assessment also facilitates enhanced data collection and analysis capabilities
(Panadero et al., 2017). Digital tools can collect a wide range of data, including student
responses, interaction patterns, and progress over time. This rich data set enables
teachers to gain deeper insights into student learning processes, track learning
trajectories, and identify patterns or trends in student performance. Through analyzing
this data, educators can make informed decisions about instructional strategies,
curriculum development, and intervention methods, leading to more effective and
personalized learning experiences for students (Seufert, 2019).
The paper of Tsai, Lin, and Liu (2022) mentioned that as technology continues
to play a central role in education, the integration of technology-mediated formative
assessment into modern educational practices is becoming increasingly prevalent.
Digital platforms and tools are readily available and accessible, making it easier for
educators to implement technology-mediated formative assessment in their
classrooms. This integration aligns with the principles of learner-centered and
technology-mediated learning theories, emphasizing personalized learning, active
participation, and data-driven decision-making. Furthermore, technology-mediated
formative assessment prepares students for future learning environments that rely on
digital literacy, critical thinking, and adaptability, equipping them with essential skills
and competencies for success in the 21st century (Ifenthaler & Kim, 2023).
According to Alonso-Mencía (2022) and, Guskey and McTighe (2020)
Technology-Mediated Formative Assessment plays a crucial role in enhancing learning
engagement and motivation among learners. Through providing immediate feedback
and personalized learning experiences through digital platforms, learners are more
7

actively engaged in the learning process. The interactive nature of technology-mediated


assessments encourages students to participate more actively, as they receive instant
feedback on their performance, understand their progress, and identify areas for
improvement. This dynamic feedback loop fosters a sense of accomplishment and
motivates learners to set and achieve their learning goals. Additionally, the gamified
elements often incorporated into technology-mediated assessments can make learning
more enjoyable and engaging, further enhancing student motivation and commitment
to learning.
According to Alonso-Mencía (2022) and, Guskey and McTighe (2020) ,
technology-mediated Formative Assessment plays a crucial role in enhancing learning
engagement and motivation among learners. Through providing immediate feedback
and personalized learning experiences through digital platforms, learners are more
actively engaged in the learning process. The interactive nature of technology-mediated
assessments encourages students to participate more actively, as they receive instant
feedback on their performance, understand their progress, and identify areas for
improvement. This dynamic feedback loop fosters a sense of accomplishment and
motivates learners to set and achieve their learning goals. Additionally, the gamified
elements often incorporated into technology-mediated assessments can make learning
more enjoyable and engaging, further enhancing student motivation and commitment
to learning.
Studies by Hickey et al. (2012) as cited in Hagos & Andargie, (2022) have shown
that technology can significantly enhance science education by allowing students to
delve deeper into scientific concepts, engage in a wider range of scientific activities,
and stay motivated throughout the learning process. Technology's impact goes beyond
the classroom experience and influences assessment as well.
Furtak et al. ( 2016) pointed out that educators now leverage technology to
conduct formative assessments and achieve various goals, including reaching a wider
student population, boosting student motivation and adapting lessons to meet student
needs and providing students with personalized feedback and scaffolding.
Providing immediate feedback is a crucial part of helping students learn. This
feedback can help them solidify their strengths, identify areas for improvement and
understand the steps they need to take to achieve their learning goals ((Brown, Bull,
& Pendlebury, 1997 as cited in Bahati, 2019). There are many terms for using
8

technology to deliver this feedback in an educational setting. These terms include


formative e-assessment (Pachler, Daly, Mor, & Mellar, 2010; Pachler et al., 2009 as
cited in as cited in Bahati, 2019 ), online formative assessment (Koç et al., 2015;
Baleni, 2015, computer-based formative assessment (Braber-van den Broek, & van
den Berg, 2013), or simply technology-enhanced formative assessment (Spector et al.,
2016).
Technology Integration in Education. Technology refers to
methods, systems, and devices which are the result of scientific knowledge
being used for practical purposes (Collins Dictionary, n.d.). The advent of
technology paved way to a paradigm shift in different areas in society
including education. Countries worldwide that do not use the power of
technology in commerce, industries, communication, education and the
like lag behind in terms of industrialization and sustainable economic
development.
Technology refers to the use of scientific knowledge, tools, and
techniques to create, design, develop, and improve products, services, and
processes. It can include a wide range of tools, such as machinery,
electronics, software, and other forms of innovation.
The Merriam-Webster dictionary defines technology as "the practical
application of knowledge, especially in a particular area" (Merriam-Webster,
2021). Similarly, the National Academy of Engineering defines technology
as "the systematic application of scientific or other organized knowledge to
practical tasks" (National Academy of Engineering, 2021).
Some of the forms of technology included in the definition are smart
phones, tablets, laptops, desktop computers, speech generating devices,
interactive white boards, software for computers, and the internet.
According to Escueta, et.al. (2017), technological innovation over the
past two decades has indelibly altered today’s education landscape.
Revolutionary advances in information and communications technology
(ICT)—particularly disciplines associated with computers, mobile phones,
and the Internet—have precipitated a renaissance in education technology
(ed-tech) which is used to refer to any ICT application that aims to improve
education.
9

The integration of technology in education helps and supports the


expanding and easy access to quality education. It facilitates faster and
efficient communication between teachers and learners and it helps
learners to achieve the leaning outcomes in a more interactive manner.
At the K-12 level, much of the experimental evidence as cited by Lanz
et.al (2014) suggests that giving a child a computer may have limited
impacts on learning outcomes, but generally improves computer
proficiency and other cognitive outcomes. The study further proves that
computer-assisted learning can be quite effective in helping students learn.
This study reveals a positive impact of using technology in the development
of the cognitive skills of learners.
A wide range of technology tools and platforms have been employed
in formative assessment practices. Learning management systems (LMS),
such as Moodle or Canvas, offer features for online quizzes, discussion
forums, and assignment submission, facilitating continuous assessment
and feedback (Tarhini et al., 2015). Online quiz and survey tools, such as
Kahoot! or Socrative, provide interactive and gamified assessment
experiences, enhancing student engagement and motivation. Digital
portfolios or e-portfolios offer opportunities for students to showcase their
learning progress and reflect on their work, promoting metacognition and
self-assessment.
The use of interactive whiteboards allows for the display and
manipulation of computer images, along with the capability to take and
save handwritten notes (BBC, n.d.). Notably, interactive whiteboards are
commonly linked to whole-class instruction rather than activities centered
on individual students. Student engagement is generally elevated in
classrooms where information and communication technology (ICT) is
readily available for student use.
Paper Based Formative Assessment . Formative assessment is a
cornerstone of effective Physics education, allowing teachers to gauge
student understanding in real-time, identify areas of difficulty, and adjust
instruction accordingly to optimize learning outcomes (Black & Wiliam,
1998). Traditionally, paper-based formative assessments (PBFA) have been
10

the mainstay of classroom practice. However, with the increasing


integration of technology in education, technology-mediated formative
assessment (TMFA) is emerging as a valuable alternative (Amasha et al.,
2017).
While both PBFA and TMFA serve the purpose of formative
assessment, they differ in their methods of administration, feedback
delivery, and potential impact on student engagement. PBFA encompasses
a wide range of techniques that utilize paper and pencil to assess student
comprehension throughout a lesson. Common examples include exit
tickets, muddiest point cards, and quick quizzes (McMillan, 2018). These
methods offer a low-cost and readily available option for teachers. However,
PBFA can be time-consuming to administer and grade, and often lacks the
capacity for immediate feedback (Amasha et al., 2017). This delay in
feedback can hinder the opportunity for students to self-correct
misconceptions or address areas of weakness before moving on to new
concepts.
TMFA, on the other hand, leverages technology tools to assess
student understanding and provide immediate feedback during instruction
(Usher & Barak, 2018). Examples of TMFA tools include classroom
response systems (clickers), online quizzing platforms, and collaborative
whiteboards. These tools allow students to submit responses electronically
to teacher-posed questions, facilitating a more interactive learning
environment (Robertson et al., 2016). A key advantage of TMFA is the ability
to provide immediate feedback to students on their responses. This allows
for self-correction and reinforcement of understanding, potentially leading
to improved learning outcomes (Petrović et al., 2017). Additionally, TMFA
offers real-time insights for teachers. Based on the class responses
displayed on the platform, teachers can identify areas of difficulty and
adjust their teaching approach on the fly, tailoring instruction to meet the
specific needs of their students (Amasha et al., 2017).
Another potential benefit of TMFA lies in increased student
engagement. Interactive elements of TMFA tools, like voting or
gamification, can make formative assessments more engaging for students
11

compared to traditional paper-based methods (Usher & Barak, 2018). This


increased engagement can foster a more positive learning environment and
potentially lead to deeper understanding of the material.
However, TMFA is not without limitations. While PBFA requires
minimal resources, TMFA necessitates access to technology. This can be a
barrier in classrooms with limited technological infrastructure or where
students lack experience or comfort using technology for learning purposes
(Amasha et al., 2017).
Both PBFA and TMFA offer valuable tools for formative assessment in
Physics education. PBFA provides a low-cost and familiar option, but can
be limited by delayed feedback. TMFA offers the advantages of immediate
feedback, real-time adjustments in teaching, and potentially increased
student engagement, but requires access to technology and student
comfort with its use. Ultimately, the most appropriate approach depends
on several factors, including the specific learning objectives, available
resources, and student
Classroom Assessment. Classroom assessment, defined as the
process of identifying, collecting, organizing, and interpreting information
about learners' knowledge and skills, serves various purposes, including
fostering self-reflection and personal accountability (DepEd Order No. 8, s.
2015). Additionally, it provides the groundwork for profiling student
performance in alignment with the learning competencies and standards
outlined in the K to 12 Basic Education Program.
Classroom assessment, when carried out effectively, is dedicated to
securing the success of learners as they transition from guided to
independent demonstrations of knowledge, understanding, and skills.
Additionally, it aims to empower learners to effectively apply and transfer
their knowledge, understanding, and skills to navigate future situations
successfully.
In the realm of education, the terms "testing" and "assessment" are
frequently employed interchangeably to refer to the evaluation of student
learning. While a test is a specific form of assessment, typically associated
with traditional exams or quizzes, the term "assessment" is more inclusive.
12

Assessment encompasses a broader spectrum of activities, indicating the


gathering of information about student learning. This can extend beyond
tests to include a diverse range of techniques such as performance tasks,
portfolios, and observations (Rogler, 2014).
According to Alakaleek (2022), considerably, assessment is different
from testing. While assessment is ongoing process of collecting evidences
of what learners have learned and what are the expected capabilities of
them, testing is built on formal and standardized measurements, where
pupils are familiars with scoring procedures Assessment is “any method
used to better understand the current knowledge that a student possesses”.
Classroom assessment is the process of gathering evidence of what a
learner knows, what the learner understands, and what the learner is able
to do. There are two types of classroom assessment; formative assessment
and summative assessment. The formative assessment is an ongoing
process to provide learners with immediate feedback on how well they are
learning. Results of this assessment are documented but not included in
the computation of grades. On the other hand, summative assessment is
used to measure whether the learners have met the content and
performance standards. The results of this assessment are used as basis
for computing grades (DepEd Order No. 8, s. 2015).
It is important to note that this distinction refers to the purpose of an
assessment’s use, not an assessment tool itself. An individual assessment
cannot unequivocally be declared formative or summative, as this depends
on the inferences to be drawn. As noted by Black and William (2022), the
assessment's function is summative when inferences pertain to the
student's status or future potential and when the inferences revolve around
identifying activities that would optimally support the student's learning,
the assessment operates formatively.
Formative Assessment in Classroom. Formative assessment is the
main topic of interest in classrooms since it "provides teachers and
students with continuous, real-time information that informs and supports
instruction" (Ramsey & Duffy, 2016). In order to acquire data for increasing
student learning, it was imperative that teaching and learning be modified
13

for formative assessment to be prioritized as a key component of classroom


instruction. This is a result of the assistance that formative assessment
provides in guiding students throughout class to comprehend skills and
concepts and in making decisions about how to proceed in order to meet
the course learning objectives.
Formative Assessment Theory, pioneered by Black and Wiliam (2022),
as cited by Ramsey & Duffy (2016) is a pedagogical framework that
underlines the importance of ongoing and dynamic assessment during the
learning process. It centers on assessing student understanding and
progress continuously, providing timely feedback to both educators and
learners. Unlike summative assessment, which evaluates learning
outcomes at the end of a course or unit, formative assessment occurs
throughout the learning journey, guiding instructional decisions in real-
time. This theory posits that effective formative assessment strategies
empower educators to tailor their teaching methods to address the evolving
needs of individual students, ultimately enhancing learning outcomes.
At its core, Formative Assessment Theory encourages a student-
centric approach, emphasizing learner engagement and participation in
their own educational progress. It aligns with the idea that students should
be actively involved in monitoring their learning, setting goals, and
understanding how to achieve them. By promoting an environment where
constructive feedback is a fundamental component, formative assessment
theory seeks to foster a growth mindset, encouraging learners to embrace
challenges and view mistakes as opportunities for growth. As a result,
educators can optimize their teaching strategies, ensuring they are
effective, relevant, and supportive of every student's unique learning
journey.
Classroom formative assessment is an ongoing process of identifying,
gathering, organizing, and interpreting quantitative and qualitative
information about what learners know and can. It can also measure the
achievement of competencies by the learners do (DepEd, 2015).
Formative assessment conducted in different parts of the lesson
serves different purposes. Before the lesson, it helps teachers understand
14

where the students are in terms of conceptual understanding and


application and provides bases for making instructional decision, such as
moving on to a new lesson or clarifying prerequisite understanding. During
the lesson, it informs teachers of the progress of the students in relation to
the development of the learning competencies. It also helps the teacher
determine whether instructional strategies are effective. The results of
formative assessment given may be compared with the results of the
formative assessments given before the lesson to establish if conceptual
understanding and application have improved.
On this basis, the teacher can make decisions on whether to review,
re-teach, remediate, or enrich lessons and subsequently, when to move on
to the next lesson. After the lesson, formative assessment assesses whether
learning objectives were achieved. It also allows the teacher to evaluate the
effectiveness of instruction (DepEd Order No. 8 s. 2015).
Technology Tools and Platforms for Formative Assessment in
Physics. Technology tools and platforms play a crucial role in facilitating
formative assessment practices in physics education. This review explores
the existing literature on the use of digital tools for data collection and
online platforms for formative assessment in physics. Specifically, it
focuses on motion sensors and probes, video analysis software, simulations
and virtual laboratories as digital tools, and learning management systems
(LMS), online quiz and survey tools, and collaborative learning platforms as
online platforms for formative assessment.
Inknoe Classpoint. ClassPoint serves as a Classroom Response
System seamlessly integrated into Microsoft PowerPoint. This tool
empowers users to transform their existing slides into interactive
presentations, facilitating the delivery of quiz questions directly within
PowerPoint without the need to switch to another application during
teaching. ClassPoint offers various question modes, such as multiple-
choice questions, short questions, and quick polls, among others (Inknoe,
2021).
Moreover, ClassPoint provides features that allow instructors to
incorporate unlimited whiteboards during a slide show and annotate slides.
15

Students can actively participate in quizzes and follow the instructor's slide
presentation using either their smartphones or computer-based devices. To
engage with the platform, students simply visit http://classpoint.app,
enter the class code, and create a username, which they will use
throughout the lesson.
The use of ClassPoint at in certain courses complements traditional
teacher-centered lecture settings by promoting student engagement. It
enables students to showcase their learning progress and knowledge in an
enjoyable and interactive manner. While in some undergraduate courses at
SUTD, students' responses are not considered in course assessments, there
are others where quiz participation contributes to their attendance and
participation points.
Kahoot in Gamifying Assessments. Gamification, as defined by
Faiella and Ricciardi (2015), involves employing game elements in learning
activities. Its effectiveness in enhancing learning speed and efficiency in
non-game settings has been noted (Sailer & Homner, 2020). Various
authors highlight gamification as a potent motivator (Tan Ai Lin, Gonapathy
& Manjet, 2018; Wang, 2015; Zainuddin et al., 2020) and an effective tool
for increasing student engagement (Hanus & Fox, 2015; Kuo & Chuang,
2016).
Kahoot! serves as a digital game-based learning platform fostering
interactive engagement between teachers and students through competitive
knowledge games. This web-based platform, offering interactive quizzes,
surveys, points, leaderboards, instant feedback, and rewards, has been the
subject of prior studies mainly focusing on its impact on enhancing
engagement and motivation rather than students' grades and academic
achievements.
Simulations and Virtual Laboratories. Simulations and virtual
laboratories provide a digital environment for students to conduct
experiments and explore physics concepts. These tools allow students to
manipulate variables, observe simulations, and gather data for analysis.
Research has indicated that simulations and virtual laboratories enhance
students' engagement, conceptual understanding, and problem-solving
16

abilities (Smetana & Bell, 2012). They provide an opportunity for formative
assessment by presenting interactive scenarios and collecting students'
responses to evaluate their comprehension.
Academic Performance. Ming-Hung et al. 2016, the terms
"academic performance," means the same concept as learning result,
academic achievement, and learning achievement, i.e. students' academic
learning outcome, or the consistent outcome throughout time. According to
Lubega et al. (2014), learning outcome is a key metric for assessing the
effectiveness of instruction as well as an indicator of how much students
have learned.
The materials reviewed provide ample evidence that the use of
technology – supported formative assessment in various discipline affect
the students’ engagement and academic performance of the students. But
it also showed that there is a limited research conducted on how the
technology- mediated formative assessment affects the academic
performance of students in Physics, in particular. This motivated the
researcher to conduct a study that aimed to investigate the effect
This study is mainly anchored on the Technology-Mediated Learning
theory that emphasizes the scaffolding of students’ learning by providing
immediate feedback, displaying it in usable ways, and assessing the levels
of understanding dramatically which enhances formative assessment
practices especially in science education.
Technology-Mediated Learning Theory incepted by
Rosenberg(2001)elucidates the role of technology in facilitating and
enhancing the learning process. It recognizes that technology can act as a
powerful tool to deliver educational content, promote interactive learning
experiences, and provide personalized learning opportunities (Dhawan,
2020). This theory suggests that when technology is effectively integrated
into educational settings, it can support learners in acquiring knowledge,
developing skills, and achieving learning outcomes more efficiently and
effectively than traditional methods alone (Bower, 2019).
In the same vein, Sung et al. (2019) underscores the importance of
learner-centered approaches in the theory. It advocates for the use of
17

technology to cater to individual learning styles and preferences, thereby


promoting personalized and adaptive learning experiences. This approach
encourages active participation, collaboration, and engagement among
learners, fostering a more interactive and dynamic learning environment.
Additionally, technology can offer immediate feedback, resources,
and support, enabling students to take ownership of their learning and
progress at their own pace.
In addition, it acknowledges the evolving nature of technology and
its impact on education. It highlights the need for educators to continually
adapt and innovate their teaching practices to leverage the potential of
technology effectively (Lin & Huang, 2020). This involves integrating various
technological tools and platforms into the curriculum, providing
professional development opportunities for teachers, and fostering a
culture of lifelong learning and innovation within educational institutions.
Relating to the context of this study, the focus was on exploring the
effects of technology-mediated formative assessment methods on student
learning outcomes and performance in Physics. Formative assessment
involves ongoing, continuous assessment practices that provide feedback
to both teachers and students during the learning process, allowing for
adjustments and improvements to instruction and learning strategies. The
study employed technology-mediated formative assessment tools and
platforms to collect real-time data on student understanding,
misconceptions, and learning progress in Physics.
In this study, the researcher investigated the effect of Technology-
Mediated Formative Assessment in the academic performance of Grade 8
students at Zamboanga del Norte National High School for the school year
2023-2024.
The Schema of the Study is presented in Figure 1 on page 20. The
topmost rectangle contains the subject under study which is the
Zamboanga del Norte National High School Grade 8 students. Two arrows
from the said rectangle are pointing to two blocks. The one pointing to the
left block contains the students in the control group and the arrow pointing
to the right block contains the students belonging to the experimental
18

group. From both blocks are arrows pointing to the center block which the
Pretest of both control and experimental group were administered. The
arrow from the center block is pointing downward which contains the topics
which were included in the pretest. An arrow from the center block which
is pointing downward contains the traditional method of assessing the level
of understanding of students in different topics, while the arrow from the
experimental group is pointing downward to the blocks that contain the
technology- mediated classroom formative assessment utilizing the use of
computers, TV, mobile phones, and other digital technologies in assessing
the academic performance of students. From both blocks are arrows
pointing to the lowermost position of the figure which contains the Posttest
which was administered to the students with the arrow signifying that
similar topics during pretest were included in the posttest namely :
Propagation and Characteristics and Speed of Sound, Heat and
Temperature, Colors of Light and Current, Voltage and Resistance.
During the investigation, firstly the researcher chose two sections of
the Grade 8 level from the five sections under her instruction that qualify
for the quasi-experiment. After the two selections were chosen, the
researcher randomly chose which section will comprise the control group
or experimental group by tossing a coin. The two groups were given a
pretest of a valid and reliable teacher – made multiple choice test composing
of questions on the topics to be discussed. Both groups were taught with
the same subject matter in consonance with the ready- made lesson plan
for Science 8 until the end of the experiment using the topics. However, the
members of the control group were taught and assessed using paper-based
assessment methods while the experimental group were exposed to
Technology-Mediated Formative Assessment. After the experiment, the
same tests were given to both group and served as post test. The data
collected were tabulated using tables and were analyzed with the use of
appropriate statistical tools.
19

Figure 1: Schema of the Study


20

Research Questions
This study aimed to investigate the effect of technology mediated
formative assessment to the students’ performance level in Physics.
Specifically, the study sought answers to the following questions:
1. What is the pretest performance of Zamboanga del Norte National High School- Turno
Grade 8 students in the :
1.1 Control Group;
1.2 Experimental Group?
2. Is there a significant difference between the pretest performance of the Zamboanga del
Norte National High School Grade 8 students in the :
2.1 Control group;
2.2 Experimental group?
3. What is the posttest performance of the Zamboanga del Norte National High School
Grade 8 students in the :
3.1 Control group;
3.2 Experimental group?
4. Is there a significant difference between the pretest and posttest performance of the
Zamboanga del Norte National High School- Turno Grade 8 students in the :
4.1 Control group;
4.2 Experimental group?
5. Is there a significant difference between the posttest performance of the Zamboanga
del Norte National High School Grade 8 students in the control and experimental
group?
6. Is there a significant difference on the pre-post mean gain of the control and
experimental groups?

Scope and Limitations


This study investigated the effect of technology-mediated formative
assessment to the students performance level in Physics. The study
involved the Grade 8 students who are officially enrolled in Zamboanga
del Norte National High School Turno during the school year 2023-
21

2024. The students of Grade 8 Peacock were part of the control group while
the students of Grade 8 Sparrow composed the experimental group.
Science topics covered in the study were limited to Sound, Heat and
Temperature, Light, and Electricity. The students in the control group were
exposed to the conventional way of assessing their level of understanding in
the areas of Physics which are covered in the study before, during and after
the lesson while the students in the experimental group were exposed to
technology mediated formative assessment. The technologies used for
assessing were limited to the use of Kahoot , Classpoint, Quizziz, Google
Forms in mobile phones, computers, LCD projectors, Smart TV and in
presenting the lesson, video clips, ready made powerpoint were utilized to
both groups.
Method
This section of the study presents the sampling, data collection,
ethical issues, and plan for data analysis.
Research Design
In this study, the researcher utilized the experimental method of
research. The researcher used the Quasi – Experimental Design 10 or the
Pretest – Posttest Control and Experimental Group Design in order to test
the hypotheses. This design was similar to the Pretest- Posttest Control and
Experimental Group Design which contains two groups: one receives an
experimental treatment and the other does not. According to Imelda and
Muyangwa cited by Villanueva (2014), this design has some deficiencies
that can seriously threaten the internal validity as a result of not
randomizing the subjects to the experimental and control groups. To deal
with this threat, initial observation and preliminary and first quarter scores
were measured to determine the statistical equivalence.
The utilization of this method was deemed appropriate since the
researcher wanted to investigate the application of technology- mediated
formative assessment in Physics in Junior High School. During the
experiment, the students of the control group was taught using the
strategies and methods employed by the researcher in teaching Physics and
22

was exposed to the paper based formative assessment method while the
experimental group was exposed to the same mode of instructional delivery
but exposed to the technology- mediated formative assessment.

Research Participants
The Grade 8 students of Zamboanga del Norte National High School- Turno who
belong to sections Sparrow and Peacock, under the Enhanced Basic Education
Program constituted the respondents of the study. As presented in the table, 35
students of section Peacock composed the experimental group while 36 students from
section Sparrow composed the control group. These students are officially enrolled as
Grade 8 students of ZNNHS- Turno.

Table 1
Distribution Of Respondents of the Study

Number of
Group Respondents Percentage
Grade 8 Peacock ( Control Group) 35 49.47%
Grade 8 Sparrow (Experimental Group) 36 50.53%
Total 71 100%

There are 15 sections in Grade 8 in Zamboanga del Norte National


High School. Three sections belong to the Science, Technology and
Engineering Program. One section belongs to the Special Program in
Journalism and one section belongs to the Special Program in the Arts. The
remaining 10 sections belong to the Enhanced Basic Education Program
(EBEP). The students belonging to the ten sections of the EBEP are
heterogeneous classes. The researcher chose Grade two sections from
EBEP as they are under her for the school year 2023-2024. The students
As to what section constituted the experimental group and control group,
the researcher randomly assigned the groups by tossing a coin.
23

The head represented the students of section Sparrow while the tail
represented the students of section Peacock. After the tossing of the coin,
the researcher was able to identify as to what section belonged to the
experimental and control groups respectively.

Research Instrument. The researcher developed a 50 - item teacher


made test to measure students’ performance in Physics. This test is a
Multiple-Choice test constructed by the researcher with the help of books
and teaching kits. In this test, students in the control and experimental
groups were presented with 50 questions or instructions called stems. They
were directed to select the correct answer or response from the list of
answer options. The test consisted of questions on the topics that were
discussed.
The number of items was dependent on the number of contact hours
indulged in teaching each topic and construction of the instrument followed
Bloom’s Taxonomy. The researcher prepared a Table of Specification (TOS)
as a guide in the test construction to ensure that there was an equitable
distribution in items in all the competencies, and to make sure that all
major aspects of the topics were covered in the test items and in correct
proportions.
Validation of the Test Instrument. To make sure that the test items
was of good quality; its validity and reliability were carefully taken into
considerations. To assure content- related validity, the researcher constructed
Table of Specifications (TOS) in Science using Bloom’s Taxonomy to make
certain that all major aspects are covered by the test items and in correct
proportions.
In constructing the teacher- made multiple choice tests, the researcher
followed the TOS. After constructing the test, the researcher submitted it,
including the TOS, to experts for comments and/or recommendations. These
experts were Master of Arts in Science and Doctor of Education holders. There
corrections and suggestions were eventually incorporated.
Aside from the validity of the instruments, its reliability was also tested.
Before administering the test to the control and experimental groups, it was
administered first to 38 Grade 8 STE students of ZNNHS who were not part of
24
the study. The Item Analysis U-L was utilized in testing the reliability.
Scoring Procedure. To describe the overall pretest and posttest
performance of the students in the control and experimental group in the 50 –
item Physics exam, the range of mean scores, its corresponding verbal
description and interpretation was used.

Data Analysis
The data that was gathered was then interpreted using the following
statistical tools.
Weighted Mean. This was used to describe the performance of the Grade 8
students who belong to the control and experimental group during the pretest
and posttest.
The formula in obtaining the weighted mean is :
X = ∑ fw/N
Where : X = Mean
∑ fw = Summation of score
N = number of students
Z – test. This was used to determine the significant difference between
the hypothetical mean ( HM) score and the actual mean (AM) score of the
students. The HM or level of expectations was set to 75 %. The 75 percent HM
is based on the scores of the respondents not their grades as set by the school
where the current investigation will be conducted.
The formula is

Where:
s = standard deviation of the sample
x = the mean score
u = the hypothesized population mean
N = the number of students who took the test
T- test. The study used to test the significant difference between the pretest
performance , posttest performance and the pre-posttest performance mean gain
of the Grade 8 students in the control and experimental group.
25

The formula is :

Where : t = the t- test value


X1 = mean gain of the control group
X2 = mean gain of the experimental group
S21 = variance of scores in the control group
S22 = variance of scores in the experimental group
N1 = the total number of students in the control
group
N2 = the total number of students in the
experimental group
All statistical values were set at 0.05 level of significance.
Results and Discussion
This portion of the study presents the discussion of
results and recommendations.

Topics Number HM AM SD Z D
of Value
Items
Propagation,
Characteristics and Speed 10 7.5 5.20 1.15 1.18 Fair
of Sound
Heat and Temperature 10 7.5 4.60 1.01 1.09 Fair
Colors of Light 15 11.25 4.83 1.03 1.11 Fair
Current, Voltage and
Resistance 15 11.25 3.98 .98 .78 Fair

Total 50 37.5 18.61 4.17 4.13 Fair


26

Table 3 presents the pretest performance of Zamboanga del Norte


National High School students of Grade 8 Sparrow and Grade 8 Peacock. There
were four topics included in the investigation namely : Propagation
,Characteristics and Speed of Sound, Heat and Temperature, Colors of Light
and Current , Voltage and Resistance. The level of expectation ( HM)was set at
75 percent of the total number of items in each topic; thus 7.5, 7.5, 11.25 and
11.25 respectively.
As shown in the table, the group did not attain the 75 percent level of
performance on Propagation, Characteristics and Speed of Sound, Heat and
Temperature, Colors of Light, and Current, Voltage and Resistance with AMs
5.20, 4.60, 4.83 and 3.98 respectively.
As to the topic the Propagation, Characteristics and Speed of Sound, the
students in the control group obtained the Actual Mean (AM) of 5.20 having a
standard deviation (SD) of 1.15 described as “Fair Performance”, was below the
expected level of performance of 7.5. The computed z- test value of 1.15 did not
exceed the critical value of 1.658 at having 34 degrees of freedom which implies
that is it not significant at 75 percent confidence level.
As to the topic the Heat and Temperature, the group obtained an AM of
4.60 with Standard Deviation (SD) of 1.15, which was described as “Fair”
performance. On the topic of Colors of Light, the group obtained an AM of 4.83
with 1.03 SD which described as “ Fair Performance”, and as to the topic
“Voltage, Current and Resistance”, the group obtained an AM of 3.98 with a
0.98 SD which was described as “ Fair Performance.”
Looking at the overall mean, the performance of the control group during
the pretest was “Fair” having obtained an AM of 18.61 with SD of 41.17. This
indicates that the group did not exceed the 75 percent level of expectation.
The findings of the current investigation as regard to the performance of
the control group during the pretest is supported by the study conducted by
Villanueva ( 2014) which revealed that the control group was below the
expected performance in performance during the pretest.
27

Table 4 : Pretest Performance of Zamboanga del Norte NHS, Grade 8


students in Experimental Group

Topics Number HM AM SD Z D
of Value
Items

Propagation, 10 7.5 5.35 1.34 1.09 Fair


Characteristics and Speed
of Sound

Heat and Temperature


10 7.5 4.55 1.23 1.05 Fair

Colors of Light
15 11.25 4.92 1.30 1.17 Fair

Voltage, Current and 15 11.25 4.01 1.17 .98 Fair


Resistance

Total 50 37.5 18.83 1.26 4.29 Fair

Table 4 presents the pretest performance of Zamboanga del Norte NHS,


Grade 8 students in the experimental group. Like the control group, the
students on the experimental group were also provided with a test with the
same topics namely: Propagation, Characteristics and Speed of Sound, Heat
and Temperature, Colors of Light and Voltage, Current and Resistance. The
level of expectation (HM)was also set at 75 percent of the total number of items
in each topic; thus 7.5, 7.5, 11.25, and 11.5 respectively.
28

As shown on the table, the group did not attain the 75% level of
performance on the topics; namely Propagation, Characteristics and Speed of
Sound, Heat and Temperature, Colors of Light, and Voltage, Current and
Resistance with actual means of 5.35, 4.55, 4.92 respectively, all described
below the expected performance level.
Results showed that like those in the control group, the students in the
experimental group did not succeed in obtaining the 75% level performance
or 37.5 score. Obtaining the AM of 18.83 and SD of 1.26, the z- value of 4.29
did not exceed the expected mean of 37.5 at 0.05 significance level with 35
degrees of freedom. This means that the group did not attain the level of
expectation to a significant degree. The groups AM of 18.83 was describes as
“Fair” performance.
Numerous foreign studies substantiate the findings of the current study
as they also found out that the experimental group did not perform well during
the conduct of the pretest.
In order to improve the performance, the delivery of instruction must
focus on the existing knowledge on teaching and learning. “The teacher is
considered the potential force of changes, is in his best in the equipped room,
it is unlikely that the students in unable to learn” as Olayinka (2016)
explained.
Students that may come into situation where they do not have a
foundation of skills or where they have missed key elements find themselves
very confused. Hence, teacher need to employ and integrate strategies or
approaches so that learning the subject will be easy and enjoyable and
teaching the subject can be more meaningful.
29

Table 5: Test of Difference Between the Control and Experimental


Groups’ Pretest Performance
t-
TOPICS Group Means SD value Description

Control 1.23 0.89


Propagation,Characteristics Not
and Speed of Sound 0.11 Significant
Experimental
1.34 0.92

Control
Heat and Temperature 1.11 0.79
0.03 Not
Experimental 1.14 1.01 Significant

Control
Colors of Light 1.17 0.88 0.03 Not
Significant
Experimental
1.20 0.92

Control
Current, Voltage and 1.37 1.05 Not
Resistance 0.29 Significant
Experimental
1.66 1.15

Table 5 presents the test or difference between the control and


experimental group pretest performance. On the topic about Propagation,
Characteristics and Speed of Sound, the t- value was 0.11, which is less than
the critical value of 1.658 at 0.05 level of significance with 69 degrees of freedom.
This leads to the non-rejection of the null hypothesis. Therefore, there is no
significant difference on the performance of the students in the control and
experimental group. It shows that the score obtained by the experimental group
were slightly higher than the control group.
30

As to the topic on Heat and Temperature, the computed t – value of 0.03


obtained did not exceed the critical value of 1.658 at 0.05 level f significance
with 69 degree of freedom. This leads to the non-rejection of the hypothesis or
there is no significant difference in the performance between the students in the
control group and experimental group as to their level of performance during the
pretest.
On the topic Colors of Light, the computed t- value was 0.03, which is not
significant at 0.05 level with 69 degree of freedom. This leads to the non-
rejection of the null hypothesis or there is no significant difference in the
performance between the students in the control group and experimental group
as to their level of performance during the pretest.
As to the topic on Current, Voltage and Resistance, the computed t value
of 0.09 did not exceed the critical value of 1.658 at 0.05 level of significance with
69 degree of freedom. This leads non- rejection of the null hypothesis or there is
no significant difference in the performance between the students in the control
group and experimental group as to their level of performance during the pretest.
Finally, the overall performance of the control group and experimental
group obtained the means of 18.61 and 18.83 respectively. The computed t-
value was 0.46 which did not exceed the critical value of 1.658 at 0.05 level of
significance with 89 degrees of freedom. This leads to the non – rejection of the
null hypothesis. Therefore, there was no significant difference on the overall
pretest performance between the control group and experimental group. This
indicates that there is no significant difference in the performance of the two
groups before the intervention.
Generally, the performance of the students in both control and
experimental group were comparable during the pretest. The results clearly
suggest that the students in two groups needed reinforcement so that their
performance level will be elevated fair to very good or perhaps to excellent level.
Teachers should find ways and apply these to increase students’ performance
in Science 8. They may utilize the technology – based integration approach in
conducting the formative assessment in teaching the subject so that learning
will be fast – tracked.
31

Table 6: Posttest Performance of Zamboanga del Norte NHS, Grade 8 students in


Control Group

Topics Number HM AM SD Z D
of Value
Items

Propagation, 10 7.5 8.03 2.83 3.49 Good


Characteristics
and Speed of Sound

Heat and Temperature


10 7.5 8.01 1.94 2.89 Good

Colors of Light
15 11.25 11.30 4.44 3.98 Good

Current, Voltage and 15 11.25 11.40 4.60 4.0 Good


Resistance

Total 50 37.5 38.81 13.81 14.36 Good

Table 6 reveals the posttest performance of the Grade 8 students in the


control group. Similar to the pretest, there were also four topics included in the
post test, to wit: Propagation, Characteristics and Speed of Sound, Heat and
Temperature and Colors of Light, and Current, Voltage, and Resistance. The
same level of expectation (HM) was set at 75 percent of the total number of items
in each topic; thus 7.5,7.5, 11.5 and 11.5 respectively with a 37.5 overall
expected posttest performance.
32

A closer look at the table reveals that the students in the control group
obtained AMs of 8.03,8.01, 11.30, and 11.40 respectively on the topics namely:
Propagation, Characteristics and Speed of Sound, Heat and Temperature, Colors
of Light, and Current, Voltage, and Resistance which were all described as
“good” performance. The AMs were all above 75 percent level of expectation.
These findings imply that the students in the control group attained the
expected mean scores (HM) of 7.5,7.5, 11.25 and 11.25 respectively.
Generally, the students in the control group succeeded in attaining the
HM score of 37.5 in the posttest since they obtained the mean score of 38.81
with SD of 13.81 which was described as “good” performance. The z- value of
14.36 exceeded the critical value of 1.658 at 0.05 level of significance with 34
degree of freedom. This means that the group attained the 75% expected
performance in a significant degree.
This result is supported by the study of Moyer (2013) that students cannot
learn Science by simply introducing “chalk and talk” but providing real
manipulating concrete objects that could help to think critically and internalized
abstract concepts.
As supported the study by Keong, Horani and Daniels (2015) a Malaysian
faculty of Information Technology of Multimedia University concluded that the
use of technology in teaching Science can make the teaching process more
effective as well as enhance the students’ capabilities in understanding basic
concepts especially in Science.
33

Table 7: Posttest Performance of Zamboanga del Norte NHS, Grade 8 students in


Experimental Group

Topics Number HM AM SD Z D
of Value
Items

Propagation, 10 7.5 8.9 2.69 1.94 Very


Characteristics and Speed Good
of Sound

Heat and Temperature


10 7.5 9.2 2.83 1.98 Very
Good

Colors of Light
15 11.25 13.65 3.49 2.17 Very
Good

Current, Voltage 15 11.25 14.11 4.46 2.83 Excellent


and Resistance

Total 50 37.5 45.86 13.47 8.92 Very


Good

The posttest performance of the Grade 8 students in the experimental


group is presented in Table 7. The same level of expectation (HM) was set at 75
percent of the total number of items in each topic: thus 7.5 for topic 1, 7.5 for
topic 2, 11.25 for topic 3 and 11.25 for topic 4.
It is evident in the table that the students in the experimental group
obtained AMs of 8.9, 9.2,13. 65 and 14. 11 on the topic namely : Propagation ,
Characteristics and Speed of Sound, Heat and Temperature, Colors of Light
34

and Current, Voltage and Resistance which were described as “ very good
performance” which were all beyond the 75 percent level of expectation. These
findings imply that the students in the experimental group have learned the four
topics during the conduct of the lesson employing the technology- mediated
formative assessment.
The group also succeeded in reaching the expected performance
level on the topic Propagation , Characteristics and Speed of Sound obtaining
AM of 8.9 having SD of 2.69 which was described as “ very good” performance.
The z- value of 1.94 is more than the critical value of 1.658 having 35 degrees of
freedom implying that it attained more than the expected mean score of 7.5 to a
significant degree.
The table shows that the students belonging to the experimental
group passed the expected mean score of 37.5 as they obtained the AM of 45.86
with SD of 13.47 which is described as very performance. The z- value of 8.92
exceeded the critical value of 1.658 at 35 degrees of freedom, implying that the
group achieved the 75 percent performance level to a significant degree.
The current findings reveals that the used technology- mediated
formative assessment to assess the level of understanding of the students
certainly helped the students in improving their performance in Science.
The study of Paje et al (2021) cited that technology is essential in
the teaching and learning process. Technology improves the way Physics lessons
should be assessed and enhances students’ understanding of basic concepts.
This further shows that the use of technology- mediated formative assessment
helps a lot in making instruction more effective.
35

Table 8: Test of Difference Between the Control Group and Experimental


Group
t-
TOPICS Group Means SD value Description

Control 8.03 2.83


Propagation,Characteristics Significant
and Speed of Sound 2.11
Experimental
8.9 2.69

Control
Heat and Temperature 8.01 1.94
2.18 Significant
Experimental 9.2 2.83

Control
Colors of Light 11.30 4.44 2.17
Significant
Experimental
13.65 3.49

Control
Current. Voltage and 11.40 4.60
Resistance 2.13 Significant
Experimental
14.11 4.46

Table 8 reflects the test of significant difference on the posttest


performance between the control and experimental group. On the first topic,
Propagation, Characteristic and Speed of Sound, the computed t- value is 2.11
which is significant at 0.05 level and 69 degrees of freedom as it has exceeded
the critical value of 1.658. This means that there is a significant difference
between the control group and the experimental group posttest performance on
topics relative to Sound. This implies that technology- mediated formative
assessment is effective.
As to the Heat and Temperature, the computed t- value was 2.18 which
exceeded the critical value of 1.658 at 0.05 level of significance with 69 degrees
36

of freedom. This leads to the rejection of the null hypothesis. There is therefore
a significant difference on the posttest performance of the control and
experimental groups on the Reflection and Refraction of Sound and Light.
On the Colors of Light, the computed t- value was 2.17 which also
exceeded the critical value of 1.658 at 0.05 level of significance with 69 degrees
of freedom. This also leads to the rejection of the null hypothesis. There is
therefore a significant difference on the posttest performance of the control
group and experimental group on the topic about Electromagnetic Spectrum.
On the Current, Voltage and Resistance , the computed t- value was 2.13
which exceeded the critical value of 1.658 at 0.05 level of significance with 69
degrees of freedom. This leads to the rejection of the null hypothesis. There is
therefore a significant difference on the posttest performance of the control and
experimental groups on the topic about the Current, Voltage and Resistance.
The overall performance showed a computed t-value of which is greater
than the critical value of 8.59 at 005 level of significance with 69 degrees of
freedom which leads to the rejection of the null hypothesis. There is therefore a
significant difference between the control and experimental group of students’
performances during the posttest.
This indicates that there exists a significant difference in the performance
of the two groups. This implies a significant variation between the performance
of the students which were given formative assessments using technology and
those who were given with paper – based formative assessments.
Rogers (2008) were indeed right that teachers in the classroom on
the proper use of technology can stimulates interest in learning, active
classroom atmosphere improving teaching effectiveness that makes learning
easy and enjoyable. According to Suarez (2010) which was quoted by Baguinat
(2011) also corroborate the present findings stating that the experimental group
performed better than the control group after the treatment.It has been claimed
that formative assessment which utilises electronical tools (e-
37

assessments) can improve and support learners to a greater extent than more
traditional paper-based assessments (Bahati, Fors, Hansen, Nouri, & Mukama,
2019;Pachler, Daly, Mor, & Mellar, 2010). This is because they can provide
immediate grading of student performance, and therefore expediate feedback
mechanisms to rapidly address misconceptions (Shieh & Cefai, 2017).

Table 9 : Test of Difference between the Pretest and Posttest Performance


of Students in the Control Group and Experimental Group

Pre- Post Mean t- Description


TOPICS Group test test Difference value

Control
Propagation, 5.20 8.03 -2.83 1.79 Significant
Characteristics and
Speed of Sound Experi- -3.55 1.64 Significant
mental 5.35 8.9

Control
Heat and Temperature 4.60 8.01 -3.41 1.87 Significant

Experi-
4.55 9.20
mental -4.65 3.57 Significant
11.3
Control
Colors of Light 4.83 0 -6.47 1.92 Significant
Experi- 13.6
mental 4.92 5 -8.73 3.84 Significant
11.4
Control
Current, Voltage and 3.98 0 -7.42 2.14 Significant
Resistance
Experi- 14.1
mental 4.01 1
-10.1 3.98 Significant
38.7
18.61
Overall Control 4 7.72
38

Table 9 shows the test of difference between the pretest and posttest
performance of the control group and experimental group at 0.08 level of
significance. The table shows that the t- values of 7.72 and 13. 03 were greater
than the critical value of 1.658 at 0.05 level of significance with 69 degrees of
freedom. This leads to the rejection of the null hypothesis. This means that there
is a significant difference between the pretest and posttest performance of the
control and experimental group.
There is also a significant difference between the pretest and posttest
performance of the control group and experimental group on the first topic on
Propagation, Characteristics and Speed of Sound. The computed t- values of
1.79 and 1.64 were greater than the critical value of 1.658 at 005 level of
significance with 69 degrees of freedom. This leads to the rejection of the
hypothesis. This means that there is a significant difference between the posttest
performance of the control and experimental group along this topic.
The table also reveals that there exists a significant difference between the
pretest and posttest performance of the control and experimental group on the
topic Reflection and Refraction of Sound and Light. The t- values of 1.87 and 3.
57 were greater than the critical value of 1.658 at 0.05 level of significance with
69 degrees of freedom. This leads to the rejection of the null hypothesis.
The table also reveals that there exists a significant difference between the
pretest and posttest performance of the control and experimental group on the
topic Colors of Light. The t- values of 1.92 and 3. 84 were greater than the critical
value of 1.658 at 0.05 level of significance with 69 degrees of freedom. This leads
to the rejection of the null hypothesis.
And on the last topic which was about the Current, Voltage and Resistance
as shown on the table, the t – values of 2.14 and 3.98 were greater than the
critical value of 1.658 at 0.05 level of significance with 69 degrees of freedom.
This leads to the rejection of the null hypothesis.
A serious glimpse on the actual means obtained by the two groups
revealed that the experimental group also performed better than the control
group in all topics.
39

The current findings reveals that the used technology- mediated formative
assessment to assess the level of understanding of the students certainly helped
the students in improving their performance in Science.
This finding share resemblance to the the study of Paje et al (2021) which
cited that technology is essential in the teaching and learning process.
Technology improves the way Physics lessons should be assessed and enhances
students’ understanding of basic concepts. This further shows that the use of
technology- mediated formative assessment helps a lot in making instruction
more effective.
Additionally, electronic assessments can offer adaptive learning pathways
tailored to individual student needs, further enhancing the personalized
learning experience. Overall, the utilization of electronic tools in formative
assessment has the potential to transform the learning experience by providing
timely feedback, fostering active engagement, and supporting individualized
learning journeys.

Table 10: Mean Gain of the Performance of Control and Experimental


Group

Groups Over-all Mean SD Mean t – value Decision


Performance Difference

Control 38.81 13.81 Significant


7.05 2.59 Difference
Reject Ho
Experimental 45.86 13.47
40

Conclusion and Recommendations


The result of the study implies that the performance level of the students
is affected by the use of technology -mediated formative assessment thus the null
hypothesis is rejected. This outcome share resemblance to the study of Bahati,
Fors, Hansen, Nouri, and Mukama (2019) and Pachler, Daly, Mor, and Mellar
(2010), have suggested that electronic formative assessments offer distinct
advantages over traditional paper-based assessments in supporting and
enhancing student learning. One key advantage lies in the immediacy of feedback
provided by e-assessments, enabling swift identification and correction of student
misconceptions. This rapid feedback mechanism not only facilitates timely
intervention but also fosters a more dynamic and responsive learning
environment.
Based on the finding of this study, the following recommendations are
proposed:
1. Incorporate technology-mediated formative assessment methods into Physics
education by introducing digital tools and platforms that enable real-time feedback and
personalized learning experiences, fostering deeper understanding and engagement
among students.
2. Provide professional development opportunities for teachers to enhance their
digital literacy skills and integrate technology seamlessly into their teaching practices,
ensuring effective implementation of technology-mediated assessment in the classroom.
3. Establish monitoring and evaluation systems to track student progress and
identify areas for improvement, enabling educators to adapt teaching strategies and
interventions accordingly, thus maximizing the effectiveness o of technology-mediated
assessment.
4. Invest in technological infrastructure by allocating resources for reliable internet
connectivity, hardware devices, and software applications, facilitating the seamless
integration and utilization of technology in Physics education, enhancing access to digital
learning resources and tools.
5. Encourage further research to investigate the long-term impact of technology-
mediated formative assessment on student learning outcomes, exploring innovative
approaches and best practices for implementation in diverse educational settings.
41

6. Promote collaborative learning environments where students actively engage in


peer-to-peer interaction and constructive feedback using technology tools, fostering a
collaborative spirit and enhancing learning outcomes through shared knowledge and
experiences.
7. Future studies could compare the effectiveness of different TMFA tools and
strategies to identify the most impactful approaches for various topics and learning
objectives.
42

REFERENCES

Abadzi, H. (2021). Personalized and adaptive learning. In H. Beetham & R. Sharpe

(Eds.). Rethinking pedagogy for the digital age (3rd ed., pp. 161-174). Routledge.

Agonia, V., & Panoy, D. Ph.D., J. F. (2023, January 12). Technology- Based Formative

Assessment and Learning Outcomes in Science among Grade 10 Students in an

Online Instructional Delivery Program. APJAET - Journal Ay Asia Pacific

Journal of Advanced Education and Technology, 318–328.

https://doi.org/10.54476/apja

Alakaleek, W. (2022). The Utilization of Digital Assessment Techniques in K-12

Education. Retrieved from https://bspace.buid.ac.ae/items/1374f523-94b5-

40a6-9f3e-0c0ca5113f1a/full

Alonso-Mencía, M. E., Gómez-Galán, J., & Rodríguez-Martín, A. (2022). Impact of

formative assessment on academic performance, self-regulation, and motivation

of pre-service teachers. Studies in Educational Evaluation, 77, 101070.

Albert, M. & Beatty, B. (2014). Flipping the Classroom Application to Curriculum

Redesign for an Introduction to Management Course: Impact on Grades.

Retrieved from

https://www.tandfonline.com/doi/abs/10.1080/08832323.2014.929559

Amasha, M. A., Abougalala, R. A., Reeves, A. J., Alkhalaf, S., & Owais, S. M. (2017).

Challenges and Possibilities of ICT-Mediated Assessment in Virtual Teaching

and Learning Processes. MDPI, 27(12), 622. https://www.mdpi.com/1999-

5903/12/12/232

Andrade, et al. (2015). Formative Assessment in Dance Education. Retrieved from

https://www.researchgate.net/publication/277977643_Formative_Assessmen

t_in_Dance_Education
43

Angelo, T. S., & Cross, K. P. (1993). The classroom assessment techniques handbook.

Jossey-Bass.

Arends, R. I. (2011). Learning to teach (8th ed.). McGraw-Hill.

Avci et al. (2020). Academic Motivation Levels of Secondary School Students and Their

Attitudes Towards a Social Studies Course. Retrieved from

https://www.researchgate.net/publication/341111595_Academic_Motivation_

Levels_of_Secondary_School_Students_and_Their_Attitudes_Towards_a_Social

_Studies_Course

Awad B., (2014), Empowerment of teaching and learning chemistry through information and

communication technologies, Afr. J. Chem. Educ., 4(3), 34–47.

Azevedo, R., & Hadwin, A. F. (2021). Self-regulation, metacognition, and self-regulated

learning in online and blended learning environments. In M. K. Tallis & T. A. Winters

(Eds.), Online teaching and learning: A practical guide (pp. 154-177). Stylus

Publishing.

Bahati, B. (2019). Technology-enhanced formative assessment in higher education : An

intervention design of scaffolding student self-regulated learning. DIVA.

https://su.diva-

portal.org/smash/record.jsf?pid=diva2%3A1281224&dswid=8595

Bansal, G. (2020). Understanding gaps in teacher interpretation of formative assessment

evidence. School Science Review, 101(377), 67-72.

Barefah & McKay (2016). Evaluating the Design and Development of an Adaptive E-

Tutorial Module: A Rasch-Measurement Approach. Retrieved from

https://files.eric.ed.gov/fulltext/ED571591.pdf

Bates, A., W. (2015). Teaching in a digital age. Tony Bates Associate Ltd. P. 201.

Bay, L. (2012). Senior Psychometrics Director at College Board (USA) and Senior Advisor
44
of Frontlearners.

Bennett, R. E. (2018). Formative assessment: A critical review. Assessment in Education:

Principles, Policy & Practice, 18(1), 5–25.

https://doi.org/10.1080/0969594X.2010.513678.

Bhagat, K. & Spector, J. (2017). Formative Assessment in Complex Problem-Solving

Domains: The Emerging Role of Assessment Techniques. Retrieved from

https://www.researchgate.net/publication/318820441_Formative_Assessment_i

n_Complex_ProblemSolving_Domains_The_Emerging_Role_of_Assessment_Techn

ologies

Black & William (2022). A Summary of Evidence Based Formative Assessment Strategies.

Retrieved from https://evidencebased.education/a-summary-of-evidence-based-

formative-assessment-strategies/

Bordoh, A., Eshun, I., Ibrahim, A. W., Bassaw, T. K., Baah, A., & Yeboah, J. (2022).

Technological, Pedagogical Content Knowledge (TPACK) of Teachers and Their

Formative assessment practices in Social Studies Lessons in The Junior High

Schools in Komenda Edina Eguafo Abirem (K.E.E.A) Municipality of Ghana

Universal Journal of Social Sciences and Humanities, 2(4), 201–209. Retrieved

from https://doi.org/10.31586/ujssh.2022.459

Borich, G. A. (2014). Effective teaching methods: Research-based practice (8th ed.).

Boston, MA: Pearson

Bower, M. (2019). Technology-mediated learning theory. In R. E. Ferdig & K. E. Kennedy

(Eds.), Handbook of research on K-12 online and blended learning (2nd ed.) (pp.

159-180). ETC Press.

Brindha V. E., (2018), Creative Learning Methodology using Revised Bloom's Taxonomy.

Çekiç, A., & Bakla, A. (2021). A review of digital formative assessment tools:

Features and future directions. International Online Journal of Education and


45
Teaching (IOJET), 8(3). 1459-1485

Clark & Mayer (2016). E-Learning and the Science of Instruction: Proven Guidelines

for Consumers and Designer of Multimedia Learning (4 th ed). Hoboken, NJ:

Wiley.

Chen et al. (2016). Challenges Confronting Beginning Researchers in Conducting

Literature Reviews. Studies in Continuing Education, 38 (I), 47-60.

Cheng et al. (2020). Prevalence of Social Media Addiction Across 32 Nations: Meta-

Analysis with Subgroup Analysis of Classification Schemes and Cultural

Values. Retrieved from

https://www.sciencedirect.com/science/article/pii/S0306460321000307

Chou, C. C., Block, L., & Jesness, R. (2012). A case study of mobile learning pilot

project in K-12 schools. Journal of Educational Technology Development and

Exchange, 5(2). https://doi.org/10.18785/jetde.0502.02

College Board, (2021). Texas Success Initiative Assessment 2.0 Technical Manual.

Retrieved from https://accuplacer.collegeboard.org/accuplacer/pdf/tsia2-

technical-manual.pdf

Dalby, D., & Swan, M. (2018, January 18). Using digital technology to enhance

formative assessment in mathematics classrooms. British Journal of

Educational Technology, 50(2), 832–845. https://doi.org/10.1111/bjet.12606

D’Angelo C., (2018), The impact of technology: Student engagement and success,

Technology and the Curriculum: Summer 2018

Denham, A. R. (2015). Strategy instruction and maintenance of basic multiplication

facts through digital game play. Retrieved July 10, 2020, from

https://pdfs.semanticscholar.org/52e2/c435e304eb8b1498d09d6554564cc1

330593.pdf.

Department of Education (2016). K to 12 Curriculum Guide in Science. Retrieved from


46
https://www.deped.gov.ph/wp-content/uploads/2019/01/Science-

CG_withtagged-sci-equipment-revised.pdf

Deped Order Series No. 8 s. 2015.

Dhawan, S. (2020). Online learning: A panacea in the time of COVID-19 crisis. Journal

of Educational Technology Systems, 49(1), 5-22.

https://doi.org/10.1177/0047239520934018.

Edwards, S. (2013). Toward a Model for Early Childhood Environmental Education:

Foregrounding, Developing, and Connecting Knowledge Through Play-Based

Learning. Retrieved from

https://www.researchgate.net/publication/271944679_Toward_a_Model_for_

Early_Childhood_Environmental_Education_Foregrounding_Developing_and_C

onnecting_Knowledge_Through_Play-Based_Learning

El-Hashash et al. (2022). Peer-Review Statements. Retrieved from

file:///C:/Users/OWNER/Downloads/125980813.pdf

Elmahdi, et al. (2018). Using Technology for Formative Assessment to Improve

Students’ Learning. Retrieved from

https://files.eric.ed.gov/fulltext/EJ1176157.pdf

Escueta et al. (2017). Education Technology: An Evidence-Based Review. Retrieved

from https://www.nber.org/papers/w23744

Faiella, F., & Ricciardi, M. (2015). Gamification and learning: A review of issues and

research. Journal of E-Learning and Knowledge Society, 11(3), 13–21.

Fredlund, T., Airey, J., & Linder, C. (2012). Exploring the role of physics

representations: an illustrative example from students sharing knowledge about

refraction. European Journal of Physics, 33, 657-666

Fullan, M. & Higgins, S. (2014). Literature Review on the Impact of Digital Technology

on Learning and Teaching. Retrieved from

https://www.gov.scot/publications/literature-review-impact-digital-
47
technology-learning-teaching/pages/4/

Gikandi, J., Morrow, D., & Davis, N. (2021). Online formative assessment in higher

education: A review of the literature. Computers & Education, 57(4), 2333-2351.

https://doi.org/10.1016/j.compedu.2011.06.004.

Gobert J. D., Sao Pedro, M. Raziuddin, J. and Baker, R. S., (2013), From log files to

assessment metrics: Measuring students’ science inquiry skills using

educational data mining, J. Learn. Sci., 22(4), 521–563.

Gupta, T. and Belford R. E. (ed.), (2019), Technology Integration in Chemistry

Education and Research (TICER), American Chemical Society.

Guskey, T. R., & McTighe, J. (2020). Implementing effective formative assessment to

improve student learning. In D. Wyse, L. Hayward, & J. Pandya (Eds.), The

SAGE handbook of curriculum, pedagogy and assessment (pp. 839-854). SAGE

Publications Ltd.

Habler et al. (2016). Perspectives on Technology, Resources and Learning. Retrieved

from

https://www.educ.cam.ac.uk/people/staff/watson/Hassler%20et%20al%202

016%20-

%20Perspectives%20on%20Technology,%20Resources%20and%20Learning%2

0(Full).pdf

Hagos, T., & Andargie, D. (2022, December 1). Effects of Technology-Integrated

Formative Assessment on Students’ Retention of Conceptual and Procedural

Knowledge in Chemical EquilibriumConcepts.

Haluk Ö. and Akbar N., (2018), Effect of simulations enhanced with conceptual change

texts on university students’ understanding of chemical equilibrium, J. Serb.

Chem. Soc., 83(1), 121–137.

Hanus, M. D., & Fox, J. (2015). Assessing the effects of gamification in the classroom:

A longitudinal study on intrinsic motivation, social comparison,


48
satisfaction,effort, and academic performance. Computers & Education, 80,

152–161.

Hattie & Timperley (2007). The Power of Feedback. Retrieved from

https://journals.sagepub.com/doi/abs/10.3102/003465430298487

Heritage, M. (2016). Formative Assessment: An Enabler of Learning. Proven Programs

in Education: Classroom Management & Assessment, 35–38. FA-Heritage-an-

enabler of learning.pdf (michiganasses sment co nsortium.org).

Hickey, D.T., Taasoobshirazi, G., & Cross, D. (2012). Assessment as learning:

Enhancing discourse, understanding, and achievement in innovative science

curricula. Journal of Research in Science Teaching, 49(10), 1240-

1270.Holzberger et al. (2013). How Teachers’ Self-efficacy is Related to

Instructional Quality: A Longitudinal Analysis. Retrieved from

https://psycnet.apa.org/record/2013-14683-001

Ifenthaler, D., & Kim, Y.-H. (2023). Formative assessment in digital learning

environments. Computers in Human Behavior, 142, 107662.

Inknoe (2021). Interactive classroom quiz in PowerPoint. http://www.classpoint.io

(last accessed Dec 2021

Ismail, M. A. A., & Mohammad, J. A. M. (2017). Kahoot: A promising tool for formative

assessment in medical education. Education in Medicine Journal, 9(2), 230.

https://doi.org/10.21315/eimj2017.9.2.2

Ismail, M. A. A., & Fakri, N. M. R. M. (2017). Transforming stressful to joyful classroom

through web 2.0 applications. In Carnival on e-learning (IUCEL) (pp. 199-201).

Malaysia: Negeri Sembilan.

Kaye, T., & Ehren, M. (2021). Computer-Assisted Instruction Tools: A Model to Guide

Use in Low-and Middle-Income Countries. International Journal of Education

and Development using Information and Communication Technology, 17(1), 82-

99.
49
Kean, D. (2012). Leading with Technology. The Australian Educational Leader, 34 (2),

44.

Keong, C. C., Horani, S., & Daniel, J. (2015). A study on the use of ICT in mathematics

teaching. Malaysian Online Journal of Instructional Technology, 2(3), 43-51.

Retrieved from

https://pdfs.semanticscholar.org/0419/ba0310ac083bdb277238c5800a

059ccd142c.pdf

Koehler, M., & Mishra, P. (2009, March 1). What is Technological Pedagogical Content

Knowledge (TPACK)? What Is Technological Pedagogical Content Knowledge

(TPACK)? - Learning & Technology Library (LearnTechLib).

https://www.learntechlib.org/primary/p/29544/

Krause M., Pietzner V., Dori Y. J. and Eilks I., (2017), Differences and developments in

attitudes and self-efficacy of prospective chemistry teachers concerning the use of

ICT in education, Eurasia J. Math., Sci. Technol. Educ., 13(8), 4405–4417.

Kuo, M. S., & Chuang, T. Y. (2016). How gamification motivates visits and engagement

for online academic dissemination – an empirical study. Computers in Human

Behavior, 55, 16–27.

Kurt, D. S. (2021, February 21). Constructivist Learning Theory - Educational

Technology. Educational Technology.

Lanz et al. (2014). Simulated Galaxy Interaction as Probes of Merger Spectral Energy

Distributions. Retrieved from

https://ui.adsabs.harvard.edu/abs/2014ApJ...785...39L/abstract

Lin, C.-H., & Huang, H.-M. (2020). An investigation of the effects of implementing the

knowledge exploration activity with augmented reality on nursing students'

learning performance, critical thinking, learning motivation, and learning

satisfaction. Interactive Learning Environments, 28.


50
Lubega, T. J., Mugisha, A. K., & Muyinda, P. B. (2014). Adoption of the SAMR model to

assess ICT pedagogical adoption: A case of Makerere University. International

Journal of e-Education, e-Business, eManagement and e-Learning, 4(2), 106- 115.

Luthfiyyah R., Aisyah A. and Sulistyo G. H., (2021), Technology-enhanced formative

assessment in higher education: A voice from Indonesian EFL teachers, EduLite:

J. Engl. Educ., Literature Culture, 6(1), 42–54.

McDowell, L. (2013). Assessment for learning. Improving Student Engagement and

Development through Assessment: Theory and Practice in Higher Education, 73–

85.

McMillan, J. H., Venable, J. C., & Varier, D. (2013). Studies of the effect of formative

assessment on student achievement: So much more is needed. Practical

Assessment, Research & Evaluation, 18(2), 1–15

Mdlalose, et al. (2022). Using Kahoot! As a Formative Assessment Tool in Science Teacher

Education. Retrieved from

https://www.researchgate.net/publication/354461649_Using_Kahoot_As_A_Forma

tive_Assessment_Tool_in_Science_Teacher_Education

Merriam Webster (2024). Technology. Retrieved from https://www.merriam-

webster.com/dictionary/technology#:~:text=%3A%20a%20capability%20given%20b

y%20the,technical%20processes%2C%20methods%2C%20or%20knowledge

Ming-Hung et al. (2017). A Study of the Effects of Digital Learning on Learning Motivation

and Learning Outcome. Retrieved from https://www.ejmste.com/article/a-study-of-

the-effects-of-digital-learning-on-learning-motivation-and-learning-outcome-4843

Miroslav P., Anna D. and Zuzana H., (2018), Learners’ Understanding of Chemical

Equilibrium at Submicroscopic, Macroscopic Symbolic Levels, Chem. Didact Ecol.

Metrol., 23(2), 97–111.

Mohebi, L. (2022, January 13). Theoretical Models of Integration of Interactive Learning

Technologies into Teaching: A Systematic Literature Review | Mohebi | International


51
Journal of Learning, Teaching and Educational Research. Theoretical Models of

Integration of Interactive Learning Technologies Into Teaching: A Systematic

Literature Review | Mohebi | International Journal of Learning, Teaching and

Educational Research. https://doi.org/10.26803/ijlter.20.12.14

Moss, C. (2013). Research on Classroom Summative Assessment. Retrieved from

https://sk.sagepub.com/reference/hdbk_classroomassessment/n14.xml

Moyer, P. S. (2013). Are we having fun yet? How teachers use manipulatives to teach

mathematics. Educational Studies in Mathematics, 47(2), 175-197.

https://doi.org/10.1023/A:1014596316942

National Academy of Engineering (2021). Four MIT Researchers Elected to the National

Academy of Engineering for 2021. Retrieved from https://news.mit.edu/2021/four-

mit-researchers-elected-national-academy-engineering-2021-0212

Nawzad L., Rahim D. and Said K., (2018), The effectiveness of technology for improving the

teaching of natural science subjects, Indones. J. Curr. Educ. Technol. Stud., 6(1),

15–21.

Nsabayezu, E., Iyamuremye, A., Mbonyiryivuze, A., Niyonzima, F. N., & Mukiza, J. (2023).

Digital-based formative assessment to support students’ learning of organic

chemistry in selected secondary schools of Nyarugenge District in Rwanda. Education

and Information Technologies, 1-31.

Nu-Man, M.R., and Porter, T.M., (2018). Igniting Your Teaching with Educational

Technology A Resources for New Teachers. Editors: Matt Rhoads & Bonni

Stachowiak. Retrieved from https://edd7032017f2.pressbooks.com/

OECD (2018). Program for International Student Assessment (PISA) Results from PISA 2018.

Retrieved from https://www.oecd.org/pisa/publications/PISA2018_CN_PHL.pdf

Olayinka, A. (2016). Effects of Instructional Materials on Secondary Schools Students’

Academic Achievement on Social Studies in Ekiti State, Nigeria. Retrieved from

https://files.eric.ed.gov/fulltext/EJ1158251.pdf
52
Panadero, E., Jonsson, A., & Botella, J. (2017). Effects of self-assessment on self-regulated

learning and self-efficacy: Four meta-analyses. Educational Research Review, 22, 74–

98. https://doi.org/10.1016/j.edurev.2017.08.004.

Paje, Y. M., Rogayan, D. V., & Dantic, M. J. P. (2021). Teachers’ utilization of computerbased

technology in science instruction. International Journal of Technology in Education

and Science (IJTES), 5(3), 427-446. https://doi.org/10.46328/ijtes.261

Plybour, C. (2015). Integrating Formative Assessment into Physics Instruction: The Effect

of Formative vs. Summative Assessment on Student Physics Learning and

Attitudes Dissertations. 536. https://scholarworks.wmich.edu/dissertations/536

Rosenberg, M. J. (2001). E-learning: Strategies for delivering knowledge in the digital age.

New York: McGraw-Hill.

Querido, D. V. (2023, July 18). Effectiveness of Interactive Classroom Tool: A Quasi-

Experiment in Assessing Students’ Engagement and Performance in Mathematics

10 using ClassPoint. Applied Quantitative Analysis, 3(1), 79–92.

https://doi.org/10.31098/quant.1601

Ramsey, B., & Duffy, A. (2016). Formative assessment in the classroom: Findings from

three districts. Michael and Susan Dell Foundation and Education, 1. Retrieved

from https://education-first.com/wpcontent/uploads/2016/05/MSDF-

Formative-assessment-Study-Final-Report.pdf

Reynolds, L. M., Attenborough, J. & Halse, J. (2020). Nurses as educators: creating

teachable moments in practice. Nursing Times, 116(2), pp. 25-28.

Roehl et al. (2013). The Flipped Classroom: An Opportunity to Engage Millennial Students

through Active Learning. Journal of Family and Consumer Sciences, 105, 44.

Rogers, P. J. (2008). Using Programme Theory to Evaluate Complicated and Complex

Aspects of Interventions. Evaluation, 14, 29-48.

http://dx.doi.org/10.1177/1356389007084674
53
Rogler, D. (2014). Assessment Literacy: Building a Base for Better Teaching and Learning.

Retrieved from https://eric.ed.gov/?id=EJ1045594

Sailer, M., & Homner, L. (2020). The gamification of learning: A meta-analysis.

Educational Psychology Review, 32, 77–112. https://doi.org/10.1007/s10648-

019-09498-w

Seufert, S. (2019). The role of technology and immediate feedback for developing self-

regulated learning skills. Technology, Knowledge and Learning, 23(3), 545–563.

Shirley, M. L., & Irving, K. E. (2015). Connected Classroom Technology Facilitates

Multiple Components of Formative Assessment Practice. Journal of Science

Education & Technology, 24(1), 56-68. Doi: 10.1007/s10956-014-9520-x

Shute, V. J., & Rahimi, S. (2017). Review of computer-based assessment for learning in

elementary and secondary education. Journal of Computer Assisted Learning,

33(1), 1–19. https://doi.org/10.1111/jcal.12172.Skryabin et al. (2015). How the

ICT Development Level and Usage Influence Student Achievement in Reading,

Mathematics, and Sciences. Retrieved from

https://www.researchgate.net/publication/273480843_How_the_ICT_developme

nt_level_and_usage_influence_student_achievement_in_reading_mathematics_an

d_science

Smetana, L. & Bell, R. (2012). Computer Simulations to Support Science Instructions

and Learning: A Critical Review of the Literature. Retrieved from

https://www.researchgate.net/publication/263388491_Computer_Simulations_t

o_Support_Science_Instruction_and_Learning_A_critical_review_of_the_literature

Sneider, C., & Wojnowsk, B. (2013). Opening the Door to Physics Through Formative

Assessment. pdf (aapt.org)

Sun, P., & Wu, J. (2020). Formative assessment in the era of technology: A literature

review. International Journal of Educational Technology in Higher Education,


54
18(1), 1–23.

Sung et al. (2016). The Effects of Integrating Mobile Devices with Teaching and Learning

on Students’ Learning Performance: A Meta-Analysis and Research Synthesis.

Retrieved from

https://www.sciencedirect.com/science/article/pii/S0360131515300804

Sung, Y. T., Chang, K. E., & Yang, J. M. (2019). The effects of integrating mobile devices

with teaching and learning on students' learning performance: A meta-analysis

and research synthesis. Computers & Education, 130, 136-149.

https://doi.org/10.1016/j.

Tan Ai Lin, D., Ganapathy, M., & Manjet, K. (2018). Kahoot! It: Gamification in higher

education. Journal of Social Siences and Humanities, 26(1), 565–582.

Tarhini et al. (2015). A cross-cultural examination of the impact of social, organisational

and individual factors on educational technology acceptance between British and

Lebanese university students. British Journal of Educational Technology, 46, 739–

755. doi:10.1111/bjet.12169

Terada, Y. (2020). A Powerful Model for Understanding Good Tech Integration. Retrieved

from https://www.edutopia.org/article/powerful-model-understanding-good-

tech-integration/

Timmis et al. (2016). Rethinking Assessment in a Digital Age: Opportunities, Challenges

and Risks. Retrieved from https://bera-

journals.onlinelibrary.wiley.com/doi/10.1002/berj.3215

Tondeur et al. (2017). Understanding the Relationship Between Teachers’ Pedagogical

Beliefs and Technology Use in Education: A Systematic Review of Qualitative

Evidence. Retrieved form

https://www.researchgate.net/publication/308128849_Understanding_the_relat

ionship_between_teachers'_pedagogical_beliefs_and_technology_use_in_education

_A_systematic_review_of_qualitative_evidence
55
Tosunoglu, T. (2018). Relationship Between Financial Stability and Economic Growth in

Turkey (2002- 2017). Retrieved from

https://ideas.repec.org/p/sek/iacpro/6409266.html

Trumbull, E., & Lash, A. (2013). Understanding Formative Assessment: Insights from

Learning Theory and Measurement Theory.

https://www.wested.org/online_pubs/resource1307.pdf

Tsai, C. W., Lin, C. Y., & Liu, Y. H. (2022). Investigating technology-enhanced formative

assessment to promote higher-order thinking skills: An exploration of design

principles and patterns. British Journal of Educational Technology, 53(5), 1096–

1115.

Turgay Yıldırım, Z., Yılmaz, T., & Yıldırım, S. (2021, August 31).

https://dergipark.org.tr/tr/download/article-file/1845500. Cukurova Anestezi

Ve Cerrahi Bilimler Dergisi, 4(2), 102–112.

https://doi.org/10.36516/jocass.2021.78

Ukobizaba, F. & NIzeyimana, G. (2021). Facing the Efforts of COVID-19 on Grade-12

Students’ Education: A Focus on Science and Mathematics Instructions. Retrieved

from

https://www.researchgate.net/publication/357737573_Facing_the_effects_of_C

OVID-19_on_Grade-

12_students'_education_A_focus_on_science_and_mathematics_instructions

University of Texas Arlington. UTA Works with Boeing and NASA to Understand Social

Network’s Impact on Online Students’ Grades, Completion Rates. Retrieved form

https://www.eurekalert.org/news-releases/536526

Villanueva, G. Facilitating Excellent Learning Through the Use of Educational

Technology. Retrieved from

https://www.slideshare.net/GlennVillanueva5/facilitating-excellent-learning-

through-the-use-of-educational-technology
56
Virata, R. O., & Javier, C. B. (2019, January). Perceived effects of computer-aided

formative assessments on lesson planning and student engagement.

In Proceedings of the 10th International Conference on E-Education, E-Business,

E-Management and E-Learning (pp. 151-157).

Voitkiv, H. V., & Lishchynskyy, I. M. (2020, September 20). Using of digital tools for the

formative assessment of future physics teachers. Science and Education a New

Dimension, VIII(236)(94), 77–80. https://doi.org/10.31174/send-pp2020-

236viii94-17

Wang, A. I. (2015). The wear out effect of a game-based student response system.

Computers & Education, 82, 217–227

Wang, T., & Liu, X. (2023). Technology-enhanced formative assessment and student

learning: A review of the literature from 2010 to 2020. Educational Technology

Research and Development, 71(3), 823–842. .

Webb M. E., Prasse D., Phillips M., Kadijevich D. M., Angeli C., Strijker A., Laugesen H.

et al., (2018), Challenges for IT-enabled formative assessment of complex 21st

century skills, Technology, Knowl. Learn., 23(3), 441–456.

Yoon et al. (2012). Exploring the Factors that Influence Students; Behaviors in

Information Security. Retrieved from https://aisel.aisnet.org/jise/vol23/iss4/7/

Yu, S., & Liu, X. (2021). The affordances of technology-mediated formative assessment

for formative assessment practices: A systematic review. Assessment & Evaluation

in Higher Education, 46(5), 770–788.

Zainuddin, Z., Kai Wah Chu, S., Shujahat, M., & Perera, C. J. (2020). The impact of

gamification on learning and instruction: A systematic review of empirical

evidence. Educational Research Review, 30, 1–23.

https://doi.org/10.1016/j.edurev.2020.100326
57

Action Plan

Implementation Responsibilities Resources Timeline Implications


Steps

Submit a copy of
the Final May 3, SDO archiving
3 sets
Research Report of the
Researcher Final 2024
to the Division completed
Research
Research education
Report
Committee research
Present
findings Division-
e-Research May 10-12,
during Researcher recognized
Report copy 2024
Division research work
Research
Congress
58

Financial Report

Particulars Cost
Classpoint Pro Account subscription 1,000.00
Internet Connectivity 1,000.00
Professional Fee (Statistician) 7,500.00
Miscellaneous Expense 1,000.00

Total 10,500.00
59

Science 8
Test Items in Science 8 Pretest/Posttest for Item Analysis

DIRECTION: Choose and write the letter of the best answer on the space before each
number.

1. Which of the following is true about the effect of air temperature on the speed of
sound?
A. The lower the temperature the faster the speed.
B. The higher the temperature the faster the speed.
C. The higher the temperature the slower the speed.
D. The temperature does not affect the speed of sound.
2. What is the speed of sound in dry air at 0 0C?
A. 31 m/s B. 331 m/s C. 3000 m/s D. 300000 m/s
3. How much is the increase in the speed of sound in the air for every 1C0
increase in temperature?
A. 0.06 m/s B. 0.6 m/s C. 6 m/s D. 60 /s
4. Which of the following statements is true about the speed of sound?
A. Sound travels faster in dry air than in wet air.
B. Molecules move faster so sound travels faster in warmer air.
C. Sound travels at a constant speed even if air temperature changes.
D. When comparing two media with the same phase, the sound will travel
faster in a denser material.
5. Which wave property is observed when a boy shouts and hears his own voice
inside the church?
A. echolocation B. reflection C. refraction D. both reflection and refraction
6. Sound travels faster through warm materials than cold materials because ____.
A. warm particles move slowly
B. gas particles are packed tightly
C. warm particles are moving quickly
D. sound does not travel faster through a warm substance
7. Which room temperature of air does sound travel faster?
A. 20 0C B. 23 0C C. 25 0C D. 28 0C
60

8. When a boy yells his name inside a cave, the sound reflects off the walls of the cave and
travels back to his ears. What do you call the reflected sound?
A. density C. echolocation
B. echo D. refraction
For nos.9 and 10:
Below is the data of air temperature of the four cities at the same time:
Metro Manila 29 0C Cebu City 27 0C
Davao City 26 0C Butuan City 25 0C
9. Which city does sound travel the fastest?
A. Butuan City B. Cebu City C. Davao City D. Metro Manila
10. Which city does sound travel the slowest?
A. Butuan City B. Cebu City C. Davao City D. Metro Manila
11. What happens to the temperature of an object when the particles are moving faster?
A. reduces B. increases C. remains constant D. increases then reduces
12. When a substance undergoes thermal expansion, its _________.
A. mass increases B. volume decreases
C. particles get colder D. particles spread out
13. The decrease in temperature of a substance indicates that _______.
A. the number of particles in it decreases
B. the average velocity of its particles increases
C. the average potential energy of particles decreases
D. the average kinetic energy of its particles decreases
14. Which of the following happens when ice changes into a liquid at 0oC?
A. The molecules move slower than before.
B. The temperature of the substance increases.
C. The potential energy of the molecules increases.
D. The average movement of the molecules increases.
15. What happens to the surface of the water when the rate of evaporation is greater?
A. It becomes cooler. C. It becomes more massive.
B. It absorbs less energy. D. It absorbs greater energy.
16. During warm days, you cool yourself by damping your skin with a wet towel.
Which of the following takes place?
A. Your skin absorbs the coldness of the water.
B. Your skin releases energy when water from your skin evaporates.
61

C. The temperature of the water on your skin decreases as it evaporates.


D. The temperature of your skin increases as water evaporates from your skin.
D. there is a decrease in the average kinetic energy of particles
17.Which of the following is TRUE about boiling?
A. It is slower than evaporation.
B. It takes place at a specific temperature.
C. It is the same for all liquids at the same temperature.
D. It takes place when bubbles begin to appear in the liquid.
18.Which of the following happens when ice changes into liquid at 0oC?
A. The molecules are not moving.
B. The molecules move slower when ice changes into liquid.
C. The temperature of the liquid is higher than the temperature of the ice.
D. The temperature of the liquid is the same as the temperature of the ice.
19. A decrease in temperature of a substance indicates that the_______.
A. volume of the substance reduces
B. volume of the substance increases
C. particles of the substance get closer with each other
D. particles of the substance move farther from each other
20.Why does liquid in the thermometer rise when put in hot water?
A. The liquid is boiling.
B. The liquid is evaporating.
C. The liquid gains heat from the hot water causing it to expand.
D. The liquid loses heat from the hot water causing it to contract.
21. Which has the greatest energy among the colors in a rainbow?
A. green B. orange C. red D. violet
22. Among the following colors in a rainbow, which has the least energy?
A. green B. red C. violet D. yellow
23. Which property of light enables the formation of a rainbow?
I. color separation II. Dispersion III. Reflection IV. refraction
A. I and III C. I, III and IV
B. II, III, and IV D. I, II, III, IV
62

24. What is called the separation of white light into different colors as it passes
through prism?
A. color separation B. dispersion C. reflection D. refraction
25. Which of the following orders of visible light colors shows increasing
wavelength?
A. red, orange, yellow, green, blue, indigo, violet
B. red, yellow, green, orange, violet, blue, indigo
C. violet, indigo, blue, green, yellow, orange, red
D. violet, blue, green, orange, red, indigo, yellow
26. Why does white light separate into different colors as it passes through a prism?
A. The colors are changed by addition.
B. This is an example of color by subtraction.
C. Different colored light has different wavelengths.
D. The side part of a prism only let certain colors of light pass through.
27. Which of the following is true about the relationship between frequency and
energy?
A. The frequency of the color of light and energy are not related.
B. As the frequency of the color of light increases, its energy decreases.
C. As the frequency of the color of light decreases, the energy increases.
D. As the frequency of the color of light increases, the energy also increases.
28. Which of the following statements is incorrect?
A. Short wavelength corresponds to low frequency.
B. Frequency and wavelength are inversely related.
C. High frequency light corresponds to short wavelength.
D. Low frequency light corresponds to long wavelength.
29. White light separated through a prism is an example of ___________.
A. diffraction B. rarefaction C. reflection D. refraction
30. Which of the following arrangements of visible light colors shows decreasing
wavelength?
A. red, orange, yellow, green, blue, indigo, violet
B. red, yellow, green, orange, violet, blue, indigo
C. violet, blue, green, orange, red, indigo, yellow
D. violet, indigo, blue, green, orange, yellow, red
63

31. Refractive Index is a ratio between the speed of light in vacuum and ____.
A. speed of light in vacuum C. speed of light in a medium
B. speed of sound in vacuum D. speed of sound in a medium
32. The diagram shows a ray of white light passing through a prism and emerges
as a band of colored light which strikes a screen. What is the color of X and Y?
A. X=Blue, Y=Red C. X=Green, Y=Red
B. X=Red, Y=Violet D. X=Green, Y=Blue
33.White light separated through a prism is an example of ___________.
A. diffraction B. rarefaction C. reflection D. refraction
34. Which of the following colors has the highest energy?
A. orange B. red C. violet D. yellow
35. What refers to the bending of light as it passes from one medium into another?
A. frequency B. reflection C. refraction D. wavelength
36. It is a difference in electric potential energy in joule/coulomb.
A. circuit B. current C. resistance D. voltage
37. What is the SI unit of voltage?
A. ampere B. ohm C. volt D. watt
38. It is the number of charges passing through a wire per unit time.
A. current B. power C. resistance D. voltage
39. What is the SI unit of current?
A. ampere B. ohm C. volt D. watt
40. It is the opposition to the flow of electric charges as they travel through a conducting
wire.
A. circuit B. current C. resistance D. voltage
41. What is the SI unit of resistance?
A. ampere B. ohm C. volt D. watt
42. Which of the following is the correct statement of Ohm’s Law?
A. When current increases in a circuit, voltage increases and resistance increases.
B. When current increases in a circuit, voltage decreases and resistance
increases.
C. When current increases in a circuit, voltage increases while resistance
remains constant.
D. When current decreases in a circuit, voltage decreases and resistance increases.
64

43. According to Ohm’s law, across a resistor with constant resistance, what
happens to the current across it when the voltage applied is halved?
A. halved B. doubled C. quadrupled D. remains the same
44. Consider a simple electric circuit with a voltage source of 20.0 V which has a
current of 0.500 A. What is the resistance of the load?
A. 20.0 ohms B. 30.0 ohms C. 40.0 ohms D. 50.0 ohms
45. A laptop power charger has an output of 5.00 volts and has a resistance of 800
ohms. What is the current output of the charger?
A. 6.25 mA B. 50.0 mA C. 75.0 mA D. 80.0 mA
46. A motorcycle starter motor needs 40.0 A to operate with a resistance of 0.150
ohms. What is the needed voltage to start the motor?
A. 5.00 V B. 6.00 V C. 7.00 V D. 8.00 V
47. What happens to the current across a circuit when the voltage is doubled while
the resistance is held constant?
A. tripled B. halved C. doubled D. remains the same
48. An electric current if a circuit has a resistance of 100 Ω and voltage of 12.0 V?
A. 0.120 A B. 9.00 A C. 12.0 A D. 25.0 A
49. What is the electric current if a circuit has a resistance of 100 Ω and a voltage
of 6.00 V?
A. 0.0600 A B. 4.50 A C. 6.00 A D. 12.5 A
50. What will happen to the current if the voltage is reduced to one half?
A. tripled C. decreased by one half
B. doubled D. decreased by one fourth

END OF TEST
65

CHARMALOU P. OGARTE

PERSONAL DATA

Place of Birth : Dapitan City, Zamboangad del Norte


Date of Birth : January 3, 1983
Home Address : Sicayab, Dipolog City
Nationality : Filipino
Civil Status : Married
Parents : Hermenegildo G. Pampilo
Herry M. Pampilo
Husband : James Neil A. Ogarte
Daughter : Francine Kizzie P. Ogarte
Son : Ken James Enzo P. Ogarte

EDUCATIONAL BACKGROUND

Elementary : Sindangan Pilot Demonstration School


Municipality of Sindangan
1995

Secondary : Polanco National High School


Polanco, Zamboanga del Norte
1999

Tertiary : Bachelor of Science in Industrial Education


Major: General Science
Jose Rizal Memorial State University
Dipolog City
2003

ELIGIBILITY

Licensure Examination for Teachers


(LET) – July, 2003

WORK EXPERIENCE

Zamboanga del Norte National High School - Turno - 2012-Present


Teacher III, Science Department

Dipolog Community School 2003-2012


Science Teacher
66

TRAININGS ATTENDED

GURO21 Course 2 (Developing Higher Order Thinking Skills)


February 20, 2022- April 22, 2022

2ND DOST- SEI PROJECT STAR International Conference


November 16, 2022 – November 18 , 2022

PD - Program on Assessment Competencies and Emerging Literacies


October 23, 2021- April 3, 2022

National Science Club Month Webinar Series


September 4, 2021- September 25, 2021

Capacity Building On Learning Resources Mapping , Crafting Of WHLP,


LAS Training
October 15, 2021 – October 20, 2021

Regional Pedagogical Retooling In Mathematics, Language And Science


February 25, 2019 - March 1, 2019

You might also like