0% found this document useful (0 votes)
23 views9 pages

Assessment in Learning 1: Melisa O. Derramas Educ2207

1. The document discusses key concepts in educational assessment including formative and summative assessment, item analysis, classical test theory, and item response theory. 2. Formative assessment is used to provide feedback to teachers and students to improve instruction, while summative assessment evaluates learning against outcomes. 3. Item analysis examines item difficulty, discrimination, and response patterns to evaluate question quality, and classical test theory assumes measurement error while item response theory estimates probability of response based on ability.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
23 views9 pages

Assessment in Learning 1: Melisa O. Derramas Educ2207

1. The document discusses key concepts in educational assessment including formative and summative assessment, item analysis, classical test theory, and item response theory. 2. Formative assessment is used to provide feedback to teachers and students to improve instruction, while summative assessment evaluates learning against outcomes. 3. Item analysis examines item difficulty, discrimination, and response patterns to evaluate question quality, and classical test theory assumes measurement error while item response theory estimates probability of response based on ability.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 9

Assessment in Learning 1

Melisa O. Derramas Educ2207

Basic Concepts in Assessment  Actual collection of


information on student
Assidere learning through various
 Latin word which means strategies and tools.
to sit beside another.
Assessment Evaluation
 Process of gathering  Process of making value
quantitative and qualitative judgments on the
data. information collected from
Purpose of Assessment measurement based on
 Decision making specified criteria.
 Determine the impact of  Actual process of making
curriculum and instruction a decision of judgment on
on students. student learning based on
Assessment in Learning the information collected
 Systematic and purpose- from measurement.
oriented collection, Testing
analysis, and interpretation  Most common form of
of evidence of student assessment
learning, in order to make  Use of a test or battery of
informed decisions tests to collect information
relevant to the learners. on student learning over a
 Use evidence on student specific period of time.
learning to further promote
and manage learning. “A test is a form of assessment,
Characteristics of Assessment in but not all assessments use test
Learning for testing.”
 Process
 Based on Specific Types of Tests
Objectives  Selected Response:
 From Multiple sources Matching type
 Constructed Response:
Terms in Assessment Essay, Short Answer.
 Process of quantifying the
attributes of an object. Type of Test Based on Format
 Process of assigning value
Objective Format- Direct to the performance or
Choices achievement of the learner
 Multiple Choice based on specified criteria
 Enumeration or standards using tests
 True or False and tasks.
 Bias-free, items have exact  Form of evaluation which
answers provides information on
Subjective Format whether a learner passed
 Essay or failed a subject or a
 Less objective, especially particular task.
if no rubric.  Quantifies the
Table of Specifications (TOS) performance of the
 Maps the essential aspects student.
of the test
o Test Objectives
o Contents Inclusions in Grading
o Topics covered by  Recitation
the test  Homework
o Item Distribution (  Seatwork
Divides into  Project
LOTS and HOTS)
 For design and Final Grade
development of the test  Summation of information
from multiple sources
“A test is good and effective if it (several tasks or
has acceptable psychometric requirements)
properties.”
Frameworks in
Good and Effective Test Assessment
 Valid- It measures what it
intends to measure. Frameworks- Basis in forming
 Reliable- Consistency assessment
 Acceptable level of
difficulty Classical Test Theory or True
 Discriminate between Score Theory
learners with higher and  Variation in the
lower ability performance of examinees
on a given measure is due
Grading
to variation in their correctly answered a
abilities. particular item.
 Must assume both the o Example: If in a
score and error. certain item, only
 Example: five out of 50 got
o If student A score that correct
is 10 and student answer, it means
score B is 5, it that the item is
means, that there very difficult. If
is really a 45 out of 50 got
difference in the the item correctly,
ability of Student meaning that item
A and Student B. is easy.
 Item Discrimination-
Assumes that an examinee’s Provided based on the
observed score in each measure number of examinees with
is the sum of the: higher ability or lower
 Examinee’s true score ability to answer a
 Some degree of error in particular item.
the measurement caused If an item is able to
by some internal and distinguish between
external conditions. examinees with higher and
lower ability, the item is
Internal Condition- Personal considered to have a good
situation of students discrimination.
External Condition- Environment  Discrimination Index-
of students For item analysis
o Below 0.10- poor
Classical Test Theory or True o 0.10-0.19-
Score Theory marginal
 Assumes that all measures o 0.20-0.29-
are imperfect, and scores moderately
obtained form a measure discriminating
could differ from the true o 0.30-0.39-
score (true ability) of an discriminating
examinee. o 0.40 above- very
 Item Difficulty- Based on discriminating
the frequency or number (very good item)
of examinees who
Test Reliability Estimation
 Kuder- Richardson 20 students based on their
(KR-20) responses)
 Cronbach’s Alpha IRT Analysis
o Less than 0.70  Item Difficulty
 Cronbach’s Alpha-  Item Discrimination
checks whether your  Fit Statistics
responses are consistent.  Item Characteristic Curve
 Ranges from 0 to 1..0 – (ICC)
The cut-off for standard  Test Characteristic Curve
testing is 0.7 (TCC)

Item Analysis Diagnostic Test- Confirms if the


In CTT, considered simple learner is having difficulty.
 Item difficulty index –
The number of correct and Types of Assessments
incorrect items.
 Item discrimination Formative (Assessment for
index- Learning)
 Item-Total Correlation-
(E.g.- Item number 1 is
wrong, what’s the total
score of the student, Item
number 2 is wrong,
what’s the total score of
the student?)

Item Response Theory


 Analyzes items by
estimating the probability
that an examinee answers  Provide information to
an item correctly or both the teachers and the
incorrectly. learners on how they can
 Assumes that the improve teaching-learning
characteristic of an item process.
can be estimated  Used at the beginning and
independently of the during instruction.
characteristic or ability of  Allows the teachers to
the examinee and vice- make adjustments to
versa. (We can know the instructional process and
performance of our
strategies to facilitate  Evaluation against
learning. outcome
 Inform Learners about
their strength and Diagnostic
weaknesses to enable  Aims to detect the learning
improvement. problems or difficulties of
the learners
Assessment for Learning-  Goal; corrective measures
Assesses the knowledge, or interventions done to
understanding, and skills if the ensure learning.
teacher is doing the right thing and Placement
gives them the chance to modify  When? Beginning of the
his strategies. school year
 Why? Determine what the
Summative Assessment learners already know or
(Assessment of Learning) what are their needs that
 Aim to determine learner’s could inform design of
mastery of content or instruction
attainment of learning  Grouping of learners based
outcomes. on the result is usually
 Provide information on the done to make it relevant to
quantity or quality that the address the needs or
students learned or accommodate the entry
achieved at the end of the performance of the
instruction. learners entrance
 E.g. Prelims, Midterms, examination.
Finals Traditional
 Used for evaluating
learners’ performance in  Use of conventional
class. strategies or tools to
 Provide teachers with provide information about
information about the the learning of the students
effectiveness of teaching  Objective (Multiple
strategies and how they
can improve instruction in Choice)
the future.  Subjective (Essay)
 Inform learners what they  Used as basis for
have done well and what evaluating and grading
they need to improve. learners.
 Used due to ease of design able to do.
and quicker to be scored (Outcome)
 Inauthentic Type o Teachers should
focus on helping
Authentic students to
 Allows learners to perform develop the
or create a product that are knowledge, skills
meaningful to the learners and personalities
based on real-world that will enable
contexts. them to achieve
 Authenticity: Best the intended
described in terms of outcomes that
degree rather than have been clearly
presence or absence of articulated.
authenticity  Designing Down
o Start the
Principles in Assessing curriculum design
Learning with the intended
outcomes to be
Background on OBE (Outcome- achieved at the
Based Education) end of the program
 Start with a clear picture BY students.
of what is important for o Curriculum
students to be able to do. design, delivery,
 Organize the curriculum, and assessment
instruction, and should be based on
assessment. outcomes.
 High Expectations
Four Basic Principles of OBE o “No student left
(CDHE) behind” does not
mean automatic
 Clarity of Focus
pass.
o Focused on what
o Establish high and
they want students
challenging
to know,
standards.
understand, and be
o
The purpose of Assessment is not end in itself
high expectations
 Not a simple recording on
is to encourage
documentation of what
students to engage
learners know and do not
in class and
know.
learning.
 It should lead to decisions
o Successful
that will allow
learning promotes
improvement of learners.
more successful
learner. Assessment is an ongoing,
 Expanded Opportunities continuous, and a formative
o Teachers must process.
strive to provide
 Consist of series of tasks
expanded
and activities conducted
opportunities for
over time.
all students.
 Not a one-shot activity and
Principle: should be cumulative.
 Important element:
 Not all learners can learn Continuous feedback
the same thing in the same
 OBE Principle: Expanded
way and in the same time.
Opportunity
 Most students can achieve
high standards if they are Assessment is learner-centered
given appropriate
 It is not about what the
opportunities.
teacher does, but the
Principles of Assessment learner can do.
 Provides teachers with an
Assessment should have a clear
understanding on how to
purpose
improve teaching.
 Purpose- Basis for
methods in collecting Assessment is both process-and
product-oriented.
information.
 OBE- Clarity of Focus and Equal Importance between learner
Design Down performance or product = Process
Assessment must be and are capable of doing at
comprehensive and holistic a particular grade level,
subject, or course.
 Variety of strategies and
tools. 4 Types of Educational
 Conducted in multiple Standards
periods over time.  Content- desired
outcomes in a content area.
 OBE Principle: Expanded
 Performance- what
opportunity.

Assessment requires the use of


appropriate measures.

 Valid (measure what it


intends to measure) and
reliable students do to demonstrate
(consistently/accurate) competence.
 Challenging  Development- sequence
of growth and change over
 Age-appropriate
time.
 Context-appropriate  Grade Level- outcomes
 OBE Principle: High for a specific grade.
Expectations.

Assessment should be as
authentic as possible.
 Closely, if not fully,
approximate real-life
situations or experiences.
Educational Objectives
Educational Goals,  Specific statement of
Standards, and learner performance at the
Objectives end of an instructional
unit.
Goals

Standards
 Specific statements about
what learners should know
 Also referred to as Multiple type of test can fall
behavioral objectives and under the six levels of taxonomy,
typically stated with the
use of verbs.
Learning Targets- Used for
specific activities
Goals- Prescribed by DEPED
Standards & Objectives- School
Based

Bloom’s Taxonomy of
Educational Objectives it only depends on how the
Domains Goals question is constructed.
Cognitive (K)
Affective (A)
Psychomotor
(S)
Bloom’s Taxonomy of
Educational Objectives for
Knowledge-Based Goals
 Knowledge
 Comprehension

Cognitive Domain

You might also like