Assessment of Learning

Download as docx, pdf, or txt
Download as docx, pdf, or txt
You are on page 1of 27

Assessment of Learning

Assessment = ASSISTment

Classroom Assessment
 Identifying, gathering, organizing, and interpreting
 Types of Data
1. Qualitative
a. Names
b. Smell
c. Color
2. Quantitave
a. Scores on exam
b. Weight of a person

Basic Concepts

Test – tool/instrument that quantifies data or knowledge of students


Evaluation – judge data
Assessment – interpret data
Measurement – gather data

Outcome-Based Education
 Shift of Educational Focus from Content to Learning Outcomes
 (CMO 46 s. 2012)
 Motivation – A call for a Quality and Accountability in Education
 A method of curriculum design that focuses on what students can
actually do after they are taught.
 Learner-centered approach to education
 Focus – to have students that are able to apply their learnings in real-
life situations.
 Outcomes
1. Immediate Outcomes – can manifest results right after class.
2. Deferred Outcomes – can manifest results after few years

Cycle
Principles of OBE (Spady, 1996)
1. Clarity of focus
a. School’s focus must be clear
2. Designing down
a. Deductive (general to specific)
3. High expectations
a.
4. Expanded opportunities
a. Opportunities for the students become larger

Steps in OBE
1.

Program outcomes for teacher education


1. Articulate the rootedness of education in philosophical, socio-cultural,
historical, psychological, and political contexts
2. Demonstrate master of subject matter/discipline
3. Develop innovative curricula, instructional plans, teaching approaches,
and resources for diverse learners
4. Facilitate learning using a wide range of teaching

Outcome-based Teaching and Learning (OBTL)


 OBE which is applied in the teaching and learning process.

Learning Outcomes in Different Levels


1. Learning Outcomes –
 Most specific outcomes; observed after lesson
2. Course outcomes –
 subject outcomesl observed after a semester, few weeks, or
months.
3. Program Outcomes –
 A graduate should attain at the end of the program.
4. Institutional Outcomes –
 graduate attributes.
Institution’s Vision and Mission
 Can be a source of students learning.
Mission
 What the schools are doing right now
 Present
Vision
 Future

Students Learning Outcomes


 What we want our students to know or be able to do

Outcomes
 Clear learning results that learners have to demonstrate.
 Content and Performance Standards and Learning Competencies
o Discussed at the beginning of class

Domains of Learning
1. Cognitive
 knowledge
2. Affective
 emotions
3. Psychomotor
 Movements

Blooms taxonomy of Educational Objectives


Old Version
evaluati
on
synthesis

analysis

application

Comprehension

Knowledge

New Version

creatin
g
evaluatin
g
analysis

application

Comprehension

Knowledge
Determining the progress towards the attainment of learning outcomes

Purpose of Assessment
1. Assessment for learning
a. Done before or during instruction
b. E.g. placement exams, pre-test, diagnostic, quizzes

Placement assessment – place students in specific learning


groups

Diagnostic Assessment – strengths and weaknesses


Formative Assessment
2. Assessment as Learning
a. Implemented at the end of the lesson
b. Self-assessment
i. Teacher – to understand and perform well their role of
assessing for and of learning
ii. Students – to be self-directed in their own learning
c. Not graded
3. Assessment of Learning
a. Done after instruction
b. Graded
i. Quantitative measurement
ii. Summative assessment
c. Integral part of the teaching-learning process
d. Finds out if the learning objectives were achieved.
e. e.g prelim, midterm, finals

Modes of Assessment
1. Traditional – pen and paper
 Selected-response type
 Test items with options
 Multiple choice questions; matching type; true or false
 Constructed-response type
 Test items without options
 Short-answer, wssay, problem solving
2. Authentic – real-life situations
 Students are asked to perform real-world tasks that demonstrate
meaningful application of essential knowledge or skills
 E.g. OJTs, internships, practicum
 Performance Assessment
1. Process Oriented
2. Product Oriented
 Features
 Meaningful Performance task
 Clear standards and public criteria
 Quality products and performance
 Positive interaction between the assessee and the assessor
 Emphasis on metacognition
 Learning that transfers
Attributes Traditional Authentic
1. Action Selecting a response Performing a task
2. Setting Contrived Simulation
3. Method Recall Application
4. Focus Teacher-structured Student-structured
5. Outcome Indirect evidence Direct evidence

Non-Test Assessment of Learning


 Formative
 Informal, impromptu feedback
 Measures students directly with real tasks
 E.g. portfolio, teacher observation, slates, debates, panel discussion,
problem solving, demo, projects, games, checklist, etc

3. Alternative – methods other than


4. Performance
a. Demonstration type
i. Teachers will demonstrate how to do a task then the
students will be assessed on how to the task.
b. Creation type
i. There is an output.
7 Criteria in Selecting Performance Assessment Task
1. Generalizability
2. Authenticity
3. Multiple-foci
4. Teachability
5. Feasibility
6. Scorability
7. Fairness

Criterion-Referenced Assessment
 There is a certain criteria or standard for all of the students
 e.g Board Examination – there are standards set by PRC that students
must achieve

Norm-referenced Assessment
 Each student’s performance is compared to the performance of other
students
 e.g PE – the grading for a certain dance will be based on how they
performed the dance. However, the teacher saw that one student
stands out. Therefore, he will become the norm.
Contextualized Assessment
 Focuses on student’s construction of functioning knowledge
Decontextualized Assessment
 Written exams
 Focuses on declarative and procedural knowledge.
 No direct connection to real-life context

5. Portfolio
a. Systematic and organized collection of student’s work that
demonstrates skills and accomplishments
b. Types
i. Showcase
1. Best outputs
ii. Process
1. Cognitive and psychomotor progess
iii. Evaluation/Assessment
1. Diagnose student’s learning
iv. Documentation
1. Learning progress
c. Includes:
i. Artifacts
1. Academic outputs
ii. Reproductions
1. Students work outside the classroom
iii. Attestations
1. Evaluative notes of a teacher
iv. Productions
1. Goals, reflections, and captions.
d. Elements
i. Cover letter
ii. Table of contents
iii. Entries
iv. Dates
v. Drafts
vi. ---
e. Principles
i. Content
1. Subject matter that the students is importants to
learn
ii. Learning
1. The students become an active learner
iii. Equity
1. Allow students to demonstrate their learning styles
and multiple intelligences
f. E-portfolio
i. Digital collection
ii. Types
1. School-centers
2. Student-centered
3. Assessment
4. Learning
5. Career/transfer

Scoring Rubric
 Content
1. Criteria
2. Descriptions of level of performance
 Types
1. Holistic
a. General or overall impression
2. Analytic
a. Specific and describes level of performance to each criteria
3. General
a. Assesses general tasks/skills
4. Specific
 Other scoring instruments
1. Likert scale
2. Rating scale

Distinguishing and constructing Various Paper-and-pencil test


1. Selected-response test
a. Binary-choice items
i. Give students only two options from which to select
ii. True or false test
1. Avoid giving clues
2. Avoid using specific determiners (sweeping
generalization) like always, never, all, usually,
impossible.
3. Avoid using trick questions (e.g. misspellings.)
4. Keep item length similar
5. Avoid using negative statements and double
negatives.
6. Include only one concept in each statement
b. Multiple-choice items
i. Students are asked to choose a correct or best answer out
of the choices from a list.
ii. Types
1. Direct-question form
2. Incomplete statement form
3. Negative stem
4. Best answer
5. Group options
6. Contained options
7. Stimulus material-stem options
iii. Pointers
1. The stem should consist of a self-contained question
or problem.
2. Distractors should be equally plausible and
attractive.
3. Avoid options that are synonymous.
4. Avoid double negatives.
5. The options must be in capital letters.
6. Avoid stems that reveal the answer to the next
questions.
7. Avoid complex/awkward words.
c. Matching Type/Association Test
i. Students associate an item in one column with a choice in
the second column.
ii. Types
1. Perfect Matching
2. Imperfect Matching
iii. Pointers
1. Use homogenous options and stems.
2. Provide more responses than premises.
3. Arrange the options alphabetically, numerically, or
logically
4. Limit the number of items within each set
5. ---
6. ---
7. ---
2. Constructed-Response Test
a. Short-Answer Items/Completion Test
i. Type
1. Cloze Test
ii. Pointers

b. Essay test
i. Allows greater freedom of response to questions and
require more writing

Five general item commandments


1. Thou shall not provide opaque directions to students regarding how to
respond to your assessment instrument.
2. Thou shall not employ ambiguous statements in your assessments.
3. ---
4. Thou shall not employ complex syntax in your assessment item.
5. Thou shall not use vocabulary that is more advanced than required.
Phases of Making a Test
Item Analysis: Difficulty and Discrimination Index
Discrimination Index
1. Negative Discrimination Index
a. More from the lower group answered the test item correctly
2. Positive Discrimination Index
a. More from the upper group answered the test item correctly
Principles of a high-quality assessment
Characteristics of a good test
1. Validity
a. It refers to the extent to which the test intends to measure.
b. Types
i. Face Validity
1. Physical appearance
a. How is your test arranged?
b. Is the font too small?; is the spacing enough?
ii. Construct Validity
1. Determines which assessment is a meaningful
measure of an unobservable trait or characteristics.
2. Types
a. Convergent validity
i. Defines another similar trait
b. Divergent validity
i. Describes only the intended trait and not
other traits.
Test Validity enhancers

2. Reliability
a. Consistency of scores
b. Methods
i. Test-retest
1. Repetition of the same test
ii. Parallel/Equivalent forms
1. Two parallel forms of test is given to the same group
of students
iii. Split-half
1. One test that is divided into two equivalent halves
iv. Kuder-Richardson
1. Correlating the proportion/percentage of the students
passing or not pasting the test.
Statistics
Measure of Central Tendency/Location/Point
1. Mean – average; only one mean; most affected by outlier
2. Median -middlemost data;only one median; less affected by outlier
3. Mode – most frequent data; can be one or more mode; not affected bu
outlier

Measures of Variability
Quartile

Stanine
Measure of shapes

You might also like