0% found this document useful (0 votes)
15 views11 pages

BASICCONCEPTSANDPRINCIPLESOF21 STCENTURYASSEment

The document discusses the various roles and purposes of assessment in education, emphasizing its importance in enhancing teaching and improving student learning. It outlines different types of assessments, including summative, diagnostic, formative, and placement assessments, and highlights principles of high-quality assessment such as validity, reliability, and fairness. Additionally, it covers the methods of expressing validity and the types of tests used in educational assessment.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
15 views11 pages

BASICCONCEPTSANDPRINCIPLESOF21 STCENTURYASSEment

The document discusses the various roles and purposes of assessment in education, emphasizing its importance in enhancing teaching and improving student learning. It outlines different types of assessments, including summative, diagnostic, formative, and placement assessments, and highlights principles of high-quality assessment such as validity, reliability, and fairness. Additionally, it covers the methods of expressing validity and the types of tests used in educational assessment.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 11

CHAPTER 1: VARIOUS ROLES OF ASSESSMENT

BASIC CONCEPTS AND PRINCIPLES OF 21ST The primary role of assessment is to enhance
CENTURY ASSESSMENT teaching and improve student learning. It plays a major
role in how students learn, their motivation to learn and
As teachers become more familiar with data- how teachers teach.
driven instruction, they are making decisions about what
and how they teach based on the information gathered
from their students. They first find out what their students Summative Role
know and what they do not know, and then determine how  assessment tries to determine the extent to which the
best to bridge that gap. learning objectives for a course are met and why.
Diagnostic Role
MEASUREMENT, ASSESSMENT AND  Assessment used in determining the gaps in learning
EVALUATION or learning processes, hopefully to be able to bridge
these gaps. It is usually given at the beginning of
 MEASUREMENT, as it applies to education, simply instruction. It aims to identify the strengths and
means determining the attributes or dimensions of an weakness of the students regarding the topics to be
object, skill or knowledge. discussed.
 process by which information about the attributes or
characteristics of things are determined and Formative Role
differentiated, and it answers the question, how much  assessment is used to monitor the learning progress
does a student learn or know? of the students during instruction. It guides the
teacher on his/her day-to-day teaching activity, it
 ASSESSMENT, is an ongoing process aimed at allows immediate feedback, identify leaming errors,
understanding and improving student learning. It modify instructions to improve both learning and
involves gathering and organizing quantitative instruction.
qualitative indicators or evidences of learning as basis Placement Role
for decision making.  Assessment plays a vital role in determining
 EVALUATION, is a process of summing up the appropriate placement of students both in terms of
results of measurement and assessments of an achievement and aptitude. Aptitude refers to the
educational characteristic, giving them some meaning area or discipline where a student would most likely
based on value judgement on the educational excel or do well.
outcome.
PURPOSES OF ASSESSMENT
Assessment is used for various purposes.

Testing is a technique of obtaining information needed


for evaluation purposes. It is a method used to measure
the level of achievement or performance of the learners.
Tests on the other hand are devices used to obtain
information needed.
ASSESSMENT FOR LEARNING
It’s ongoing process that monitors student  Quality assessment begins with clear and
learning in order to help teachers improve their teaching appropriate learning targets. Learning targets,
and students to improve their learning. It continuously involving knowledge, reasoning, skills, products and
informs instruction and helps students manage their own effects, need to be stated in behavioral terms which
learning. denote something which can be observed through
the behavior of the students.
Some examples are:
 concept maps A. Cognitive Targets
 progress/monitoring Educational psychologist Dr. Benjamin Bloom
 reports, checklists/surveys in 1956 proposed a hierarchy of educational objectives at
 interviews the cognitive level. This includes the recall and
 observations, recognition of specific facts, procedural patterns, and
 quizzes, home works, concepts that serve in the development of intellectual
 Worksheets abilities and skills. There are six major categories of
cognitive processes that are group into Lower Order
Thinking Skills (LOTS) and Higher Order Thinking
ASSESSMENT AS LEARNING
Skills (HOTS).
It’s also a ongoing process that helps students to
self- reflect, monitor their own learning, and adjust their
learning strategies in order to achieve their goals and Lower Order Thinking Skills (LOTS)
become more self-directed, metacognitive, independent. Knowledge - acquisition of facts and theories
successful learners. Comprehension - understanding, involves cognition or
awareness of the interrelationships.
Some examples are: Application - transfer of knowledge from one field of
 Journals, self-assessment, peer-assessment, personal study to another of, from
learning logs one concept to another concept in the same discipline.
 Assessment OF Learning
Higher Order Thinking Skills (HOTS)
ASSESSMENT OF LEAMING Analysis - breaking down of a concept or idea into its
refers to strategies designed to confirm what components and explaining the concept as a composition
students know, demonstrate whether or not they have of this concept.
met curriculum outcomes or the goals of their Evaluation - valuing and judgment or putting the
individualized programs, or to certify proficiency and "worth" of a concept or principle.
make decisions about students' future programs or
Synthesis - putting together the components in order to
placements.
summarize the concept.

Some examples are:


Products, Affects, Reasoning, Skills (PARS)
 final oral presentations
Products - student's ability to create
 standardized tests
 end of unit tests or projects Affects -student's emotional attainment
 Recitals Reasoning - student's ability to use their knowledge
 long exams Skills - student's ability to demonstrate what they have
learned.
Principles of High Quality Assessment
High quality assessments provide an academic PRINCIPLES OF HIGH QUALITY ASSESSMENT
guide that students, parents, guardians, and educators
understand how each learner is progressing towards 2. Appropriateness of Assessment Methods
attaining the learning outcomes.
Once the learning targets are clearly stated,
appropriate assessments methods can be easily selected.
1. Clarity of Learning Targets Assessment methods can be categorized according to the
nature and characteristics of each method.
Written-Response Instruments Observation and self-reports are useful
Written-response instruments are use in supplementary assessment methods when used in
assessing cognitive learning. It includes objective tests conjunction with oral questioning and performance tests.
and essay test. Objective test is a kind of test wherein Observation and Self Reports
there is only one answer to each item, while essay test is useful supplementary assessment methods when
one wherein the test taker has the freedom to respond to used in conjunction with oral questioning and
a question based on how he feels it should be answer. performance tests. These methods can offset the negative
Generally there are two types of objective tests: impact on the students brought about by their fears and
supply type and selection type. anxieties during oral questioning or when performing
Product Rating Scales actual tasks under observation.
When assessing the quality of students' product, it is not
necessary to see the student make it. It is the finished 3. Properties of Assessment Methods
product that is important.
Expected learning outcome
a. Validity
The student will write efficient, documented,
 arguably the most important criteria for the quality
error-free computer programs that meet the
of a test. It refers to the extent to which a test's
specifications.
results are representative of the actual knowledge
Criteria for success: and/or skills we want to measure and whether the
A maximum of one item is rated as "Below test results can be used to determine accurate
expectations" conclusions about those knowledge and/or skills.

Types of Validity
Content Validity
 It is the extent to which the content of the test
matches the instructional objectives. In judging
content validity, one should look at both the topics
and subject matter covered in the test as well as the
learning targets.
Face Validity
 review of the items and not through the use of
statistical analyses. It is not investigated through
formal procedures and is not determined by subject
matter experts.
Criterion-Related Validity
 demonstrates the degree of accuracy of a test by
comparing it with another test, measure or
Performance Tests procedure which has been demonstrated to be valid.
used to assess how well the students apply their There are two contexts for using criterion
knowledge, skills, and abilities to a given problem. validity:
Oral Questioning
This method involves the teacher probing •Concurrent validity is when both measures are
students to think about what they know regarding the current. This approach allows one to show the test is
topic. Questions typically allow the teacher to keep valid by comparing it with an already valid test. It is a
discussion focused on the intended objective and statistical method using correlation, rather than a logical
maintain students' involvement. It allows teachers to method.
challenge students to think beyond the basic levels of
•Predictive validity is when one measure is done
Bloom's Taxonomy by asking higher level questions.
in the present and one is done at a later date. The later
Observation and Self Reports test is known to be valid.
Construct Validity  To estimate reliability by means of the test-retest
 is the degree to which the test scores can be method, the same test is administered twice to the
accounted for by certain explanatory constructs in a same group of pupils with a given time interval
psychological theory. between the two administrations of the test.
2. Measure of Equivalence or Parallel Forms Method
Methods of Expressing Validity  Estimating reliability by means of the equivalent
form or parallel method involves the use of two
different but equivalent forms of the test.
1. Correlation Coefficient Method
3. Split-Half Method
 the scores of newly constructed test are correlated
 is theoretically, the same as the equivalent-forms
with that of criterion scores. The coefficient of
method.
correlation gives the extent of validity index of the
test. For this purpose Pearson Product-Moment 4. Method of Rational Equivalence
correlation is most widely and popularly used.  also known as "Kuder-Richardson Reliability' or
2. Cross-Validation Method 'Inter-Item Consistency. It is a method based on
single administration. It is based on consistency of
 a trial of the selected items on new groups. In order
responses to all items. This method enables to
to evaluate the usefulness of a test, the prediction
compute the inter- correlation of the items of the test
equation or cut-off score must be derived from one
and correlation of each item with all the items of the
sample of information and validated on a second
test.
sample of subjects from the same universe or
population. c. Balance
3. Expectancy Table Method Assessment method should be able to assess all
domains of learning and hierarchy of objectives.
 the scores of newly constructed test are evaluated or
correlated with the rating of the supervisors. It
provides empirical probabilities of the validity Domains of Learning
index.  Cognitive
4. Item Analysis Method  Affective
 valid test should have proper difficulty value and  Psychomotor
discriminating power. Difficulty value and
discriminating power of test items can be calculated
through Item Analysis'. Item analysis is a process by
which the difficulty value and discriminating power
of the individual items of a test are calculated.

5. Method of Inter-Correlation of Items and Factor


Analysis
 done by highly statistical methods. Methods of
inter- correlation and other statistical methods are
used to estimate factorial validity. d. Fairness
 give equal opportunities for every student. There
b. Reliability
should be no discrimination of any kind (racial, age,
refers to its consistency between two measures gender, etc.)
of the same thing. It is the degree to which a test will e. Authenticity
yield similar or comparable results for the same students  Assessment should touch real-life situation and
when administered at different times. should emphasize practicability.
f. Practicality and Efficiency
There are four procedures in common use for  Assessment should save time, money, etc. It should
computing the reliability coefficient of a test. These be resourceful.
are:

1. Measure of Stability or Test-Retest Method


g. Continuity
 Since assessment is an integral part of the teaching Types of Test
and learning process, It should be continuous.
Assessment may be: 1. Educational and Psychological Tests
The primary function of educational test is the
measurement of results or effects of instruction. Examples
of educational tests are achievement test, Reading test,
language test, spelling test, etc.
2. Mastery Tests and Survey Tests
achievement tests which measure the degree to
which an individual has mastered certain instructional
objectives or specific learning outcomes, while Survey
Tests measure a student's general level of achievement
regarding a broad range of learning outcomes.
3. Individual Tests and Group Tests
administered to only one person at a time. Many
of the tests in these scales require oral responses from the
h. Ethics in Assessment examinee or necessitate the manipulation of the materials.
 Assessment should not be used to derogate the Group tests are administered to a group of person at a
students. One example of this is the right to time.
confidentiality. 4. Speed Tests and Power Test
i. Clear Communication there is a time limit within which the test taker is required
 Assessment's results should be communicated to the to answer all the items of the same degree of difficulty.
learners and the people involved. Power test assess the underlying ability of the individuals
by allowing the sufficient time.
1. Tests and Their Uses in Educational Assessment 5. Verbal Tests and Non-Verbal Tests
Verbal reasoningests assess the ability to
Tests provide teachers with information that can understand and comprehend written passages. They are
help them in enhancing instruction. It also provides designed to measure verbal comprehension, reasoning and
students with information and feedback that aid them in logic, all through your understanding of language. Non-
understanding themselves better. verbal reasoningests involve the ability to understand and
analyze information presented visually and solve problems
Uses of Test logically.
6. Informal Tests and Standardized Tests
1. Instructional Uses Informal tests are those that are used to evaluate a
Tests provide teachers with information that is student's own performance and progress individually.
helpful in providing more effective instructional guidance Standardized test is an assessment instrument whose
for individual students and the whole class. 2. validity and reliability have been established by thorough
Administrative Uses empirical investigation and analysis.
School administrators utilize tests results to make I7. Criterion-Referenced Tests and Norm-Referenced
decisions about the effectiveness of programs, teachers, Test
schools, and curriculum. Tests results provide school Criterion-referenced tests are designed to measure
administrators clear picture of the extent to which the student performance against a fixed set of predetermined
objectives of the school's instructional program is criteria or leaning standards it describe what an individual
achieved. can do without reference to the performance of others
3. Guidance Uses Norm-referenced tests describe the performance of an
Test results are useful in predicting an individual's examines in terms of of the relative position held in a
success in a field of study and thus aid the students in group, it determines how an individual's performance
choosing the appropriate course or program of study to compares with that of others.
pursue. 8. Supply Tests and Selection Tests
4. Industry Uses Supply tests are those in which answers are not
The test results are also use in the selection, given in the questions. The students supply their answer in
progression and promotion in industry. It is used to the form of a word phrase, number, symbol or drawing.
determine the best candidate for a specific position or job.
 The task is cognitively complex.
MODULE FOR WEEK 6
B. The Role of the Student
1. Completion Item Tests
Completion items require students to associate an  A defense of the answer or product is required.
incomplete statement with a word or phrase recaled from  The assessment is formative.
memory.  Students collaborate with each other or with the
a. Completion Drawing Type teacher.
b. Completion Statement Type
C. The Scoring
2. Identification Type
a brief description is presented and the student has  The scoring criteria are known or student-developed.
to identify what it is. Multiple indicators or portfolios are used for scoring
a. Simple Recall Type  The performance expectation is mastery.
b. Short Explanation Type
Authentic assessment has four basic characteristics:
3. Selection Types of Objective Tests
In the selection type of objective tests the student 1. The task should be representative of performance in the
chooses the night answer to each question. field. 2. Attention should be paid to teaching and learning
a. Arrangement Type the criteria for assessment.
b. Matching Type 3. Self-assessment should play a great role.
4. When possible, students should present their work
3. Multiple Choice Type publicly and defend it.
contains a question, problem or unfinished
sentence followed by several responses. 1. Authentic assessments are direct measures.
The main purpose of authentic assessment is to be
4. Alternate Response Type able to use the acquired knowledge and skills in the real
 a test wherein there are only two possible answer to world. Forms of assessment task must be applied in
the question. authentic situations.
2. Authentic assessments capture constructive nature
5. Interpretive Exercise of learning.
is often used in testing higher cognitive behavior. In a constructivist point of view, learners should
This kind of test item may involve analysis of maps, create knowledge and meaning based from schemata
figures or charts or even comprehension of written Thus, assessments cannot just ask students to repeat
passages. information they have received.
3. Authentic assessments integrate teaching, learning,
6. Essay Tests and assessment.
Essay exams are designed to test your ability to In the authentic assessment model, the same
synthesize information and to organize your thoughts on authentic task used to measure the students' ability to
paper. Students are free to select, relate and present ideas apply the knowledge or skills is used as a vehicle for
in their own words. student learning.
a. Brief or Restricted Essay Test 4. Authentic assessments provide multiple paths to
b. Extended Essay Test demonstration.
Students may have different ways by which they
CHAPTER 2 TYPES OF ASSESSMENT could demonstrate what they have learned. Similarly,

1. Traditional and Authentic Assessment


 Paper-and-pencil tests or quizzes are best examples of
traditional assessment which mainly describe and
measure student learning outcomes.

A. The Context of the Assessment

 Realistic activity or context


 The task is performance-based.
authentic tasks tend to give the students more freedom on  In contextualized assessment, the focus is on the
how they will demonstrate what they have learned. students' construction of functioning knowledge and
the students' performance in application of
knowledge in the real work context of the discipline
2. Formative Evaluation and Summative Evaluation area.
 Assessment for Learning pertains to the use of 5. Analytic and Holistic Assessment
formative evaluation to determine and improve  Analytic assessment refers to specific approach in the
students learning outcomes. On the other hand, assessment of learning outcomes. In this procedure,
Assessment of Learning uses summative evaluation students are given feedback on how well they are
which provides evidence of students' level of doing on each important aspect of specific task
achievement in relation to curricular learning expected from them.
outcomes.  Holistic assessment refers to a global approach in the
Formative assessment occurs at three (3) points of assessment of a student-learning outcome. Sadler
instruction: (2009) pointed out that in holistic assessment, the
(1) during instruction; teacher or the assessor has to develop complex mental
(2) between lessons; and responses to a student's work and in evaluating the
(3) between units. student's work, the assessor provides a grade and
supports it with a valid justification for assigning the
grade.

CHAPTER 5: NATURE OF PERFORMANCE-


BASED ASSESSMENT

1. Meaning and Characteristics

Performance-Based Assessment
 is one in which the teacher observes and makes a
judgment about the student's demonstration of a skill
or competency in creating a product,constructing a
response, or making a presentation (McMillan,
2007).

Hands-on experiences
 allow them to be more critical, motivated and
involved when they are allowed to perform on their
own.
Types of activities that best exemplified performance-
based assessments include writing a research report,
solving and conducting experiments and investigations,
demonstration, speech, skit, role playing, constructing
and implementing seminar plan or creating video
presentation.
It is stipulated in the DepEd Order No. 7, s. 20h2 that the
highest level of assessment focuses on the performances
(product) which the students are expected to produce
through authentic performance tasks.
Linn (fi995) stated that performance assessments provide
a basis of teachers to evaluate both the effectiveness of the
3. Norm and Criterion-Referenced Assessment process or procedure
 Norm-referenced assessment gives us information on
what the student can perform by comparing to Process- oriented assessments
another student. It describes student performance in  provide insights on the students' critical thinking,
the class by comparing to others. logic and reasoning skills. These will lead them to
4. Contextualized and Decontextualized Assessment independent learning and set goals for future use.
Some performance assessment proponents contend that
genuine performance assessments must possess at least 2.4. Demonstration Task.
three features (Popham, 20h):  This task shows how the students use knowledge and
 Multiple evaluation criteria. The student's skills to complete well-defined complex tasks.
performance must be judged using more than one
evaluation criterion. 2.4 Developing Exhibits.
 Pre-specified quality standards. Each of the  Exhibits are visual presentations or displays that need
evaluative criteria on which a student's little or no explanation from the creators. An exhibit
performance is to be judged is clearly explicated is offered to explain, demonstrate or show something.
in advance of judging the quality of the student's
performance. 2.5 Presentation Task.
 Judgmental appraisal. Unlike the scoring of  This is a work or task performed in front of an
selected-response tests in which electronic audience. Storytelling. singing and dancing, musical
computers and scanning machines can, once play or theatrical acting are some presentations which
programmed, carry on without the need of demonstrate presentation tasks.
humankind, genuine performance assessments
depend on human judgments to determine how 2.6 Capstone Performances.
acceptable a student's performance really is.  tasks that occur at the end of a program of study and
enable students to show knowledge and skills in the
2. Types of Performance Tasks context that matches the world of practicing
 The main objective of the performance task is to professionals. These tasks include the research paper,
capture all the learning targets which shall be aligned practice teaching, internship or on-the-job training.
to the teaching and learning objectives, activities and
assessment. Thus, the focus of performance-based 3. Strengths and Limitations
advantages of performance assessments over other
 assessment is the final output that must be developed assessments.
or completed. These could be inform of problem-
solving, demonstration, tasks and other authentic 3.1 Performance assessment clearly identifies and
experiences that would influence the clarifies learning targets.
thinkingprocesses, skills and products required from  Authentic performance tasks such as real world
performance tasks. Below are some performance- challenges and situations can closely match with the
based assessment tasks (Musial, 2009): various complex learning targets. This offers a direct
way to assess what the students know and can do
2.1 Solving a problem. within the variety of realistic contexts.
 critical thinking and problem solving are important
skills that need to besharpened and developed by the 3.2 Performance assessment allows student to exhibit
learners, Teachers may include activities and make their own skills, talents, and expertise.
sense of complex authentic problems or issues to be  Tasks show integration of the student's skills,
solved by the students. This helps the students knowledge and abilities, provide challenge and
become independent thinkers and learners for life, opportunities to exhibit their best creation.
and help them meet the challenges of the 2ff"
century. 3.3 Performance assessment advocates constructivist
principle of learning.
2.2 Completing an inquiry.  Students are more engaged in active learning and give
 An inquiry tasks is one in which the students are more opportunities to demonstrate their learning in
asked to collect data in order to develop their different ways in complex tasks. Students use their
understanding about a topic or issue. Examples of previous knowledge to build a new knowledge
inquiries include science investigation, research- structures and be actively involved in exploration and
based activities, survey and interviews or independent inquiry through different tasks.
studies.
3.4 Performance assessment uses a variety of
2.3 Determining a position. approaches to student evaluation.
 This task requires students to make decision or clarify  This offers students a variety of way of expressing
a position thase analysis and issue related activities or their learning and increases the validity of student's
debate are some examples of this task. evaluation. Teachers may share criteria of assessment
before the actual evaluation so that students can use utilize this partial knowledge effectively and
this criteria as well. efficiently.
DEFINING THE PURPOSE OF ASSESSMENT
3.5 Performance assessment allows the teachers to
explore the main goal and processes of teaching and What is Assessment?
learning process. Assessment is the process of gathering and discussing
 Teachers may reflect and revisit learning targets, information from multiple and diverse sources in order to
curriculum and instructional practices, and standards develop a deep understanding of what students know,
as they utilize performance based assessment. They understand, and can do with their knowledge as a result of
may use a variety of teaching strategies and their educational experiences: the process culminates
techniques, and explore how students will use the when assessment results are used to improve subsequent
instructional material and resources given to them. learning (Learner-Centered Assessment on College
Campuses: shifting the focus from teaching to learning
Though performances assessments offer several by Huba and Freed 2000)
advantages over traditional objective assessment Assessment is the systematic basis for making inferences
procedures, they have some distinct limitations as well; about the learning and development of students. It is the
process of defining, selecting, designing, collecting.
1. Development of high quality performance analyzing, interpreting, and using information to increase
assessment is a tedious process. students' learning and development. (Assessing Student
 Performance assessment needs careful planning and Learning and Development: A Guide to the Principles,
implementation. It is very time consuming to Goals, and Methods of Determining College Outcomes
construct good tasks. Teachers have to make sure that by Erwin 1991)
the performance tasks expected from the students are
authentic and match the outcome to be assessed and Purpose of assessment
not with other qualities that are not part of the 1. Assessment drives instruction
outcomes to be assessed. Quality scorning rubrics are A pre-test or needs assessment informs instructors what
difficult to create as well. students know and do not know at the outset, setting the
2. Performance assessment requires a considerable direction of a course.
amount of time to administer. 2. Assessment drives learning
 Paper-and- pencil takes 15 to 20 minutes per tasks to What and how students learn depends to a major extent on
complete depending on the number of items. Most how they think they will be assessed.
authentic tasks take a number of days to complete.
Most of the time, performance assessment is 3. Assessment informs students of their progress
administered to small groups of students unlike Effective assessment provides students with a sense of
traditional testing which is simultaneously what they know and don't know about a subject.
administered to an entire class. 4. Assessment informs teaching practice
3. Performance assessment takes a great deal of time to Reflection on student accomplishments offers instructors
score. insights on the effectiveness of their teaching strategies.
 The more complex the process andperformance, the 5. Role of grading in assessment
more time you can expect to spend on scoring. To Grades should be a reflection of what a student has
reduce the scoring time, crafting a high quality learned as defined in the student learning outcomes.
rubrics is recommended.
4. Performance task score may have lower reliability. DEFINING THE PURPOSE OF ASSESSMENT
 This resulted to inconsistency of scoring by teachers
who interpret observation quite differently. With In order to administer any good assessment, you must
complex tasks, multiple correct answers, and fast- have a clearly defined purpose. Thus, you must ask
paced performances, scoring depends on teachers' yourself several important questions:
own scoring competence. - What concept, skill, or knowledge am I trying to
5. Performance task completion may be discouraging assess?
to less able students. -What should my students know?
 Some tasks that require students to sustain their -At what level should my students be performing?
interest for a longer time may discourage -What type of knowledge is being assessed: reasoning,
disadvantaged students. They may have partial memory, or process (Stiggins, 1994)?
knowledge of the learning target but may fail to After asking you these four questions, these things are
complete the task because it does not allow them to need to follow:
 Choosing the Activity
After you define the purpose of the assessment, you can SUGGESTION FOR CONSTRUCTING
make decisions concerning the activity. There are some PERFORMANCE TASK
things that you must take into account before you choose The development of high quality performance assessment
the activity: time constraints, availability of resources in effectively measures complex learning outcomes request
the classroom, and how much data is necessary (This attention to task development and to the ways
consideration is frequently referred to as sampling.). performance are rate. Linn (1995) suggested ways to
 Defining the Criteria improve the development of task.
After you have determined the activity as well as what 1. Focus on learning outcomes that request complex
tasks will be included in the activity, you need to define cognitive skills and student performances.
which elements of the project task you shall use to -Task need to developed or selected in light of learning
determine the success of the student's performance. outcomes.
Sometimes, you may be able to find these criteria in local 2. Select or develop task that represent both the
and state curriculums or other published documents content and the skill that are central to important
(Airasian, 1991). learning outcomes.
 Assessing the Performance - It is important to specify the range of content and
Using this information, you can give feedback on a resources student can use in performing task.
student's performance either in the form of a narrative 3. Minimize the difference of task performance on skill
report or a grade. There are several different ways to that are irrelevant to the intended purpose of the
record the results of every types of assessments (Airasian, assessment task.
1991; Stiggins, 1994) -The key here is to focus on the attention of the
1."Checklist Approach"-- When you use this, you only assessment.
have to indicate whether or not certain elements are
present in the performances. Process-Oriented Performance Task on Problem Solving
2. "Narrative/Anecdotal Approach"--When teachers and Decision-Making
use this, they will write narrative reports of what was done Process-Oriented Performance Task
during each of the performances. -is assessment concerned with the actual task performance
3. "Rating Scale Approach"-When teachers use this, rather that the output or product of the activity.
they indicate to what degree the standards were met.
4. "Memory Approach"--When teachers use this, they Performance task for Product-oriented Performance-based
observe the students performing the tasks without taking Assessment
any notes. Product-Oriented ssessment
-is a kind of assessment where in the assessor views and
Topic: Identifying Performance Task scores the final product made and not on the actual
Performance task performance of making that product.
- any learning activity or assessment that asks students to
perform to demonstrate their knowledge, understanding Topic: Developing Scoring Schemes
and proficiency. Ways of assessing the student's performance
Identifying performance tasks measures the learning target  anecdotal records
that you are about to assess. Some targets imply that the  Interviews
tasks should be structured; others require unstructured  direct observations using checklist or likert scale
tasks. Below ure some questions that should be answered the use of rubrics especially for the performance-
in identifying tasks (Nitko 2011): based assessment.
a task description must be prepared to provide the listing
of specification of the tasks and will elicit the desire Rubrics as an Assessment Tool
performance of the students. Task description (Mcmillan What is Rubrics?
2007) should include the following: Scoring tool that lays out specific expectations for
1. Content and skill target to assessed assignment (Levy 2005)
2. Description of the student activity • The scoring procedures for judging student's responses
3. Group or individual to the performance tests (Popham, 2011)
4. Help allowed • Set of rules specifying the criteria used to find out what
5. Resource needed the students know and are able to do so (Musical, 2009)
6. Teacher Role
7. Administrative process A rubric has three important features:
8. Scoring procedures
• Evaluative criteria. These are the factors to be used in ✔Since performance-based assessment involves
determining the quality of a student's response. professional judgment. some common errors in rating
• Descriptions of qualitative differences for evaluating should be avoided: (McMillan (2007)
criteria. For each evaluate criterion, a description must be ✔To have personal biases is to be human. We all hold our
supplied so qualitative distinctions in student's responses own subjective world views and are influenced and shaped
can be made using the criterion. by our experiences, beliefs, values, education, family,
· An indication of whether a holistic or analytic scoring friends, peers and others
approach is to be used. The rubric must indicate whether
the evaluative criteria are to be applied collectively in a 3 components of Personal Bias
form of holistic scoring or on a criterion-by-criterion basis 1. Generosity error occurs when the teacher tends to
in the form of analytic scoring. give higher scores;
2. Severity error results when the teachers use the
2 major types of rubrics: low end of the seale and underrate student
Analytic Rubric- it requires teacher to list and identify performances; and the
the major knowledge and skills which are critical in the 3. Central tendency error in which the students are
development of process or product tasks. rated in the middle.
Holistic Rubric -it requires the teacher to make a
judgement about the overall quality of each student Halo Effect
response.  Occurs when the teacher' general impression of
the students affects scores given on individual
Developing Scoring Schemes traits or performance.
Developing Rubric components: 4 parts  Students, on the other hand, can assess their own
1) A task description=the outcome being assessed progress. Student participation needs not to be
2) Scale/ The characteristics to be rated (rows)- skills, limited to the use of assessment instruments. It is
knowledge, behavior to be demonstrated also useful to have student help develop the
3) Dimensions Levels of mastery (columns) advanced, instrument.
intermediate high, intermediate low, novice, or other  In some practices, students rate themselves and
descriptions (1,2,3,4, etc.) (Aim for an even number) compare their ratings with the teacher-in-charge.
4) A description of each characteristic at each level of  Follow up-conference
mastery (cells)  Peer and self-evaluation of output enable teachers
to understand better curriculum and instructional
TASK DESCRIPTION learning goals and the progress being undertaken
Task description involves the performance of the students. towards the achievement of the goals.
Task can be taken from assignments, presentation, and
other classroom activities. Usually, task description are
being set in defining performance tasks.
DIMENSIONS
A set of criteria which serves a basis for evaluating
student output or performances.
DESCRIPTION OF DIMENSIONS
Dimension should contain description of the level of
performance as standard of excellence accompanied which
example.
Allows teachers and students to identify the level of
expectation and what dimension must be given an
emphasis.
Rating the Performance
OBJECTIVE
✔The main objective of rating the performance is to be
objective and consistent. Be sure also that the scoring
system is feasible as well in most of the classroom
situations, the teacher is both the observer and the rater.
PERSONAL BIAS

You might also like