PED06 LM-Prelim PermalinoGD

Download as pdf or txt
Download as pdf or txt
You are on page 1of 27

PED 06 - Assessment in Learning 1

Assessment in
Learning 1

Prepared by:
Prof. GRACE D. PERMALINO
Prof. ANTONIO V. ROMANA
Prof. JOY THERESE L. VILLON
SLSU - CTE

permalinoromanavillonfirstsemesteray2020-2021
PED 06 - Assessment in Learning 1

Nature and Roles


of Assessment

permalinoromanavillonfirstsemesteray2020-2021
PED 06 - Assessment in Learning 1

CHAPTER 1 Concepts and Relevance of Assessment

Let’s get into it!


Let’s conceptualized!

OVERVIEW DISCUSSION

Assessment is said to be at the core of the Before knowing deeply what is all about

learning process and also considered as a assessment, it is very imperative to have a

vital element in the curriculum process. It basic knowledge of the different concepts

used to determine student’s learning needs, related to this topic. At this point, we will

monitor their progress and examine their differentiate measurement, testing,

performance against identified student assessment and evaluation. While some

learning outcomes. As such, it is people especially those who personally have

implemented at different phases of no background or ideas about these terms,

instruction: before (pre-assessment), during they usually use these interchangeably.

(formative assessment) and after instruction


(summative assessment. Therefore, it is very A. MEASUREMENT

important for students, teachers and other Measurement comes from the old French

individual who is related to education to word “measure” which means “limit or

understand what is assessment is all about, quantity”.

why it is needed and how it is connected to


measurement, testing and evaluation.

OBJECTIVES
At the end of this chapter the students are
expected to:
1. Describe the basic concepts and
Basically, it is a quantitative description of an
terminologies in assessment.
object’s characteristic or attribute.
2. Explain the type and purposes of
measurement, assessment, testing and
In education, teachers measure a particular
evaluation.
element of learning like their readiness to
3. Discuss the relationship of measurement,
learn, recall of facts, demonstration of
testing, evaluation and assessment.
specific skills, or their ability to analyze and

permalinoromanavillonfirstsemesteray2020-2021
PED 06 - Assessment in Learning 1

solve applied problems. They use tools or A test can be classified into two, subjective
instruments such as tests, oral presentations, or objective. An objective test can be
written reports, portfolios and rubrics to corrected and quantified quite easily where
obtain pertinent information. scores can be readily compared. Examples
are: true or false, matching type, multiple
B. TESTING choice. While subjective test elicits varied
answers or responses. This kind of test may
have more than one answer. Example:
restricted essay and extended-response
essay. In terms of scoring, unlike objective
test, scores are likely to influence by
personal opinion or judgment by the person

Testing is a formal, systematic procedure for doing the scoring.

gathering information (Russel& Airasian,


(2012) as cited in Almeida, et al. (2016). A B.1.3 According to Mode of Administration

test is a form of assessment. It is a formal A test can be given individually or in a group.

and systematic procedure for measuring a In the case of a student with special needs,

learner’s knowledge, skills or abilities individual test can be given to them in order

administered under certain conditions. Tests to facilitate the test appropriately.

are traditional assessments. They may not


be the best way to measure how much B.1.4 According to Test Constructor

students have learned but they still provide Classified based on the constructor, attest

valuable information about student learning may either be standardized or non-

and their progress. standardized. Standardized tests are


prepared by specialists who are versed in the

B.1 Types of Test principles of assessment. They can be


administered to a large group of students or

There are several typologies of tests. The examiners under similar conditions. Scoring

successful use of a test depends on the procedures and interpretation are consistent.

purpose and the construct to be measured. On the other hand, Non-standardized tests
are usually prepared by teachers. Mostly we

B.1.1 According to Mode of Response called this as Teacher-Made Test. At times,

In terms of the way responses are made, a this kind of test are constructed haphazardly

test may be oral, written or performance- due to limited time and lack of opportunity to

based. pre-test the items or pilot test.


B.1.5 According to Mode of Interpreting

B.1.2 According to Ease of Quantification of Results

Response To interpret the test results, we can classify


them as norm or criterion-referenced
interpretation.

permalinoromanavillonfirstsemesteray2020-2021
PED 06 - Assessment in Learning 1

* Tests that yield norm-referenced at the same time, student can assess their
interpretations are evaluative instruments own learning and performance.
that measure a student’s performance in
relation to the performance of a group on the ➢ Administrative Functions
same test. Likewise, they measure student’s This is being used in any
performance in comparison to the administrative matter in order to provide
performance of same-age students on the means of determining how they will
same assessment. Comparisons are group students according to their level of

made and the student’s relative position is ability. Some may use it as means of

determined. For instance, a student may accreditation, mastery or certification

rank third in a class of fifty. and mechanism for quality control.

* Tests that allow criterion-referenced


interpretations describe each student’s ➢ Research and Evaluation

performance against an agreed upon or pre- Tests are utilized in studies to

established criterion or level of performance. determine the effectiveness of an

Moreover, they measure student’s instructional material, new pedagogical

performance based on mastery of a specific techniques in teaching and even the use

set of skills. It measures what the student of technology that will help in enhancing

knows and doesn’t know at the time of the learning are carried out using test

assessment. The student’s performance is and other assessment.

NOT compared to other students’


performance on the same assessment. ➢ Guidance Functions
This is utilized in guidance in order

B.1.6 According to Nature of Answer to enable to assess student’s abilities,

There are types of test which are classified interest and aptitudes so that they can

according to construct they are answering take advantage of educational,

such as personality, intelligence, aptitude, vocational and personal opportunities.

achievement, social relationships, etc.


C. EVALUATION

B.2 FUNCTIONS OF TESTING


Not all types of tests are suited for all use.
One needs to be aware of the different
functions of testing in order to select the most
appropriate type which is suited to the needs
of the one who will use it. Evaluation come s in after the data had been
collected from an assessment task.
➢ Instructional Functions
One of the uses of test in education According to Russel and Airasian (2012) as
is to provide feedback to the teacher if cited in de Guzman, et al. (2015) that
how far the students learn a lesson. And evaluation is the process of judging the

permalinoromanavillonfirstsemesteray2020-2021
PED 06 - Assessment in Learning 1

quality of a performance or course of action. through assessment instruments whether


As what its etymology indicates (French word implicit or explicit, assessment is most
evaluer), evaluation entails finding the value usefully connected to some goal or objective
of an educational task. This means that for which the assessment is designed.
assessment data gathered by the teacher
have to be interpreted in order to make D.1 Purposes of Assessment
sound decisions about students and the There are three interrelated purposes of
teaching-learning process. Evaluation is assessment. Knowledge of these purposes
carried out both by the teacher and students and how they fir in the learning process can
to uncover how the learning process is result to a more effective classroom
developing. assessment.

D. ASSESSMENT D.1.1 Assessment FOR Learning


This pertains to diagnostic and formative
assessment tasks which are used to
determine learning needs, monitor academic
progress of students and guide instruction.

D.1.2 Assessment AS Learning

Assessment can be defined as the process It employs tasks or activities that provide

of gathering the data and fashioning them students with an opportunity to monitor and

into interpretable form for decision-making. It further their own learning- to think about their

involves collecting data with a view of making personal learning habits and how they can

valve judgment about the quality of a person, adjust their learning strategies to achieve

object, group or event (Ajuonnma,2006 as their goals.

cited in Ajayi, 2018). The term assessment


comes from the Latin word assidȇre which D.1.3 Assessment OF Learning

means “to sit beside a judge”. this means that This is summative and done at the end of a

assessment is tied up with evaluation. unit, task, process or period. Its purpose is to

According to Ajayi (2018), aassessment is a provide evidence of a student’s level of

broad term that includes testing which is a achievement in relation to curricular

special form of assessment. Tests are outcomes.

assessments made under contrived


circumstances especially so D.2 Relevance of Assessment

that they may be administered. In other Assessment is needed for continued

words, all tests are assessments, but not all improvement and accountability in all

assessments are tests. We test at the end aspects of the education system. In order to

of a lesson or unit. We assess progress at make assessment work for everyone-

the end of a school year through testing, and students, teachers and other players in the

we assess verbal and quantitative skills education system should have an

permalinoromanavillonfirstsemesteray2020-2021
PED 06 - Assessment in Learning 1

understanding of what assessment provides


and how it is used to explain the dynamics of
student learning. The following gives insights
on how relevant assessment is:

Students - become actively engaged in the


learning process and take responsibility for
their own learning.

Teachers - gives information about student’s


knowledge and performance base.

Parents - helps to identify the needs of the


children for appropriate action and
intervention.

Administrators - help to identify the


weaknesses and strengths of the program.

Policymakers - it provides information about


students’ achievements which in turn reflect
the quality of education being provided by the
school. Through results of the assessment,
policymakers utilize this in order to formulate
new laws that will improve the educational
system.

permalinoromanavillonfirstsemesteray2020-2021
PED 06 - Assessment in Learning 1

CHAPTER 2 Roles of Assessment

1. Define/Refine student
learning outcomes based on
input from stakeholders.

2. Document results and 3. Design assessment tools,


outline needed changes in criteria and standards directly
curriculum, instructional linked to each outcome.
materials, or teaching
strategies.

5. Implement assessment
6. Identify gaps between tool(s) to gather evidence of
desired and actual results. student learning.

4. Analyze and evaluate the


collected data.

Figure 1. Assessment Implementation Cycle

gathered, analyzed and interpreted. Gaps


Let’s get into it!
are identified between desired learning
OVERVIEW outcomes and actual results. Data-driven
Assessment has an important role in action plans are then developed for program
education and it has a critical role in the improvement. Changes in assessment tools,
teaching process. Through appropriate course materials, instructional methods,
assessment, teachers can classify and grade course prerequisites or learning outcomes
their students, give feedback and structure are affected. Goals and objectives are
their teaching accordingly reviewed and refined following evaluation
(Tosuncuoglu,2018). According to de findings. This is referred to as the feedback
Guzman, et al. (2015), assessment is a cyclic loop and the cycle begins anew.
procedure, meaning it’s a never-ending
activity that we must consider in the teaching OBJECTIVES
learning process. As illustrated in Fig. 1, At the end of this chapter the students are
program level learning outcomes are expected to:
developed from research and input from 1. Identify the roles of assessment being
stakeholders. These are aligned with the used in instructional process.
instructional outcomes and mapped to the 2. Explain the various roles of assessment
courses within the program through to learners, teachers, parents, and other
curriculum mapping. Course learning stakeholders.
outcomes are assessed using appropriate 3. Determine the role of assessment in the
tools and criteria. Assessment data are given scenarios.

permalinoromanavillonfirstsemesteray2020-2021
PED 06 - Assessment in Learning 1

process; hence learners regularly receive


feedback. And how does this work? For
instance, a teacher provides his comments
and suggestions to an essay on COVID 19
submitted by one of the students. After
turning it back, the student will revise his
Let’s conceptualized! work upon the suggestions and comments of
his/her teacher before it finally assessed.
DISCUSSION
Results of formative assessments are
There are four roles of assessment used in
recorded right away for the purpose of
instructional process. Miller, et al. (2009) as
monitoring students’ learning progress.
cited in de Guzman, et al. (2015) identified
However, these are not used as bases for
these as functional roles of assessment in
students’ marks.
classroom instruction.
Some scholars suggest that there are
positive effects of formative assessment to
Roles of Assessment
students. They are as follows:
1. It is used for student’s placement.
* Reactivates or consolidates prerequisite
Placement assessment is basically used to
skills or knowledge prior to introducing new
determine a learner’s entry performance.
material.
Done at the beginning of instruction,
* Focuses attention on important aspects of
teachers assess through a readiness pre-
the subject.
test whether students possess prerequisite
* Encourages active learning strategies.
skills needed to prior instruction. If
* Gives students opportunities to practice
prerequisite skill is insufficient, then the
skills and to consolidate learning.
teacher can provide learning experiences to
* Provides knowledge of outcomes and
help them develop those skills. If students
corrective feedback.
are ready, then the teacher can proceed with
* Helps students monitor their own progress
instruction as planned. This is somehow the
and develop self-evaluation skills.
same as diagnostic test or assessment. You
* Guides the choice of further learning
are assessing the skills or ability of a student
activities to increase performance and;
in order to diagnose from where s/he might
* Helps students to feel a sense of
be deficient or efficient.
accomplishment.

2. It is used for formative assessment.


3. It is used for diagnostic assessment.
Formative assessment mediates the
Diagnostic assessment is intended to identify
teaching and learning process. It is learner-
learning difficulties during instruction. A
centered and teacher-directed. Moreover,
diagnostic test for instance can detect
this formative test is used as
feedback to enhance teaching and improve
commonly held misconceptions in a subject.
the process of learning. It is an on-going
Contrary, to what others believe, diagnostic

permalinoromanavillonfirstsemesteray2020-2021
PED 06 - Assessment in Learning 1

tests are not merely given at the start of


instruction. It is used to determine causes of
persistent learning difficulties despite the
pedagogical remedies applied by the teacher.
This is not used as part of a student’s mark
of achievement.

4. It is used in assessing students’


learning outcomes.
Summative assessment is done at the end of
instruction to determine the extent to which
the students have attained the learning
outcomes. It is used for assigning and
reporting grades or certifying mastery of
concepts and skills. An example of
summative assessment is the written
examination at the end of the school year to
determine who passes or fails.

permalinoromanavillonfirstsemesteray2020-2021
PED 06 - Assessment in Learning 1

Principles of
High-Quality
Assessment

permalinoromanavillonfirstsemesteray2020-2021
PED 06 - Assessment in Learning 1

CHAPTER 3 Principles of Assessment

Let’s get into it! Let’s conceptualized!

OVERVIEW
Assessment Principles set out the key
aspects of assessment practice that should
be reflected in all assessment practice and
procedure. They help explain why we have
assessment and guide our approach to
assessment matters. There can be
circumstances in assessment practice where
all the principles cannot be applied at
the same time. For example, when choosing DISCUSSION
a particular method of assessment, it may be The following list represents a set of
that the principle of validity in assessment is principles for good practice in assessment.
over-ridden by the principle of reliability. The
sacrifice of one principle for another is 1.1 Assessment should be an integral part
acceptable when there is clear justification. of the curriculum
Overall, the principles should provide an The design of assessment should not be
underpinning and guiding framework that separated from the design of the overall
steers assessment practice. curriculum, which comprises aims, learning
outcomes, teaching, learning and
OBJECTIVES assessment activities and which is described
At the end of this chapter the students are in programme specifications. Therefore,
expected to: assessment strategies for individual modules
1. Identify the principles of assessment. should not be decided in isolation from other
2. Explain the principles of assessment. modules that make up the rest of the year, or
3. Construct a concept map to explain the that build incrementally year on year.
principles of assessment. Assessment tasks should relate to the
learning outcomes of the
module/level/programme.

permalinoromanavillonfirstsemesteray2020-2021
PED 06 - Assessment in Learning 1

1.2 Assessment should be an integral part necessary to design different assessment


of the students’ approach to learning tasks to ensure that all outcomes are
Students’ approaches to learning can both appropriately assessed.
influence and inform assessment practices.
Therefore, assessment methods should be 1.5 Assessment should be reliable
chosen so that they encourage a deep, In assessment, consistent standards of tutor
rather than a surface approach to learning assessment and fairness are important goals
and assist the student in identifying to aim for. Both are more likely to be
appropriate priorities in learning. Wherever achieved if clear task guidance, explicit
possible, students’ approaches to learning assessment criteria and marking schemes
should help to inform assessment design. are given to staff and students alike. The
‘connoisseur’ approach to assessment, ‘I
1.3 The purpose of assessment should be know it when I see it but I can’t put it into
clearly understood by staff and students words’ is not acceptable.
There are a variety of purposes of
assessment, e.g. to monitor learning, to 1.6 Assessment should balance the
assess competence, to provide a context for formative and summative so as to provide
learning, and to provide feedback to staff and meaningful feedback
students. In deciding on the methods and Assessment tasks used for formative
timing of assessment for a module, it is purposes should be designed to provide
necessary to clarify the purpose(s) for which meaningful feedback to students which helps
the assessment is required, and consider the them to know how they are doing and how
extent to which the method of assessment is they can improve. An over-reliance on
fit for such purpose(s). Students should be summative assessment at the conclusion of
prepared for the assessment tasks they face. an element of study gives students a grade
Rubrics should be published in advance of but provides very little feedback that will help
assessment taking place, and sample them develop and improve before they reach
questions and materials be made available the end of the module/programme. It is
so that students know what is expected of acknowledged that some methods of
them, such assessment literacy increases assessment can balance both a formative
student confidence in approaching and summative function. Once a task
assessment tasks, and improves becomes even partly summative there is a
performance. Peer assessment can also add tendency for the student to focus on the mark
to student understanding of assessment. achieved rather than the feedback itself – but
on the other hand, it may be felt that
1.4 Assessment should be valid incentives are needed to encourage students
To be valid, assessment tasks should be to participate. It is important to remain aware
designed to ensure that they assess the of this trade-off in designing assessment
learning outcomes. Where a module entails tasks.
multiple learning outcomes, it may be
PED 06 - Assessment in Learning 1

1.7 Criteria for assessment should be and resources in appropriate ways.


transparent However, efficiency in assessment should
Criteria for assessment should be as clear as not override the preceding principles. In
possible to tutors, examiners and students to cases where there are trade-offs (losing
ensure equity, validity and reliability. reliability because of practicality issues for
Assessment criteria (grade descriptors) example) then these must be made explicit.
should be published and provided to all
students, markers and examiners, including 1.11 Assessment should be inclusive
external examiners. Assessment tasks (including for referral
assessments) should be selected with an
1.8 Assessment should be incremental awareness that different methods may be
and sufficiently demanding appropriate for different learning styles –
Assessment tasks need to build on what was therefore a variety of methods should be
expected in previous study. Assessment used to ensure that particular students are
tasks should be designed to challenge not disadvantaged. Faculties should be
students considered capable of undertaking aware of the range of possible variations to
a module/programme to demonstrate the assessment methods that may be
best level of attainment of which they are recommended for students with
capable. disabilities/specific learning difficulties.
Assessment tasks and documentation
1.9 Assessment should be redeemable setting out marking criteria etc should be
Faculties must follow the regulations for the clear enough for students for whom English
redeeming of failed assessments as detailed is not their first language to understand what
in the school Calendar. All students are is expected of them. Where possible, a
permitted repeat opportunities during their balance of different modes of assessment
programme. This is not only just but may help should be utilized in the core and compulsory
to avoid high drop-out or failure rates. It is modules that make up a programme.
recognized that the number of these may be Assessment tasks should be designed to
limited by specific PRSB requirements. cater for students from a variety of
The form of a repeat or referral may differ disciplinary backgrounds following
from the original assessment and it may be interdisciplinary modules or modules from
that multiple elements of the original outside of their home
assessment may be replaced where the discipline.
learning outcomes can be assessed by a
single form of assessment in referral or 1.12 Assessment outcomes should be
external repeat. monitored
Students’ performance in different types of
1.10 Assessment should be efficient assessment tasks should be monitored,
Systems of assessment should be managed including monitoring performance by race,
so as to use academic and support staff time disability, gender and age, to
PED 06 - Assessment in Learning 1

ensure that assessment is not inadvertently programme learning outcomes are being
culturally biased or otherwise disadvantaging assessed. A variety of assessment types
particular may be seen as desirable to ensure
groups. inclusivity, fairness and motivation, but
unless this is coordinated at programme
1.13 Student assessment workload level there is a danger that the overall pattern
should be appropriate of assessment will become unbalanced. For
Each Faculty is expected to publish details of example, insisting on having an unseen
what is expected of students. In arriving at an written examination on every module may be
appropriate workload Faculties should take unnecessary, but removing all examinations
account of the following suggestions: may not be wise. Such imbalance may arise
• Spread assessment throughout the when assessment design is only carried out
semester/year so as to minimize bunching. at module level. The balance of modes of
Several tasks can be set/assessed early assessment across a programme could
on— e.g. literature searches, book reviews usefully be considered as part of programme
— to help ensure that tasks do not all come validation and monitoring.
at the module end.
• Assess a little rather than a lot: focus on
exactly what needs to be assessed and
design tasks which measure this primarily.
Don’t measure the same things repeatedly.
• Adhere to deadlines firmly but avoid fixing
assessment dates to coincide across several
modules
• Assessment dates should be pre-planned
and published at the beginning of the
semester/year. Choose distributed hand-in
deadlines appropriately. In designing
curriculum, staff need to ensure they are
making sufficient and appropriate demands
so that students are able to demonstrate the
highest levels of attainment; this needs to
be reflected in student workload
requirements.

1.14 Assessment should be coordinated


Assessment design should be coordinated at
programme level in order to achieve a
balance between alternative modes of
assessment, and to ensure that all
PED 06 - Assessment in Learning 1

CHAPTER Appropriateness and Alignment of


4 Assessment Methods to Learning Outcomes

Let’s get into it!


understanding. In truth, not all assessment
OVERVIEW methods are applicable to every type of
What principles govern assessment of learning outcomes and teachers have to be
learning? Chappuis, et al. (2009) as cited in skillful in the selection of assessment
Alemeida, et al. (2015) stated that there are methods and designs. Knowledge of the
five standards of quality assessment to different levels of assessment is paramount.
inform sound instructional decisions and
these are: (1) clear purpose; (2) clear OBJECTIVES
learning targets; (3) sound assessment At the end of this chapter the students are
design; (4) effective communication of expected to:
results; and (5) student involvement in the 1. Understand the taxonomy of learning
assessment process. domains.
Classroom assessment begins with the 2. Describe the types of assessment
question, “Why are you assessing?” The methods.
answer to this question gives the purpose of 3. Determine how the learning outcomes will
assessment. The next question is, “What do match with the appropriate assessment
you want to assess? “This pertains to the method.
student learning outcomes- what the 4. Match the learning outcomes with the
teachers would like their students to know assessment methods.
and be able to do at the end of a section or a
unit. Once targets or outcomes are defined,
“How are you going to assess?” These refer Let’s conceptualized!
to the assessment tools that can measure DISCUSSION
the learning outcomes. Assessment A learning outcome to a particular level of
methods and tools should be parallel to the knowledge, skills and values that a student
learning targets or outcomes to provide has acquired at the end of a unit or period of
learners with opportunities and that are rich study as a result of his/her engagement in a
in breadth and depth and promote deep set of appropriate and meaningful learning
PED 06 - Assessment in Learning 1

experiences. An organized set of learning Anderson, Krathwohl et al. In 2001 to


outcomes helps teachers plan and deliver produce a two-dimensional framework of
appropriate instruction and design valid Knowledge and Cognitive Processes and
assessment tasks and strategies. Anderson, account for 21st century needs by including
et al. (2005) as cited in Almeida, et al. (2015) metacognition. It is designed to help
statted that there are five (5) steps on how to teachers understand and implement a
assess students’ outcomes. These are: standards-based curriculum. The cognitive
1. Create learning outcome. domain involves the development of
2. Design teaching/assessments to achieve knowledge and intellectual skills. It answers
these outcomes statements. the question, “What do I want learners to
3. Implement teaching/ assessments know?” Krathwohl (2002) as stated in
activities. Alemeida, et al. (2015) stressed that the
4. Analyze data on individual and aggregate revised Bloom’s taxonomy is not only used in
levels. classroom instruction and learning activities
5. Reassess the process. in order to achieve the objectives but rather
it is also use for assessment.

Create

Design

Implement

Analyze

Reassess

Fig. 2 Steps in Student Outcome Assessment


Fig. 3 Cognitive Level (Anderson, et al.,2001)

However in this chapter, centers only on steps


1 and 2. Hence, to comprehend the principle of LEVEL 1: REMEMBERING
appropriateness of assessment methods to
learning outcomes, we need to revisit the Description: Retrieving relevant knowledge
from long term memory.
taxonomy of learning domains and look at the
different assessment methods. Process: Recognizing, Recalling
Action Verbs Describing Learning
TAXONOMY OF LEARNING DOMAINS Outcomes: Define, describe, identify, label,
list, match, name, outline, reproduce, select,
A. Cognitive Domain (Knowledge-Based) state
The levels of cognitive learning originally
Sample Learning Competencies:
devised by Bloom, Engelhart, Furst, Jill and Define the four levels of mental processes in
Krathwohl in 1956 and revised by Marzano & Kendall’s Cognitive System.
PED 06 - Assessment in Learning 1

Sample Learning Competencies:


LEVEL 2: UNDERSTANDING Compare and contrast the thinking levels in
the revised Bloom’s Taxonomy.
Description: Constructing meaning from
instructional messages, including oral, LEVEL 4: EVALUATING
written, and graphic communication.
Description: Making judgments based on
Process: Interpreting, Exemplifying,
criteria and standards.
Classifying, Summarizing, Inferring,
Comparing, Explaining
Process: Executing, Monitoring, Generating
Action Verbs Describing Learning Action Verbs Describing Learning
Outcomes: Convert, describe, distinguish, Outcomes: Appraise, compare, conclude,
estimate, extend, generalize, give examples, contrast, criticize, evaluate, judge, justify,
paraphrase, rewrite, summarize support, verify

Sample Learning Competencies: Sample Learning Competencies:


Explain the purposes of assessment. Judge the effectiveness of writing learning
outcomes using Bloom’s Taxonomy.

LEVEL 3: APPLYING LEVEL 5: CREATING

Description: Carrying out or using a Description: Putting elements together to


procedure in a given situation. form a coherent or functional whole;
reorganize elements into a new pattern or
Process: Executing, Implementing structure.

Action Verbs Describing Learning Process: Planning, Producing


Outcomes: Apply, change, classify,
compute, demonstrate, discover, modify, Action Verbs Describing Learning
operate, predict, prepare, relate, show, solve, Outcomes: classify, construct, create,
use extend, formulate, generate, synthesize

Sample Learning Competencies: Sample Learning Competencies:


Write a learning objective for each level of Design a classification scheme for writing
cognitive domain. learning outcomes using the levels of
cognitive system developed by Anderson
and Karthwool.

LEVEL 4: ANALYZING Whatever taxonomy you choose, they should


help you categorize learning outcomes which
Description: Breaking material into its are crucial in designing and developing
constituent parts and determine how the assessments.
parts relate to one another and to an overall
structure or purpose B. Psychomotor Domain (Skill-based)

Process: Differentiating, Organizing, The psychomotor domain focuses on


Attributing physical and mechanical skills involving
coordination of the brain and muscular
Action Verbs Describing Learning activity. It answers the question, “What
Outcomes: Analyze, arrange, associate, action do I want learners to be able to
compare, contrast, infer, organize, solve, perform?”
support
PED 06 - Assessment in Learning 1

Dave (1970) identified five levels of behavior LEVEL 4: ADAPTING


in the psychomotor domain: Imitation,
Manipulation, Precision, Articulation, and
Description: Fine tuning. Making minor
Naturalization. In his taxonomy, Smpson
adjustments in the physical activity in order
(1972) laid down seven progressive levels:
to perfect it.
Perception, Set, Guided Response,
Mechanism, Complex Overt Response,
Action Verbs Describing Learning
Adaptation, and Organization. Meanwhile,
Outcomes: Arrange, combine, compose,
Harrow (1972) developed her own taxonomy
construct, create, design, originate,
with six categories organized according to
rearrange, reorganize
degree of coordination: Reflex movements,
Basic fundamental movement, Perceptual,
Sample Learning Competencies:
Physical activities, Skilled movements, and
Perform dance showing new combinations of
non-discursive communication.
steps.

LEVEL 1: OBSERVING

Description: Active mental attending of a


physical event.

Action Verbs Describing Learning


Outcomes: Describe, detect, distinguish,
differentiate, describe, relate, select.

Sample Learning Competencies:


Relate music to a particular dance step.

C. Affective Domain (Values, Attitudes


LEVEL 2: IMITATING and Interests)

Description: Attempted copying of a


physical behavior. LEVEL 1: RECEIVING

Action Verbs Describing Learning Description: Being aware of or attending to


Outcomes: Begin, display, explain, move, something in the environment.
proceed, react, show, state, volunteer
Action Verbs Describing Learning
Sample Learning Competencies: Outcomes: Ask, chooses, describes, follows,
Demonstrate a simple dance step. gives, holds, identifies, locates, names,
points to, selects, sits erect, replies, uses

LEVEL 3: PRACTICING Sample Learning Competencies:


Listen attentively to volleyball introduction.
Description: Trying a specific physical
activity over and over.
LEVEL 2: RESPONDING
Action Verbs Describing Learning
Outcomes: Bend, calibrate, construct, Description: Showing some new behaviors
differentiate, dismantle, fasten, fix, grasp,
as a result of experience.
grind, handle, measure, mix, organize,
operate, manipulate, mend Action Verbs Describing Learning
Outcomes: Answer, assist, comply, conform,
Sample Learning Competencies: discuss, greet, help, label, perform, practice,
Display several dance steps in sequence. present, read, recite, report, select, tell, write
PED 06 - Assessment in Learning 1

Sample Learning Competencies:


student’s learning journey. It is the bridge
Assist voluntarily in setting up volleyball nets.
between the teaching and learning. By
measuring student’s achievement and skill
LEVEL 3: VALUING
mastery, assessment facilitate the students
Description: Showing some definite to learn, teachers to improve instruction,
involvement or commitment.
administrations decide how to allocate
Action Verbs Describing Learning resources and policymakers to evaluate the
Outcomes: Complete, describe,
efficacy of education programs. There are
differentiate, explain, follow, form, initiate,
invite, join, justify, propose, read, report, several methods to assess student learning
select, share, study, work
outcomes (Prasanthi & Vas,2019). And it can
Sample Learning Competencies: be categorized according to the nature and
Attend optional volleyball matches.
characteristics of each method (Almeida, et
al.,2015). McMillan (2007) as cited in
LEVEL 4: ORGANIZING Almeida, et al. (2015) stated that there are
four major categories of methods used in
Description: Integrating a new value into
assessments namely: selected-response,
one’s general set of values, giving it some
constructed-response, teacher observation,
ranking among one’s general priorities.
and student self-assessment. These are
Action Verbs Describing Learning similar to carpenter tools and you need to
Outcomes: Adhere, alter, arrange, combine,
choose which is apt for a given task. It is not
compare, complete, defend, explain,
generalize, identify, integrate, modify, order, wise to stick to one method of assessment.
organize, prepare, relate, synthesize
As the saying goes, “If the only tool you
Sample Learning Competencies: have is a hammer, you tend to see every
Arrange his/her own volleyball practice.
problem as a nail.”

LEVEL 5: INTERNALIZING VALUES


A. Selected-Response Format
In selected-response format, students
Description: Characterization by a value or
select from a given set of options to answer
value complex. Acting consistently with the
a question or a problem. Example: multiple
new value.
choice, alternate response (True or False),
Action Verbs Describing Learning
Outcomes: Act, discriminate, display, matching type, etc.
influence, listen, modify, perform, practice,
propose, qualify, question, revise, serve,
solve, use, verify B. Constructed-Response Format
In this type of assessment, students need
Sample Learning Competencies:
Join intramural to play volleyball twice a only to recognize and select the correct
week.
answer. Example: brief-constructed
response items (short answer to an open-
TYPES OF ASSESSMENT METHODS
ended question, labeling a diagram);
Assessments are key components of all performance assessment (written reports,
education system and play a major role in a portfolio, audio-visual materials); essay;
and oral questioning.
PED 06 - Assessment in Learning 1

C. Teacher Observation to the provide feedback during the course. It


This is a form of on-going assessment, is used for finding a growth over a time. Eg:
usually done in combination with oral Students Observations, Homework, Peer
questioning. Teachers have to check their reviews, Informal presentations,
students understanding base from Think/Pair/Share, Visual Thinking Strategies,
observation. Quiz, Feedback

D. Student-Self Assessment C. Summative Assessment


It is a process where students are given a Measuring a student’s achievement at the
chance to reflect and rate their own work end of instruction. It is to find mastery and
and judge how well they have performed in performance levels. Long term benefits can
relation to a set of assessment criteria. be determined by the students who attended
Students are the one who track and the course or test. Eg: High Stake tests as
evaluate their own progress or performance. Mid examinations, End University exams 2.4
On the other hand, Prasanthi & Vas (2019) Norm-Referenced Assessment Comparison
presented other types of assessments which of a student’s performance against other
teachers can use in their class. group/norm of students.It is relative grading.
Eg: SAT Test, IQ Test
A. Diagnostic Assessment
It is also called as Pre-Assessment. Before D. Criterion-Referenced Assessment
creating any instruction, it is necessary to It measures a student’s performance against
know, to what kind of students we are a goal, specific objective, or predefined
teaching. Assessing a student’s strengths, performance standards. It checks what
weaknesses, knowledge, and skills prior to students are expected to know and be able
instruction/course is called as Diagnostic to do at a specific stage of their education is
Assessment. It is used to identify current an absolute grading. Eg: ACT Test, The
knowledge and/or misconceptions about a Smarter Balanced Assessment TestI (SBAT)
topic. Based on this data, we can plan our
own instruction. Eg: Pre and posttests, Self- E. Interim/Benchmark Assessment
Assessments, Interviews, Observations, Evaluating student performance at periodic
Polling intervals, frequently at the end of a grading
period. It can predict student performance on
B. Formative Assessment end-of-the-year summative assessments. Eg:
It is used in first attempt of developing Attitude scales, Interest Inventories, Critical
instruction. In this method, assessing a thinking tests/checklists
student’s performance during instruction,
and usually occurs at regular intervals F. Confirmative Assessment
throughout the instruction process is done. It It is an extension of summative assessment.
is used to monitor the student learning and When an instruction has been implemented
PED 06 - Assessment in Learning 1

in a class room, it is necessary to take prepared activities or on-the-fly questions to


assessment that to check if still it is a get immediate insight into understanding. It
success after a year or not. is Powerful and easy-to-use student-
response system has the potential to
G. Ipsative Assessment support responsive teaching.
It measures the performance of a student
against his/her previous performances. This 7. Nearpod: Nearpod works in the browser
method is to check the progress of student to of any device to let you create or upload a
find the improvement. slideshow, to which you then add your own
questions. Interactive slideshow tool
FORMATIVE ASSESSMENT TOOLS engages students, promotes collaboration.
1. Google Forms: These are used to
create forms with hyperlinks, images, and 8. Playposit: It Integrates with a wide
videos. They are mainly used for surveying variety of learning management systems.
such as pre/post-course survey and for The features in it are both basic and useful,
quizzes. it's an easy tool to learn; adding interactivity
to video is a snap.
2. Plickers: Plickers is a simple app that let
the teachers collect real-time formative 9. Classflow: It is the collaborative and
assessment data without the need for cloud-based lesson delivery software for
student devices such as pens, pencil etc., It interactive whiteboards and interactive
is an assessment tool made by a teacher displays. Classflow lets teachers to build
lessons using cards and can create the
3. who are looking for a quick and simple content of our own choice.
way to check student understanding.
10. Spiral: Spiral is an interactive learning
4. Edulastic: It is a powerful formative platform that teachers can use for quick
assessment tool. It is an efficient platform assessment, student collaboration,
supports teachers who are expected to interactive video, and flipped classroom
assess and track student progress on activities.
meeting standards.
11. Formative: Formative lets you create
5. Poll Everywhere: A real-time polling app lessons using any Internet-connected
that works with mobile, Twitter, or in our device and is optimized for 1:1, BYOD,
web browser. That enhances and amplifies flipped or blended classrooms. It gets the
classroom discussion, participation, and students results and respond in real time.
understanding.
12. Classkick: It allows teachers to create
6. Socrative.com: A free web-based lessons and assignments that students
service that lets the students to access with work through on their devices at their own
PED 06 - Assessment in Learning 1

pace. Teachers can observe student ASSESSMENT METHODS

progress in real time and provide immediate SR E PT OQ O SSA


&
feedback. Teachers can Upload a PDF and BCR
add text, drawings, photos, hyperlinks, and Targets
Knowledge &
audio recordings to create dynamic lesson
Simple 5 4 3 4 3 3
content. Understanding
Deep
MATCHING LEARNING TARGETS WITH Understanding 2 5 4 4 2 3
ASSESSMENT METHODS & Reasoning
Skills 1 3 5 2 5 3
Matching the assessment method with the Products 1 1 5 2 4 4
learning outcomes requires an examination Affect 1 2 4 4 4 5

of the evidences of learning needed and


targeted levels of knowledge, understanding,
reasoning, skills, product/performance and
affect as manifested in the learning targets.

McMillan (2007) as cited in Almeida, et al.


(2015) prepared a scoreboard as a guide on
how well a particular assessment method
measures each level of learning. In the table
on the succeeding page depicts the relative
strength of each assessment method in
measuring different learning targets.

Note: Higher numbers indicate better matches.


Legend:
SR &BCR = Selected-Response & Brief- Constructed
Response
E = Essay
PT = Performance Task
OQ = Oral Questioning
O = Observation
SSA = Student-Self Assessment

The K to 12 Basic Education Curriculum has


a balanced assessment program. It utilizes
both traditional and authentic assessment
tools and techniques to get valid and reliable
evidences of student learning.
PED 06 - Assessment in Learning 1

CHAPTER 5 Reliability and Validity

Let’s get into it!

OVERVIEW
It is not unusual for teachers to receive
complaints or comments from students
regarding tests and other assessments. For OBJECTIVES
one, there may be an issue concerning the At the end of this chapter the students are
coverage of the test. Students may have expected to:
been tested on areas that were not part of the 1. Determine the distinction between
content domain. They may not have been validity and reliability.
given the opportunity to study or learn the 2. Cite evidences of validity and reliability in
opportunity to study or learn the material. teacher made test.
The emphasis of the test may also be too 3. Create a recommendation based on the
complex inconsistent with the performance principles of validity and reliability.
verbs in the learning outcomes.
Let’s conceptualized!
Validity alone does not ensure high
quality assessment. Reliability of test results DSICUSSION
should be checked. Questions on reliability A. What is Validity?
surface if there are inconsistencies in the Reliability comes for the Latin word validus
results when tests are administered over which means strong. In view of assessment,
different time periods, sample questions or it is deemed valid if it measures what is
groups. supposed to. We can say the test is valid if it
Both validity and reliability are considered measures a student’s actual knowledge and
when gathering information or evidences performance with respect to the intended
about student achievement. outcomes and not something else. For
instance, an assessment purportedly for
measuring arithmetic skills of Grade 4 pupils
is invalid if used for Grade 1 pupils because
PED 06 - Assessment in Learning 1

of issues in content and level of performance. globalization and sustainable development.


There are two example of validity Only two were discussed in class but
problems particularly on content-related assessment covered all three issues.
evidence and construct-related evidence. Although these were all identified in the
Which will be presented here. curriculum guide and may even be found in
a textbook, the question remains as to
1. Content-Related Evidence whether the topics were all taught or not.
This pertains to the extent to which the Inclusion of items that were not taken up in
test covers the entire domain of content. If a class reduces validity because students had
summative test covers a unit with four topics, no opportunity to learn the knowledge or skill
then the assessment should obtain items being assessed.
from each topic. It is recommended for teachers to
On the other hand, a test that appears to construct Table of Specifications (ToS) in
adequately measure the learning outcomes order to improve the validity of assessments.
and content is said to possess face validity. It is a blueprint used to identify the content
As the same suggests, it looks at the area and describes the learning outcomes at
superficial face value of the instrument. It is each level of the cognitive domain.
based on the subjective opinion of the one
reviewing it. Hence, it is considered non- 2.Criterion-Related Evidence
systematic or non-scientific. A test that was Criterion-related evidence for validity refers
prepared to assess the ability of pupils to to the degree to which test scores agree with
construct simple sentences with correct an external criterion. As such, it is related to
subject-verb agreement has face validity if external validity. It examines the relationship
the test looks like an adequate measure of between an assessment and another
the cognitive skill. measure of the same trait.
Another consideration related to content
validity is instructional validity- the extent Two Types of Criterion -Related Evidence
to which an assessment is systematically 1. Concurrent Validity
sensitive to the nature of instruction offered. This provides an estimate of a students’
This is closely related to instructional current performance in relation to a
sensitivity which Popham,(2006) as cited in previously validated or established measure.
Almeida, et al.,(2015) defined as the “degree For instance, a school has developed a new
to which students’ performances on a test intelligence quotient (IQ) test. Results from
accurately reflect the quality of instruction to this test are statistically correlated to the
promote students’ mastery of what is being results from a standard IQ test. If the
assessed.” Let us consider the Grade 10 statistical analysis reveals a strong
curriculum in Araling Panlipunan. In the first correlation between the two sets of scores,
grading period, they will cover three then there is high criterion validity. It is
economic issues: unemployment, important to mention that data from the two
measures are obtained at about the same
time.

permalino_romana_villon_2019
PED 06 - Assessment in Learning 1

2. Predictive Validity Sources of Reliability Evidence


Pertains to the power or usefulness of In terms of sources of reliability evidence,
test scores to predict future performance. For there are five classes namely: stability,
instance, can scores in the entrance equivalence, internal consistency, scorer or
examination test be used to predict college rater consistency and decision consistency.
success? If there is a significantly high
correlation between entrance examination A. Stability
scores and first year grade point average This happens when the test-retest
(GPA) assessed later, then there is correlates scores obtained from two
predictive validity. administrations of the same test over a
period of time. it assumes that there is no
B. What is Reliability? considerable change in the construct
Reliability talks about reproducibility and between the first and second testing.
consistency in methods and criteria. An Typically, test-retest reliability coefficients for
assessment is said to be reliable if it standardized achievement and aptitude tests
produces the same results if given to an are between 0.80 and 0.90 when the interval
examinee on two occasions. It is important between testing is 3 to 6 months (Nitko &
then to stress that reliability is unlikely to turn Brookhart,2011 as cited in Almeida, et
out 100% because no two tests will have al.,2015)
some differences. There are environmental
factors like lighting and noise that affect B. Equivalence
reliability. Student error and physical Parallel forms or reliability ascertain the
wellbeing of examinees also affect equivalency of forms. In this method, two
consistency of assessment results. different versions of an assessment tool are
For test to be valid, it has to be reliable. administered to the same group of
Let us look at an analogous situation. For individuals. However, the items are parallel,
instance, a weighing scale is off by 6 pounds. i.e.they probe the same construct, base
You weighted a dumbbell for seven knowledge or skill. The two sets of scores are
consecutive days. The scale revealed the then correlated in order to evaluate the
same measurement; hence the results are consistency of the results across alternate
reliable. However, the scale did not provide versions. Equivalent forms are ideal for
an accurate measure and therefore is not make-up tests or action researches that
valid. would utilize pre- and post-tests. An
equivalent test is not just a matter of
Types of Reliability rearranging the items. New and different
1. Internal Reliability items must be thought of but measuring the
This assesses the consistency of results same construct. This is where the difficulty
across items within a test. lies. For specific, skills test like addition of
2. External Reliability signed numbers, it would be relatively easy.
It gauges the extent to which a measure However, for complex or subjective
varies from one use to another. constructs, it would require time and effort to

permalino_romana_villon_2019
PED 06 - Assessment in Learning 1

prepare. Moreover, it is rather impractical to correlation coefficient between among the


ratings. The first is used for ordinal data
subject students to answer two forms of test.
while the other from nominal and discrete
data.
C. Internal Consistency
E. Decision Consistency
Internal consistency implies that a student
Decision consistency describes how
who has mastery learning will get all or most
consistent the classification decisions are
of the items correctly while a student who
rather than how consistent the scores are. It
knows little or nothing about the subject
is seen in situations when teacher decide
matter will get all or most of the items wrong.
who will receive a passing or failing mark, or
To check the internal consistency, the split
considered to possess mastery or not. This
half method can be used. This method is
can be considered using the levels of
done by dividing the test into two-separating
proficiency adapted in the K to 12 programs.
the first half and the second half of the test or
by odd or even numbers, and then
correlating the results of the two halves. The
Spearman-Brown Formula is applied. It is a
statistical correction to estimate the reliability
of the whole test and not each half of the test.
There are other ways to establish internal
consistency like Cronbach Alpha.

D. Scorer or Rater Consistency


People do not rate in similar way. They
may have disagreement as to how
responses or materials truly reflect or
demonstrate knowledge of the construct or
skill being assessed. More so, certain
characteristics of the raters contribute to
errors like bias, halo effect, mood, fatigue,
among others.
Just as several test items can improve References:
de Guzman, E. S., et al. (2015). Assessment of Learning 1.
the reliability of standardized test, having Adriana Publishing Co., Inc. Quezon City; Manila,
Philippines.
multiple raters can increase reliability. Inter-
rater reliability is the degree to which different Ajayi, Victor. (2018). Difference Between Assessment,
Measurement and Evaluation in Science Education.
raters, observers or judges agree in their Retrievedfromhttps://www.researchgate.net/
publication on July 28,2020.
assessment decisions. To mitigate rating
Norm & Criterion- Referenced Interpretations. Retrieved from:
errors, a wise selection and training good http://files.hbe.com.au/flyerlibrary/Brigance/Criterion-
referenced%20vs.%20Norm-
judges and use of applicable statistical referenced%20Assessment.pdf on July 28, 2020.

techniques are suggested. To estimate inter- https://pluspng.com/img-png/writing-a-test-png-evaluation-ex


amination-examiner-learner-student-test-writing-icon-3
rater reliability, the Spearman’s Rho or 89.png
Cohen’s Kappa may be used to calculate the

permalino_romana_villon_2019

You might also like