FSU - Systematic Instructional Design Guide
FSU - Systematic Instructional Design Guide
FSU - Systematic Instructional Design Guide
This study guide is a companion to The Instructional Triad The type of assessment and
Systematic Design of Instruction by Walter Dick, instructional strategy will vary
Lou Carey, James O. Carey. depending upon the learning outcome
Goals & objectives
to be taught. Four types of learning
outcomes:
Instructional Design i) Intellectual skill goals (Cognitive
Assessment Instruction
domain)
Reiser & Dempsey (2007) define (1) Concept – identify examples of
instructional design as a "systematic Define Instructional Goals concepts. E.g. identify
process that is employed to develop (a.k.a. Front-end analysis) architectural style of buildings.
education and training programs in a (2) Rule – apply rule to solve
consistent and reliable fashion" (pg. 11). problem. E.g. compute
Reiser, R.A. & Dempsey, J.V. (2007). Trends and issues in
Refer to figure 2.1 p17
1) Conduct performance analysis averages.
instructional design (2nd ed.). Upper Saddle River, NJ:
Pearson Education, Inc. 2) Conduct needs assessment / analysis (3) Problem solving – select and
3) Conduct job analysis apply a variety of rules to solve
Instructional Design Models 4) Identify the instructional goal problems. E.g. write a business
a) The instructional goal is written as goal letter.
Instructional design models (or methods) statement. ii) Psychomotor skill goals
are guidelines (Read: blueprints) that b) Goal statement is the broad, general (Psychomotor domain)
instructional designers follow when creating purpose of instruction. Not measurable Goals that require basic
instruction. because it does not clarify exactly what motor skills and/or physical
ADDIE instructional design model a learner must do or how a learner movement.
Systematic Design of Instruction Model should perform. iii) Attitudinal goals (Affective domain)
(Dick & Carey instructional design model) A goal may be defined as a general Goals pertaining to
Reeves multimedia design model statement of desired accomplishment. attitudes, appreciations, values,
It does not specify exactly all of the and emotions.
components or steps or how each step iv) Verbal information goals (Cognitive
Systematic Design of Instruction domain)
will be achieved on the road to
Model (Dick & Carey accomplishing the goal. Facts and knowledge.
Example Goals: (1) Students will master E.g. name the capitals of
Instructional Design Model) various countries.
the procedure of setting up a multiple
camera production system (2) Students ii) Cognitive strategy
Based on a systems approach for designing (Cognitive domain)
instruction will understand the concept of basic
networking. Employ a learning
Based on a behaviorist perspective of strategy. E.g. use a
learning 5) Classify the goal
Classify each goal into one of the memorization technique.
domains (identify the type of learning
outcome specified in goal statement)
SYSTEMATIC INSTRUCTIONAL DESIGN
ii) Characteristics directly related to the site can be used to deliver The instructional goal is rephrased as a
task analysis training on skills that will be terminal objective describing what students
Already perform terminal required for transfer to the will be able to do in the learning context,
objective? workplace 2) a list of any limitations and subordinate objectives describe the
Entry skills that may have serious implications building block skills (steps and subordinate
Prior knowledge of the topic for the project. skills) that students must master before the
area 3) Evaluation and revision of the instructional terminal objective is obtained.
2) Context analysis analysis The terminal objective describes a
The purpose of a Context Analysis is to specific behavior that learners engage
identify and describe the environmental Develop Instructional Objectives in to demonstrate success with a goal.
factors that inform the design of. The subordinate objectives may be
a) Performance context analysis (analysis (a.k.a., Performance Objectives, general or specific but they are used as
of performance) Learning Objectives, Behavioral a means of attaining a terminal
Managerial or supervisor objective. They are objectives which
support Objectives, and Objectives OR apply during learning process.
Physical aspects of the site Instructional Outcome or Generally, each instructional unit (or
Social aspects of the site lesson) has a corresponding terminal
Relevance of skills to the
Learning Outcome) objective.
workplace Instructional objectives
Purpose: At this stage, it is necessary to
The major outputs of this phase ii) Are:
translate the needs and goals into
of the study are 1) a description of Specific in that they describe
objectives that are sufficiently specific to
the physical and organizational precisely what the learner is
guide the instructor in teaching and the
environment where the skills will be expected to do.
learner in studying. In addition, these
used 2) a list of any special factors Outcome based in that they
objectives form the blueprint for testing as
that may facilitate or interfere with state what the learner should be
a means of evaluating both the instruction
the learners use of the new skills. able to do after the instruction is
and the learning that has occurred.
b) Learning context analysis (analysis of complete. Instructional objectives
Example: The student will be able to explain
learning context) focus on learning outcomes for
the role of waterfall model in system
Compatibility of site with students. Not actions by the
analysis and design.
instructional requirements teacher.
Assessment and evaluation measure
Adaptability of site to stimulate Measurable in that they
whether or not learning and/or learning
workplace describe learning outcomes that
objectives are being met.
Adaptability for delivery can be measured. Note that goal
Two types of instructional objectives:
approaches statements are not measurable.
a) Terminal objective
Learning site constraints c) Are written to include (Mager model for
(1) Only one
affecting design and delivery instructional objectives):
b) Subordinate objectives (enabling
The major outputs are 1) a
objectives or intermediate objective)
description of the extent to which
(1) Often more than one
SYSTEMATIC INSTRUCTIONAL DESIGN
Audience - the learner(s) the 1) Edit goal to reflect eventual performance information. To check the results of student
objective is written for. Usually "the context if necessary. learning during the process of instruction
learner" or "the student". 2) Write terminal objective to reflect context and provide document of learners progress
Behavior (action or of learning environment. (or improvement). To test terminal
performance) - Identify the a) Job / Task Analysis, the terminal objective and relevant subordinate
behavior the learner will be taking objective is developed from duties. objectives.
when he/she has achieved the 3) Write objectives for each step in goal 1) Identify which steps and/or subordinate
objective. The behavior should be analysis for which there are no substeps skills to assess.
specific and observable (observable shown. 2) Identify which type of test to conduct.
learner outcome). a) Job / Task Analysis, the enabling a) Four types of criterion-referenced
Condition - Describe the objectives are derived from tasks. (objective-referenced) tests
relevant conditions under which the 4) Write objective for each grouping of i) Entry skills test
learner will be acting. The substeps under a major step of the goal ii) pretest - test entry skills and
conditions under which the analysis, or write objectives for each instructional objectives
behavior is to be completed, substep. iii) practice or rehearsal test (formative
including what tools or assistance is 5) Write objectives for all subordinate skills. assessment)
to be provided. Conditions reflect 6) Write objectives for entry skills if some iv) Posttest (summative assessment) -
the context that will be available in students are likely not to possess them. measures learners performance on
the learning environment. 7) -OR- only write objectives for those items in the subordinate and terminal
Standard (criterion or degree) - your task analysis which you plan to assess. objectives
Describe the level of performance 8) SEE Instructional Objective Helper - 3) Identify what domain the objective is in.
that is desirable, including an http://www.cogsim.com/idea/forms/Inst_o a) Organizing your objectives according to
acceptable range of answers that bj2.htm learning domain can also aid you in
are allowable as correct. Do not selecting the most appropriate type of
provide ambiguous standards such >PRODUCE LEARNER / TASK assessment item.
as "…to the instructor's i) Verbal (a.k.a. knowledge),
ANALYSIS REPORT intellectual (concept, rule, or
satisfaction".
Differences in writing conventions. problem solving), attitude,
Report consists of:
The goal statement describes the 1) Learner analysis
psychomotor
context in which the learner will 4) Writing test items
2) Task analysis
ultimately use the new skill while the You should write an assessment item
terminal objective describes the for each objective whose
conditions for performing the goal at Develop assessment accomplishment you want to measure.
the end of the instruction. instruments a) Read the objective and determine what
Objectives are to include condition it wants someone to be able to do (i.e.,
(CN), type of learning (intel, behavior,…) identify the performance).
Purpose: To diagnose whether a learner
and criteria (CR). b) Draft a test item that asks students to
possesses the necessary prerequisites
i) See p122-125 exhibit that performance.
(entry skills) for learning the new
SYSTEMATIC INSTRUCTIONAL DESIGN
c) Read the objective again and note the b) Is the test item a performance, product, instructional goal analysis and the
conditions under which the performing or attitude (not an objective item)? materials created
should occur (i.e., tools and equipment Essay, product development, iii) Judge the congruence between the
provided, people present, key live performance materials and the characteristics of
environmental conditions). i) Writing directions the target learners
d) Write those conditions into your item. ii) Developing instrument iv) Judge the congruence between the
e) For conditions you cannot provide, (1) Develop a response format performance and learning contexts
describe approximations that are as (a) Checklist (yes/no); Rating and the materials
close to the objective as you can scale (0,1,2,3…); Frequency v) Judge the clarity of all the materials
imagine. count (tally)
f) If you feel you must have more than (2) Identify the elements to be Select and Develop an
one item to test an objective, it should evaluated
be because (a) the range of possible (3) Paraphrase each element instructional strategy
conditions is so great that one (4) Sequence elements in (instructional method)
performance won’t tell you that the instrument
student can perform under the entire (5) Select the type of judgment to Purpose: To outline how instructional
range of conditions, or (b) the be made by the evaluator activities will relate to the accomplishment
performance could be correct by (6) Determine how instrument will of the objectives.
chance. Be sure that each item calls for be scored Determine delivery system:
the performance stated in the 8) Design evaluation a) Review the instructional analysis and
objective, under the conditions called The skills, objectives, and assessments identify logical clusters of objectives
for. must all refer to the same skills, so that will be taught in appropriate
5) Criteria for writing test items careful review is required in order to sequence.
Goal-Centered Criteria ensure this congruence. b) Plan the learning components that will
Learner-Centered Criteria Requirements: Instructional analysis be used in the instruction.
Context-Centered Criteria diagram, performance objectives, c) Choose the most effective students
Assessment-Centered Criteria summaries of learner characteristics groupings for learning.
6) Type of behavior stated in objective and and performance and learning contexts, d) Specify the most effective media and
select associated types of test items p.140 performance objectives, and materials that are within the range of
7) Objective test item or non-objective test assessment cost, convenience, and practicality for
item: Complete a design evaluation chart the learning context.
a) Is the test item an objective test item? Congruence assessment e) Assign objectives to lessons and
Completion, short answer, Use design evaluation chart in consolidate media selection.
true/false, matching, and multiple some of these steps. f) Select or develop a delivery system that
choice i) Organize and present the materials best accommodates the decisions made
i) Sequence items to illuminate their relationship. in steps 1 through 5.
ii) Writing directions ii) Judge the congruence between the Select content sequencing
iii) Determine how instrument will be informaiton and skills in From left to right, subskills first.
scored
SYSTEMATIC INSTRUCTIONAL DESIGN
3 exceptions noted learning; present content; provide (b) Inform learners of the
Select clustering size - size of unit(s) of learning guidance; elicit relevance of what they are
instruction. performance (practice); provide learning.
Develop instructional materials feedback; assess performance; (c) Confidence by providing
Except for the pretests and posttests, all enhance retention and transfer to material at the appropriate
learning components are included the job. level of difficulty.
within the materials. Instructional activities (components) [Dick (d) Satisfaction derived
If instruction is intended to be and Carry] through extrinsic or
independent of an instructor, then the a) Pre-instructional activities intrinsic rewards.
materials will have to include all the i) Gain attention and motivate (2) Reiser Model
learning components in the strategy. learners (a) Relevant to learners' needs
The instructor in this case is not ii) Inform learners of objectives and interests
expected to play a role in delivering iii) Stimulate recall of prerequisite skills (b) Entertaining (via stories,
instruction. b) Content presentation anecdotes, graphics,
Develop instructional package i) Present content* pictures, etc.)
Instructional materials ii) Provide learning guidance* (c) Interactive (via practice
Assessments c) Learner participation problems, rhetorical
Course management i) Provide practice* questions, etc.)
information ii) Provide feedback* (d) Success-producing (present
Instructional activities d) Assessment (note: usually not tasks that learners can
These instructional activities are not considered part of a lesson) accomplish, but don't make
intended to serve as one model for i) Entry skills test tasks so easy that learners
formatting your instructional strategy; it ii) Pretest will get bored)
should not be used as a model for the iii) Practice tests (e) Enthusiasm-provoking (use
specific sequence of instructional iv) Posttest language that will get
activities you design. e) Follow-through activities (incorporated learners interested; write
The inst. strategy is related to the throughout lesson) in the the active voice;
section of a task analysis. i) Provide memory aids avoid dull, dry words)
Both Dick & Carry and Reiser have their ii) Promote transfer of learning (f) Rewarding (incorporate
own instructional strategies. They have *repeated for each objective or cluster of praise, encouragement, etc.
different titles, but all the steps are objectives that is at the appropriate
there and they are in the same order. Instructional Activities (events) [Reiser] level for the target
The two [rephrased] a) Introductory activities audience )
instructional strategies both follow i) Motivate learners (at the beginning ii) Inform learners of objectives
with a high degree of similarity and throughout) (1) Two objectives for each
Gagne's Nine Events of Instruction: (1) ARCS Model element for which an objective
gain attention; inform learners of (a) Gain learners' attention. is written.
ojbectives; stimulate recall of prior
SYSTEMATIC INSTRUCTIONAL DESIGN
(a) One objective is the three (1) Properly executed (correctly (3) Congratulate learners for
part objective for the completed) samples of the attaining goal
designer desired skill that are shown to (4) Encourage learners to apply
(b) The second objective is a learners so they can see how to skills taught
short form to provide to the perform the skill correctly. *for each obj. or set of objectives
learner (2) Provide examples Blooms original cognitive domain taxonomy
(2) Do not provide a long list of (3) Provide non-examples There are three domains to Blooms
objectives to the learner iii) Provide practice* taxonomy (cognitive, affective,
(3) Emphasize the relevance of the (1) The opportunity students are psychomotor)
objectives given to perform a particular Cognitive domain
iii) Help learners recall prerequisite behavior prior to the time they iii) A learning task is a question or
skills and knowledge are formally assessed statement that asks a student to
(1) Tell learners what the iv) Provide feedback* complete a certain task.
prerequisite knowledge and (1) Provide frequent feedback A knowledge learning task is a
skills are (2) Feedback should be arranged task that asks the student to recall
(2) Describe how the prerequisites so learners have difficulty content in the exact form that it
are performed. Recall stories seeing it prior to trying to was presented. It asks learners to
and examples from previous respond to practice problems. recall, locate specific information,
instruction (3) Knowledge of results or remember or memorize details.
(3) Question them (4) Knowledge of correct response A comprehension learning task
(4) Give learners cues or a few (5) Corrective/instructional is a task that asks the student to
practice problems. feedback restate material in their own words.
b) The heart of the lesson c) Concluding activities It asks learners to explain,
Information->example->practice- i) Promote transfer (often not a demonstrate, and translate
>feedback separate activity, built in understanding.
i) Present essential information (rules, throughout) An application learning task is a
procedures, facts, etc.) (1) Provide practice opportunities task that asks the student to apply
(1) Be succinct, concise, precise that stimulate "real world" rules, concepts, principles, and
(2) Highlight key information conditions. theories in new situations. It asks
(3) Branching if necessary. See (2) Describe the learner to use the information,
index for further information (a) "real world" conditions and use methods/concepts/theories in
on. constraints. new situations, and interpret facts.
(4) Different strategies (b) "real world" applications of An analysis learning task is a
(a) Break each individual part the skill task that asks the student to break
down and move forward ii) Conclude the “lesson” (not down information into parts. It asks
(exposition theory) mentioned by Dick and Carey) the learner to dissect information,
(b) Give then the big picture (1) Restate goal/objective of the identify and distinguish
and them move from there. lesson
ii) Present worked examples* (2) Summarize key points
SYSTEMATIC INSTRUCTIONAL DESIGN
components, and compare and objective tests or by product or Goal-centered criteria for evaluating
contrast. performance assessments materials
A synthesis learning task asks May include pretest and Instructional analysis document
the students to put together ideas postest. provides the basis for determining
into a new or unique product or Will the assessments be the acceptability of the content in
plan. available to students or will they various instructional materials
An evaluation learning task appear as part of the instructor's Learner-centered criteria for evaluating
asks the student to evaluate or material so they are not available to existing materials
judge the value of a concept or students. Your learner analysis document
object. It asks the learner to judge Course management information should provide the foundation for
outcomes, dispute thoughts and A general description of the consideration of the
ideas, and form opinions. total package, typically called an appropriateness of instructional
instructor's manual that provides an materials for your target group
Develop Instructional Materials overview of the materials and Learning-centered criteria for
shows how they might be evaluating materials
Purpose: To select materials and media
intended to convey activities (or events) of
incorporated into an overall Your instructional strategy can
learning sequence for students. be used to determine whether
instruction.
May include the tests and other existing materials are adequate as is
Delivery system
information important for or whether they need to be
Three factors often compromise in
implementing the course. adapted or enhanced prior to use.
selection of media and delivery system
If the management system is a Context-centered criteria for evaluating
Availability of existing
commercial web-based materials
instructional materials
instructional management system Your instructional and
Production and implementation
then it may include automated class performance context analyses can
constraints
listing, student tracking, online provide the foundation for judging
Amount of facilitation that the
testing, project monitoring, grade whether existing materials can be
instructor will provide
book , and communication tools. adopted as is or adapted for your
Components of an instructional package
Existing instructional materials settings.
Instructional materials
Sharable content object reference Technical criteria for evaluating
Written, mediated, or
model (SCORM) is a set of e-learning material
facilitated that all students will use
standards for interchangeability of Materials should also be judged
to achieve the objectives.
learning objects. A learning object is for their technical adequacy,
Student workbooks, activity
what might have been traditionally according to criteria related to 1)
guides, problem scenarios,
called a lesson or module, that would delivery system 2) packaging 3)
computer simulations, case studies,
include a cluster of content with the graphic design and typography 4)
resource lists...
required learning components of an durability 5)legibility 6)audio and
Assessments
instructional strategy. video quality 7) interface design,
All instructional materials
Learning centered and technical criteria
should be accompanied by
SYSTEMATIC INSTRUCTIONAL DESIGN
navigation, and functionality 8) Purpose: To remove the most and that you would like his or
updatability. obvious errors in the instruction her reaction to them. You
Instructional materials and formative and to obtain initial performance should state that any mistakes
evaluation indications and reactions to the that learners might make are
Rough draft materials content by learners. During this probably due to deficiencies in
Rapid prototyping - a series of stage of direct interaction between the material and not theirs.
informed, successive approximations, designer and individual learners, Encourage the learners to be
emphasizing the word informed the designer works individually with relaxed and to talk about the
because this development approach three or more of the learners who materials. Have them go
relies absolutely on information are representative of the target through the instructional
gathered during tryouts to ensure the population. Take care not to over materials and also take the
success of the final product. Concurrent generalize the data gathered from test(s) provided with the
design and development only one individual. material. Learners must be
Learner selection by four convinced that it is legitimate to
categories: be critical of what is presented
(1) By achievement - one who is to them. Support an
above average, one who is atmosphere of acceptance and
average, one who is below support for any negative
Conduct Formative Evaluations comments from the learner.
average.
Purpose: To provide data for revising and
(2) By attitude (optional) - a highly Best herein to use qualitative
improving instructional materials.
positive learner, a neutral evaluations rather than
learner, a negative learner. quantitative.
2) Expert appraisal (reviewers): Have SME's
(3) Previous experience and years Criteria and Data
and/or interested specialists not involved in
on the job (optional) - 10 years Three main criteria the
the instructional development project
on the job, 5 years on the job, decisions designers will make
review the instruction.
less than 1 year. during the evaluation are as
a) SME's comments on the accuracy and
Procedure follows.
currency of instruction.
An interactive process. (i) Clarity of instruction: Is the
b) A specialist in the type of learning
The designer should sit message, or what is being
outcome involved in may critique and
diagonally beside the learner, presented, clear to
enhance the instructional strategy as
the designer should read individual target learners?
concerned to the learning outcome.
(silently) with the learner and, 1. Intended outcome -
c) Share draft with someone familiar with
at predetermined points, contains appropriate
the target audience.
discuss with the learner what information
has been presented in the 2. Considerations:
Three phases of formative evaluation:
materials. a. Message:
d) One to one (clinical evaluation) - the
Explain to the learner that a Vocabulary level,
designer works with individual learners
new set of instructional language
to obtain data to revise the materials.
materials have been designed complexity,
SYSTEMATIC INSTRUCTIONAL DESIGN
responses to the instruction, learning test items are faulty. If flawed, then the difficulty of each item for the
time, posttest performance, and changes should be made to make them group, to determine the difficulty of
responses to an attitude questionnaire. clearer or consistent with the objectives each objective for the group, and to
a) Describe the learners who participated and the intent of instruction. If the determine the consistency with
in the one-to-one evaluation and to items are satisfactory, and the learners which the set of items within an
indicate their performance on any performed poorly, then the instruction objective measures learners'
entry-skill measures. must be changed. performance on the objective. An
b) Bring together all the comments and e) Three sources of suggestions for item difficulty value is the
suggestions about the instruction that change: learner suggestions, learner percentages of learners who
resulted from going through it with performance and the your (the answer an item correctly. Item
each learner, which can be done by designers) reactions to the instruction. difficulty values above 80 percent
integrating everything on a master copy Data analysis for small group and field trials reflect relatively easy items for the
of the instruction using a color code to The available data includes: item group, whereas lower values reflect
link each learner to his or her particular performance on the pretest, posttest, more difficult ones. Similarly,
problems. It is also possible to include and responses to an attitude consistently high or low values for
comments from an SME and any questionnaire, learning and testing items within earn objective reflect
alternative instructional approaches time, and comments made directly in the difficulty of the objective for the
that were used with learners during the the materials. group.
one-to-ones sessions. The fundamental unit of analysis for all The consistency of item
As you go through the instruction the assessments is the individual difficulty indices within an objective
with each learner take notes on a assessment item. Performance on each typically reflects the quality of the
single copy and use a pen of a item must be scored as correct or items. If items are measuring the
different color for each learner. incorrect. If an item has multiple parts, same skill, and if there is no
c) Posttest data are summarized by then each part should be scored and inadvertent complexity or clues in
obtaining individual item performance reported separately so that the the items, then learners'
and then combining item scores for information is not lost. This individual performance on the set of items
each objective and for a total score. It is item information is required for three should be relatively consistent.
often of interest to develop a table that reasons: Within small groups, differences of
indicates each student's pretest score, Groups's item by item performance 10 or 20 percent are not considered
posttest score, and total learning time. large, but differences of 40 percent
Item by objective table 11.1 &
In addition, student performance on the or more should cause concern.
table 11.2 (percentages of learners
posttest should be summarized along When inconsistent difficulty indices
who mastered each objective
with any comments for each objective. are observed within an objective, it
should increase from pretest to
Begin within those sections that indicates that the items within the
posttest) & table 11.3 ( learners'
resulted in the poorest performance by set should be reviewed and revised
performances across test using the
learners and those that resulted in the prior to reusing them to measure
percentage of objectives mastered
most comments. learner performance.
on each test). The test-by-objective
d) Determine, based on learner
analysis is threefold: To determine
performance, whether the rubric or the
SYSTEMATIC INSTRUCTIONAL DESIGN
Learners' item-by-objective whole, had already acquired the skills with which learners had difficulty. Was
performance that you were teaching. If they already the planned strategy actually used in
The item by objective table possess most of the skills, then you will the instructional materials? Are there
provides the data for creating receive relatively little information alternative strategies that might be
tables to summarize learners' about the effectiveness of the employed? The final step is to examine
performance across tests. instruction or how it might be the materials themselves to evaluate
Graphing learners' performances improved. If they lack these skills, you the comments about problem areas
Another way to display data. A will have more confidence in the made by learners, instructors, and
graph may show the pretest and analyses that follow. subject matter experts.
postest performance for each d) Comparing pretest with posttest scores g) Learning time - It may be necessary to
objective in the formative objective by objective, which is the revise the materials to make them fit
evaluation study. You may also usual procedure when you examine the within a particular time period.
want to graph the amount of time instructional analysis chart, you can h) Media, materials, and instructional
required to complete the assess learner performance on each procedures - data that relate to the
instructional materials as well as particular objective and begin to focus implementation of the instructional
the amount of time required for the on specific objectives and the related materials must also be examined.
pretest and posttest. Figure 11.1 instruction that appear to need Controllable vs. non-controllable. Were
revision. You may need to revise the learners hindered by the logistics
Other types of data
conditions or the criteria specified in required to use the materials? Were
the objectives. Recall that conditions there questions about how to proceed
Sequence for examining data are used to control the complexity of from one step to the next? Were there
a) Instructional analysis and entry skills - performance tasks, and your criteria long delays in getting test scores? Some
did the learners have the entry skills may be too lenient or harsh for the solutions may need to be incorporated
anticipated. If they did succeed but did target group. into the instructor's manual to make
not have the required skills, you must e) Examine the exact wording of the the instructional activity more efficient.
question whether you have identified objective and the associated test items Revision process
critical entry skills. and the exact student answers to the a) Summarize data in a clear and accurate
b) Objectives, pretests, and posttests - items. Before revising the instructional fashion.
the second step is to review the pretest materials, refer to you item analysis b) Table 11.4 template for summarizing
and posttest data as displayed on the table to see whether poor test items, information from a formative
instructional analysis chart. Learners rather than the materials, indicate poor evaluation.
pretest performances should decrease learner performance. All that may be
as you move upward through the c) Any problems identified with a
needed are revised test items rather component
hierarchy - there should be poorer than a major revision of the
learner performance on the terminal d) The changes that are being proposed
instructional materials. based upon the problems
objective than on the earlier skills. f) Learning components of instructional
c) Examine the pretest scores to e) The evidence gathered illustrating the
strategy and materials - the next step is problem. The evidence might name
determine the extent to which to examine the instructional strategy
individual learerns, and the group as a associated with the various objectives
SYSTEMATIC INSTRUCTIONAL DESIGN
source materials, assessments, data worth of) instruction. This is accomplished Impact on job: Are learners able to
summaries, observations or interviews. through the design of evaluation studies transfer the information, skills, and
f) The problems may be apparent but the and the collection of data to verify the attitudes from instructional setting to
appropriate changes might not be. If a effectiveness of instruction and the job setting or to subsequent units of
comparison of several approaches has instructional materials with target learners related instruction?
been embedded in the formative Did the intervention, including the Impact on organization: Are learners’
evaluation (such as implementing instruction, solve the problem that led changed behaviors (performance,
several versions of the instructional to the need for the instruction in the attitudes) making positive differences in
material with changes to formatting, first place? achievement of the organization’s
steps…), then the results should Two phases of summative evaluation: mission and goals (e.g., reduced
indicate the type of changes to be expert judgment phase and field trial phase dropouts, resignations; improved
made. Otherwise, the strategies attendance, achievement; increased
suggested for revising instruction Expert judgment phase productivity, grades)?
following the one-to-one evaluation Management analysis:
also apply at this point - namely, use Do the materials have the potential for Are instructor and manager attitudes
the data, expertise, and sound learning meeting this organization’s needs? satisfactory?
principles as the basis for revisions. Congruence analysis: Are the needs and Are recommended implementation
g) Avoid responding too quickly to any goals of the organization congruent with procedures feasible?
single piece of data. those in the instruction? Are costs related to time, personnel,
h) In instructor-led instruction it is Content analysis: Are the materials equipment, and resources reasonable?
relevant to note that some students are complete, accurate, and current?
unlikely to understand the concepts as Design analysis: Are the principles of Comparison of formative and summative
rapidly as others during a given class. learning instruction, and motivation clearly evaluations
Identifying learners who are performing evident in the materials? Formative Summative
evaluation evaluation
poorly and inserting appropriate Feasibility analysis: Are the materials Purpose Locate weaknesses Document strengths
activities are important components of convenient, durable, cost-effective, and in instruction in and weaknesses in
the revision process for the instructor satisfactory for current users? order to revise it instruction in order
to decide whether
who is using an interactive instructional to maintain or
approach. Field trial phase adopt it
i) Do not simply assume that changes are Phases or
stages
One-to-one; small
group; field trial
Expert judgment;
field trial
for the better. Are the materials effective with target Instructional Systematically Produced in-house
learners in the prescribed setting? development designed in-house or elsewhere not
history and tailored to the necessarily
Outcomes analysis needs of the org. following a systems
Summative Evaluation Impact on learners: Are the approach
achievement of motivation levels of Materials One set of One set of materials
materials or several or several
Collect data and information in order to learners satisfactory following differing sets competing sets
make decisions about the acquisition or instruction? Position of Member of design Typically an external
evaluator and development evaluator
continued use of (overall effectiveness and team
SYSTEMATIC INSTRUCTIONAL DESIGN