Instructional System Design Concept Map

,,

Click map area for more information on ISD, such as Learning Environment, Design, or Knowledge.

ID and ISD Models

The main goal of an ID (Instructional Design) model or process is to construct a learning environment in order to provide the learners with the conditions that support the desired learning processes.

ID models differ from an ISD (Instructional System Design) model in that ISD models are more broad in nature. On the other hand, ID models are less broad in scope and normally focus on the first two phases of the ISD model - analysis and design. They focus on the analysis of a to-be-trained skill or knowledge-acquisition and then convert the analysis into a training strategy (design of the learning environment). While ID models normally only account for analysis and design, ISD models normally cover five-phases:

  • Analysis
  • Design
  • Development or Production
  • Implementation or Delivery
  • Evaluations.

Formative and Summative Evaluations

Formative evaluations are embedded in each of the five phases for judging the value or worth of that process while the program activities are "forming" or happening. This part of the evaluation focuses on the processes or activities. A summative evaluation is also performed at the end of the ISD process that focuses on the outcome (summation).

http://www.nwlink.com/~donclark/hrd/ahold/isd.html


Learning Goals and Learning Objectives

,,

Learning Goal: a statement of purpose or intention, what learners should be able to do at the conclusion of instruction.

  • When given ( TOPIC ) with ( SPECIFIC ) be able to ( X ).

Learning Objectives: are statements that tell what learners should be able to do when they have completed a segment of instruction.

  • Learners will decide X
  • Learners will choose X
  • Learners can X

Subject Matter Expert, SME, Draft Questions

,,

  1. What are your expectations from this collaboration?
  2. What is the relevant background on this instruction?
    • i.e. What do you want to do with this instruction?
  3. What are your performance objectives of instruction?
    • What do you want students to know by the end of this?  Such as they are able to describe, explain, compare..
  4. What are the students actual learning outcomes before the instruction?
  5. Who is your target audience, age, level, etc. .
  6. How would you describe your learners?
    • Their general reading/writing aptitude and development level.
    • What is their prior knowledge of the instructional topic?
    • What is their attitude towards learning?
    • What are their attitudes towards current instructional content and delivery?
  7. What are the best resources to the instruction?
  8. What types of media would you like to use?
  9. What types of deliverables are you expecting?
  10. How long will the instruction be?
  11. Do you want an immediate summative evaluation after the instruction?
    • If so, what types of evaluation do you expect?
  12. How involved do you want to be in the development?
  13. Do you understand our development process?

Kirkpatrick Model Four Levels of Evaluation


Level

Measure

Description

Benefits

Limitations

Methods

1
Reaction
Participant reactions of the instructional program just completed
  • Give quick feedback about various aspects of the course
  • Easy to collect data
  • Validity of information limited (i.e., is influenced by many factors)
Survey instruments requiring quick quantified responses to:
  • Instructor's presentation
  • Content relevance
  • Effectiveness of instructional materials
  • Facilities and arrangements
  • Program strengths and weaknesses
2
Learning
Participant learning which has occurred during the instructional program just completed
  • Gives most direct objective evidence of training quality
  • Gives learners feedback on achievement
  • Gives instructor feedback on course quality for revision purposes
  • Depends on high validity of tests
  • Assumes objectives are relevant to the bottom-line
  • Instructor may have to deal with negative attitudes
  • requires additional course time
Objective tests using:
  • Multiple choice questions
  • True-false questions
  • Fill-in-the-blank questions
3
Behavior
Participant behavior, usually on the job, which is directly related to the instructional program.
  • Gives direct information as to training success on the job
  • Provides good selling argument to management
  • Impacted by factors other than training
  • May be difficult to collect
Data collected with a time lapse after training (2 to 6 months) using:
  • Workplace observation
  • Direct supervisor reports
  • Self-reports
4
Results
Organizational impact
  • Provides most persuasive information for management
  • Impacted by many factors besides training
  • May be difficult/impossible to collect
Data collected over a periodic interval to establish behavior trends and patterns