Workshop Objectives - PowerPoint PPT Presentation

1 / 43
About This Presentation
Title:

Workshop Objectives

Description:

Write learning outcomes to incorporate types of thinking ... an A grade essay for an MBA validly assess a persons managerial competence in ... – PowerPoint PPT presentation

Number of Views:31
Avg rating:3.0/5.0
Slides: 44
Provided by: DENN272
Category:

less

Transcript and Presenter's Notes

Title: Workshop Objectives


1
Workshop Objectives
  • Identify the components of a Thinking Curriculum
  • Identify types of thinking
  • Write learning outcomes to incorporate types of
    thinking
  • Design learning tasks to promote thinking
  • Identify instructional methods that promote
    thinking
  • Evaluate a range of assessment methods as tools
    for assessing thinking

2
Quality in Education as I see it
Curriculum
Management
Q A System
3
Key Issues
  • Establishing a practical and valid Model of
    Thinking that teaching professionals could
    understand, agree with and apply in their modules
  • The incorporation of thinking into all stages of
    the curriculum development process. (e.g.,
    Learning Outcomes, Instructional Strategies and
    Assessment)

4
An Aligned Curriculum
Learning Outcomes
Types of Thinking
Assessment System
Instructional Strategies
In basic terms this means that the types of
thinking incorporated in the Learning Outcomes
must be effectively taught through the
Instructional Strategies used and accurately
measured in the Assessment System.
5
Key Assumptions
People can learn to think and act intelligently
(Perkins, 1995, p.18)
Effective thinking strategies can be modelled
and utilized by any individual who wishes to do
so (Dilts, 1990, p.193)
we can improve students ability to perform the
various processes by increasing their awareness
of the component skills and by increasing their
skill proficiency through conscious practice
(Marzano, 1988, p.65)
6
Thinking Important for effective learning
The best thing we can do, from the point of
view of the brain and learning, is to teach our
learners how to think (Jenson, 1996, p.163)

Problem-solving is to the brain what aerobic
exercise is to the body. It creates an
explosion of internal activity, causing synapses
to form, neurotransmitters to activate and blood
flow to increase.
The more we make school learning like real life,
the more the brain, with its rich capabilities,
will sort it out (Jensen, 1997, p.99)
7
Knowledge, Rote-learning as well as Thinking are
important in effective learning
Debates about the relative merits of teaching
content Vs process, transmission of knowledge Vs
discovery learning, thinking Vs rote learning,
etc, only cloud rather than help effective
pedagogy. For example, there is now virtual
agreement among cognitive psychologists that
effective thinking - however defined - needs an
extensive and well organized knowledge base. As
Resnick (1989) summarizes Study after study
shows that people who know more about a topic
reason more profoundly about that topic than
people who know little about it
(p.4) Similarly, Satinover (2001), drawing from
recent brain research makes the case for the
importance of repetition in the learning
process these mundane chores are precisely
what turns the fourth brain from a mass of
randomness into a intellect of dazzling capacity.
Genius, according to Thomas Edison, is one
percent inspiration and nine-nine percent
perspiration. Of critical thinking skills, he
had nothing to say (p.49)
8
Activity Find me a girlfriend potential wife
Wife leaves me for Brad Pitt - What to do, lah?
9
What is thinking?
The conscious and goal directed mental activity
we do in order to Solve Problems The following 7
slides define and outline a model of thinking

10
Specific Types of Thinking
Comparison and Contrast
Inference and Interpretation
Metacognition
Analysis
Evaluation
Generating Possibilities
11
Generating Possibilities
What do we do when we generate possibilities?
  • Generate many possibilities
  • Generate different types of
  • possibilities
  • Generate novel possibilities

12
Analysis
What do we do when we analyse?
  • Identify the relationship of the parts
  • to a whole in system/structure/model
  • Identify the function of each part
  • Identify the consequences to the
  • whole, if a part was missing
  • Identify what collections of parts
  • form important sub-systems of the
  • whole
  • Identify if and how certain parts
  • have a synergetic effect

13
Comparison and Contrast
What do we do when we compare and contrast?
  • Identify what is similar between things -
  • - objects/options/ideas, etc
  • Identify what is different between things
  • Identify and consider what is important
  • about both the similarities and differences
  • Identify a range of situations when the
  • different features are applicable

14
Inference and Interpretation
What do we do when we make inferences and
interpretations?
  • Make meaning of information/data
  • available
  • Identify causal relationships
  • Identify key points and emphasis
  • Make predictions concerning future
  • possibilities
  • Separate fact from opinion
  • Identify intentions and assumptions

15
Evaluation
What do we do When we evaluate?
  • Decide on what is to be
  • evaluated
  • Identify appropriate criteria
  • from which evaluation can be made
  • Apply the criteria and make
  • decision

16
Metacognition
What are we doing when we are metacognitive?
  • Aware that we are thinking in a
  • planned manner
  • Actively thinking about the ways
  • in which we are thinking
  • Monitoring and evaluating how
  • effective we are thinking
  • Seeking to make more effective
  • use of the different ways of thinking
  • and any supporting learning/thinking
  • strategies/tools

17
Infusion Approach 1
Curriculum
18
Infusion Approach 2
Specific types of thinking that
underpin competent performance
Real world applications of the subject content
Curriculum
19
Identifying the Types of Thinking
  • Step 1
  • Refocus the curriculum
  • towards real world activities or competency
  • Step 2
  • Identify the types of thinking that underpin
    competent performance in these real world
    activities through
  • COGNITIVE MODELING
  • In doing this it is useful to start by asking
  • the question
  • How does a highly competent
  • person think in the effective
  • execution of this activity?
  • Example from a Business Law
  • Module
  • Predict possible legal outcomes
  • in the event of a breach of contract
  • Analyse the components of a contract
  • Compare and contrast the expected and the actual
    behaviour of defendants
  • Make inferences and interpretations concerning
    the behaviour
  • Evaluate the possibility of specific outcomes

20
Other Examples
  • Example from Environmental Science
  • Managing Pollution
  • Compare and contrast different types of pollution
    in a range of contexts
  • Analyse the causes of pollution
  • Make inferences and interpretations concerning
    the effects of pollution in different situations
  • Generate possibilities in terms of
    managing/reducing pollutants
  • Evaluate pollution policies
  • Example from Engineering Materials
  • Predict failure in metal structures
  • Analyse metal capability using appropriate tests
  • Compare and contrast metal failure in a range of
    situations
  • Make inferences and interpretations concerning
    the behaviour of metal under different conditions
  • Evaluate the probability of metal failure in a
    range of case scenarios

21
Writing learning outcomes
  • Write in direct performance terms focusing on
  • the Type of Thinking or Product Outcome
  • Analyse the impact of pollution on water quality
  • Compare and contrast a range of retaining
    structures
  • Generate new design options for marketing a
    health food
  • Predict the outcomes of specified legal scenarios
  • Conduct product packaging tests for a specified
    product
  • Prepare a voyage passage plan
  • Write a programme in Java script to animate a
    range of figures
  • Prepare a tender report

22
Learning outcomes for a Social Psychology Module
  • Design a small experiment to test an established
    theory in social psychology
  • Conduct a small experiment following established
    principles of experimental procedure
  • Compare and contrast a range of data sources
  • Make inferences and interpretations from
    experimental data
  • Write up a experiment using a recognized format

23
Promoting thinking general instructional
principles
  • Systematically teach and model the types of
    thinking, taking students through the range of
    cognitive operations for each type of thinking
  • Use appropriate language and specific questions
    to direct and reinforce types of thinking (e.g.,
    Lets compare contrast these two alternatives
    rather than comment on)
  • Involve students in real world learning tasks
    which necessitate direct use of the types of
    thinking
  • Consistently promote values and dispositions
    conducive to good thinking and effective learning
    (e.g., looking for the truth, managing
    impulsivity, persistence, etc)

24
Instructional methods and strategiesthat provide
opportunities for thinking
  • Questioning
  • Small group activities that involve specific
    types of thinking
  • Case studies
  • Projects
  • Performance tasks that involve specific types of
    thinking
  • Discussion/Debates
  • Thinking Tools, e.g., Mind mapping, Thinking
    Hats, Force-Field Analysis

25
Using Questions
  • The effective use of questions is a powerful
    means of
  • promoting specific types of thinking, for
    example
  • What are the similarities and differences between
    Hepatitis A and HIV?
  • In what ways are these differences significant?
  • What inferences and interpretations can be drawn
    from the data on HIV infection in Asia?
  • How might we evaluate the effectiveness of the
    present HIV prevention programme?
  • What is the relationship between HIV infection
    and poverty?
  • What other ways might we make people more aware
    of HIV infection?

26
Steps in designing learning tasks to promote
thinking
  • Step 1 Identify clearly the knowledge, skills
    and dispositions to be incorporated into the
    task
  • For this step it is important to
  • Choose specific topic areas in your curriculum
    that contain knowledge essential for key
    understanding of the subject. For example,
    central concepts, principles and models.
  • Identify the types of thinking that are important
    for promoting student understanding and
    subsequent competence in these topic areas. For
    example, analysis, comparison and contrast,
    evaluation, etc.
  • Identify other process skills and dispositions
    that are important for promoting learning in the
    identified areas. For example, team-working,
    searching for and organising information,
    time-management, etc.

27
Steps in designing learning tasks to promote
thinking
  • Step 2 Produce the learning task
  • It is important that the task
  • Clearly involves the application of the
    knowledge, skills and dispositions identified
    from Step 1.
  • Is sufficiently challenging, but realistically
    achievable in terms of students prior
    competence, access to resources, and time frames
    allocated.
  • Successful completion involves more than one
    correct answer or more than one correct way of
    achieving the correct answer
  • Clear notes of guidance are provided, which
  • Identify the products of the task and what
    formats of presentation are acceptable (e.g.
    written report, oral presentation, portfolio,
    etc)
  • Specify the parameters of the activity (e.g.
    time, length, areas to incorporate,
    individual/collaborative, how much choice is
    permitted, support provided, etc)
  • Cue the types of thinking and other desired
    process skills
  • Spell out all aspects of the assessment process
    and criteria.

28
Example 1 Package Design
  • Select a food product and design the packaging
    that you think will give it
  • best marketability. You must be able to identify
    the product attributes,
  • protection and enhancement needed to satisfy the
    functional and
  • marketing requirements, and use suitable
    packaging material(s) and
  • package type. The work produced should reflect
    the quality of your
  • thinking in the following areas
  • identify the criteria for evaluating the
    marketability of a product
  • analyze the components of a product that
    constitute an effective design
  • generate new ways of viewing a product design
    beyond existing standard forms
  • predict potential clients response to the product
    given the information you have
  • monitor the development on the groups progress
    and revise strategy where
  • necessary

29
Example 2 Experiment to test the Halo Effect
  • Over the past two weeks we have been looking at
    the Halo Effect and its impact
  • on how we perceive and treat people. For this
    assignment, you are to work in
  • groups of 4-5 and design and conduct a small
    experiment to test the halo effect.
  • You may choose the particular focus for the
    experiment, but it must
  • Clearly test the Halo Effect
  • Be viable in terms of accessing relevant data
  • Meet ethical standards in conducting experiments
    with persons
  • Follow an established method and procedure
  • Produce results that support or refute the
    hypothesis
  • Once completed, the experiment should be written
    up in an appropriate format of
  • approximately 1000 words. It should document the
    important stages of the
  • experiment and compare and contrast the data
    found with existing
  • findings on the Halo Effect.

30
Problems in assessing thinking
  • As an internal cognitive activity Thinking is not
    directly visible
  • The plethora of models/perspectives defining
    thinking
  • Thinking is not separate from knowledge

31
Methods for assessing thinking
  • Fixed response items/objective tests (MCQs)
  • Open response essay items(short and long answer)
  • Performance assessments (real life tasks or
    simulated activity - these can be in the form of
    specific work tasks, projects, case studies,
    reports, presentations, etc)
  • Interviewing using structured/focused questioning

32
Choosing methods/items for assessing thinking
  • NB. Irrespective of the methods used, items must
    be well designed, administered and marked

VALIDITY
EFFICIENCY
33
Designing multiple-choice items for assessing
thinking
  • There are number of formats for designing MCQs.
    The formats most suitable
  • for designing items to assess thinking are
  • Two or more premises in combination, presented in
    the stem, lead to a
  • particular conclusion. For example
  • If pass rates for a course have progressively
    dropped over the past 3 years, and there is
  • no evidence of change in student cohort and
    staffing, we are most likely to conclude that
  • Students attitude to work had deteriorated
  • Lecturers are assessing more stringently
  • Examinations had increased in difficulty
  • Teaching quality had deteriorated

34
  • An information set is provided as a stimulus.
    This may include a written
  • scenario (case) graph, table, article, etc or
    combination of the above.
  • The stimulus is then followed by a series of
    MCQs, based on the
  • information provided. A short example is
    provided below

  • Table 1
  • Question no. No. of
    responses (60)
  • 1
    11
    7
  • 2
    30
    8
  • 3
    16
    10
  • 4
    27
    18
  • From the date presented in Table 1, the most
    likely inference is that
  • Students had done well overall
  • Some questions were more confusing than others
  • Students had done poorly overall
  • Certain topic areas had been studied in more
    detail

35
Designing essay-type items
Essay types items (whether of the short or long
answer kind) can be effectively used to assess
types of thinking. However, the design of the
items is very important. Consider the items
below 1. Compare and contrast two published
models of thinking. Identify two similarities and
two differences in these models. Briefly outline
the significance of these differences for
teaching thinking. (10 marks) 2. Evaluate the
impact of introducing performance-based
assessments in a module you teach. Identify the
benefits and concerns, which may result from such
a curriculum change. (25 marks)
36
Designing performance-based items
Performance-based items are assessments that
directly focus on the real world competence. A
typical example is a driving test, where the
person being assessed is tested on their driving
performance against established
criteria. Performance-tests are unlike many
paper and pencil tests that only measure
indicators that cognitive or other performances
are established. For example, does an A grade
essay for an MBA validly assess a persons
managerial competence in their place of
work? The design of performance based items was
covered earlier (Steps in designing learning
tasks) Real world learning tasks, once used for
purposes of assessment, become performance-based
items.
37
Designing interview questions
  • Asking questions that challenge students to
    reveal their thinking is very useful
  • means for assessing the thinking process. The
    questions used should require
  • students to demonstrate the types of thinking to
    be assessed.
  • Consider, the following examples
  • If you removed this part from the system, how
    would performance be affected
  • and why?
  • What were the most significant features that led
    you to choose X over Y
  • On what basis did you make these interpretations
    from that data?
  • Why did you feel that these criteria were the
    most important in making this
  • evaluation?
  • What other possibilities did you consider before
    prioritising these?
  • How did you check that your thinking was
    effective over the duration of
  • this project?

38
Key planning considerations in producing a
marking scheme
  • Decide on what exactly is to be assessed from the
    item - Performance
  • Areas. These must be reflect the learning
    objectives for the module
  • Decide on the Performance Criteria for each of
    the performance areas.
  • These are the key operations or elements that
    underpin competence in
  • each of the performance areas.
  • Decide on the Marks Weighting for each
    performance area. This must
  • reflect the table of specifications in the module
    document
  • Decide on the sources of Performance Evidence to
    be used in assessing
  • the item (e.g., written, oral, products,
    observation, questioning)
  • Decide on the Format for the marking scheme
    typically a Checklist or
  • Rating Scale/Scoring Rubric

39
High and Low Inference Items
  • Low inference items are those where the
    performances being tested are clearly visible and
    there is a widely established correct answer
    (e.g., conducting a fire drill, setting up an
    experiment)
  • High inference items involve performances that
    are less directly visible and/or more open to
    subjective judgement (e.g., creative writing,
    managing a team)
  • A major challenge to test design is to produce
    tasks that require low
  • inference scoring systems. Unfortunately, many
    worthwhile student
  • outcomes reflecting higher order thinking lend
    themselves more to high
  • inference scoring.

40
Developing a checklist
  • Identify the important components - procedures,
    processes or operations - in an assessment
    activity
  • for example, in conducting an experiment one
    important operation is likely to be the
    generation of a viable hypothesis
  • For each component, write a statement that
    identifies competent performance for this
    procedure, process or operation
  • in the above example, the following may be
    pertinent
  • A clear viable hypothesis is described
  • Allocate a mark distribution for each component -
    if appropriate
  • this is likely to reflect its importance or level
    of complexity

Note Checklist are most useful for low inference
items where the performance evidence is clearly
agreed and there is little disagreement relating
to effective or ineffective performance (e.g.,
observable steps)
41
Assessment checklist for Assignment 1 Design and
conduct a small experiment to test the Halo Effect
  • Performance Areas
  • The context of the experiment is accurately
    described ?
  • A clear viable hypothesis is presented ?
  • The method/procedure is appropriate ?
  • There is no infringement on persons ?
  • Findings are clearly collated and presented ?
  • Valid inferences and interpretations are drawn
    from the data?
  • and comparison is made with existing data ?
  • 7. The write-up of the experiment meets required
    conventions ?

42
Developing a rating scale/scoring rubric
  • Define the performance areas for an assessment
  • for example, Valid inferences and
    interpretations are drawn from the data and
    comparison is made with existing data
  • Identify the key constructs/elements that
    underpin competence for each performance area
  • Using the above example
  • validity
  • inference and interpretation
  • comparison and contrast
  • Write a concise description of performance at a
    range of levels from very good to very poor
  • for example, 5 very good 1 very poor

Note Rating Scales/Scoring Rubrics are most for
useful for high inference items where the
performance evidence requires considerable
professional judgement in making an assessment
decision
43
Scoring RubricValid inferences and
interpretations are drawn from the dataand
comparison is made with existing data
  • Score Description
  • 5 All valid inferences have been derived from the
    data. Interpretations are consistently logical
    given the data obtained. All essential
    similarities and differences are identified
    between this data and existing data. The
    significance of these similarities and
    differences is fully emphasized.
  • 4 Most of the valid inferences have been derived
    from the data. Interpretations are mainly logical
    given the data obtained. Most of the essential
    similarities and differences are identified
    between this data and existing. The main
    significance of these similarities and
    differences is emphasized.
  • 3 Some valid inferences have been derived from
    the data. Some logical interpretations are made
    from the data obtained. Some essential
    similarities and differences are identified
    between this data and existing data. The
    significance of these similarities and
    differences is only partly established.
  • 2 Few valid inferences derived and limited
    interpretation of findings. Comparison and
    contrast with existing data is partial and the
    significance is not established
  • 1 Failure to make valid inferences and
    interpretations
Write a Comment
User Comments (0)
About PowerShow.com