Methods and Procedures of Students' Learning Assessment - PowerPoint PPT Presentation

About This Presentation
Title:

Methods and Procedures of Students' Learning Assessment

Description:

Methods and Procedures of Students' Learning Assessment Slovak University of Technology in Bratislava www.stuba.sk Faculty of Civil Engineering – PowerPoint PPT presentation

Number of Views:394
Avg rating:3.0/5.0
Slides: 33
Provided by: Adm91859
Category:

less

Transcript and Presenter's Notes

Title: Methods and Procedures of Students' Learning Assessment


1
Methods and Procedures of Students' Learning
Assessment
  • Slovak University of Technology in Bratislava
    www.stuba.skFaculty of Civil Engineering
  • Gabriela Pavlendova

2
Assessment makes teaching into teaching.
  • Mere presentationwithout assessment is not
    teaching.
  • Assessment is not a discrete process, but
    integral to every stage of teaching, from minute
    to minute as much as module to module.

3
Informal Assessment
  • going on all the time
  • student answers a question,
  • asks
  • starts looking out of the window
  • cracks a joke
  • He is providing you with feedback about whether
    learning is taking place.
  • It's more an evaluation of the teaching session
    than about his learning, but the two are
    inextricable.

4
Each Assessment is Ultimately Subjective
  • However, we can still make every effort to ensure
    that assessment is
  • valid,
  • reliable
  • fair.

5
Validity
  • A valid form of assessment
  • measures what it is supposed to measure.
  • It does not assess memory, when it is supposed to
    be assessing problem-solving (and vice versa).
  • It does not grade someone on the quality of their
    writing, when writing skills are not relevant to
    the topic being assessed, but it does when they
    are. 
  • It does seek to cover as much of the assessable
    material as practicable, not relying on inference
    from a small and arbitrary sample (and here it
    spills over into reliability).

6
Reliability
  • Or "replicability".
  • A reliable assessment will produce the same
    results on re-test, and will produce similar
    results with a similar cohort of students, so it
    is consistent in its methods and criteria.

7
Fairness
  • This is really an aspect of validity, but
    important enough to note in its own right.
  • Fairness ensures that everyone has an equal
    chance of getting a good assessment.

8
Purposes of Assessment
  • To determine whether and to what extent students
    have learned specific knowledge or skills
    (content goals). The assessment should focus on
    outcomes or products of student learning, such as
    objective assessments and projects/products.
  • To diagnose student strengths and weaknesses and
    plan appropriate instruction (process goals). To
    understand where the student is going wrong, you
    need to assess the process as well as the
    product, through activities such as interviews,
    documented observations, student learning logs
    and/or self-evaluations, behavioral checklists,
    and student think-alouds in conjunction with
    multiple-choice tests.

9
Forms of Aassessment
  • Summative assessment- which says whether or not
    you have"passed".
  • It isor should beundertaken with reference to
    all the objectives or
  • outcomes of the course, and is usually fairly
    formal. Note that all
  • summative assessment can also be formative, if
    the feedback offered is
  • sufficient.
  • Formative assessment is going on all the time.
    Its purpose is to
  • provide feedback on what students are learning
  • to the student to identify achievement and areas
    for further work
  • to the teacher to evaluate the effectiveness of
    teaching to date,
  • and to focus future plans.

10
Examples of Formative Assessment, which Can
Become also Part of Summative Assessment
  • Peer assessment
  • Initial assessment
  • Middle assessment
  • Problem sheets
  • Short- answer questions etc.

11
No Assessment is Perfect
  • Assuming it is possible to tell who is competent
    in a given area and who is not.
  • Such an idealistic construction is known as a
    "gold standard". In the case in the diagram,
    about 80 are competent (or "deserve to pass")
    and about 20 aren't.

12
(No Transcript)
13
Assuming Highly Valid Assessment Scheme
  • It comes to the "right" answer about 80 of the
    time. Again, this is idealistic, because we
    rarely have a clue as to the quantifiable
    validity of such a scheme

14
(No Transcript)
15
What happens when we use this scheme with our
group of students?
16
We end up with
  • 64 "True Positives" they are competent, and the
    assessment scheme agrees that they are. In other
    words, they passed and so they ought to have
    done.
  • 16 "True Negatives", who failed and deserved to
    do so.
  • 16 "False Positives" they passed, but they did
    not deserve to do so, and
  • 4 "False Negatives", who failed, but should have
    passed.

17
Is there a Solution?
  • We have both unfair, and a potentially serious
    technical problem. Imagine a pilot had qualified
    as a False Positive! 
  • There is of course a solution raise the "pass"
    threshold. Unfortunately it's wrong.
  • All it does it to change the proportion of False
    Positives and False Negatives. This may be the
    right thing to do if the most important thing is
    to eliminate the False Positives (the people who
    qualified who weren't competent), but the cost to
    the poor characters who should have passed and
    didn't gets even higher.

18
What Can We Do?
  • We have to live with it, and make strenuous
    efforts to improve
  • Validity.
  • Do not rely on a single assessment exercise
  • Use a variety of different approaches
  • In our assessment plan, methods and procedures
    should meet the following criteria
  • Clarity - methods and procedures are clear
  • Measurements occur at appropriate times in the
    certificate program
  • Measurements are appropriate for the SLOs
    (Student Learning
  • Outcomes)
  • Methods and procedures reflect an appropriate
    balance of direct and
  • indirect methods
  • Examples of certificate assessment tools

19
Methods We Can Use
Case studies Direct observation Examination (unseen, seen/open book) Multiple-choice tests Performance projects Problem sheets Self-assessment Collaborative/group projects Essays Oral questioning after observation Portfolios Presentations Projects Short- answer questions Viva voce examination Peer assessment Peer assessment via wiki
20
Test Administration
  • 1. Write explicit directions for a test. The
    directions should include
  • How much time is available, will extra time be
    allowed
  • What to do if finish early
  • How to record answers
  • Whether to show work on problems
  • Weight of different sections, items
  • Whether there is a penalty for guessing
  • What can be used during the test, e.g.,
    calculators, crib sheet
  • If test booklet will be collected, etc
  • Directions on how to use the answer sheet if at
    all different from the usual way
  • 2. State your cheating policy on the test or the
    directions and enforce it.
  • (Remember it is easier to prevent cheating than
    to deal with the consequences later.)
  • 3. Explain your grading system on the first day
    of class
  • 4. Let students know how they are doing in the
    class throughout the semester
  • 5. Help students to realize that they earn their
    grades
  • 6. Check to see that your students did work they
    handed in for group projects, term papers, etc.
  • 7. Get students self-assess on their preparation
    and performance on tests

21
Students Lack of Insight into their Own
Preparation and Performance on Tests
  • HELP THEM, ask them to reflect with questions
    like these
  • Did you study the right material?
  • Did you put the right emphasis on your studying
    on
  • concepts or the big picture
  • Material in the reading, but not covered in class
  • What could you do differently/or how can you
    prepare for the next test better?
  • Would studying in groups be effective?
  • If so, what type of students should I meet with?
  • What type of group study is effective?
  • Did you begin studying early enough to master the
    material?
  • How well did you know the material?
  • Where were there gaps in your understanding?

22
Measurements are Appropriate for the SLOs
  • What important cognitive skills do I want my
    students to develop? Select no more than three to
    five skills per subject area.
  • What social and affective skills do I want my
    students to develop? (e.g., to work
    independently, to develop a spirit of teamwork
    and skill in group work, to be persistent in the
    face of challenges, to have a healthy skepticism
    about current arguments and claims, etc.)
  • What metacognitive skills do I want my students
    to develop?
  • What types of problems do I want my students to
    be able to solve?
  • What concepts and principles do I want my
    students to be able to apply?
  • THEN
  • Prioritize these outcomes.
  • List your final set of skills, processes, and
    dispositions (by subject area, if desired).

23
(No Transcript)
24
(No Transcript)
25
Cognitive domain Affective domain Psychomotor domain
KNOWLEDGE ATTITUDE SKILLS
1. Recall data 1. Receive (awareness) 1. Imitation (copy)
2. Understand 2. Respond (react) 2. Manipulation (follow instructions)
3. Apply (use) 3. Value (understand and act) 3. Develop Precision
4. Analyse (structure/elements) 4. Organise personal value system 4. Articulation (combine, integrate related skills)
5. Synthesize (create/build) 5. Internalize value system (adopt behaviour)  5. Naturalization (automate, become expert)
6. Evaluate (assess, judge in relational terms)    
26
Finks Taxonomy
  • 1. Foundational Knowledge. Foundational Knowledge
    includes all of the content, ideas, and
    information that you want your students to know
    at the end of the semester.
  • 2. Application. The Application taxon encompasses
    critical, creative, and practical thinking, as
    well as additional skill sets that may be
    beneficial to students.
  • 3. Integration. Integration includes connecting
    different ideas that might appear in different
    disciplines or across the lifespan.
  • 4. Human Dimension. The Human Dimension taxon
    helps assess if students learn more about
    themselves and others. It stresses the human
    factor and gives human significance to learning.
  • 5. Caring. Caring is the taxon that provides the
    motivation and energy for learning by developing
    new interests, feelings, and values associated
    with the course material.
  • 6. Learning How to Learn. The Learning How to
    Learn taxon provides the ability for long-term
    learning by teaching students to become
    self-directed learners.

27
SOLO Taxonomy
  • The SOLO taxonomy stands for
  • Structure ofObservedLearningOutcomes
  • It describes level of increasing complexity in a
    student's understanding of a subject, through
    five stages, and it is claimed to be applicable
    to any subject area. Not all students get through
    all five stages, of course, and indeed not all
    teaching (and even less "training") is designed
    to take them all the way.

28
(No Transcript)
29
Krathwohl's Affective Domain Taxonomy.
  • The taxonomy is ordered according to the
    principle of internalization.
  • Receiving is being aware of the existence of
    certain ideas, material, or phenomena and being
    willing to tolerate them.
  • Responding is committed in some small measure to
    the ideas, materials, or phenomena involved by
    actively responding to them.
  • Valuing is willing to be perceived by others as
    valuing certain ideas, materials, or phenomena.
  • Organization is to relate the value to those
    already held and bring it into a harmonious and
    internally consistent philosophy.
  • Characterization by value or value set is to act
    consistently in accordance with the values he or
    she has internalized.

30
Overview of Development of Taxonomies and their
Domains
31
External Assessment of SLOs
  • Possibillities

Alumni Surveys Archival Data Culminating Assignments Content Analysis Course-embedded Assessment Curriculum Analysis Delphi Technique E-Portfolios Employer Surveys Focus Groups Institutional Data Matrices Observations Oral Exams Performance Assessment Portfolio Evaluations Pre-test/Post-test Evaluation Reflective Essays Rubrics Standardized and Local Test Instruments Student Self-efficacy Student Surveys and Exit Interviews Transcript Analysis Value-Added Assessment (Pre- and Post testing)
32
Thank you for your attention
Write a Comment
User Comments (0)
About PowerShow.com