Assessment Rubric for Critical Thinking Rubric Validation Process Second Workshop - PowerPoint PPT Presentation

1 / 39
About This Presentation
Title:

Assessment Rubric for Critical Thinking Rubric Validation Process Second Workshop

Description:

Assessment Rubric for Critical Thinking Rubric Validation Process Second Workshop Quality Enhancement Plan QEP Team and Faculty Champions ARC Scoring Workshop Process ... – PowerPoint PPT presentation

Number of Views:323
Avg rating:3.0/5.0
Slides: 40
Provided by: Coraggi1
Category:

less

Transcript and Presenter's Notes

Title: Assessment Rubric for Critical Thinking Rubric Validation Process Second Workshop


1
Assessment Rubric for Critical ThinkingRubric
Validation ProcessSecond Workshop
  • Quality Enhancement PlanQEP Team and Faculty
    Champions

2
Authentic Assessments
  • Authentic assessments serve dual purposes of
  • encouraging students to think critically and
  • providing assessment data for measuring improved
    student learning.
  • These assessment techniques fall into three
    general categories
  • criterion-referenced rubrics,
  • student reports (reflection or self-assessments),
    and
  • student portfolios.

3
Rubrics
  • What is a rubric?
  • Scoring guidelines, consisting of specific
    pre-established performance criteria, used in
    evaluating student work on performance assessments

4
Criterion-referenced Rubrics
  • Complex, higher-order objectives can be measured
    only by having students create a unique product,
    whether written or oral, which may take the form
    of in-class essays, speeches, term papers,
    videos, computer programs, blueprints, or artwork
    (Carey, 2000).

5
Rubrics
  • SPC currently uses rubrics in such programs as
  • College of Education
  • College of Nursing
  • Paralegal

6
Assessment Rubric for Critical Thinking
  • A global rubric template developed to provide a
    snapshot view of how student learning is being
    affected by the critical thinking initiative.
  • Designed to be flexible enough to address a
    number of student project modalities including
    written and oral communications.
  • Will evaluate the students use of critical
    thinking skills in the development of the paper
    as opposed to specifically evaluating the quality
    of students writing skills.

7
Assessment Rubric for Critical Thinking
  • Development of a rubric is an iterative process
    and will be improved and strengthened as it is
    used more widely

8
Assessment Rubric for Critical Thinking
  • ARC was designed by the QEP staff and the Faculty
    Champions to
  • Enhance the QEP
  • Align with the Colleges definition of critical
    thinking
  • Be flexible for use in multi-disciplines

9
Rubric Development Process
  • Re-examine the learning objectives to be
    addressed by the task ?
  • Identify specific observable attributes students
    should demonstrate ?
  • Describe characteristics of the identified
    attribute ?
  • Write narrative descriptions for each level of
    continuum ?
  • Collect samples of student work ?
  • Score student work and identify samples that
    exemplify various levels ?
  • Revise the rubric as needed ?

Repeat as Needed
10
Assessment Rubric for Critical Thinking
11
Assessment Rubric for Critical Thinking
12
Assessment Rubric for Critical Thinking
13
ARC Assignment Profile
  • ARC Assignment Profile is designed to provide
    consistency and accuracy in the evaluation of the
    ARC at the institutional level as well as provide
    guidelines for the use of the assessment at the
    course level.
  • For a tool to be effective it must be used in the
    correct situation or job. The ARC is
    essentially a tool to evaluate critical
    thinking, but for a tool to be effective it must
    be in the correct situation or job.
  • The purpose of the ARC Assignment Profile is to
    outline the most appropriate course assignment.

14
ARC Assignment Profile
  • 1. Participating faculty should have one
    assignment during the course that can be
    evaluated using the ARC scoring rubric. The
    course assignment could be a graded homework
    assignment or a major assessment for the course.

15
ARC Assignment Profile
  • 2. The course assignment for the ARC should
    include all of the elements of the rubric and
    should be aligned with the task outlined for each
    element. Assignments that only evaluate some of
    the elements or are not aligned with the specific
    ARC tasks will be considered incomplete and not
    used in the institutional analysis.

16
ARC Assignment Profile
  • 3. Faculty may add additional discipline specific
    rubric elements (such as grammar and punctuation
    in a composition class), but must maintain the
    ARC elements as listed.

17
ARC Assignment Profile
  • Students should be provided a copy of the
    assignment rubric (ARC and any additional
    discipline specific elements). The specific
    elements and tasks include
  • Communication Define the problem in your own
    words.
  • Analysis Compare contrast the available
    solutions within the scenario.
  • Problem Solving Select one of the available
    solutions and defend it as your final solution.
  • Evaluation Identify the weaknesses of your final
    solution.
  • Synthesis Suggest ways to improve/strengthen
    your final solution (may use information not
    contained within the scenario).
  • Reflection Reflect on your own thought process
    after completing the assignment.
  • What did you learn from this process?
  • What would you do differently next time to
    improve?

18
ARC Assignment Profile
  • 5. The evaluating scenario (selected or created)
    should be stated in such a manner to allow the
    student to address each of the tasks. The QEP
    team is willing to assist with the creation of
    the scenario or identify possible sources of
    existing scenario that could be used.

19
ARC Assignment Profile
  • 6. At the end of the semester, please send the
    completed student assignments to the Janice
    Thiel, QEP Director, TE 1-111 (X3110). Completed
    student assignments should include a copy of the
    scenario, the assignment provided to the student
    (with the rubric), the students work and the
    final graded rubric.

20
ARC Assignment Profile
Competency (KSA) Problem with Multiple Solutions Premise with Multiple Perspectives
Communication Define Problem Define Premise
Analysis Compare Contrast Solutions Compare Contrast Alternative Perspectives
Problem Solving Select Defend Final Solution Select Defend Final Perspective
Evaluation Identify Weaknesses Final Solution Identify Weaknesses Final Perspective
Synthesis Suggest Improvements Final Solution Suggest Improvements Final Perspective
Reflection Reflect on Thought Process Reflect on Thought Process
21
Sample Scenario (Deer)
  • Three teenagers were seriously injured in a
    car accident when swerving to avoid a deer on a
    two-lane road near a small, rural town in
    Florida. The residents of the town have seen more
    and more deer enter the towns populated areas
    over recent years. Local law enforcement has been
    called numerous times this year to remove the
    animals from backyards and neighborhood streets,
    and one deer even caused considerable damage as
    it entered a restaurant in town. The mayor has
    been charged by the city leaders to keep the town
    residents safe.

22
Sample Scenario (Deer)
  • Local crops have even been damaged by the
    animals. Some long time residents have requested
    that the hunting season and catch limits be
    extended in order to reduce the deer population.
    One city leader even proposed that the city
    purchase electronic devices to deter the deer
    from entering populated areas. Health concerns
    have recently been elevated as three deer
    carcasses were found at the edge of town and
    local law enforcement suspect that the animals
    had been poisoned.

23
Sample Scenario (Deer)
  • Possible Solutions
  • Some long time residents have requested that the
    hunting season and catch limits be extended in
    order to reduce the deer population.
  • One city leader even proposed that the city
    purchase electronic devices to deter the deer
    from entering populated areas.
  • Health concerns have recently been elevated as
    three deer carcasses were found at the edge of
    town and local law enforcement suspect that the
    animals had been poisoned.

24
Rubric Development Process
  • Re-examine the learning objectives to be
    addressed by the task ?
  • Identify specific observable attributes students
    should demonstrate ?
  • Describe characteristics of the identified
    attribute ?
  • Write narrative descriptions for each level of
    continuum ?
  • Collect samples of student work ?
  • Score student work and identify samples that
    exemplify various levels ?
  • Revise the rubric as needed ?

Repeat as Needed
25
ARC Scoring Workshop Process
  1. After the completion of this PowerPoint
    Presentation, the workshop will begin with
    introductions from the participants
  2. Workshop participants will be provided the ARC as
    well as scoring worksheets. Additional
    instruction will be provided on the scoring
    process.
  3. A sample test item will then be presented on the
    screen, and various responses will be discussed
    and scored based on the scoring rubric given for
    that specific item.
  4. Each scorer will then review the response
    provided for the first item on his/her first
    assessment, and scored it based on the scoring
    rubric. This process will be repeated for each
    of the five items on the assessment.

26
ARC Scoring Workshop Process
  • Scorers who encountered a response which did not
    clearly follow the rubric will discuss the
    response with the group for clarification.
  • Each scorer will then passed the scored
    assessment to their scoring partner, and the same
    assessments will be scored by the second scorer.
  • In the event that two scores differed
    significantly, the facilitator will provide the
    assessment to a third scorer, and a third score
    will be recorded.
  • When all scoring for an assessments is completed,
    the assessment will be provided to the
    facilitator.

27
ARC Scoring Workshop Process
  • Finally, steps 1 through 8 will be repeated for
    each assessment as time allows.
  • Workshop participants will complete the ARC
    Validity and Reliability Form at the end of the
    workshop.
  • Interrater reliability will also be calculated
    from ARC ratings after the completion of the
    workshop.
  • Rubric results will be reevaluated after each
    administration, and additional refinements and
    modifications may be made to the instrument as
    the assessment development and validation is
    intended to be an on-going dynamic process
    designed to provide the very best indicator of a
    students skills.

28
Validity and Reliability
29
Validity
  • Does the Rubric measure what it is suppose to
    measure?
  • Validation is the process of accumulating
    evidence that supports the appropriateness of
    inferences that are made of student responses
    (AERA, APA, NCME, 1999)

30
Validity
  1. Consequences The effects of the assessment
  2. Content Coverage Comprehensiveness of assessment
    content 
  3. Content Quality Consistency with current content
    conceptualization
  4. Transfer and Generalizability Whether assessment
    is representative of a larger domain
  5. Cognitive Complexity Whether level of knowledge
    assessed is appropriate
  6. Meaningfulness The relevance of the assessment
    in the minds of students
  7. Fairness Fairness to members of all groups
  8. Cost and Efficiency The practicality or
    feasibility of an assessment

31
Validity
  • Consequences
  • The effects of the assessment
  • Is the assessment likely to produce results that
    will be used to improve instructional programs or
    otherwise improve student learning?
  • Content Coverage
  • Comprehensiveness of assessment content 
  • Does the assessment comprehensively cover the
    content and processes assessed?
  • Is the content covered in sufficient breadth and
    depth?
  • Does the assessment represent important (not
    trivial) components of the content?
  • Together, will the assessments provide sufficient
    evidence about the content?

32
Validity
  • Content Quality
  • Consistency with current content
    conceptualization
  • Is the assessment consistent with the best
    available conceptualization of the knowledge or
    skill assessed?
  • Does the assessment represent current, rather
    than outdated, perspectives?
  • Transfer and Generalizability
  • Whether assessment is representative of a larger
    domain
  • Can the assessment results be generalized to the
    broader domain (knowledge, skill, or learning
    outcome) they are intended to represent?

33
Validity
  • Cognitive Complexity
  • Whether level of knowledge assessed is
    appropriate
  • Do the assessment tasks or questions represent
    the cognitive complexity of the knowledge or
    skill that it is intended to assess? (For
    example, if an outcome includes higher order or
    critical thinking skills--such as problem solving
    or synthesis--does the assessment measure them?)
  • Does the assessment actually require students to
    use higher-level knowledge or skills, or can
    students simply respond from memory without
    having to think?
  • Meaningfulness
  • The relevance of the assessment in the minds of
    students
  • Are assessment items or tasks meaningful to
    students?
  • Is the assessment relevant to problems students
    will encounter again in school, work, or daily
    living?
  • Does the assessment provide students with
    worthwhile or meaningful experiences?

34
Validity
  • Fairness
  • Fairness to members of all groups
  • Is the assessment biased against students who are
    members of various racial, ethnic, and gender
    groups or students with disabilities? Does it
    contain stereotypes of any groups?
  • Do students of similar ability, regardless of
    group membership, score the same?
  • Cost and Efficiency
  • The practicality or feasibility of an assessment
  • Is the assessment a reasonable burden on
    teachers, instructional time, and finances?
  • Is resulting information worth the required costs
    in money, time, and effort?

35
Reliability
  • Consistency of the assessment scores
  • Types of reliability
  • Interrater Reliability scores vary from
    instructor to instructor.
  • Intrarater Reliability scores vary from a
    single instructor from paper to paper
  • A test can be reliable and not valid, but never
    valid and not reliable

36
Reliability Concerns
  • Reliability
  • Are the score categories well defined?
  • Are the differences between the score categories
    clear?
  • Would two independent raters arrive at the same
    score for a given student response based on the
    scoring rubric?

37
Improving Scoring Consistency
  • Provide rubric to students prior to assessment
  • Anonymous papers
  • Anchor papers defining levels of proficiency for
    reference
  • Use of multiple scorers
  • Interrater reliability statistics during training
    and grading

38
Next Steps
  • The new faculty champions will administer
    coursework using the ARC rubric within their
    programs during the Spring semester
  • Faculty Champions will use the ARC assignment
    profile to ensure consistency
  • Process will be repeated (Steps 5 - 7)

39
Assessment Rubric for Critical ThinkingRubric
Validation ProcessSecond Workshop
  • Quality Enhancement PlanQEP Team and Faculty
    Champions
Write a Comment
User Comments (0)
About PowerShow.com