Research in Practice: Using Assessment to Improve Student Outcomes in General Education Mathematics - PowerPoint PPT Presentation

1 / 51
About This Presentation
Title:

Research in Practice: Using Assessment to Improve Student Outcomes in General Education Mathematics

Description:

Research in Practice: Using Assessment to Improve Student Outcomes in General Education Mathematics Gail Wisan, Ph.D. University Director of Assessment – PowerPoint PPT presentation

Number of Views:119
Avg rating:3.0/5.0
Slides: 52
Provided by: Compu314
Learn more at: http://iea.fau.edu
Category:

less

Transcript and Presenter's Notes

Title: Research in Practice: Using Assessment to Improve Student Outcomes in General Education Mathematics


1
Research in Practice Using Assessment to
Improve Student Outcomes in General Education
Mathematics
  • Gail Wisan, Ph.D.
  • University Director of Assessment
  • Institutional Effectiveness and Analysis
  • Florida Atlantic University
  • Presented at the SAIR 2010 Conference
  • Southern Association for Institutional Research
  • New Orleans, LA
  • September 27, 2010


2
Some Common Faculty Complaints About Assessment
  • Paper pushing
  • Dusty reports sit on shelf
  • Nobody even reads reports
  • Improves nothing
  • Has no impact
  •  

3
Perspective/ point of view
  • Evaluation Research should drive outcomes
    assessment because
  • it helps identify what works
  • it provides direct evidence
  • it helps improve educational
  • outcomes.

4
Overview of Presentation Benefits/Learning
Outcomes
  • Be Able to explain evaluation research
  • identify the benefits of evaluation research
  • Able to explain use of experimental and
    quasi-experimental design evaluation research in
    education assessment
  • Able to apply evaluation research strategies to
    outcomes assessment at your institution to
    improve student learning outcomes.
  • After this pres improve student learning outcomes
    at your institution. Assessment should seek
    systematic evidence of the effectiveness of
    existing programs, pedagogies, methodologies and
    approaches to improve student learning outcomes
    and instill a cycle of continuous improvement.
  •  

5
The Problem Student Learning Outcomes in Gen.
Ed. Mathematics
  • Math Faculty Coordinator interesting in improving
    learning outcomes in General Education math
    courses high percentage of D,W, F grades.
    (Comparative Data)
  • Problem for students, department and faculty,
    and university.

6
Improving Outcomes Assessment Research in
Practice/Evaluation Research
  • The Director of the Mathematics General
    Education program and the Director of Assessment
    worked together to design a quasi-experimental
    design to compare the effectiveness of different
    teaching and learning strategies.
  •  
  •  

7
Outcomes Assessment and Evaluation Research
  • Outcomes Assessment, at its most effective,
    incorporates the tools and methods of evaluation
    research.
  •   1. Outcomes Evaluation Research
  •   2. Field Experiment Research

8
Outcomes Assessment and Evaluation Research
  • Outcomes Evaluation Research assesses the
    effects of existing programs, pedagogies, and
    educational strategies on students learning,
    competencies, and skills

9
Outcomes Assessment and Evaluation Research
  • Field Experiment Research assesses the effects
    of new programs, pedagogies, and educational
    strategies on students learning, competencies,
    and skills

10
Outcomes Assessment and Evaluation Research
  • Outcomes assessment as evaluation research should
    facilitate faculty acceptance since it involves
    using the tools and methods of science to improve
    student learning.

11
Outcomes Assessment and Evaluation Research
  •  Evaluation Research can answer the question
  • How can Assessment Improve Education?

12
Outcomes Assessment and Evaluation Research
  • This presentation describes how evaluation
    research assessment is being used to compare
    different pedagogies in mathematics education
    (Pre-calculus) to improve student learning
    outcomes.  

13
Outcomes Assessment and Evaluation Research
  • The Director of the Mathematics General
    Education program assigned a mathematics
    professor two sections of Pre-Calculus.
  • 3 hour lecture class 
  •   2 hour lecture and 2 hours of hands on in the
    computer lab working problems

14
Comparison of Outcomes for Two Teaching/Learning
Strategies
2 Hrs Lect./2 Hrs Lab Fall 09 (Instr. Smith) 3 Hrs Lecture Fall 09 (Instruct. Smith)
Number Enrolled 34 35
Mean Final Grade 2.4 2.3
A Grade 15 20
B Grade 32 9
C Grade 24 31
D Grade 0 11
F Grade 18 11
W Grade 12 14
15
Comparison of Outcomes for Two Teaching/Learning
Strategies
Number Enrolled 2 Hr. Lect/2 Hr. Problem Solving Computer Lab 3 Hr. Lecture
Number Enrolled 34 35
Mean Final Grade 2.4 2.3
B or Above Grade 47 29
C or Above Grade 71 60
D, W, or F Grade 29 40
Next Math Course Calculus 56 43
Next Math Course None 21 41
16
Comparison of Inputs for Students in Two
Classes With Different Teaching/Learning
Strategies
2 Hrs Lect./2 Hrs Lab Fall 2009 (Instr. Smith) 3 Hrs Lecture Fall 2009 (Instruct. Smith)
Number Enrolled 34 35
HS GPA 3.4 3.4
With HS GPA 100 91.4
SAT Math 563 552
SAT Verbal 521 529
ACT Math 23 24
ALEX Math Placement Score 53.7 54.6
Has Math Placement Score 94 89



17
Comparison of Outcomes for Two Teaching/Learning
Strategies (Additional Control Group- 2008)
2 Hrs Lect./2 Hrs Lab Fall 09 (Instr. Smith) 3 Hrs Lecture Fall 09 (Instruct. Smith) 3 Hrs Lecture Fall 08 (Instruct. Smith) All
Number Enrolled 34 35 109 178
Mean Final Grade 2.4 2.3 1.9 2.1
A Grade 15 20 11 13
B Grade 32 9 9 13
C Grade 24 31 18 22
D Grade 0 11 17 12
F Grade 18 11 16 15
W Grade 12 14 27 21
18
Comparison of Outcomes for Two Teaching/Learning
Strategies (Additional Control Group- 2008)
Number Enrolled 34 35 109 178
Mean Final Grade 2.4 2.3 1.9 2.1
B or Above Grade 47 29 20 27
C or Above Grade 71 60 39 49
D, F, or W Grade 29 40 59 49
19
Comparison of Inputs for Students in Two
Classes With Different Teaching/Learning
Strategies (Additional Control Group)
2 Hrs Lect./2 Hrs Lab Fall 09 (Instr. Smith) 3 Hrs Lecture Fall 09 (Instruct. Smith) 3 Hrs Lecture Fall 08 (Instruct. Smith) All
Number Enrolled 34 35 109 178
Mean Final Grade 2.4 2.3 1.9 2.1
HS GPA 3.4 3.4 3.1 3.2
With HS GPA 100 91.4 94 95
SAT Math 563 552 532 542
SAT Verbal 521 529 510 516
20
Comparison of Inputs for Students in Two
Classes With Different Teaching/Learning
Strategies (Additional Control Group)
2 Hrs Lect./2 Hrs Lab Fall 09 (Instr. Smith) 3 Hrs Lecture Fall 09 (Instruct. Smith) 3 Hrs Lecture Fall 08 (Instruct. Smith) All
Number Enrolled 34 35 109 178
Mean Final Grade 2.4 2.3 1.9 2.1
SAT Math 563 552 532 542
ACT Math 23 24 22 23
ALEX Math Placement Score 53.7 54.6 54.4 54.3
Has Math Placement Score 94 89 84 87
21
Research Design Examples Overview
  • Notation X, O, R
  • Experimental Design
  • Pre-Experimental Design and its problems in
    educational research
  • 1. Threats to internal validity (Is X really
    having an effect?)
  • 2. Threats to External Validity
    (generalizability)

22
Research Design Examples Quasi-Experimental
Designs Versus Pre-Experimental Designs
  • QUASI- Experimental Designs Better Answers
  • 1. Better Solutions to internal validity
    threats (Is X really having an effect?)
  • 2. Better Solutions to external validity
    threats (generalizability)

23
Notation on Diagrams
  • An X will represent the exposure of a group to an
    experimental variable or teaching method, the
    effects of which are to be measured.
  • O will refer to observation or measurement.
  • R refers to a random assignment.

24
Research Design
  • How Quasi-experimental Design helps to solve the
    problems of Pre-experimental Design

25
Experimental Designs
  • Pretest-Posttest Control Group Design
  • Random assignment to two groups
  • R O X O
  • R O O

26
Experimental Designs
  • Pretest-Posttest Control Group Design
  • R O X O
  • R O O
  • Sources of Invalidity
  • External
  • Interaction of Testing and X
  • Interaction of Selection and X ?
  • Reactive Arrangements ?

27
Experimental Designs
  • Posttest-Only Control Group Design
  • R X O
  • R O

28
Experimental Designs
  • Posttest-Only Control Group Design
  • R X O
  • R O
  • Sources of Invalidity
  • External
  • Interaction of Selection and X ?
  • Reactive Arrangements ?

29
Experimental Designs
  • Solomon Four-Group Design
  • R O X O
  • R O O
  • R X O
  • R O
  • Sources of Invalidity
  • External
  • Interaction of Selection and X ?
  • Reactive Arrangements ?

30
Pre-Experimental Designs
  • One-Shot Case Study
  • X O
  • Sources of Invalidity
  • Internal
  • History
  • Maturation
  • Selection
  • Mortality
  • External
  • Interaction of Selection and X

31
Pre-Experimental Designs
  • One-Group Pretest-Posttest Design
  • O X O
  • Sources of Invalidity
  • Internal
  • History
  • Maturation
  • Testing
  • Instrumentation
  • Interaction of Selection and Maturation, etc.
  • Regression ?
  • External
  • Interaction of Testing and X
  • Interaction of Selection and X
  • Reactive Arrangements ?

32
Pre-Experimental Designs
  • Static-Group Comparison
  • X O
  • O
  • Sources of Invalidity
  • Internal
  • Selection
  • Mortality
  • Interaction of Selection and Maturation, etc.
  • Maturation ?
  • External
  • Interaction of Selection and X

33
Threats to Internal Validity
  • History, the specific events occurring between
    the first and second measurement in addition to
    the experimental variable.
  • Maturation, processes within the respondents
    operating as a function of the passage of time
    per se (not specific to the particular events),
    including growing older, growing hungrier,
    growing more tired etc.
  • Testing, the effects of taking a test upon the
    scores of a second testing.

34
Threats to Internal Validity
  • Instrumentation, in which changes in the
    calibration of a measuring instrument or changes
    in the observers or scorers used, may produce
    changes in the obtained measurements.
  • Regression. This operates where groups have been
    selected on the basis of their extreme scores.

35
Threats to External Validity
  • Interaction of Testing and X. A pretest might
    increase/decrease the respondents sensitivity or
    responsiveness to the experimental variable,
    making the results obtained for a pretested
    population unrepresentative for the unpretested
    universe from which the respondents were
    selected.
  • Interaction of Selection and X

36
Threats to External Validity
  • Reactive Arrangements. This would preclude
    generalization about the effect of the
    experimental variable upon persons being exposed
    to it in nonexperimental settings.
  • Multiple-X Interference. This is likely to occur
    whenever multiple treatments are applied to the
    same respondents, because the effects of prior
    treatments are not usually erasable.

37
Threats to Internal Validity
  • Selection. There could be biases resulting in
    differential selection of respondents for the
    comparison groups.
  • Mortality. This refers to differential loss of
    respondents from the comparison groups.
  • Interaction of Selection and Maturation, etc.,
    which in certain of the multiple-group
    quasi-experimental designs might be mistaken for
    the effect of the experimental variable.

38
Quasi-Experimental Designs
  • Nonequivalent Control Group Design
  • O X O
  • O O

39
Quasi-Experimental Designs
  • Nonequivalent Control Group Design Comparing
    Math Classes Example
  • O X O
  • O O

40
Quasi-Experimental Designs
  • Nonequivalent Control Group Design
  • O X O
  • O O
  • Sources of Invalidity
  • Internal
  • Interaction of Selection and Maturation, etc
  • Regression ?
  • External
  • Interaction of Testing and X
  • Interaction of Selection and X ?
  • Reactive Arrangements ?

41
Comparing Math Strategies First
Observation/First Test Pre-Calculus
Lecture 2 Hrs./ Hands-On Computer Lab 2 Hrs. Lecture 3 hrs.
Mean Grade 58.97 59.44
Median Grade 59 60.5
Lowest Grade 25 15
Highest Grade 100 90
Confidence Level (95.0) 6.37 6.59
42
Examples of Other Quasi-Experimental Designs
  • Time Series
  • O O O O X O O O O
  • Multiple Time Series
  • O O O O X O O O O
  • O O O O O O O O

43
Quasi-Experimental Designs
  • Time Series
  • O O O OXO O O O
  • Sources of Invalidity
  • Internal
  • History
  • Instrumentation ?
  • External
  • Interaction of Testing and X
  • Interaction of Selection and X ?
  • Reactive Arrangements ?

44
U.S. Dept. of Ed Focuses on Level of
Evidence
  • U.S. Department of Education highlights What
    Works in educational strategies
  • What works is based upon assessment of level of
    evidence provided by educational research
    evaluation research

45
Dept. of Education Evaluates Evidence
46
General Education Learning Outcomes
AssessmentThe National Context
  • At the National Symposium on Student Success,
    Secretary of Education Margaret Spellings and
    others called on colleges to measure and provide
    evidence of student learning.
  • Measuring Up-National Report Cards By State
    Little Data on Whether students are Learning
  • Outcomes assessment has two purposes
  • Accountability (standardized national tests?)
  • Assessment/Effectiveness
  • Are Students Learning? How much?

47
Performing Assessment as Research in Practice
  • Assessment should seek systematic evidence of
    the effectiveness of existing programs,
    pedagogies, methodologies and approaches to
    improve student learning outcomes and instill a
    cycle of continuous improvement.
  • Implementation Strategy Aim for
    Quasi-Experimental Designs (or Exp. Designs)

48
Revitalizing Assessment Consider these
Next Steps
  • 1. Academic Leadership needed Work with
    Academic Coordinators and Chairs interested in
    improving outcomes
  • 2. Encourage Academic Action Research Outcomes
    evaluation research and field experiments to
    compare Learning Outcomes for different pedagogies

49
Revitalizing Assessment Consider these
Next Steps
  • 3. Encourage comparing teaching strategies when
    faculty are teaching more than one section of the
    same course
  • 4. Provide Analytic Support for Academic
    Coordinators, faculty, departments Engaged in
    Evaluation Outcomes Research

50
Revitalizing Assessment Consider these Next
Steps
  • 5. Encourage enthusiasm and excitement (e.g.,
    faculty mini-grants, recognition)
  • 6. Communicate and Use Results

51

Acknowledgements
  • Dr. Roger Goldwyn, Director of the Math General
    Education Program, Florida Atlantic University,
    Boca Raton, Fl
  • Dr. Kevin Doherty, Database Administrator,
    Institutional Effectiveness and Analysis, Florida
    Atlantic University, Boca Raton, Fl

52

QUESTIONS? Please email gwisan_at_fau.edu
Write a Comment
User Comments (0)
About PowerShow.com