Responsive Evaluation in the Community College: An Alternative Approach to Evaluating Programs - PowerPoint PPT Presentation


PPT – Responsive Evaluation in the Community College: An Alternative Approach to Evaluating Programs PowerPoint presentation | free to view - id: 1b6a5c-YTRlM


The Adobe Flash plugin is needed to view this content

Get the plugin now

View by Category
About This Presentation

Responsive Evaluation in the Community College: An Alternative Approach to Evaluating Programs


... color just by default because we have such a diverse student population to have ... Student finances: book loans & more ' ... Student voices. Findings: ... – PowerPoint PPT presentation

Number of Views:117
Avg rating:3.0/5.0
Slides: 21
Provided by: ndurd


Write a Comment
User Comments (0)
Transcript and Presenter's Notes

Title: Responsive Evaluation in the Community College: An Alternative Approach to Evaluating Programs

Responsive Evaluation in the Community College
An Alternative Approach to Evaluating Programs
  • Nathan R. Durdella, PhD
  • Monterey, California
  • April 10, 2006

Presentation Overview
Background, Design, Methods
Research Context and Problem
  • Increasing institutional, accreditation
    requirements to document student outcomes
  • Dominant model systematic evaluation (Rossi,
  • Program objectives, outcomes
  • Alternative evaluation models
  • Recently been used successfully (Shapiro, 1988)
  • Responsive evaluation

Evaluation Models Systematic vs. Responsive
  • Stakes problem with systematic evaluation
  • Systematic evaluations narrow focus to assess
    programs goals, measurements, and standards
    (Shaddish et al, 1991)
  • Systematic evaluations best suited for summative
  • Responsive evaluations focus
  • The primary purpose should be to respond to
    audience requirements for information (Guba,
    1978, p. 34)
  • Process-oriented issues
  • Program implementation
  • Stakeholder-based
  • Locally-generated criteria

Stakes Responsive Evaluation
  • Responsive Evaluations prescriptive steps
  • Program staff/participants are identified and
    solicited for those claims (Guba Lincoln,
    1989, p. 42)

2. Issues of program staff and participants are
organized and brought to staff members for comment
3. Issues not resolved are used as organizers
for information collection (Guba Lincoln,
1989, p. 42)
4. The evaluator approaches each audience member
with the evaluation results to resolve all issues
Research Questions
  • Two research questions
  • How effectively does responsive evaluation theory
    work as a way to evaluate instructional support
  • How does responsive evaluation articulate with
    systematic evaluation approaches?

Research Design and Methods
  • Design Comparative, qualitative case study
  • Case selection
  • Institutions
  • Cerritos College Santa Ana College HSIs
  • Programs
  • Project HOPE MESA
  • Data sources and Sampling
  • Interviews and journals
  • 2-step procedure purposeful and random
  • Data Collection
  • Interviews 19 total subjects, 23 total
  • Per program 3 students, 2 staff and 2 faculty,
    2-3 administrators
  • Program directors were interviewed 3 times

Results Project HOPE MESA
Results Project HOPE
  • Faculty resisted cultural pedagogy
  • Project HOPE faculty
  • Its a method of learning where you would
    approach youre teaching looking at culture
  • They dont feel like it would have any impact on
    their students.
  • Faculty and administrators
  • We need to serve all of our students equitably.
  • Well were not really a minority any more.
  • 2. Campus did not value Project HOPE
  • Project HOPE staff
  • There are issues of, Id say, with respect to
    this program and the college in general about the
    value of it, the need for it because I think
    theres a prevailing thought that we do already
    all we can for students of color just by default
    because we have such a diverse student population
    to have programs like these.

Results Project HOPE
  • Guidance Counseling
  • Well now I know exactly what am I supposed to be
    taking for every, every semester and everything.
  • Parent, family participation
  • My mom was telling my dad, We have to do our
    taxes because they have to file. So now she
    knows what were talking about when we have to do
    our financial aid paperwork.
  • Health Occupations 100 as central
  • I definitely know I want to stay in in L.A. and
    really serve those communities in need.
  • Program communication, coordination
  • There was nothing said or nothing exchanged.
  • Lack of faculty buy-in, participation
  • The only things I ever hear is why arent we
    part of this.

Results MESA Program
  • Major issue Program impact
  • In general, MESA students outperform
    math/science, SAC students
  • MESA staff central to students
  • I know you really want to go, call me. If you
    cant make it, call me. If you cant come to
    class, tell me why. If you think youre doing
    bad in class, just talk to me. We can work
    something out.
  • Successful program coordination
  • We have an organized system.

Results MESA Program
  • Other emerging themes
  • Student finances book loans more
  • I then use the money I saved to attend events
    sponsored by the Transfer Center.
  • MESA Study Center
  • The MESA Study Center is a good place if one
    wants to share a friends company and eat lunch
    while one studies.
  • Program focus no parent participation
  • A big obstacle for me as well was that the lack
    of information available to my parents.
  • Course scheduling, engineering
  • These classes are not offered every semester.

Findings Conclusions
Findings Responsive Evaluation
  • Ongoing programs, categorically funded or
  • Program staff cooperation, participation
  • Programs challenges, underlying problems
  • Program processes, improvement
  • Programmatic or institutional need
  • Not solely program impact

Further Findings Responsive Evaluation
  • Politically charged context
  • Personality and power conflicts
  • Project HOPE preexisting
  • UC, well established MESA programs
  • Responsiveness no assurance model responds to
    all stakeholders
  • Identification, development of issues

Findings Responsive Systematic Models
  • Models articulate well
  • Project HOPE prior evaluations vs. responsive
  • MESA program impact
  • Results meaningful
  • Project HOPE new face
  • But, reinforce perceptions
  • MESA few surprises but useful
  • Student voices

Findings Responsive Evaluator
  • Initial phases conditions present to conduct
  • Balance between encouraging participation and
    maintaining control
  • Stakeholder-based models
  • Key understanding programs as insider while
    maintaining checks
  • Presentation of results critical

Conclusions Responsive Evaluation in the
Community College
  • Institutional charge respond to students,
    faculty, staff, stakeholders
  • Responsive evaluation powerful tool for
    community colleges programs
  • Community colleges limited resources
  • Research offices overburdened

Thank you for attendingQuestions or comments?
  • Nathan R. Durdella, PhD
  • Cerritos College