Using Assessment Findings To Make a Difference At Two-Year Colleges Southeastern Association for Community College Research July 24, 2006 Nashville, Tennessee Presented By Trudy W. Banta Vice Chancellor Planning and Institutional - PowerPoint PPT Presentation

Loading...

PPT – Using Assessment Findings To Make a Difference At Two-Year Colleges Southeastern Association for Community College Research July 24, 2006 Nashville, Tennessee Presented By Trudy W. Banta Vice Chancellor Planning and Institutional PowerPoint presentation | free to download - id: 43745c-MDM1M



Loading


The Adobe Flash plugin is needed to view this content

Get the plugin now

View by Category
About This Presentation
Title:

Using Assessment Findings To Make a Difference At Two-Year Colleges Southeastern Association for Community College Research July 24, 2006 Nashville, Tennessee Presented By Trudy W. Banta Vice Chancellor Planning and Institutional

Description:

... 8 California Community Colleges Compared CATs vs. no ... Transactional Leadership Transformational Leadership Assessment Leaders ... – PowerPoint PPT presentation

Number of Views:316
Avg rating:3.0/5.0

less

Write a Comment
User Comments (0)
Transcript and Presenter's Notes

Title: Using Assessment Findings To Make a Difference At Two-Year Colleges Southeastern Association for Community College Research July 24, 2006 Nashville, Tennessee Presented By Trudy W. Banta Vice Chancellor Planning and Institutional


1
Using Assessment Findings To Make a Difference At
Two-Year Colleges Southeastern
Association for Community College Research July
24, 2006 Nashville, Tennessee Presented
By Trudy W. Banta Vice Chancellor Planning and
Institutional Improvement Indiana
University-Purdue University Indianapolis 355 N.
Lansing St., AO 140 Indianapolis, Indiana
46202-2896 tbanta_at_ iupui.edu http//www.planning.i
upui.edu
2
Organizational Levels for Assessment
  • National
  • Regional
  • State
  • Campus
  • College
  • Discipline

  • Classroom

  • Student

3
Outcomes Assessment
  • a process of assembling credible
  • evidence of the
  • resources
  • processes
  • outcomes
  • of higher education undertaken for the
  • purpose of guiding improvement of
  • programs, services, learning.

4
Some Purposes of Assessment
  • 1. Students learn content
  • 2. Students assess own strengths
  • 3. Faculty improve instruction
  • 4. Institutions improve programs/services
  • 5. Institutions demonstrate accountability

5
Outcomes Assessment Requires Collaboration
  • In setting expected program outcomes
  • In developing sequence of learning experiences
    (curriculum)
  • In choosing measures
  • In interpreting assessment findings
  • In making responsive improvements

6
Organizing for Assessment
7
To Ensure That Concepts Are Taught
8
Hagerstown Community College
  • focuses on
  • SCANS Skills
  • Allocates resources (time, money, people)
  • Works well with others
  • Acquires, organizes, communicates information
  • Monitors, designs, improves systems
  • Applies technology to specific tasks

9
Career Transcript
SCANS SKILL Source of Information Level of Performance
Allocates time Observation Prepares and organizes multiple schedules
10
  • Direct Measures of Learning
  • Assignments, exams, projects, papers
  • Indirect Measures
  • Questionnaires, inventories, interviews
  • - Did the course cover these objectives?
  • - How much did your knowledge increase?
  • - Did the teaching method(s) help you learn?
  • - Did the assignments help you learn?

11
Select or Design Assessment Methods
  • 1. Match with goals
  • 2. Use multiple methods
  • 3. Combine direct and indirect measures
  • 4. Combine qualitative and quantitative
    measures
  • 5. Consider pre - post design to assess gains
  • 6. Use built-in points of contact with students

12
  • The Press to Assess
  • with a
  • TEST

13
TN Most Prescriptive (5.45 of Budget for
Instruction)
  • Accredit all accreditable programs (25)
  • Test all seniors in general education (25)
  • Test seniors in 20 of majors (20)
  • Give an alumni survey (15)
  • Demonstrate use of data to improve (15)
  • ___
  • 100

14
What I Learned in TN
  • We had to require seniors to test
  • You cant MAKE a student do good work
  • 1 score point cost 60,000
  • No test is that reliable

15
Other Concerns
  • Mix of majors
  • Mix of faculty expertise
  • transfers
  • adult learners
  • Time to degree

16
Concerns about Value Added
  • without SAT/ACT
  • Validity of SAT/ACT for adults
  • Test content favors some majors
  • How conscientious is performance
  • (pre- and post)?
  • How quickly are individual scores available?
  • Consequences of low score?

17
Do We Have to Use a Test?
  • May not be a test
  • Test covers material we dont teach
  • Test covers too little of what we do teach

18
Institutional Comparisons
  • Selective private parents have degrees, books,
    computers
  • Open access urban public parent may be
    homeless, student got GED and may work, have
    family sick parent
  • Both serve Different missions

19
Best Predictors of Senior Scores
  • lt Entering freshman characteristicsgt
  • BUT
  • Both will produce some
  • ? company presidents
  • ? good parents
  • ? convicted felons

20
Measurement Issues
  • We need reliable, valid measures of
  • generic skills
  • disciplinary knowledge and skills
  • performance (application of skills)
  • value added by education

21
  • In 21st century learning environments,
    decontextualized, drop-in-from-the-sky
    assessments consisting of isolated tasks and
    performances will have zero validity as indices
    of educational attainments. High stakes tests
    will fail badly (on standards of) validity.
  • - James W. Pellegrino (2004)
  • Angoff Memorial Lecture

22
Start with Measures You Have
  • Assignments in courses
  • Course exams
  • Work performance
  • Records of progress through the curriculum

23
8 California Community Colleges
  • Compared CATs vs. no CATs
  • Retention increased (up to 8)
  • More As, fewer Ds and Fs
  • Females benefited most
  • Students more involved satisfied
  • Teachers trained to use CATs are more effective
    overall

24
Longitudinal Tracking Enables institutions to
study continuity or change over time by following
cohorts of students as they move through college
  • Transcript Analysis
  • Produces realistic picture of academic progress,
    including course-taking patterns, use of
    electives, effectiveness of prerequisites
    placement decisions

25
Community College of Denver
  • Implemented student tracking by program in 1989
  • Attrition highest for
  • Student not associated with a program
  • Developmental education students
  • Non-native English speakers
  • Improvements undertaken
  • Student entry procedures developed for every
    degree
    program
  • Basic skills labs in English and Math
  • Grants for faculty projects designed to
    reinforce basic skills
  • Collaboration strengthened between faculty and
    student services personnel

26
Dyersburg State Community College
  • Problem Low Graduation Rates
  • Solutions
  • Computerized Degree Monitoring System
    instituted
  • Faculty evaluation includes review of advisee
    progress
  • Students are counseled concerning progress
  • Result 60 increase in number of graduates

27
Butler County CC Learning Outcomes
  • P Personal Development Skills
  • (time management, teamwork)
  • A Analytical Skills
  • (CT, problem solving)
  • C Communication Skills
  • (writing, speaking)
  • T Technological Skills

28
Butler Co. Community College
  • Faculty use
  • Course assignments
  • Standardized rubrics
  • Observation in and outside class
  • Interviews
  • to assess PACT skills

29
Butler Co. Community College
  • Students receive
  • Grades on their assignments
  • An individualized record of their
  • levels of achievement
  • on PACT skills

30
Butler Co. Community College
  • Office of Assessment
  • aggregates assessment data
  • across students, across courses
  • to detect strengths/weaknesses
  • on PACT skills

31
PRINCIPLES OF UNDERGRADUATE LEARNING (PULs)
  • Core communication and quantitative skills
  • Critical thinking
  • Integration and application of knowledge
  • Intellectual depth, breadth, and adaptiveness
  • Understanding society and culture
  • Values and ethics
  • Approved by IUPUI Faculty Council
    May 1998

32
Student Electronic Portfolio
  • Students take responsibility for demonstrating
    core skills
  • Unique individual skills and achievements can be
    emphasized
  • Multi-media opportunities extend possibilities
  • Metacognitive thinking is enhanced through
    reflection on contents
  • - Sharon J. Hamilton
  • IUPUI

33
Community College Student Experience Questionnaire
  • (CCSEQ)
  • Course learning
  • Library usage
  • Interaction with faculty
  • Student acquaintances
  • Art, music, and theatre
  • Science activities
  • Vocational skills

34
College Student Experiences Questionnaire (sample
item) Library Experience
  • used library as quiet place to study
  • used online catalogue
  • asked librarian for help
  • read reserve book
  • used indexes to journal articles
  • developed bibliography
  • found interesting material by browsing
  • looked for further references cited
  • used specialized bibliographies
  • read document other authors cited

35
Santa Barbara City College
  • Problem Need to improve retention and
    satisfaction with goal achievement
  • COMMUNITY COLLEGE STUDENT EXPERIENCES
    QUESTIONNAIRE
  • Findings Strong relationships between
    involvement and progress toward goals and
    satisfaction with instruction
  • Response More extra-curricular activities,
    departmental clubs, space for group study

36
Dyersburg (TN) State Community College
  • Employer Focus Groups and CCSEQ
  • Improve writing and interaction skills
  • Writing assignments increased
  • Interaction emphasized
  • Now employers are more satisfied

37
Syllabus Analysis at Oakton Community College
  • Course syllabi fail to make clear
  • required reading writing
  • types of assignments tests
  • dates of assignments tests
  • definitions of class participation
  • components of final grade with weights
  • Response Faculty development on syllabus
  • design

38
Nursing Program at Oakton Community College
  • Increased active learning in classes
  • Appointed advisors/mentors for groups of
    students who review student progress
    continuously
  • Taught test preparation
  • Results
  • Pass rate on national board exam 55 95
  • Attrition rate 9

39
Assessment in Dental Hygiene 56 Competences
(knowledge, skills, attitudes, communication
and psychomotor skills)
  • ? National board and state licensing exams
  • ? 5-hour performance assessment
  • OSCE (measure vital signs, interpret
    radiographs)
  • Communicate with standardized patient
  • Design explain community oral health
    program
  • Interpret a research article
  • Solve an ethical dilemma
  • Ann McCann
  • Baylor College of Dentistry

40
Results of Assessment in Dental Hygiene
  • Add to the curriculum
  • Patient assessment exercises
  • Experience in taking vital signs
    quarterly
  • -Ann McCann
  • Baylor College of Dentistry

41
Johnson County Community College
  • Student evaluations of instruction suggested
    poor test items in use
  • Faculty workshops on item writing/test
    development
  • Now student evaluations have improved
  • Academic program reviews revealed outdated course
    material
  • Faculty encouraged to renew course
    material
  • Now reviews are more positive

42
College of DuPage
  • Faculty review test scores and answer short
    questions about what should be done now.
  • Anonymous responses are posted on college
    website.
  • Colleagues determine responses.

43
College of DuPage
  • Reading scores were low
  • Faculty in each division added a reading-related
    goal
  • Reading scores increased

44
Middle States Survey Spring 2005
  • BARRIERS TO ASSESSMENT
  • Campus culture resistant to change
  • Time to do it
  • Lack of leaders involvement and
  • commitment
  • -Linda Suskie

45
Strong Leadership
  • is needed from
  • Presidents
  • Provosts
  • Deans
  • Chairs
  • But what kind of leadership?

46
Interviews
  • 4 Presidents
  • 6 Provosts
  • 1 VP for Students
  • 2- and 4-year
  • public and private
  • nation-wide
  • -Banta Effective Practices for
  • Academic Leaders Stylus (2006)

47
Collaboration
  • - using collective intelligence of
  • those around me (Brown)
  • - get to the grassroots to stimulate
  • growth (Miller)

48
Set a Vision
  • - overarching vision of the possible
  • (Malmberg)
  • - challenge prevailing assumptions . . .
  • set stretch goals (Schroeder)

49
Build Consensus
  • You can have all the transformative vision
  • in the world ( and it will be worthless) if
  • you dont have a sufficient cohort of
  • people who buy into it
  • (Lazerson)

50
Communicate
  • - be a good listener
  • - provide information that sets a tone
  • and begins to establish a foundation
  • for trust
  • (Durrer)

51
Other Leadership Characteristics
  • - be extremely focused (Brown)
  • - be willing to compromise (Lazerson)
  • - use authority to make changes
  • (Doherty)

52
Why Assessment?
  • - Dissatisfaction with
  • levels of learning
  • evidence of learning
  • - Regional accreditation
  • - Disciplinary accreditation

53
Vision for Assessment
  • - a powerful force for change
  • - a tool for improving
  • curriculum
  • instruction
  • learning
  • - a way to determine which programs
  • meet current needs . . . prepare
  • students for jobs
  • (Durrer)

54
Stewardship
  • Do we care enough about our students to learn
  • how they experience the
  • university?
  • if we are meeting their needs?
  • (Schroeder)

55
Transactional Leadership
  • Bound by
  • tradition
  • rules
  • procedures

56
Transformational Leadership
  • - Provides intellectual stimulation and
  • inspiration
  • - Emphasizes
  • relationships
  • trust
  • innovation
  • -Bernard Bass (1999)

57
Assessment Leaders
  • Emphasize vision, mission, values
  • Set high expectations
  • Serve as role models
  • Motivate others to collaborate
  • Communicate
  • See challenges as opportunities
  • Support innovation
  • Build trust by using data to improve, not punish

58
Introduce Assessment
  • - Expand the group of believers
  • Meals with faculty (Lazerson)
  • Multiple choice tests (Schroeder)
  • Town hall meetings (Corts)
  • Invite student affairs to join faculty

59
Provide Oversight and Support
  • Campus-wide committee
  • Leader with release time
  • Office to provide data
  • Faculty-staff development
  • - travel to conferences
  • - on-campus workshops
  • - grants to support innovation

60
Provide Incentives
  • - Link with scholarship (Lohmann)
  • - Build into P T processes
  • - Provide merit pay (Malmberg)
  • - Establish awards
  • - Celebrate successes

61
USE DATA
  • Were driven by data (Lohmann)
  • Data determine what we fund (Brown)
  • We instinctively reach for data (Corts)

62
Sustaining Assessment
  • Committed president and provost
  • Engaged faculty
  • Core of believers and advocates using data
    (Miller)
  • Evidence that use of data CAN improve programs
    and services and be used in research and
    scholarship

63
Conclusions
  • Articulate vision to improve learning and use
    assessment to chart progress
  • Involve stakeholders in changing the culture
  • Offer support and incentives
  • Use data to guide improvements and communicate
    results to all
About PowerShow.com