Measuring Up on College-Level Learning - PowerPoint PPT Presentation

Loading...

PPT – Measuring Up on College-Level Learning PowerPoint presentation | free to download - id: 6651b1-YmZjZ



Loading


The Adobe Flash plugin is needed to view this content

Get the plugin now

View by Category
About This Presentation
Title:

Measuring Up on College-Level Learning

Description:

Measuring Up on College-Level Learning Margaret Miller, Project Director September 2003 Measuring Up 2000 Learning in the States: Incomplete [Add state map on ... – PowerPoint PPT presentation

Number of Views:15
Avg rating:3.0/5.0
Slides: 25
Provided by: visi191
Learn more at: http://archive.sheeo.org
Category:

less

Write a Comment
User Comments (0)
Transcript and Presenter's Notes

Title: Measuring Up on College-Level Learning


1
Measuring Up on College-Level Learning
  • Margaret Miller, Project Director
  • September 2003

2
Measuring Up 2000
3
Learning in the States Incomplete
  • Add state map on incomplete

4
State Efforts to Measure Learning (taxonomy
Peter Ewell, Change magazine)
  • Certification of individual students
  • E.g., Texass TASP, Floridas CLAST
  • Institutional assessment for improvement
  • E.g., Tennessee's performance measures
  • Missouris accountability program
  • Campus-based assessment
  • Institutional assessment for accountability
  • E.g., S. Dakota and Arkansas

5
National Attention to College-Level Learning
  • Pews Quality of Undergraduate Education and
    writing assessment projects
  • American Association of Colleges and
    Universities general education assessment
    project
  • Council on Higher Education Accreditations
    project on institutional effectiveness
  • Secretary's Commission on Achieving Necessary
    Skills (SCANS) skills
  • Equipped for the Future
  • National Skills Standards Board

6
Key Questions
  • What do the states college-educated citizens
    know and what can they do that contributes to the
    social good? What kind of educational capital do
    they represent?
  • and

7
Key Questions (cont.)
  • How well do the states public and private,
    two- and four-year colleges and universities
    collectively contribute to that capital? What do
    those whom they educate know, and what can they
    do?

8
Key Decisions
  • Whose learning will we measure?
  • What learning will we measure?
  • How will we use the information?
  • What strategies will we pursue?

9
Whose Learning
  • Whose learning
  • What learning
  • The policy uses for the information
  • Assessment strategies
  • The college-educated in the states
  • and
  • college students

10
What Learning
  • Whose learning
  • What learning
  • The policy uses for the information
  • Assessment strategies
  • National Education Goal 6
  • By the year 2000, every adult American will be
    literate and will possess the knowledge and
    skills necessary to compete in a global economy
    and exercise the rights and responsibilities of
    citizenship

11
What Learning (cont.)
  • Whose learning
  • What learning
  • The policy uses for the information
  • Assessment strategies
  • National Goal 6, objective for college education
  • By the year 2000, every adult American will be
    literate and will possess the knowledge and
    skills necessary to compete in a global economy
    and exercise the rights and responsibilities of
    citizenship

12
Policy Purposes
  • Whose learning
  • What learning
  • The policy uses for the information
  • Assessment strategies
  • Higher education policy
  • and
  • K-12 education economic development adult
    literacy policy

13
Direct Strategies
  • National Assessment of Adult Literacy
  • Graduate-admissions and licensing exams
  • General intellectual skills tests
  • Whose learning
  • What learning
  • The policy uses for the information
  • Assessment strategies

14
National Assessment of Adult Literacy (NAAL)
concludes12/03
  • Disadvantages
  • Labor-intensive, expensive
  • Decadal federal survey --timing
  • National sample only, except in 6 states
  • Not what colleges think they teach
  • Advantages
  • Advanced literacy levels of a good measure of
    educational capital
  • Assesses general population
  • Comparison group of non-college-educated
  • Household survey respondent motivation high

15
Existing Exams
  • Graduate-admissions exams
  • Dental
  • Graduate Management
  • Graduate Record
  • Law School,
  • Medical College
  • Optometry
  • Pharmacy
  • Licensing exams
  • Clinical Pathology
  • Dental Hygiene
  • Occupational Therapy
  • Physical Therapy
  • Physician Assistant
  • Nursing
  • Respiratory Therapy
  • Teaching
  • Whose learning
  • What learning
  • The policy uses for the information
  • Assessment strategies

16
Existing Exams data gathered by 03/04
  • Disadvantages
  • Selection bias
  • Uneven coverage by discipline
  • Variable (and sometimes small) numbers of test-
    takers in each state
  • Most in health professions
  • Advantages
  • Established, credible instruments
  • Highly motivated test-takers
  • Admissions tests assess general intellectual
    abilities
  • Availability
  • Low cost


17
General Intellectual Skills Tests administered
fall 03
  • WorkKeys to a sample of two-year students in each
    state
  • Applied Math
  • Locating Information
  • Reading for Information
  • Business Writing
  • Collegiate Learning Assessment (CLA) to a sample
    of four-year students in each state

18
WorkKeys and CLA
  • Disadvantages
  • Institutional motivation
  • Test-taker motivation
  • Expense
  • Advantages
  • Excellent tests of general functional
    intellectual skills
  • Can impart useful information to student and
    school

19
Indirect Measures NSSE/CCSSE co-administered with
tests CRS summer through fall, 03
  • Whose learning
  • What learning
  • The policy uses for the information
  • Assessment strategies
  • National Survey of Student Engagement (NSSE)
  • Community College Survey of Student Engagement
    (CCSSE)
  • College Results Survey (CRS)

20
Surveys
  • Disadvantages
  • Not direct learning measures
  • Not yet cross-correlated with direct measures
  • Advantages
  • Excellent and recently developed instruments
  • Process measure could lead to improvement
  • Both have face validity
  • Respondent motivation good

21
Challenges
  • Political instability in states gubernatorial,
    SHEEO
  • Personnel changes among key players
  • Institutional skepticism
  • Faculty resistance
  • Data-collection hurdles
  • Test-taker motivation

22
General Timeline
  • Measuring Up 2002 model tested with
    incomplete data from Kentucky
  • 2002-2004 Five-state pilot to test assessment
    model IL, KY, NV, OK, SC
  • Measuring Up 2004 publish the results of the
    pilot
  • Measuring Up 2006 if enough states adopt the
    model, grade states on learning

23
Reasons to Act
  • It is the right thing to do.
  • We can determine how to do it right.
  • This initiative will generate information useful
    to states, institutions, and students.
  • State-level analysis can promote collaborations
    to serve underachieving subpopulations or regions
    of the state.
  • State resources can be effectively targeted.

24
  • http///collegelevellearning.org
About PowerShow.com