Assessing Research-Doctorate Programs: A Methodology Study - PowerPoint PPT Presentation

About This Presentation
Title:

Assessing Research-Doctorate Programs: A Methodology Study

Description:

Assessing Research-Doctorate Programs: A Methodology Study ... – PowerPoint PPT presentation

Number of Views:94
Avg rating:3.0/5.0
Slides: 27
Provided by: Charl257
Category:

less

Transcript and Presenter's Notes

Title: Assessing Research-Doctorate Programs: A Methodology Study


1
Assessing Research-Doctorate Programs A
Methodology Study
2
Committee Task
  • Review and revise the methodology used to assess
    the quality and effectiveness of research
    doctoral programs.
  • Explore new approaches and new sources of
    information about doctoral programs and new ways
    of disseminating these data.
  • Recommend whether to conduct a full assessment
    using the methodology developed in by the
    committee

3
History of NRC Assessments
  • 1982 Assessment of Research-Doctorate
    Programs in the United States
  • Lyle V. Jones (Co-Chair)
  • Gardner Lindzey (Co-Chair)
  • 1995 Research-Doctorate Programs in the
    United States Continuity and Change
  • Marvin L. Goldberger (Co-Chair)
  • Brendan Maher (Co-Chair)

4
Perceived Strengths of Prior NRC Assessments
  • Authoritative source
  • Comprehensive
  • Clearly stated methodology
  • Temporal continuity
  • Widely quoted and utilized

5
Perceived Weakness of Prior NRC Assessments
  • Spurious precision of program rankings
  • Confounding of research reputation and
    educational quality
  • Soft criteria for assessments of programs
  • Ratings based on old data

6
Weaknesses continued
  • Poor dissemination of results for some audiences
  • Taxonomy categories out of date
  • Validation of data inadequate

7
Design of the Methodology Study
  • Formation of a committee. Definition of tasks.
  • Panel meetings to define questions, discuss
    methodology. Panels
  • Taxonomy and interdisciplinarity
  • Quantitative measures
  • Student processes and outcomes
  • Reputation and data presentation
  • Pilot trials of questionnaires, taxonomy.

8
Recommendations
  • Spurious precision issue
  • The committee recommends a new statistical
    methodology to make clear the probable range of
    ranking for each assessed academic unit.

9
Alternative Approach to Rankings to Convey
Rating Variability
  • Draw ratings at random.
  • Calculate rating for that draw.
  • Repeat process enough times to reach statistical
    reliability.
  • Present distribution of ratings from all the
    draws.

10
(No Transcript)
11
Recommendations continued
  • Research versus education issue
  • Drop reputational estimate of education quality
    as not independent of the reputational estimate
    of program quality.
  • Add quantitative indicators of educational
    offerings and outcomes.

12
Program Measures and a Student Questionnaire
  • Questions to programs
  • Size
  • Student characteristics and financing
  • Attrition and time to degree
  • Competing programs

13
Program Measures and a Student Questionnaire
continued
  • Questions to students in selected fields
  • Employment Plans
  • Professional Development
  • Program Environment
  • Infrastructure
  • Research Productivity

14
Recommendations continued
  • Soft criteria issue
  • Add quantitative measures concerning research
    output, citations, student support, time to
    degree, etc.

15
Examples of Indicators
  • Publications per faculty member
  • Citations per faculty member
  • Grant support and distribution
  • Library resources (separating out electronic
    media)
  • Laboratory Space
  • Interdisciplinary Centers

16
Recommendations continued
  • Poor dissemination issue
  • Add analytic essays to archival book output.
  • Add updateable current web output.
  • Add electronic assessment tools.
  • Add links from professional societies.

17
Recommendations continued
  • Taxonomy issue
  • Update 1995 taxonomy.
  • State clear criteria.
  • Consult professional societies, administrators
    and faculty.
  • Allow for two academic categories (rated
    programs and emerging fields).
  • Named subfields to help universities classify
    their programs.
  • Allowed faculty to be in more than one program.
  • Included two sub-threshold humanities fields
    (classics and German) to maintain continuity.

18
Recommendations continued
  • Validation issue
  • Conduct pilot studies and institute checks, both
    by institutional respondents and by external
    societies.

19
Pilot Institutions
  • University of Maryland
  • Michigan State University
  • Florida State University
  • University of Southern California
  • Yale University
  • University of Wisconsin at Milwaukee
  • University of California, San Francisco
  • Rennsalear Polytechnic Institute

20
Whats next
  • Obtain financing for the full study from both
    federal and foundation sponsors.
  • If funding is obtained
  • Full study would begin in Spring, 2004
  • Data collection in 2004/2005 for previous
    academic year.
  • Final report in summer 2006

21
Conclusion
  • The study that the Committee recommends is a BIG
    undertaking in terms of survey cost and the time
    of graduate programs and their faculty. Why is
    it worth it?
  • It will provide faculty, students and those
    involved with public policy an in-depth look at
    quality and characteristics of those programs
    that produce our future scientists, engineers,
    and those who help us understand the human
    condition.

22
Committee
  • Jeremiah Ostriker, Princeton, (Astrophysics),
    Chair
  • Elton Aberele, U. of Wisc (Ag)
  • John Brauman, Stanford U. (Chem)
  • George Bugliarello, PolyNY (Eng)
  • Walter Cohen, Cornell U. (Hum)
  • Jonathan Cole, Columbia U. (Soc Sci)
  • Ronald Graham, UCSD (Math)
  • Paul Holland, ETS (Stat)
  • Earl Lewis, U. of Michigan (History)
  • Joan Lorden, U. of Alabama- Birmingham (Bio)
  • Louis Maheu, U. de Montréal (Soc)
  • Lawrence Martin, SUNY-Stony Brook (Anthro.)
  • Maresi Nerad, U. Wash (Sociology Education)
  • Frank Solomon, MIT (Bioscience)
  • Catherine Stimpson, NYU (Hum)

23
Sub Committee Panels
  • STUDENT PROCESSES AND OUTCOMES
  • QUANTITATIVE MEASURES
  • TAXONOMY AND INTERDISCIPLINARITY
  • REPUTATIONAL MEASURES AND DATA PRESENTATION
  • Joan Lorden (Chair)
  • University of Alabama-Birmingham
  • Catherine Stimpson (Chair)
  • New York University
  • Walter Cohen (Co-Chair)
  • Cornell University
  • Frank Solomon (Co-Chair)
  • Massachusetts Institute of Technology
  • Jonathan Cole (Co-Chair)
  • Columbia University
  • Paul Holland (Co-Chair)
  • Educational Testing Service

24
Additional Panel Members
  • STUDENT PROCESSES AND OUTCOMES
  • Adam Fagen, Harvard Univ. (Bioscience,
    grad.student)
  • George Kuh, Indiana Univ. (Education)
  • Brenda Russell, Univ. of Illinois-Chicago
    (Bioscience)
  • Susanna Ryan, Indiana U. (English, Woodrow
    Wilson Fellow)
  • QUANTITATIVE MEASURES
  • Marsha Moss, Univ. of Texas (Institutional
    Research)
  • Charles E. Phelps, Univ. of Rochester (Provost
    Econ.)
  • Peter D. Syverson, Council of Graduate Schools

25
Additional Panel Members
  • TAXONOMY AND
  • INTERDISCIPLINARITY
  • Richard Attiyeh,UCSD (Econ.)
  • Robert F. Jones, AAMC (Bioscience)
  • Leonard K. Peters, VPI (Computer Science)
  • REPUTATIONAL MEASURES AND DATA RESENTATION
  • David Schmidley, Texas Tech (President
    Bioscience)
  • Donald Rubin, Harvard (Statistics)

26
Project web-site
  • http//www7.nationalacademies.org/resdoc/index.htm
    l
Write a Comment
User Comments (0)
About PowerShow.com