Assessing the Mission of Doctoral Research Universities - PowerPoint PPT Presentation

1 / 82
About This Presentation

Assessing the Mission of Doctoral Research Universities


Program Linkages for Graduate Assessment ... Why Assess Graduate Programs? ... Common Internal Reasons for Graduate Assessment. Program marketing ... – PowerPoint PPT presentation

Number of Views:89
Avg rating:3.0/5.0
Slides: 83
Provided by: josep47


Transcript and Presenter's Notes

Title: Assessing the Mission of Doctoral Research Universities

Assessing the Mission of Doctoral Research
  • J. Joseph Hoey, Georgia Tech
  • Lorne Kuffel, College of William and Mary
  • North Carolina State University Workshop
  • October 30-31, 2003

Guidelines for This Presentation
  • Please turn off or silence you cell phones
  • Please feel free to raise questions at anytime
    during the presentation, we will also leave time
    at the end for general discussion.
  • We are very interested in your participation

  • Introduction and Objectives
  • Reasons for Graduate Assessment
  • Comparative Data Sources
  • Developing Faculty Expectations for Graduate
  • Principles of Graduate Assessment
  • Physics Case Study
  • Taking Assessment Online
  • Summary and Discussion

  • Articulate motivations for undertaking graduate
  • Increase awareness of comparative data sources
  • Program Linkages for Graduate Assessment
  • Hands-on develop faculty expectations for
    student competence utilize diverse data sources
    to evaluate a graduate programs first assessment
    efforts etc.

Why Assess Graduate Programs?
  • We are all interested in the quality and
    improvement of graduate education
  • To help satisfy calls for accountability
  • Accreditation requirements SACS accreditation
  • To change or improve an invisible system, one
    must first make it visible
  • Schilling and Schilling, 1993, p. 172.

Common Internal Reasons for Graduate Assessment
  • Program marketing
  • Meet short-term (tactical) objectives or targets
  • Meet long-term (strategic) institutional/departme
    ntal goals
  • Funded project evaluation (GAANN, IGERT)
  • Understand sources of retention/attrition among
    students and faculty

SACS Principles of Accreditation
  • Core requirement 5 The institution engages in
    ongoing, integrated, and institution-wide
    research-based planning and evaluation processes
    that incorporate a systematic review of programs
    and services that (a) results in continuing
    improvement and (b) demonstrates that the
    institution is effectively accomplishing its

SACS Principles of Accreditation
  • Section 3 Comprehensive Standards Institution
    Mission, Governance, And Institutional
  • 16. The institution identifies outcomes for its
    educational programs and its administrative and
    educational support services assesses whether it
    achieves these outcomes and provides evidence of
    improvement based on analysis of those results.

SACS Principles of Accreditation
  • Section 3 Comprehensive Standards Standards
    for All Educational Programs
  • 12. The institution places primary
    responsibility for the content, quality, and
    effectiveness of its curriculum with the faculty
  • 18. The institution ensures that its graduate
    instruction and resources foster independent
    learning, enabling the graduate to contribute to
    a profession or field of study.

SACS Accreditation
  • The intent of the SACS procedures is to stimulate
    institutions to create an environment of planned
    change for improving the educational process.

  • Much of the assessment literature employs a fair
    amount of industrial or business speak
  • Feel free to develop and use your own
  • Keep it consistent across the institution
  • Produce and maintain a glossary of terms

So What Do We Need to Do?
  • Do our departments have a clear mission
  • Do we have departmental plans to evaluate the
    effectiveness of our degree programs?
  • Do our degree programs have clearly defined
    faculty expectations for students?
  • Are they published and are they measurable or
  • Do we obtain data to assess the achievement of
    faculty expectations for students?
  • Do we document that assessment results are used
    to change or sustain the excellence of program
    activities and further student gains in
    professional and attitudinal skills and

So What Do We Need to Do? (Cont.)
  • Based on assessment results, do we reevaluate the
    appropriateness of departmental missions as well
    as the expectations we hold for student
  • The amount of work needed to satisfy
    accreditation requirements is proportional to the
    number of No responses to the above questions.

IE Chart
Needed to Succeed
  • The department should want to do this process
  • The department must use the information collected
  • The institution must use the information
  • Use participation in the process as part of
    faculty reviews

Focusing Efforts
  • It is important to achieve a strategic focus for
    the program, decide what knowledge, skills,
    abilities, and experiences should characterize
    students who graduate from our program

What is Important to Measure?
  • To decide this, it is first vital to ask
  • What are our strong areas?
  • What are our limitations?
  • What do we want to accomplish in
  • Education of students?
  • Research?
  • Service?

Purpose Statement (sample)
  • The Anthropology Department serves the
    institution by offering courses and scholarly
    experiences that contribute to the liberal
    education of undergraduates and the scholarly
    accomplishments of graduate students. Program
    faculty members offer courses, seminars, directed
    readings, and directed research studies that
    promote social scientific understandings of human
    cultures. The Department offers a bachelors
    degree major and minor, an M.A. degree, and a

Developing a Plan to Evaluate Degree Programs
  • How to start a departmental plan top down or
    bottom up (Palomba and Palomba, 2001)
  • Top Down As a group of scholars, decide what
    are the important goals or objectives for the
  • Bottom Up Identify the primary faculty
    expectations for student competence in core
    courses in the program and use this list to
    develop overarching expectations for student

Develop an Assessment Plan
  • Desirable characteristics for assessment plans
    (Palomba and Palomba, 1999)
  • Identify assessment procedures to address faculty
    expectations for student competence
  • Use procedures such as sampling student work and
    drawing on institutional data where appropriate
  • Include multiple measures
  • Describe the people, committees, and processes
    involved and
  • Contain plans for using assessment information.

Words to Remember When Starting an Assessment
  • It may be best to tackle the modest objectives
  • Assessment plans should recognize that students
    are active participants and share responsibility
    for their learning experience along with the
    faculty and administration.
  • It takes a long time to do assessment well. So
    be patient and be flexible.
  • The overriding goal is to improve educational
    programs, not to fill out reports or demonstrate

Use a Program Profile to get Started
  • Related to Operational Objectives

Data for Profiles
  • Admissions Applications, acceptance rates, and
    yield rates
  • Standardized Test Scores
  • Graduate Record Examination (GRE)
  • Graduate Management Admission Test (GMAT)
  • Law School Admission Test (LSAT)
  • Undergraduate GPA
  • Headcount or Major Enrollments (Full/Part-Time)
  • Degrees Awarded

Profiles (Cont.)
  • Formula Funding Elements when appropriate
  • Time-to-Degree and/or Graduation/Retention Rates
  • Support for Students (Type of Assistance)
  • Faculty Headcount (Full/Part, Tenure Status)
  • Faculty Salaries
  • Faculty Productivity or Workload Compliance
  • Research Proposals Submitted/Awarded
  • Research Award/Expenditure Dollars
  • Instructional and Research Facility Space

Comparative Data
  • Survey of Earned Doctorates (SED)
  • National Center for Educational Statistics (NCES)
    Institutional Postsecondary Educational Data
    System (IPEDS)
  • National Research Council (NRC) Reports
  • Higher Education Data Sharing Consortium (HEDS)
    Graduate Student Survey (GSS)
  • American Association of University Professors
    (AAUP) or College and University Professional
    Association (CUPA) Faculty Salary Surveys

SED Data
  • Is administered annually and has a very high
    annual response rate
  • Doctoral degrees awarded by broad field and
    subfield by gender, racial/ethnic group, and
  • Institutional ranking by number of doctorate
    awards (top 20) by broad field and by
    racial/ethnic group
  • Time-to-Degree (three measures) by broad field,
    gender, racial/ethnic group, and citizenship

SED Data (Cont.)
  • Financial resources for student support by broad
    field, gender, racial/ethnic group, and
  • Postgraduate plans, employment, and location by
    broad field, gender, racial/ethnic group, and
  • Reports are available at http//www.norc.uchicago.

  • Fall enrollments by major field (2-digit CIP
    code) of study, race/ethnicity and citizenship,
    gender, attendance status (full/part-time), and
    level of student (undergraduate, graduate, and
    first professional)
  • The discipline field data is reported in even
    years only.
  • Annual degrees conferred by program (6-digit CIP
    code) or major discipline (2-digit CIP code),
    award level (associate degree, baccalaureate,
    Masters, doctoral, and first professional),
    race/ethnicity and citizenship, and gender.
  • Reported annually

IPEDS Data (Cont.)
  • Useful for identifying peer institutions
  • Available at the IPEDS Peer Analysis System
  • These data are also published in the National
    Center for Education Statistics (NCES), Digest of
    Education Statistics

National Research Council
  • Research-Doctorate Programs in the United States
  • This information is dated (1982 and 1993) with a
    new study scheduled for 2004 (?).
  • Benefit is rankings of programs. But some
    critics suggest reputational rankings cannot
    accurately reflect the quality of graduate
    programs. (Graham Diamond, 1999)
  • The National Survey of Graduate Faculty
  • Scholarly quality of program faculty
  • Effectiveness of program in educating research
  • Change in program quality in last five years

Profile Comparison for History and Physics NRC
  • History department ranked 46.5
  • Physics department ranked 63
  • (Goldberger, Maher, and Flattau, 1995)

Profile Comparison for History and Physics -
Profile Comparison for History and Physics -
Profile Comparison for History and Physics -
Profile Comparison for History and Physics -
Describing Faculty Expectations for Students

Why Describe Faculty Expectations for Students?
  • To sustain program excellence and productivity
  • To give faculty feedback and the ability to make
    modifications based on measurable indicators, not
  • To inform and motivate students
  • To meet external standards for accountability

What Are Our Real Expectations?
  • Read each question thoroughly. Answer all
    questions. Time limit four hours. Begin
  • MUSIC Write a piano concerto. Orchestrate it and
    perform it with flute and drum. You will find a
    piano under your seat.
  • MATHEMATICS Give today's date, in metric.
  • CHEMISTRY. Transform lead into gold. You will
    find a beaker and three lead sinkers under your
    seat. Show all work including Feynman diagrams
    and quantum functions for all steps.
  • ECONOMICS Develop a realistic plan for
    refinancing the national debt. Run for Congress.
    Build a political power base. Successfully pass
    your plan and implement it.

Steps to Describing Expectations - 1
  • Write down the result or desired end state as it
    relates to the program.
  • Jot down, in words and phrases, the performances
    that, if achieved, would cause us to agree that
    the expectation has been met.
  • Phrase these in terms of results achieved rather
    than activities undertaken.

Steps to Describing Expectations - 2
  • Sort out the words and phrases. Delete
    duplications and unwanted items.
  • Repeat first two steps for any remaining
    abstractions (unobservable results) considered
  • Write a complete statement for each performance,
    describing the nature, quality, or amount we
    consider acceptable.
  • Consider the point in the program where it would
    make the most sense for students to demonstrate
    this performance.

Steps to Describing Expectations - 3
  • Again, remember to distinguish results from
  • Test the statements by asking If someone
    achieved or demonstrated each of these
    performances, would we be willing to say the
    student has met the expectation?
  • When we can answer yes, the analysis is finished.

Steps to Describing Expectations - 4
  • Decide how to measure the meeting of an
    expectation can we measure it directly?
    Indirectly through indicators?
  • In general, the more direct the measurement, the
    more content valid it is.
  • For more complex, higher order expectations may
    need to use indicators of an unobservable result.

Steps to Describing Expectations - 5
  • Decide upon a preferred measurement tool or
    student task.
  • Describe the expectation in terms that measure
    student competence and yield useful feedback.

Try it!
  • What Faculty Expectation? Our sample is this
    Graduates will be lifelong learners
  • Decide Under what condition? When and where will
    students demonstrate skills?
  • Decide How well? What will we use as criteria?

Try it!
  • Under what condition?
  • Condition Students will give evidence of having
    the ability and the propensity to engage in
    lifelong learning prior to graduation from the

Try it!
  • How well? Specify performance criteria for the
    extent to which students
  • Display a knowledge of current disciplinary
    professional journals and can critique them
  • Are able to access sources of disciplinary
  • Seek opportunities to engage in further
    professional development activities
  • Other?

Principles of Graduate Assessment
  • Clearly differentiate masters and doctoral level
  • Assessment must be responsive to more
    individualized nature of programs
  • Assessment of real student works is preferable
  • Students already create the products we can use
    for assessment!

Principles of Graduate Assessment (continued)
  • Use assessment both as a self-reflection tool and
    an evaluative tool
  • Build in feedback to the student and checkpoints
  • Use natural points of contact with administrative

Common Faculty Expectations at the Graduate Level
  • Students will demonstrate professional and
    attitudinal skills, including
  • Oral, written and mathematical communication
  • Knowledge of concepts in the discipline
  • Critical and reflective thinking skills
  • Knowledge of the social, cultural, and economic
    contexts of the discipline
  • Ability to apply theory to professional practice
  • Ability to conduct independent research

Common Faculty Expectations at the Graduate Level
  • Students will demonstrate professional and
    attitudinal skills, including
  • Ability to use appropriate technologies
  • Ability to work with others, especially in teams
  • Ability to teach others and
  • Demonstration of professional attitudes and
    values such as workplace ethics and lifelong

Areas and Linkage Points to Consider in Graduate
  • Deciding on what is important to measure
  • Pre-program assessment
  • In-program assessment
  • Assessment at program completion
  • Long-term assessment
  • Educational process assessment
  • Comprehensive assessment (program review)

Use Natural Linkage Points
  • Admission use diagnostic exam or GRE subject
  • Annual advising appointment/progress check
  • Qualifying/Comprehensive exams embed items
    relevant to program objectives
  • Thesis and dissertation develop rubrics to rate
    multiple areas relevant to program objectives
  • Exit exit interview exit survey at thesis
    appointment, check-out, or commencement

Pre-Program Assessment
  • Re-Thinking Admissions Criteria (Hagedorn and
    Nora, 1997)
  • Problem Graduate persistence.
  • GRE is only designed to predict first-year
  • UG GPA and GRE are not measures of professional
    and attitudinal competency.
  • A variety of skills, talents, and experiences is
    necessary for success but not usually included in
    admissions criteria.
  • Evaluating the fit between the program and the
    student is important.

Other Pre-Program Assessment Tools
  • Portfolio and/or structured interviews featuring
  • Research interests and previous products
  • Critique of a report or research paper
  • Plan for a research project
  • Prior out-of-class experiences
  • Inventories to assess motivation, personality,
    fit to program

In-Program Assessment of Student Learning
  • Based on faculty expectations
  • Methods may include assessment of
  • Case studies, term papers, projects
  • Oral seminar presentations
  • Preliminary exams, knowledge in field
  • Research and grant proposals
  • Portfolios
  • Problem-Based Learning or Team projects
  • Input from advisors, graduate internship director

Assessment at Program Completion
  • Allows demonstration of synthesis of knowledge,
    skills and attitudes learned
  • Ideal comprehensive assessment point --but a
    sense of where the student began is desirable to
    assess change, growth, and value added
  • Qualitative analysis may be appropriate
  • Portfolio of research, scholarly products

Assessment at Program Completion (continued)
  • Methods may include assessment of
  • Thesis/dissertation oral defense
  • Professional registration or licensure exam
  • Published works, conference papers
  • Portfolio
  • Exit interview
  • Exit survey

Long-Term Assessment
  • Common sentiment graduates can adequately
    self-assess the outcomes of their program only
    after they have been applying their skills for
    several years following graduation.
  • Pursuing long-term assessment, based on
    identified learning objectives, is an important
    component of a graduate assessment program.

Long-Term Assessment (continued)
  • AAU (1998) important to track graduates of
    post-baccalaureate programs
  • to gain information on expectations vs. learning
  • to gain data on outcomes and placement.
  • Other reasons to them involved in the life of
    the school to bring them back as speakers,
    mentors, advisory board membersand donors.

Long-Term Assessment (continued)
  • May include assessment of
  • Job placement and linkage to degree
  • Career success
  • Production of scholarly work
  • Evidence of lifelong learning
  • Awards and recognition gained
  • Participation in professional societies
  • Satisfaction with knowledge gained

Long-Term Assessment (continued)
  • Common Assessment Methods
  • Follow-up interviews, surveys or focus groups
  • Journal publications
  • Citation indices
  • Membership lists and papers presented in
    professional/disciplinary associations

Value of Assessing the Educational Process
  • Widely viewed as key to graduate retention
  • Helps understand the strengths and needs for
    improvement of graduate coursework, research
    experience, teaching experience, advising, and
    support services.
  • Environment and process assessment see Golde and
    Dore (2001) survey for Pew Charitable Trusts.

Ways of Assessing the Educational Process
  • Graduate student advisory groups
  • Surveys of students, focus groups
  • Peer review of teaching
  • Institutional data time to degree, graduation
  • Advising process
  • Mentoring process

Assessing the Mentoring Process
  • A primary graduate learning and professional
    enculturation process
  • Mentoring at UC Berkeley (Nerad and Miller,
  • All faculty advise individuals, but mentoring is
    the shared responsibility of all members of dept.
  • Individual faculty mentors to students
  • Departmental seminars and workshops

Comprehensive Assessment Program Review
  • The combination of an internal self-study and an
    external review of the program by qualified
    faculty peers forms a very powerful and
    comprehensive assessment device.
  • Program review encompasses an examination of
    resources, processes, and student learning

Program Review Examples of Areas to Evaluate
  • Achievement of Faculty Expectations
  • communication skills appropriate to the
    discipline, professional and attitudinal
    competency, ability to conduct independent
    research, etc.
  • Processes
  • coursework, research opportunities, teaching,
    internships, comprehensive exams, theses, and
    time in residence
  • Resources (Profile)
  • faculty, students, library, instructional and lab
    space, financial support, extramural support, etc.

Putting the Pieces Together
  • Adapted from Baird (1996) matrix of faculty
    expectations, linkage points to use in conducting
    assessment, and some possible methods to use.
  • Adapt for use by each department by inserting
    appropriate faculty expectations for each program.

Case Study
  • See case study handout
  • Doctoral program in Physics at Muggy Research
    University (MRU)
  • First time through their assessment process
  • Data in hand What now?
  • You are the consultants!

Case Study Debriefing Questions
  • What do you see in the results?
  • What do you recommend?
  • What actions do they need to take?
  • In light of their mission, what should they do
    next time?

Taking Assessment Online
  • Georgia Techs Approach Online Assessment
    Tracking System (OATS)

  • Annual Assessment Updates are a key piece in
    Techs efforts to demonstrate compliance with
    SACS Principles of Accreditation.
  • Annual Assessment Updates concept was generated
    by GT unit coordinators in 1998 as a way of
    documenting Techs responsiveness to SACS
    recommendations re assessment practices.
  • Many people have requested that the process be
    moved to an online environment.
  • The online process provides structure, formalizes
    best practices in assessment of student learning,
    and thus facilitates demonstration of compliance.
  • SACS 2005 will be an electronic remote review.

Annual Assessment Update
New Method
Previous Method
  • What Did You Look At?
  • How Did You Look At It?
  • What Did You Find?
  • What Did You Do?

Feature Comparison
  • Old System
  • Many different formats
  • Hard copy only
  • Difficult to track progress over time
  • Flexibility (but no consistency across Institute)
  • Difficult to provide feedback internally and to
    facilitate institutional sharing of good practices
  • OATS
  • Consistent format
  • Database storage
  • Ability to track progress over time
  • Flexibility maintained
  • Process facilitates accreditation e-review
  • Easier to provide feedback facilitates
    institutional sharing

OATS Application
  • Includes user id/password logon
  • Web accessible from any location
  • Defined format structureObjectives, Methods,
    Results, and Actions/Impact
  • Allows posting of formatted text (tables, charts,
  • Allows notes and written feedback
  • Review at School/Unit and College level keeps
    everyone in the loop
  • OATS Production Date October 1
  • Assessment Updates due December 1 this year

Main Menu Current Year and History
College Level Ivan Allen College
- example -
Sent to College
School Level History, Technology Society
- example -
Sent to College
Sent to College
Sent to College
Degree Program Level BS in HTS
- example -
  • SACS requires assessment of graduate programs,
    research and public service
  • Make it relevant to the program
  • Keep it simple and focused
  • Consider different assessments for each stage of
    student progress
  • Start now it takes several years to fine tune

  • See references in back of handout

Session Evaluation
  • What one aspect was the most useful to you?
  • What one aspect most needs improvement, and what
    kind of improvement?
  • Other suggestions?

Thank You!
  • Questions? Contact us!
Write a Comment
User Comments (0)