Assessing the Work of Higher Education: Institutional Effectiveness and Student Learning - PowerPoint PPT Presentation

1 / 28
About This Presentation
Title:

Assessing the Work of Higher Education: Institutional Effectiveness and Student Learning

Description:

Assessment is the process of asking and answering questions that seek ... Admissions, Bursar, Registrar. Athletics. Center for Advising, Academic Support, etc. ... – PowerPoint PPT presentation

Number of Views:55
Avg rating:3.0/5.0
Slides: 29
Provided by: Widen7
Learn more at: http://www.msche.org
Category:

less

Transcript and Presenter's Notes

Title: Assessing the Work of Higher Education: Institutional Effectiveness and Student Learning


1
Assessing the Work of Higher Education
Institutional Effectiveness and Student Learning
  • Dr. Jo Allen, Senior Vice President Provost
  • Widener University
  • Middle
    States Commission on Higher Education, October
    2008

2
Overview of Presentation
  • Operational Terms
  • Drivers of assessment
  • Assessment of institutional effectiveness
  • Assessment of student learning outcomes
  • Questions and concerns

3
Assessment An Operational Definition
  • Assessment is the process of asking and answering
    questions that seek to align our stated
    intentions with documentable realities. As such,
    in higher education, it deals with courses,
    programs, policies, procedures, and operations.

4
Evaluation An Operational Definition
  • Evaluation focuses on individual performance in
    the sense of task performance or job completion
    and quality, typically resulting in merit raises,
    plans for future improvement, orin less
    satisfying casesprobation and possibly firing.

5
Assessment vs. Evaluation
  • Assessment focuses on the work to be done, the
    outcomes, and the impact on otherstypically, the
    aggregate situation, not just the individuals.
  • Evaluation focuses on the work of the
    individualstheir contributions, effectiveness,
    creativity, responsibility, engagement, or
    whatever factors the organization deems most
    desirable.

6
Assessment vs. Evaluation
  • Assessment focuses on the work to be done, the
    outcomes, and the impact on othersnot on the
    individuals doing the work.
  • Evaluation focuses on the work of the
    individualstheir contributions, effectiveness,
    creativity, responsibility, engagement, or
    whatever factors the organization deems most
    desirable.

7
Assessment of Institutional Effectiveness vs.
Student Learning
  • Institutional effectiveness the results of
    operational processes, policies, duties and
    sitesand their success in working togetherto
    support the management of the academy
  • Student learning the results of curricular and
    co-curricular experiences designed to provide
    students with knowledge and skills

8
What or who is driving assessment?
  • Accreditors
  • charged with determining the reputable from
    non-reputable institutions and programs
  • charged with ensuring that institutional
    practices support the viability and
    sustainability of the institution and its
    offerings
  • represent disciplinary and institutional
    interests

9
Assessment drivers (contd.)
  • The public Ivory Tower, liberal bias,
    ratings/rankings
  • Legislators responsive to citizens concerns
    about quality, costs, biases.or?
  • Prospective faculty Quality and meaningful
    contributions to students lives
  • Prospective parents real learning and
    preparation for careers
  • Prospective students How will I measure up?
    And what kind of job can I get when I graduate?
  • Funding agencies/foundations evidence of an
    institutions or facultys commitment to learning
    and knowledge and evidence of prior success

10
Matters of Institutional Quality
  • Can we justify costs/prices of attendance?
  • Can we verify the quality of our educational
    offerings in measurable terms?
  • Can we verify the effectiveness of operational
    contributors to a sustainable educational
    experience?
  • Can we use data and other findings to improve the
    quality of our educational and operational
    offerings?
  • Can we use those findings to align resources
    (financial, staff, curricular, co-curricular) to
    enhance desired outcomes?

11
Sites of Institutional Effectiveness
  • Processes existence and transparency
  • Enrollment Admissions, financial aid,
    registration
  • Curricular Advising, progress toward degree
    completion
  • Budgeting operations/salaries capital bond
    ratings and ratios endowment management
    benefits etc.
  • Planning strategic planning, compact planning,
    curricular planning, etc.
  • Judicial education/training, communication,
    sanctions, etc.
  • Residence Life housing selection, training for
    RAs, conflict resolution/mediation
  • Advancement fund-raising, alumni relations,
    public relations, government/corporate relations,
    community relations, etc.

12
Sites of Institutional Effectiveness
  • Units/Offices of operations (samples)
  • Advancement
  • Admissions, Bursar, Registrar
  • Athletics
  • Center for Advising, Academic Support, etc.
  • Campus Safety
  • Institutional Research
  • IT
  • Maintenance

13
The Assessment Cycle Key Questions for
Institutional Effectiveness
  • What services, programs, or benefits should our
    offices provide?
  • For what purposes or with what intended results?
  • What evidence do we have that they provide these
    outcomes?
  • How can we use information to improve or
    celebrate successes?
  • Do the improvements we make work?

14
What are we looking for?
  • EXAMPLES of EVIDENCE
  • Our admission of students for whom our
    institution is the first choice has risen 30.
  • 95 of students report satisfaction with the
    housing selection process.
  • 5 faculty committees participated in the last
    planning cycle.
  • Overall, faculty, staff, and students report
    feeling safe on campus, following the new Campus
    Safety Improvement initiatives.

15
Where do we seek improvement and what evidence
will help us?
  • We need to raise the number of students who
    choose our institution as their first choice to
    95 by 2010.
  • All faculty committees will be invited to
    participate in the next planning meeting.
  • Students (39) still report feeling unsafe in the
    mezzanine of the University Gallery. We will ..

16
The Iterative Assessment Cycle for
Institutional Effectiveness
Gather Evidence
Interpret Evidence
Mission/Purposes Objectives/Goals
Outcomes
Implement Methods to Gather Evidence
Make decisions to improve programs, services, or
benefits contribute to institutional experience
inform institutional decision- making, planning,
budgeting, policy, public accountability

17
What qualities point to institutional
effectiveness?
  • A well-articulated set of processes for critical
    functions
  • A clear line of responsibility and accountability
    for critical functions
  • An alignment of the importance of the function
    and sufficient resources (staff, budget,
    training, etc.) to support the function
  • Evidence of institution-wide knowledge of those
    critical functions, processes, and lines of
    responsibility

18
What kinds of evidence points to institutional
effectiveness?
  • Well-managed budgets
  • Accreditation and governmental compliance
  • Clearly defined and supported shared governance
    (board, president, administration, faculty,
    staff, and students)
  • Communication pathways and strategies
    transparency
  • Consensus on mission, strategic plan, goals,
    priorities, etc.
  • Student (and other constituencies) satisfaction

19
How do we measure institutional effectiveness?
  • Tangible evidence Audited budget statements,
    handbooks, enrollment data, institutional data
  • Records/reports of activities and/or compliance
  • Self-studies pointing to documented evidence
  • Surveys of satisfaction, usage, attitudes,
    confidence, etc.
  • Disciplinary accreditation reports

20
The Assessment Cycle Key Questions for Student
Learning
  • What should our students know or be able to do by
    the time they graduate?
  • What evidence do we have that they know and can
    do these things?
  • How can we use information to improve or
    celebrate successes?
  • Do the improvements we make work?

21
The Iterative Assessment Cycle
Gather Evidence
Interpret Evidence
Mission/Purposes Objectives/Goals
Outcomes
Implement Methods to Gather Evidence
Make decisions to improve programs enhance
student learning and development inform
institutional decision- making, planning,
budgeting, policy, public accountability

22
Student Learning Assessment
What should students know or be able to
demonstrate by the time they graduate?
  • Technological competence
  • Scientific competence
  • Research skills
  • Cultural competence
  • Interdisciplinary competence
  • Civic responsibility
  • Global competence
  • Economic/financial competence
  • Social justice
  • Civic engagement
  • Diversity appreciation
  • Communication skills
  • Professional responsibility
  • Ethics
  • Critical thinking
  • Collaborative learning
  • Leadership
  • Mathematical or Quantitative competence

23
What might our sources of evidence be?
  • Essays/Theses
  • Portfolios (faculty or external readers
    evaluated)
  • Quizzes
  • Oral presentations
  • Homework assignments
  • Lab experiments
  • Tests
  • Journal entries
  • Projects
  • Demonstrations

24
What are we looking for?
  • Evidence of students skill level (basic
    competency to mastery)
  • based on faculty-articulated standards of quality
    and judgments
  • applied to all students work evenly
  • indicative of aggregate evaluations of
    performance or knowledge
  • informative for course or program improvements

25
Can we use the same processes and strategies to
assess both arenas?
  • Measuring learning versus effectiveness,
    efficiency, and/or satisfaction
  • BEYOND ANECDOTAL INTO EVIDENCE
  • Methods of testing, projects, demonstrations
    versus surveys, records, reports
  • QUALIFY OR QUANTIFY THE OUTCOMES
  • Use of results (revisions versus training versus
    expansion)
  • MODIFY WHAT YOU DO TO AFFECT OUTCOMES

26
What is similar?
  • A commitment to doing the very best job possible
    under whatever conditions exist
  • A commitment to recognizing ways that altering
    those conditions can affect the outcomes
  • A commitment to recognizing that altering the
    outcomes can affect the conditions

27
Ultimately.
  • We hold ourselves and our colleagues accountable
    for articulating the intentions of our work and
    then measuring the realities, resulting in
    designing and implementing strategies for
    improvement over time.
  • How are we doing?
  • How can we do better?

28
QUESTIONS?
  • Comments?
Write a Comment
User Comments (0)
About PowerShow.com