Assessment at small colleges and universities: Student learning as the anchor in the seas of change - PowerPoint PPT Presentation

1 / 35
About This Presentation
Title:

Assessment at small colleges and universities: Student learning as the anchor in the seas of change

Description:

Student performance based ... Goals and outcomes are not clearly stated. ... Assessment results are linked to program review process ... – PowerPoint PPT presentation

Number of Views:18
Avg rating:3.0/5.0
Slides: 36
Provided by: MAR1231
Category:

less

Transcript and Presenter's Notes

Title: Assessment at small colleges and universities: Student learning as the anchor in the seas of change


1
Assessment at small colleges and universities
Student learning as the anchor in the seas of
change
  • Carleen Vande Zande
  • Sheryl Ayala
  • Marian College

2
Discussion Overview
  • 1. How do current campus-wide initiatives support
    student learning?
  • 2. How is a shared vision for student learning
    created and maintained at the campus level?
  • 3. What are the supports and barriers to a shared
    commitment of student learning?
  • 4. Is there a mechanism to coordinate and
    evaluate assessment on your campus?

3
Sea of changenew perspectives
  • Traditional
  • Fragmented approach
  • Periodic
  • Episodic assessments
  • More summative
  • Limited formats
  • Externally motivated
  • Course quality based
  • Emerging
  • Coherent approach
  • Continuous
  • Linked to standards
  • More formative
  • Multiple formats
  • Internally motivated
  • Student performance based

4
Problems and Pitfalls
  • Culture of institution not focused on student
    learning.
  • Changes are not linked to student performance.
  • Changes not linked to institutional planning or
    budget cycle.
  • Results of assessment not linked to program
    review.
  • Change made in isolation.
  • Faculty not recognized for assessment efforts.

5
Barriers to effective assessment
  • Assessments dont link to stated outcomes
  • Goals and outcomes are not clearly stated.
  • Lack of articulation with state/professional
    standards.
  • Expectations are not shared
  • Outcomes not expressed in terms of what students
    can do.

6
Barriers
  • Lack of multiple measures
  • Assessments often too specific and do not cut
    across programs
  • Lack of continuity of assessments over students
    programs
  • Unclear assessment benchmarks
  • Assessments are limited in scope and format
  • Reliability challenges
  • Validity-logical and reasonable inferences about
    learning

7
Barriers
  • Lack of supporting structures/procedures
  • Unclear methods of using, storing, reporting
    student assessment results
  • Feedback loop not defined for interpretation or
    sharing of results
  • No review of assessment system itself
  • Change not based or linked to results

8
Supports Collaboration in efforts to focus on
student learning
  • Build a shared language about assessment
  • Collaboration of my campus and HLC to look at
    assessment characteristics across campus to
    answer some of these questions

9
What is the anchor?
  • HLC Statement The program to assess student
    learning should emerge from and be sustained by a
    faculty and administrative commitment to
    excellent teaching and effective learning

10
How do we drop the anchor?
  • provide explicit and public statements regarding
    the institutions expectations for student
    learning
  • use the information gained from the systematic
    collection and examination of assessment data
    both to document and improve student learning

11
What criteria do we use?
  • Shared understandings, guiding principles
  • A rating system based on varying levels of
    implementation and adherence to qualities

12
Characteristics of a coordinated Assessment System
  • Structured
  • Systematic
  • Continuous
  • Shared
  • Coherent
  • Explicit

13
Structured
  • Clear decision points are identified
  • Criteria are well defined
  • Clear links to unit framework are evident

14
Systematic
  • Information gained from the systematic collection
    and examination of assessment data is recorded,
    reported and used for program improvement.
  • Periodic review of system itself
  • Active Assessment Committee as support

15
Continuous
  • Assessment system has elements that cut across
    programs
  • Guides the assessment of all students as they
    progress through a program
  • Occur at key milestones in program
  • Uses multiple measures over time

16
Shared
  • The plan is widely known, accepted and routinely
    updated
  • The plan is related to other planning and
    budgeting processes-it is institutionalized
  • Documented changes based on data

17
Coherent
  • The assessment program is linked to the
    commitment of excellent teaching, effective
    learning and supports national and state
    standards
  • Clearly stated outcomes for all programs

18
Coherent
  • Feedback loop for information flow is clarified,
    linked to student learning
  • Assessment results are linked to program review
    process
  • Need for change is linked to actual data

19
Explicit
  • The assessment program provides explicit and
    public statements regarding the institutions
    expectations for student learning
  • Explicit objectives derived from goals that are
    publicly stated and linked to measures

20
Explicit
  • Shows clear evidence of student learning
  • Assessment information is available and
    distributed
  • Results are shared

21
Looking at our assessment system campus wide
  • Measuring our progress using the HLC levels of
    implementation

22
What criteria do we use?
  • A system based on varying levels of
    implementation
  • Explicit roles and responsibilities across campus

23
Rationale
  • to measure their progress toward their goals for
    a successful assessment
  • program at the institutional level by defining
    roles and responsibilities

24
Rationale
  • to identify the structural, procedural, and
    policy changes their institution needs in order
    for the institutional assessment program to
    become fully
  • realized

25
Rationale
  • to carry out the agreed upon changes that will
    serve to maintain existent
  • positive attributes of the assessment program

26
Rationale
  • to determine how far they have come toward
    carrying on effective
  • assessment programs at the academic program
    level

27
Rationale
  • to confirm or challenge the impression held by
    the institutions constituents about the quality
    of their assessment program at the programmatic
    level (general education, the major, and graduate
    and professional degree programs)

28
Uses
  • to include in their Self-Study Reports, a
    self-evaluation of both the assessment program
    for the institution as a whole and the assessment
    programs

29
A Campus Perspective Lessons learned at Marian
College
  • Student learning as the anchor

30
Uses at Marian College
  • 1. One way, is to use the plan as the basis for
    discussion across programs to raise awareness and
    to identify issues about assessment.
  • 2. A second way to use the document is to make it
    a kind of template to construct a similar matrix
    for your own institution.

31
Use at Marian College
  • 3. A third way you might use the document when
    you return to your campus would be to create a
    two-part survey of your assessment program

32
Use at Marian College
  • 4. The plan can be used as a basis for strategic
    planning in assessment
  • 5. The plan can effectively be used as a basis
    for budgeting framework for identifying support
    for assessment
  • 6. The plan can be a planning guide for
    professional development opportunities for
    faculty

33
Use at Marian College
  • 7. The assessment characteristics of the plan can
    be the basis for the campus program approval or
    program review process.
  • 8. The components of the plan can be used as a
    reporting framework across campus.
  • 9. Linking the assessment and General Education
    committees to best support a coherent vision of
    student learning.

34
Discussion
  • Problems and Pitfalls in assessment
  • Supports
  • Characteristics
  • Plan levels

35
Characteristics
  • Structured
  • Systematic
  • Continuous
  • Shared
  • Coherent
  • Explicit
Write a Comment
User Comments (0)
About PowerShow.com