Tips for Writing SACSCOC Academic Program Assessment Reports - PowerPoint PPT Presentation

Loading...

PPT – Tips for Writing SACSCOC Academic Program Assessment Reports PowerPoint presentation | free to download - id: 6fc4aa-OWM5N



Loading


The Adobe Flash plugin is needed to view this content

Get the plugin now

View by Category
About This Presentation
Title:

Tips for Writing SACSCOC Academic Program Assessment Reports

Description:

Tips for Writing SACSCOC Academic Program Assessment Reports Office of Planning, Institutional Research, and Assessment (PIRA) Fall 2014 – PowerPoint PPT presentation

Number of Views:27
Avg rating:3.0/5.0
Slides: 23
Provided by: k319
Learn more at: http://umshare.miami.edu
Category:

less

Write a Comment
User Comments (0)
Transcript and Presenter's Notes

Title: Tips for Writing SACSCOC Academic Program Assessment Reports


1
Tips for Writing SACSCOC Academic Program
Assessment Reports
  • Office of Planning, Institutional Research, and
    Assessment (PIRA)

Fall 2014
2
Relation Between Existing Assessment and SACSCOC
Reports
  • Ideally you already assess students learning
  • Ideally you already improve your program to
    increase student achievement
  • Program Assessment Reports should describe these
    activities using SACSCOC guidelines and
    terminology
  • Data or other findings that measure student
    learning should be included, as should
    interpretation of findings
  • But dont create special data collection process
    for SACSCOC just summarize existing processes
  • Initiatives to improve should be included

3
Ensure that Reviewers Will See Clear Evidence
that You Have . . .
  • defined desired mission, student learning
    outcomes (SLOs), and related measures,
  • collected and evaluated results from ongoing
    assessment,
  • undertaken actions to continuously improve
    learning.
  • Help reviewers find key components quickly
    easily

Define SLOs Measures
Implement Change (Improve)
Collect Findings
Evaluate Results
4
Use PIRA Checklist to Ensure Key Elements Are
Included
  • mission and program outcomes (objectives)
  • student learning outcomes (3) and related
    measures
  • (2 each, 1 should be direct)
  • assessment findings results for measures of
    student learning from multiple years (if
    feasible)
  • discussion of results faculty review of
    findings, including whether performance of
    students meets expectations
  • discussion of changes initiatives to improve
    student learning and/or program
  • evidence continuous improvement has occurred
  • clear narrative and organization to make
    compliance obvious (does everything make sense?)

5
Program Assessment at the University of Miami
  • Discussion for Continuous
  • Improvement
  • Faculty Review Do findings show continuous
    improvement?
  • Program Improvement What changes should be made?
  • Assessment Findings Data for EACH measure for 2
    years

Office of Planning, Institutional Research, and
Assessment (Rev 3-2013)
Assessment Measures
Capstone reviewed with faculty-developed rating
grid
Student Learning Outcome 1 (Definition)
Graduating Student Surveys
Exam questions that clearly relate to outcomes
Student Learning Outcome 2 (Definition)
Course Evaluations
Mission Statement Program Outcomes/Objectives
Graduate School Dissertation Thesis Rating Grid
Other Indirect Measures
Student Learning Outcome 3 (Definition)
Other Direct Measures
  • A program should have 3-5 measurable outcomes,
    each tied to the program mission.
  • Student learning outcomes relate to attainment of
    knowledge, skills, behaviors, or values.
  • Common outcomes include knowledge of theory and
    research in the field, ability to think
    critically about the field of study, oral and
    written communication skills.
  • For each outcome 2-3 measures are required at
    least one must be a direct measure (direct - dark
    blue, indirect - light blue)
  • A single measure (e.g., rating grid) can assess
    more than one outcome.
  • Build operationally realistic assessments into
    your annual departmental calendar.
  • Assessment findings should assist in identifying
    areas for improvement within programs.
  • Identified and resolved changes should be
    reflected in the discussion section of reports to
    PIRA.

Your mission statement and program outcomes
(objectives) should align with the mission of the
University and your programs strategic plan.
6
When Writing Your Mission Statement You Should .
. .
  • tie it to UM Mission
  • The University of Miamis mission is to
  • educate and nurture students, to create
  • knowledge, and to provide service to our
  • community and beyond. Committed to excellence and
    proud of the diversity of our University family,
    we strive to develop future leaders of our nation
    and the world.
  • and your strategic plan
  • describe program outcomes/objectives (e.g.,
    prepare graduates to . . ., teach gen-ed courses,
    research, service)

7
When Writing Student Learning Outcomes (SLOs) You
Should . . .
  • describe reasonable expectations for
  • student learning (knowledge, skills, values,
  • and behaviors)
  • include at least 3 SLOs, each with correct
  • structure and language
  • make SLOs easy to identify (e.g., use bolding
  • numbering) and clearly stated (follow
    expected
  • structure
  • Most common error Programs describe what
    they do.
  • Solution Describe what you want students to
    learn.

8
Structure of SLOs
  • Start with words like
  • Students Graduates We want students to
  • Include verbs or phrases like
  • will demonstrate should have ability to
    will analyze and synthesize
  • Include words like
  • breadth of understanding of
  • mastery of
  • a capacity for
  • Describe expected competence (e.g., broad
    knowledge, communication, critical thinking)

9
Examples of Bad and Good SLOs Instead of . . .
Help students develop research skills by
providing opportunities for supervised laboratory
practice. write Graduates will demonstrate the
ability to conduct laboratory research. ---------
------------- Students will participate in
interpersonal, interpretative, and presentational
communicative activities and be guided in the
development of literacy skills in the language of
study through the communicative acts of reading,
writing, and creating discourse around texts of
all types. write Students will demonstrate
literacy skills in the language of study through
the communicative acts of reading, writing, and
creating discourse around texts of all types.

10
Possible SLOs
  • Students should demonstrate an overall knowledge
    and understanding of the core concepts in insert
    program here, including the essential skills to
    conduct research in the insert program here.
  • We want students to graduate with strong written
    and/or oral communication skills.
  • Our doctoral students should be able to conduct
    independent research worthy of publication.
  • Graduates should have an understanding and
    capability to work with the systems and hardware
    components that support software.
  • Students should demonstrate critical thinking,
    including the ability to analyze, synthesize, and
    draw valid conclusions.

11
When Writing Measures, You Should . . .
  • ensure each SLO has 2 measures
  • ensure at least 1 direct measure (objective
    outside sourcesee p. 4 of Resources)
  • ensure indirect measure (usually self-reported
    measure) accompanied by directsee p. 4 of
    Resources
  • Instead of course grades or pass rates used
    (SACSCOC discourages), substitute project grades
    (plus description relating exam/project to SLO)
  • consider rating grids since easier to trend over
    time and 1 grid can be used for all SLOssee pp.
    8 9 of Resources
  • Most common error Programs describe how faculty
  • provide feedback to help individual
    students.
  • Solution Describe aggregate measures
  • used to evaluate student learning.

12
Examples of Bad and Good Measures Instead of . .
.
Students are given tests write Grades from
tests that measure the students ability to
describe what test is for will be used to
assess SLO. ---------------------- Table of
grades for course use Table of grades for final
paper (plus description of assignment using
language of SLO)
13
Good Graduate Program Measures (Can Rewrite SLOs
to Correspond)
  • Graduate School Rating Grid at final defense
    (already supposed to be using) fast and easy
    (PIRA will analyzesee pp. 8 9 of Resources)
  • Same rating grid, but used for proposal defense
    (and/or for each year in program)use same
    standards for both to show students progress
  • Qualifying/comprehensive exam (but need to
    explain whats tested so link to SLO is clear)
  • Rating grids from supervisors of TAs, RAs, GAs,
    internships
  • Ratings from audience for presentations on
    student research
  • Number of publications, conference presentations,
    grants
  • Graduating Masters Student Survey (items similar
    to ones on p. 10 of Resources available from
    PIRA)

14
Good Undergraduate Measures (Can Rewrite SLOs to
Correspond)
  • Graduating Senior Surveyvery easy (PIRA/Toppel
    collect, analyze, send) small programs should
    use combined years (green column) rather than
    trends (orange columns)see p. 10 of Resources
  • Rating grids for capstone papers, projects, etc.
    (see p. 8 of Resources for sample you can adapt)
  • Grades from items on tests or assignments that
    directly measure a given SLO
  • Rating grids from supervisors of internships,
    practica
  • Additional items relating to improvement in each
    SLO that are added to faculty evaluations or
    final exams
  • Existing items on New General Form for
    faculty/course evaluations that relate to
    critical thinking or communicating on the subject

15
When Writing Assessment Findings, You Should . .
.
  • ensure each measure has corresponding findings
    (and no findings without earlier measure)
  • insert corresponding outcome/measure as heading
    for each set of results
  • ensure multiple years or insert explanation that
    data not provided for new program/revised
    measures
  • As part of the major three-year continuous
    improvement update of our program assessment
    report in 2013, we decided to start using rating
    grids in conjunction with XXX e.g., senior
    projects to allow us to more easily monitor
    changes in student learning over time. Because
    this is a new measure, we have data for only the
    2013-14 academic year, but we will continue to
    update the data in upcoming years to monitor
    continuous improvement in student learning.

16
When Writing Assessment Findings, You Should . .
.
  • if measure is a narrative rather than data,
    ensure summary plus sample evaluations or insert
    statement (see p. 6 of Resources)
  • ensure results are presented clearly (tables)
  • decide if appendix of findings, survey
    instrument, etc. will be necessary (usually not)
  • put findings related to Program Outcomes under
    new sub-heading Findings Relating to Program
    Outcomes
  • Most common errors Programs simply state
    they
  • evaluate student learning or omit measure(s).
  • Solution You should provide evidence
  • of assessment activity (table/text
  • summary of findings).

17
When Writing Discussion Section, You Should
Ensure . . .
  • statement that faculty as a group reviewed (e.g.,
    dates/minutes of meeting)
  • discussion of whether faculty think students
    demonstrated desired level of learning
  • initiatives you implemented to improve student
    learningsee p. 6 of Resources
  • whenever possible, an indication of which SLO is
    affected
  • whether improvements seem to be working

18
In Discussion Section . . .
  • Most common errors
  • No statement indicating faculty reviewed
  • No statement of how faculty think students are
    doing
  • No mention of which SLO affected by improvement
    initiatives
  • No mention of whether there has been improvement
    over time
  • Solutions include
  • Dates or minutes of faculty meetings
  • Evaluation of how well each SLO achieved
  • Which SLO will benefit from improvement (if
    relevant)
  • Effectiveness of prior initiatives and how
    learning will be improved

19
Format/Organization/Wording Help SACSCOC
Reviewers Find What They Need
  • Add bold, indents, and/or underlines to assist
    reviewers
  • Nest measures under related SLOs
  • Label/nest Outcomes/Measures in Findings section
  • Include discussion of improvements/changes in
    Discussion section, not in SLO or Findings
    sections
  • Remove yellow template instructions
  • Use SACSCOC terminology (Student Learning
    Outcomes, Measures of SLOs, etc.)
  • Delete extraneous text and data (clarity more
    important than length)
  • Expand acronyms (e.g., RSMAS, PRISM)
  • Spell check fix typos

20
Tips for Writing an Efficient Report
  • Study resources and checklist before starting
  • Use existing assessments and available student
    work whenever possible (saves time and effort)
  • Consider developing a rating grid with 1-2 items
    per each learning outcomesee p. 8 of Resources
  • Contact PIRA for summary of results Graduate
    School Rating Grid email PIRA scanned forms for
    students we dont have
  • Use Graduating Senior Survey (GSS) or Graduating
    Masters Student Survey (GMSS) summary
  • Consider starting with measures and then writing
    SLOs to go with them instead of the other
    traditional order

21
Alert Recent Changes
  • Need to provide evidence of improvement based on
    initiatives, wherever possible (though sometimes
    hard to see, especially with small Ns and short
    time periods)
  • New emphasis from SACSCOC Need to add material
    (Findings, Improvements, and Discussion) related
    to Program Outcomes (NOT Processes). Examples
  • Retention/graduation rates, average time to
    degree (from PIRA)
  • Ratings from GSS or GMSS (from PIRA)
  • Job placement (from Grad Program Review profile)
  • Graduate program review, professional
    accreditation
  • Other measures of program success (e.g., quality,
    effectiveness, interdisciplinary opportunities).

22
Questions for PIRA?
Contact Dr. David E. Wiles Executive Director,
Assessment and Accreditation Institutional
Accreditation Liaison (305) 284-3276
About PowerShow.com