Fred Trapp, Ph.D. - PowerPoint PPT Presentation

1 / 54
About This Presentation
Title:

Fred Trapp, Ph.D.

Description:

Comprehensive Assessment Reports * Fred Trapp, Ph.D. Administrative Dean, Institutional Research/Academic Services (Retired) Long Beach City College – PowerPoint PPT presentation

Number of Views:241
Avg rating:3.0/5.0
Slides: 55
Provided by: fredt5
Learn more at: https://rpgroup.org
Category:

less

Transcript and Presenter's Notes

Title: Fred Trapp, Ph.D.


1
Comprehensive Assessment Reports
  • Fred Trapp, Ph.D.
  • Administrative Dean, Institutional
    Research/Academic Services (Retired)
  • Long Beach City College
  • Cambridge West Partnership, LLC
  • Robert Pacheco, Ed.D.
  • Director Of Institutional Planning, Research and
    Resource Development
  • Barstow College

2
Outcomes for the Session
  • The participant will be able to
  • Describe the comprehensive assessment report
    concept.
  • Locate best practice examples from other colleges
    through web links.
  • Discuss how the comprehensive report idea can be
    used as part of the institutions learning
    process and as a means by which the institution
    provides quality assurance to the public.
  • Indicate national trends and efforts of
    consortia/national organizations to provide
    quality assurance about student experiences and
    learning outcomes.
  • Please hold questions until the end.

3
Curriculum Map
3
4
ACCJC Institutional Effectiveness Rubric
  • Part III Student Learning Outcomes
  • Proficiency
  • Comprehensive assessment reports exist and are
    completed on a regular basis.

5
What Did We Look At, With Whom Did We Consult?
  • ACCJC. Institutional Effectiveness Rubric
  • ACCJC. 2002 Standards
  • ACCJC. Themes
  • ACCJC. Guide to Evaluating Institutions
  • Professional literature
  • Efforts of national groups/institutes
  • Institutional web sites, listservs and colleagues

6
Guiding Questions
  • How can the report writing experience
  • Help faculty explore the student learning
    process?
  • Determine the extent to which the curriculum is
    working?
  • Where can time, energy and/or money be allocated
    for continuous improvement in learning?
  • Exploit the writing process and dialogue about
    results to gain broader institutional learning
    experiences?
  • Help meet our quality assurance pledge to the
    community?

7
Illustration Selections
  • Council for Higher Education Accreditation (CHEA)
  • Annual Award for Outstanding Institutional
    Practice in Student Learning Outcomes
  • Demonstrated commitment to developed highly
    effective practice to use SLO assessment
  • Willingness to share the practices they developed
  • Selection committee
  • Selection criteria
  • Articulation evidence of outcomes
  • Success with regard to outcomes
  • Information to the public about outcomes
  • Using outcomes for improvement

8
Illustration selections (continued)
  • Cited by scholars and peers
  • Schools with assessment work cited in scholarly
    books and articles
  • Schools with assessment work selected for
    presentation at national conferences
  • Web presentations publicly available for you to
    ease drop upon
  • Prominent national movements/initiatives
    regarding learning outcomes assessment
    documentation (including public quality assurance)

9
What Might Be Included?
  • Assessment focus- course, program, general ed,
    etc.
  • What outcomes were assessed?
  • How and when were they assessed?
  • Who was assessed?
  • What were the results?
  • Who reviewed the results, made sense of the them
    and what conclusions were reached?
  • What are the implications for practice and/or
    policy or future assessment work?
  • How were the results used?

10
CC of Baltimore County (MD)Course-level Reporting
  • Middle States Commission on Higher Education
  • Community College Futures Assembly, Bellwether
    Award, 2008
  • Instructional Programs Services for High Impact
    Course Level Assessment
  • CHEA award winner, 2006
  • Institutional Progress in Student Learning
    Outcomes
  • National Council on Student Development (NCSD)
    Exemplary Practice Award Winner

11
CC of Baltimore County (MD)Course-level Reporting
  • Projects are at least three semesters long
  • Individual and high-impact courses (all sections)
    included
  • Project proposal by a faculty group
  • Measurable objectives
  • External review approval in selecting
    methods/instrument analyzing results.
    Benchmarking should be included if possible.
  • Controls and sample size considered.
  • Course improvements based on data analysis
  • Reassessment expected
  • Results/report shared across the college and web
    posted

12
CC of Baltimore County (MD) Course-level
Reporting
  • Learning Outcomes Assessment Final Report
    Template
  • Design proposal for the LOA project
  • Implementation of design data collection
  • Redesign of the course to improve student
    learning
  • Implementation of course revisions reassessment
    of student learning
  • Final analysis and results
  • Ease dropping
  • http//www.ccbcmd.edu/loa/CrseAssess.html
  • Two-page executive summaries available

13
CC of Baltimore County (MD) Course-level
Reporting
  • CHEM 108
  • An initial failure turned to success and
    collaboration with a four-year school
  • HLTH 101
  • Addressing an achievement gap with professional
    development and increased communication with
    students
  • CRJU 101 and 202
  • Statewide group assessment development effort and
    creativity in the interventions used

14
(No Transcript)
15
Riverside CC (CA) Course-level Reporting
15
  • GEG 1
  • Assessment part of the program review 2008
  • GEG 1 appendix
  • GEG 1L appendix
  • Ease dropping
  • http//www.rcc.edu/administration/academicAffairs/
    effectiveness/assess/resources.cfm

16
Program-level Reporting
  • North Central Association of Colleges Schools,
    Higher Learning Commission
  • CHEA award winner, 2008
  • Institutional Progress in Student Learning
    Outcomes

17
Hocking College (OH)Program-level Report
  • All programs have individual assessment
  • plans
  • Mission statement central objective for
    assessment
  • Institutional success skills (GE)
  • Program exit competencies
  • Criteria for and means of assessment
  • Reporting of results

18
Hocking College (OH)
  • Learning outcomes data collected in a student
    E-portfolio
  • Directing internal and external evidence (1 to 10
    measures)
  • Indirect evidence (1 to 4 measures)
  • Evidence drawn from samples of student work for
    faculty to apply an agreed upon holistic rubric
  • Eight general education outcomes (student success
    skills)
  • Discipline-specific exit competencies or outcomes

19
Hocking College (OH)Program-level Report
  • Ease dropping
  • Cloud reference, not college URL as links are
    broken there
  • Various reports available in each program profile
  • Curriculum matrix
  • Criteria statements (exit competencies)
  • Instructional Program Outcomes (assessment plan)
  • Trend Charts for performance criteria

20
Hocking College (OH)Program-level Report
  • Example reports and analysis
  • Culinary Arts Technology
  • Forestry Management Technology
  • Nursing Technology

21
Mesa College (AZ)General-Education Reports
  • North Central Association of Colleges Schools,
    Higher Learning Commission
  • CHEA Award winner, 2007
  • Institutional progress in Student Learning
    Outcomes

22
Mesa CollegeGeneral Education Report
  • Multiple outcomes assessed
  • Annually
  • Annual Report elements
  • Executive Summary
  • Methodology
  • Results observations (GE workplace)
  • Indirect measures findings
  • Appendices of past results

23
Mesa Community College (AZ)
  • General education studies completed 2007-08
    2005-06
  • Numeracy
  • Scientific inquiry
  • Problem solving/critical thinking
  • Information literacy
  • Workplace skills (CTE)
  • General education studies completed 2006-07
    2004-05
  • Arts humanities
  • Cultural diversity
  • Oral communication
  • Written communication

24
Mesa College (AZ)General Education Reports
  • Ease dropping
  • http//www.mesacc.edu/about/orp/assessment/index.h
    tml
  • Annual reports and summaries available
  • Eight years of history and experience

25
(No Transcript)
26
Capital CC (CT)General-Education Reports
  • New England Association of Schools and Colleges,
    Commission on Institutions of Higher Education
  • Cited in the Art and Science of Assessing General
    Education Outcomes A Practical Guide (AACU,
    2005)

27
Capital CC (CT)General Education Reports
  • Multiple reports
  • One per outcome
  • Each study commonly takes a year
  • Report elements
  • Introduction
  • Methods
  • Results and findings
  • Conclusions and recommendations
  • Implications for future assessments
  • Appendices of assignment, rubric, notes to
    teachers, etc.

28
Capital CC (CT)General Education Reports
  • General education studies completed
  • Writing, 2001-02
  • Math, 2002-03
  • Critical thinking, 2003-04
  • Global perspective, 2004-05
  • Ease dropping
  • http//www.ccc.commnet.edu/slat/
  • Annual reports and summaries available

29
(No Transcript)
30
Portland CC (OR)General Education Reports
  • Northwest Accrediting Commission
  • Ease Dropping
  • http//www.pcc.edu/resources/academic/learning-ass
    essment/
  • One general education theme a year
  • Learning Assessment Focus for 2009-10- Critical
    Thinking Problem Solving
  • Physical Science, Geology and General Science
  • Bioscience Technology
  • Management and Supervisory Development
  • Culinary Assistant Program

31
Truman State University (MO)Various Reports
  • Southern Association of Schools and Colleges,
    Commission on Colleges
  • Ease Dropping
  • Assessment work began in 1970
  • http//assessment.truman.edu/
  • Assessment Almanac- A compilation of results from
    each years assessment work (versions from 1997
    to 2009 are posted)
  • General Education outcomes are assessed in the
    context of the major field of study
  • Portfolio Project- required of all seniors to
    show best work assessed by faculty for the nature
    quality of the liberal arts and sciences
    learning outcomes (versions from 1997 to 2008 are
    posted)

32
Authorship
  • Course-level, program-level general education
  • Teaching faculty study team with technical
    assistance from
  • institutional research or assessment committee
  • No lone ranger authors
  • Institutional summary
  • Academic administrator with assistance from
  • Learning outcomes coordinator or assessment
    committee
  • Compilation of work accomplished in one or two
    academic years across the institution
  • Automated reporting-(TracDat)
  • Sierra College examples

33
Location
  • Location of course, program and general education
    comprehensive assessment reports
  • Teaching faculty study team members, assessment
    committee chair, assessment website
  • Location of institutional summary reports
  • Academic administrator, assessment committee
    chair, assessment website
  • Not in the library basement, actively used to
    promote a learning organization

34
Distribution
  • To all affected participants
  • Campus committees
  • Curriculum, assessment, resource allocation
    group, unit (department) leadership, general
    academic and college leadership
  • Campus fairs, brown-bag lunches, poster sessions
    for information sharing
  • Faculty professional development programs
  • Accreditation self-study committee work groups
  • College web site for the public

35
Reports a Learning Organization
  • Learning organization
  • Environment that promotes a culture of learning
  • Individual group learning enriches enhances
    the organization as a whole
  • Systematic problem solving using data for
    decisions
  • Learning from experiences in assessing
    organizational performance
  • Comparing yourself to others (benchmarking) and
    borrowing ideas
  • Adriana Kezar ed. Organizational Learning in
    Higher Education New Directions for Higher
    Education. No. 131, Fall 2005. Jossey-Bass.

36
Characteristics of Organization Learning
  • Researchers have found some critical features of
    learning organizations (Lieberman, 2005, pp.
    87-98). In particular, a college as an effective
    organizational learner
  • Maintains a scholarly approach to the questions
    and problems that the institution faces
  • Approaches the campus problems as learners and
    not as experts
  • Develops a culture of evidence that drives
    decision-making
  • Links the organizational learning to the
    colleges mission
  • Makes connections throughout the college and not
    just as individual units (e. g. , faculty,
    administration) and
  • Recognizes and rewards the colleges efforts to
    become a learner.

37
Reports as Institutional Learning Resource
Allocation
  • The assessment data sense-making process a
    faculty learning experience
  • Linking results to future interventions a
    learning experience for
  • Faculty, assessment committee, academic
    administration, planning resource allocation
    groups
  • Using results to inform an intervention, then
    reassess a learning experience (accomplished
    one or more terms later) for
  • Faculty, assessment committee, academic
    administration
  • Reference for future assessment work and other
    groups on campus

38
Reports as Institutional Learning Resource
Allocation
  • Hocking College (OH)
  • Student E-portfolio
  • Annual summary
  • Improvements in the program in the previous year
    brought on by study of assessment results
  • Expenditures of time, money materials for the
    assessment program
  • Requests for assistance in implementing
    assessment
  • Recommendations for altering the institutions
    assessment process
  • Transition from evaluating individual students to
    assessing groups of students the curriculum
    experience

39
Reports as Institutional Learning Resource
Allocation
  • Community College of Baltimore County (MD)
  • Learning Outcomes Assessment Advisory Board
  • Links findings in assessment reports to other
    college-wide initiatives and professional
    development opportunities
  • Use of assessment processes and (findings)
    results
  • Challenged faculty to reexamine prompts used in
    assessment
  • Clarity of written prompt extent it supports
    program goals
  • Common assignment options and common rubric
    increases faculty understanding and buy-in
  • Builds faculty unity toward common goals
  • Public web page enhances communication and
    accessibility to information

40
Reports as Institutional Learning Resource
Allocation
  • CHEA award winner, 2010
  • Institutional Progress in Student Learning
    Outcomes

41
Reports as Institutional Learning Resource
Allocation
  • Northern Arizona University- Seals of Assessment
    Achievement Excellence
  • Purpose
  • To recognize programs that have demonstrated
    significant progress with assessing student
    learning
  • To promote best practices in assessment by
    sharing practical experiences
  • To encourage programs to showcase program-level
    achievements and to adjust curricula when
    appropriate.

42
Reports as Institutional Learning Resource
Allocation
  • Feedback recognition
  • Feedback rubric for annual assessment reports
  • Conversations and action
  • Collection and analysis of evidence
  • Implementation of findings
  • Recognition (achievement excellence)

43
Reports as Institutional Learning Resource
Allocation
  • Seal of Assessment Achievement
  • Academic programs earning this recognition have
    demonstrated in their annual report that
  • learning outcomes have been assessed through two
    or more methods, and
  • findings have been discussed among
  • the faculty.

44
Reports as Institutional Learning Resource
Allocation
  • Seal of Assessment Excellence
  • Academic programs earning this recognition have
    demonstrated
  • a thorough implementation of assessment plan(s)
  • the reporting of meaningful assessment data
  • the discussion of findings among faculty
  • and perhaps students
  • the use of findings to showcase
  • student achievements and
  • to make curricular adjustments.

45
Reports as Institutional Learning Resource
Allocation
  • Mesa College (AZ)
  • Results Outreach Committee
  • Promotes use of outcomes data in relation to
    faculty development, pedagogy and academic
    climate
  • Groups of faculty offer a proposal for summer or
    academic year work above the course level
  • Resulting report placed on the web and used for
    campus discussion and action

46
Report as Quality Assurance
  • National Institute for Learning Outcomes
    Assessment (NILOA)
  • Assists institutions others in discovering
    adopting promising practices in the assessment of
    college student learning outcomes.
  • Documenting what students learn, know and can do
    is of growing interest to colleges and
    universities, accrediting groups, higher
    education associations, foundations and others
    beyond campus, including students, their
    families, employers, and policy makers.

47
Report as Quality Assurance
  • NILOA
  • 2010 Webscan report Exploring the Landscape
    What Institutional Websites Reveal About Student
    learning Outcomes Assessment Activities
  • 2010 Connecting State Policies on Assessment with
    Institutional Assessment Activity
  • Ease dropping
  • www.learningoutcomeassessment.org

48
Report as Quality Assurance
  • Promising Vehicles for Expanding Information to
    the Public
  • Brief narrative report from annual assessment
    reports
  • Simple statistical reports on learning outcomes
    or surveys
  • Best practices stories supported by assessment
  • Peter Ewell. Accreditation the Provision of
    Additional Information to the Public about
    Institutional and Program Performance, CHEA, May
    2004

49
Quality Assurance to the Public
  • Voluntary System of Accountability
  • APLU AASCU (520 public institutions, award 70
    of bachelors degrees in the US each year)
  • Started 2007
  • College Profile (includes learning outcomes
    links to campus)
  • Proactive initiative to document learning gains
    and average institutional scores (choice of 3
    national instruments)
  • Proactive initiative to illustrate unique campus
    learning outcomes assessment work
  • Promoting a learning institution
  • Ease Dropping
  • http//www.collegeportraits.org/

50
Quality Assurance to the PublicVSA Example, Cal
Poly Pomona
  • http//www.collegeportraits.org/map
  • Cal Poly Pomona
  • http//www.csupomona.edu/academic/programs/ge_ass
    essment/

51
Quality Assurance to the Public
  • National Association of independent Colleges and
    Universities
  • Assessment programs on campus tied to
    institutions mission
  • Ease Dropping
  • http//www.naicu.edu/special_initiatives/accountab
    ility/Student_Assessment/id.514/default.asp
  • Pepperdine University
  • http//services.pepperdine.edu/oie/learning-outcom
    es/learning-outcomes-overview.aspx

52
Where Can I Go?Resource
  • Filesanywhere.com
  • http//www.filesanywhere.com/fs/v.aspx?v8a69668b5
    c6773a96f6d

53
Contacts Questions
  • Robert Pacheco (Barstow College)
  • Rpacheco_at_Barstow.edu
  • Fred Trapp (Cambridge West Partnership)
  • fredtrapp_at_gmail.com
  • Questions and Comments

54
Session EvaluationOutcomes for the Session
  • The participant will be able to
  • Describe the comprehensive assessment report
    concept.
  • Locate best practice examples from other colleges
    through web links.
  • Discuss how the comprehensive report idea can be
    used as part of the institutions learning
    process and as a means by which the institution
    provides quality assurance to the public.
  • Indicate national trends and efforts of
    consortia/national organizations to provide
    quality assurance about student experiences and
    learning outcomes.
Write a Comment
User Comments (0)
About PowerShow.com