A Serious Approach to Accountability, Program Improvement, and Student Learning Must Involve Assessm - PowerPoint PPT Presentation

1 / 41
About This Presentation
Title:

A Serious Approach to Accountability, Program Improvement, and Student Learning Must Involve Assessm

Description:

Measures of Institutional Effectiveness ... Dr. Eric Fountain Seton Hall University. Institute for Technology Development. Anderson, J.A. ... – PowerPoint PPT presentation

Number of Views:56
Avg rating:3.0/5.0
Slides: 42
Provided by: lmja
Category:

less

Transcript and Presenter's Notes

Title: A Serious Approach to Accountability, Program Improvement, and Student Learning Must Involve Assessm


1
A Serious Approach to Accountability, Program
Improvement, and Student Learning Must Involve
Assessment
  • James A. Anderson
  • Vice Provost for Undergraduate Affairs
  • North Carolina State University
  • James_Anderson_at_NCSU.EDU
  • 919/515-3037
  • http//www.ncsu.edu/undergrad_affairs/assessment/a
    ssess.htm

2
Accountability Reporting
  • Measures of Institutional Effectiveness
  • Evaluative Feedback from Faculty/Staff/Students
    (Surveys and Focus Groups)
  • Assessment of Outcomes and Outcomes-Based Program
    Review

3
Integrative Assessment Reporting
  • Institutional Goals or Objectives
  • College/Division/Accreditation Goals and
    Objectives
  • Department/Program Goals/Objectives/Outcomes
  • Course-Bases Objectives/Outcomes

4
Community College Assessment Challenges
  • Academic programs
  • Administrative services
  • Student development programs
  • Integrate/connect proposed and ongoing assessment
    efforts across units and levels
  • Developing a culture of evidence
  • Emphasize link between MACRO-LENS and MICRO-LENS
    of assessment
  • Helping faculty see link between institutional
    effectiveness, aggregate data gathering, and
    outcomesbased assessment

5
Community College Assessment Challenges
  • The point of effective assessment is not to
    gather data and return results, it is a process
    that starts with the questions of decision-makers
    and practitioners that involves them in the
    gathering and interpreting of data, and that
    informs and helps guide continuous improvement
    (especially about student learning and success).
  • Gathering data or results that present a picture
    or profile yields important information, BUT it
    generally doesnt illuminate questions/issues
    that people really care about, nor does it
    improve the quality of undergraduate education.

6
Community College Assessment Challenges
  • Questions pertaining to the creation and
    improvement of an Effective Academic Learning
    Environment cannot be answered by
  • Admissions/enrollment data
  • Persistence/attrition data
  • Survey data
  • Demographic comparisons

7
Community College Assessment Challenges
  • Global teaching evaluations
  • Department comparisons
  • Teaching loads and credit hours
  • Entering first year student statistics
  • Transfer rates

8
Purpose of Assessmentfrom Bresciani, M.J.
  • 1) Reinforce or emphasize the mission of your
    unit
  • 2) modify, shape, and improve programs and/or
    performance (formative)
  • 3) critique a programs quality or value compared
    to the programs previously defined principles
    (summative)
  • 4) inform planning

9
Purpose of Assessment, Cont.from Bresciani, M.J.
  • 5) inform decision making
  • 6) evaluate programs not personnel
  • 7) assist in the request for additional funds
    from the institution and external community
  • 8) assist in meeting accreditation requirements,
    models of best practices, and national benchmarks

10
(No Transcript)
11
Typical Components of An Assessment Planfrom
Bresciani, M.J.
  • Mission
  • Objectives/Goals
  • Outcomes
  • Evaluation Methods
  • By Outcomes
  • Implementation of Assessment
  • Who is Responsible for What?
  • Timeline
  • Results
  • Decisions and Recommendations

12
Learner-Centered College at Maricopa C.C.
  • Traditional Learning Paradigm (TLP)
  • and/or
  • Innovative Learning Paradigm (ILP)

13
Learner-Centered College at Maricopa C.C. Cont.
  • Characteristics of the TLP (What to keep/omit)
  • Primary concern is course content and grades
  • Quantity more important than quality (hrs, load,
    seats, etc.)
  • Emphasis on grades not learning
  • Learning is competitive (Bell Curve)
  • Faculty-centered model in classroom

14
Learner-Centered College at Maricopa C.C. Cont.
  • Characteristics of the TLP Cont.
  • Students treated as monolithic group
  • College has little connection to external
    community
  • Classroom is hierarchical and authoritarian
  • Technology promotes content not active learning

15
Learner-Centered College at Maricopa C.C. Cont.
  • Characteristics of the ILP
  • Learning is process and product/outcome
  • Emphasis on enhancing the quality of
    teaching/learning
  • Curriculum is flexible, relevant, and responsive
    to students
  • Focus on active learning community
  • Authentic evaluation of student learning

16
Learner-Centered College at Maricopa C.C. Cont.
  • Characteristics of the ILP Cont.
  • Diversity considered a strength
  • Larger community connected to campus learning
  • Technology enhances teaching and learning
  • Different learning delivery systems encouraged
  • Incentives/rewards for effective faculty work

17
How Can We Support Faculty Work and Impact
Academic and Student Development?
  • Lead the discussion that illuminates critical
    questions/concerns/issues
  • Provide initial sources of data and documentation
    directly to departments and programs
  • Keep the focus on measurable objectives and
    outcomes
  • Neutralize political concerns
  • Do all necessary programming
  • Facilitate the development and implementation of
    an assessment web site
  • Champion the development of assessment plans
  • Partner in assessment with your other providers
    of the service

18
Meaningful Use of Datafrom Peggy Maki, Ph.D.
  • Collect data from different sources to make a
    meaningful point (for example, program samples
    and other samples of student work).
  • Collect data you believe will be useful to
    answering the important questions you have
    raised.
  • Collect data that will help you make decisions
    for continuous improvement.
  • Organize reports around issues, not solely data.
  • Interpret your data so that it informs program
    improvement, budgeting, planning,
    decision-making, or policies.

19
Building an Assessment Website
  • Easy to navigate
  • Definitions
  • Principles of Student Learning
  • Resources and Tool Kits
  • Projects and Portfolios
  • Presentations and Papers
  • Contact Information

20
Why Move Away from Student Satisfaction
Assessment?
  • Student satisfaction, utilization, and needs
    assessment are very important.
  • However, they dont help you understand the
    contributions of your program.
  • They dont tell you how your program contributes
    to student development and learning.
  • It seldom helps you make decisions for continuous
    improvement of your programs.

21
Compare Assessment Methods for Satisfaction
  • Self-report satisfaction survey
  • Maybe interviews
  • Maybe observations

22
Compare Assessment Methods for Dev. and Learn.
  • Self-report Survey
  • Interviews based on criteria
  • Observations based on criteria
  • Standardized career service assessment
    instruments
  • Student Portfolios
  • Peer evaluation
  • Self evaluation
  • Evidence of knowledge of discipline in portfolio

23
Some Methods That Provide Direct Evidence
  • Student work samples
  • Collections of student work (e.g. Portfolios)
  • Capstone projects
  • Course-embedded assessment
  • Observations of student behavior
  • Internal juried review of student projects
  • Evaluations of performance

24
Direct Evidence Cont.from Peggy Maki, Ph.D.
  • External juried review of student projects
  • Externally reviewed internship
  • Performance on a case study/problem
  • Performance on problem and analysis (Student
    explains how he or she solved a problem)
  • Performance on national licensure examinations
  • Locally developed tests
  • Standardized tests
  • Pre-and post-tests
  • Essay tests blind scored across units

25
Some Methods That Provide Indirect
Evidencefrom Peggy Maki, Ph.D.
  • Alumni, Employer, Student Surveys
  • Focus groups
  • Exit Interviews with Graduates
  • Graduate Follow-up Studies
  • Percentage of students who go on to graduate
    school
  • Retention and Transfer Studies
  • Job Placement Statistics

26
Indirect Evidence Cont.
  • Courses selected or elected by students
  • Faculty/Student ratios
  • Percentage of students who study abroad
  • Enrollment trends
  • Percentage of students who graduate within
    five-six years
  • Diversity of student body
  • CAS Standards

27
Teaching Portfolio
  • Various Definitions Share Common Characteristics
  • A coherent set of materials and work samples
  • A selective portrayal of ones work, not an
    accumulation
  • Indicates self-reflection and improvement
  • Provides authentic evidence of teaching
    effectiveness and student learning
  • Indicates the pedagogical reasoning or thinking
    behind ones teaching performance

28
Teaching Portfolio Cont.
  • Generally divided into six parts
  • Teaching responsibilities
  • Reflective statement of teaching philosophy/goals
  • Representative instructional materials
  • Evidence of student learning
  • Recent evaluations
  • Description of activities to improve teaching

29
Outcome Statement for Faculty
  • To examine the relationship among instructional
    styles and varied learning styles.
  • To understand the fundamental relationship
    between effective teaching, learning outcomes,
    and the diversity among the different student
    populations.
  • To explore, in a practical way, how faculty can
    produce more equitable outcomes in the classroom
    especially when students exhibit disparate needs.

30
Outcome Statement for Faculty Cont.
  • To identify the student learning outcomes and
    associated strategies that instructors should
    expect from the application of diversity in the
    classroom.
  • To examine the incorporation of course goals that
    include student self-evaluation of their own
    learning.
  • To identify instructional strategies that
    facilitate the self-evaluation of student
    learning for traditional and nontraditional
    outcomes.

31
Outcome Statement for Faculty Cont.
  • To evaluate the relationship between student
    support service activities and classroom outcomes
    especially as they pertain to 1) instructional
    approaches and 2) the classroom environment.
  • To incorporate student support activities and
    teaching/learning activities into a total
    learning community.

32
Reasons for Learning Style Assessment that
Incorporate Diversity
  • Self-assessment feedback
  • Cohort Comparisons
  • Cluster Analysis of Behaviors
  • Development of Effective Cooperative Clusters

33
Reasons for Learning Style Assessment that
Incorporate Diversity Cont.
  • Matching of Learning Styles/Teaching Styles
  • Correlation with other dimensions
  • Identification of Critical Dimensions

34
Assessing the Impact of Technology on On-Line
Course Delivery
  • Need
  • Justify expense/investment
  • Answer questions of quality and accountability
  • Measure success and efficiency
  • Provide concrete evidence of learning

35
Assessing the Impact of Technology on On-Line
Course Delivery Cont.
  • Need
  • Reliable and sophisticated tool
  • Tool that can be customized to an institutions
    needs
  • A way to compare assessment results across
    institutions
  • A longitudinal data base

36
Academic Computing Assessment Data Depository
(ACADR)
  • ACADR is divided into three parts
  • View and entry tool where students choose an
    available survey and submit responses
  • View summaries of submitted data and download in
    excel or SPSS
  • Use administration tool to create new and modify
    existing surveys

37
Academic Computing Assessment Data Depository
(ACADR) Cont.
  • Collaborations among 2 and 4 year institutions
    provide a searchable, digital library of
    assessment
  • Http//itd.shu.edu/repository/contact.htmlgt
  • Http//repository.itd.shu.edu/gt
  • Dr. Eric Fountain Seton Hall University
  • Institute for Technology Development

38
Some Questions about Student Learning and
DevelopmentAdapted from Peggy Maki, Ph.D
  • What do you expect your students to know and be
    able to do by the end of their education at your
    institution?
  • What do the curricula and the co-curricular add
    up to?
  • What do you do in your programs to promote the
    kinds of learning and development that your
    institution seeks?

39
Some More Questions Adapted from Peggy Maki,
Ph.D
  • Which students benefit from which co-curricular
    experiences?
  • What co-curricular processes are responsible for
    the intended student outcomes the institution
    seeks?
  • How can you help students make connections
    between classroom learning and experiences
    outside of the classroom?
  • How do you intentionally build upon what each of
    you fosters to achieve?

40
Questions that Direct the Development of Synergy
between Academic Affairs and Student Affairs
James A. Anderson, Ph.D.
  • What is the thinking task, intellectual
    experience, and/or co-curricula experience that
    needs to be designed relative to the preparation
    level and diversity of the students at your
    institution?
  • Can the interpersonal transactions that occur in
    the everyday life of the student and that reflect
    cultural orientations serve as a basis for
    potential new models of critical thinking? What
    curricular experiences will promote this skill
    development?

41
Questions that Direct the Development of Synergy
between Academic Affairs and Student Affairs
Continued James A. Anderson, Ph.D.
  • What structures need to evolve to assure that
    students have the opportunity to enhance academic
    self-concept and understand their role in the
    culture of learning at your institution?
Write a Comment
User Comments (0)
About PowerShow.com