Improving Student Learning and Development Through Meaningful and Systematic Reflection: What Do I N - PowerPoint PPT Presentation

1 / 33
About This Presentation
Title:

Improving Student Learning and Development Through Meaningful and Systematic Reflection: What Do I N

Description:

Improving Student Learning and Development Through Meaningful and Systematic ... What decision did you make about your program last year? ... – PowerPoint PPT presentation

Number of Views:74
Avg rating:3.0/5.0
Slides: 34
Provided by: maril201
Category:

less

Transcript and Presenter's Notes

Title: Improving Student Learning and Development Through Meaningful and Systematic Reflection: What Do I N


1
Improving Student Learning and Development
Through Meaningful and Systematic Reflection
What Do I Need to Know to Get Started?
  • Marilee J. Bresciani, Ph.D.
  • Assistant Vice President for Institutional
    Assessment
  • Texas AM University
  • mbresciani_at_tamu.edu

2
Ask Yourself These Questions
  • What decision did you make about your program
    last year?
  • What evidence did you use to inform that
    decision?
  • What was it that you were trying to influence
    about your program when making that decision with
    the stated evidence?

3
Recognize that THAT is Assessment
  • Most people do capitalize on their innate
    intellectually curiosity to find out what works
  • Most people just dont articulate their intended
    end results (e.g., outcomes) ahead of time
  • Most people dont document the decisions made
    based on their results
  • Most people dont follow up later to see if their
    decisions made the intended improvement

4
Find out What People are Already Doing
  • Talk to anyone and everyone
  • Find out who the champions are
  • Find out who has data
  • Find out who has assessment plans
  • Find out who is doing faculty/staff development
  • Find out who is curious about it
  • Find out who hates it
  • Encourage folks to talk to each other

5
Pull A Group of Interested Parties Together to
  • Define assessment
  • Develop a shared conceptual understanding of the
    purpose of assessment
  • Define a common language
  • Develop short term and long term goals for
    assessment

6
An Example from
  • North Carolina State University

7
The Assessment CycleAdapted from NCSU CUPR
Guidelines
  • The key questions
  • What are we trying to do and why? or
  • What is my program supposed to accomplish?
  • How well are we accomplishing that which we say
    we are?
  • How do we know?
  • How do we use the information to improve or
    celebrate successes?
  • Do the improvements we make work?

8
The IterativeSystematicAssessment
CycleAdapted from Peggy Maki, Ph.D. by
Marilee J. Bresciani, Ph.D.
Gather Evidence
Interpret Evidence
Mission/Purposes Goals Outcomes
Implement Methods to Deliver Outcomes and Methods
to Gather Evidence
Make decisions to improve programs enhance
student learning and development inform
institutional decision- making, planning,
budgeting, policy, public accountability
9
Assessment By M.J. Bresciani
  • Most importantly, assessment should be
  • Meaningful faculty (i.e., expert) driven
  • Understood by students
  • Manageable takes into account varying
    resources, including time do not assess
    everything every year
  • Flexible takes into account assessment learning
    curves some people will be more sophisticated
    in their assessment than others
  • Truth-seeking/objective/ethical
  • Informs decisions for continuous improvement or
    provides evidence of proof
  • Promotes a culture of accountability, of
    learning, and of improvement

10
An Example from.. .
  • Texas AM University

11
Purpose of Evidence-Based Decision Making (TAMU
Strategic Plan Draft)
  • To genuinely engage faculty, administrators, and
    students in the day-to-day reflection of
    answering these questions
  • Why do we do what we do the way we do? and
  • How do we use what we learn from the process to
    inform policy discussions, curriculum
    improvements, purposeful out-of-classroom
    experiences, and resource allocations?

12
Goals for Evidence-Based Decision Making for the
Individual Faculty Member (TAMU Strategic Plan
Draft)
  • Increase the quantity, depth, and durability of
    the learning of students in my course
  • Gather and display data that will allow me to
    make a strong case to NSF, NIH, FIPSE, or other
    funding agencies that my project is worth funding
  • Increase my confidence that I am putting time and
    energy into processes that result in the outcomes
    I value

13
Goals for Evidence-Based Decision Making for the
Academic Unit (TAMU Strategic Plan Draft)
  • Strengthen our ability to say that our graduates
    are well-prepared to enter the work force,
    graduate school, or service organizations
  • Increase our confidence that we are putting our
    time and energy into activities that result in
    the outcomes we value
  • Have ready access to data that will satisfy the
    requirements of accrediting agencies and other
    accountability driven conversations
  • Gather and display data that will allow us to
    make a strong case for increased university
    funding for our department

14
Goals for Evidence-Based Decision Making for the
Academic Unit for the University (TAMU Strategic
Plan Draft)
  • Increase our confidence that we are putting our
    time and energy into activities that result in
    the outcomes we value as an institution
  • Have ready access to data that will satisfy the
    requirements of accrediting agencies and other
    accountability driven conversations
  • Increase our confidence that we are allocating
    resources to areas (units, activities, processes)
    that are producing the outcomes we value
  • Ensure the ability to communicate the value and
    prestige of the Texas AM experience

15
Consistently
  • Remind people what assessment is for (purpose and
    goals). Make sure it does not become a process
    built to sustain itself.
  • Draw upon examples from your own institution of
    decision making based on assessment results to
    role model those reasons share these examples
    with each other
  • Commit to flexibility - - this is a thinking
    persons process (-Tom Benberg, SACS
    Commissioner)
  • Nudge and Retreat (Maki, 2001)
  • Celebrate successes

16
First Things First
  • Acknowledge why you are engaging in outcomes
    assessment
  • Acknowledge your political environment
  • Define assessment and decide what to call it
  • Articulate a shared conceptual understanding
  • Define a common language
  • Articulate assessment expectation(s)
  • Define how results will be used

17
First Things First, Cont.
  • Decide what you are going to ask for in the
    assessment plans and reports.
  • Dont ask for what wont be used.

18
Typical Components of An Assessment Plan
  • Mission
  • Objectives
  • Outcomes
  • Evaluation Methods
  • With criteria and by Outcomes
  • Add Limitations if necessary
  • Implementation of Assessment
  • Who is Responsible for What?
  • Timeline
  • Results
  • By Outcomes
  • Decisions and Recommendations
  • For each outcome and for the assessment process

19
First Things First, Cont.
  • Identify what you have already done that is
    evaluation/assessment/planning (formal verses
    informal)
  • Identify easy to access resources (data,
    assessment tools, people, technology, etc.)
  • Articulate roles in this process clearly (will an
    assessment committee be evaluating content and
    another committee be evaluating process?)
  • Establish a plan and system to support
    faculty/staff development
  • Establish a communication plan with your
    varying audiences in mind

20
First Things First, Cont.
  • Identify short-range and long-range goals
    (time-line)
  • Identify appropriate resources, including people,
    to support the educational process
  • Identify documentation resources
  • Identify whether you will have a central
    coordinated process with decentralized delivery,
    or whether you will have a decentralized process,
    or a centralized process
  • Establish a plan for meeting your short range and
    long range goals. Be sure to identify needed
    resources.
  • Think both in baby steps and in quantum leaps

21
First Things First, Cont.
  • Develop a multi-year assessment plan to evaluate
    your assessment process
  • Answer the question, what happens if I dont
    engage in assessment?
  • Move forward together, highlight champions and
    nurture resistors
  • Listen and address barriers they are real

22
Where is Your Institution in this Process?
  • What Other First Thing First steps have you
    incorporated?

23
Questions to Consider
  • Will you have a university/division assessment
    plan timeline for implementation of this process?
  • Will there be someone doing the regular data
    collection (e.g., enrollment figures, retention
    and graduation rates, budget figures)?
  • Will there be someone coordinating the assessment
    planning process?
  • Will assessment plans be centrally located?

24
Questions, Cont.
  • Will there be someone in charge of documentation?
  • How can you use assessment to inform your
    enrollment planning, performance indicators, and
    other types of evaluation?
  • Can key assessment coordinators get release time
    to get the process established?
  • How will you manage the sometimes competing
    information requests from external constituents
    and internal constituents?
  • How will you use the assessment results?

25
Questions, Cont.
  • What are the rewards and incentives for engaging
    in assessment? Who provides those rewards and
    incentives?
  • How will assessment results inform resource
    allocation or re-allocation?
  • Will assessment results be used for personnel
    evaluations?
  • How will all your planning and evaluation
    initiatives link?
  • Will you have institutional learning outcomes
    that all programs need to assess?
  • What if programs and courses cannot link to
    specific institutional goals?

26
What other Questions do you Find are Important to
Answer?
27
Helpful Reminders
  • Clearly communicate assessment expectations
  • Go ahead and write every program outcome down
    but
  • Dont try to assess every program outcome every
    year.
  • You may want to start with course outcomes and
    build program outcomes from those.
  • You can start with institutional,
    college/division, or departmental outcomes and
    see each program or course ties to those.
  • Then, move to implementing the entire assessment
    cycle one outcome at a time making everything for
    that systematic - - In other words, we want to
    begin to form habits of assessment.

28
Helpful Reminders, Cont.
  • Faculty/Administrators must understand the
    purpose of assessment - - it is not assessment
    for assessments sake
  • Faculty /Administrators must value what is being
    measured
  • Faculty /Administrators must have ownership of
    the process
  • Respect varying disciplines academic freedom
  • Recruit influential faculty /administrators to
    lead the process

29
How Will you Know how Well You Are Doing?
  • Visit the Commission on Higher Educations
    website for Levels of Implementation matrix found
    at http//www.ncahigherlearningcommission.org/reso
    urces/assessment/index.html
  • Use or adapt self-evaluation tools found at
    http//www.ncsu.edu/undergrad_affairs/assessment/f
    iles/evaluation/evaluation.htm
  • Look at what your assessment plans and results
    are telling you
  • Ask your faculty, co-curricular specialists, and
    students

30
Questions?
  • mbresciani_at_tamu.edu

31
One Minute Evaluation
  • What is the most valuable lesson that you learned
    from this workshop?
  • What is one question that you still have?
  • What do you think is the next step that your
    division/program needs to take in order to
    implement systematic program assessment?

32
References
  • Bresciani, M.J. (September, 2002). The
    relationship between outcomes, measurement. and
    decisions for continuous improvement. National
    Association for Student Personnel Administrators,
    Inc NetResults E-Zine. http//www.naspa.org/netre
    sults/index.cfm
  • Bresciani, M.J., Zelna, C.L., and Anderson, J.A.
    (2004). Techniques for Assessing Student Learning
    and Development in Academic and Student Support
    Services. Washington D.C.NASPA.
  • Ewell, P. T. (2003). Specific Roles of Assessment
    within this Larger Vision. Presentation given at
    the Assessment Institute at IUPUI. Indiana
    University-Purdue University- Indianapolis.
  • Maki, P. (2001). Program review assessment.
    Presentation to the Committee on Undergraduate
    Academic Review at NC State University.

33
References, Cont.
  • NC State University, Undergraduate Academic
    Program Review. (2001) Common Language for
    Assessment. Taken from the World Wide Web
    September 13, 2003 http//www.ncsu.edu/provost/ac
    ademic_programs/uapr/process/language.html
  • Palomba, C.A. and Banta, T.W. (1999). Assessment
    essentials Planning, implementing and improving
    assessment in Higher Education. San Francisco
    Jossey-Bass.
  • Texas AM University (2004) Strategic Plan for
    Coordinating and Supporting Evidence-Based
    Decision Making
Write a Comment
User Comments (0)
About PowerShow.com