Improving Student Learning and Development Through Meaningful and Systematic Reflection: What Do I N - PowerPoint PPT Presentation


PPT – Improving Student Learning and Development Through Meaningful and Systematic Reflection: What Do I N PowerPoint presentation | free to view - id: 9eaac-MDBhZ


The Adobe Flash plugin is needed to view this content

Get the plugin now

View by Category
About This Presentation

Improving Student Learning and Development Through Meaningful and Systematic Reflection: What Do I N


Improving Student Learning and Development Through Meaningful and Systematic ... What decision did you make about your program last year? ... – PowerPoint PPT presentation

Number of Views:68
Avg rating:3.0/5.0
Slides: 34
Provided by: maril201


Write a Comment
User Comments (0)
Transcript and Presenter's Notes

Title: Improving Student Learning and Development Through Meaningful and Systematic Reflection: What Do I N

Improving Student Learning and Development
Through Meaningful and Systematic Reflection
What Do I Need to Know to Get Started?
  • Marilee J. Bresciani, Ph.D.
  • Assistant Vice President for Institutional
  • Texas AM University

Ask Yourself These Questions
  • What decision did you make about your program
    last year?
  • What evidence did you use to inform that
  • What was it that you were trying to influence
    about your program when making that decision with
    the stated evidence?

Recognize that THAT is Assessment
  • Most people do capitalize on their innate
    intellectually curiosity to find out what works
  • Most people just dont articulate their intended
    end results (e.g., outcomes) ahead of time
  • Most people dont document the decisions made
    based on their results
  • Most people dont follow up later to see if their
    decisions made the intended improvement

Find out What People are Already Doing
  • Talk to anyone and everyone
  • Find out who the champions are
  • Find out who has data
  • Find out who has assessment plans
  • Find out who is doing faculty/staff development
  • Find out who is curious about it
  • Find out who hates it
  • Encourage folks to talk to each other

Pull A Group of Interested Parties Together to
  • Define assessment
  • Develop a shared conceptual understanding of the
    purpose of assessment
  • Define a common language
  • Develop short term and long term goals for

An Example from
  • North Carolina State University

The Assessment CycleAdapted from NCSU CUPR
  • The key questions
  • What are we trying to do and why? or
  • What is my program supposed to accomplish?
  • How well are we accomplishing that which we say
    we are?
  • How do we know?
  • How do we use the information to improve or
    celebrate successes?
  • Do the improvements we make work?

The IterativeSystematicAssessment
CycleAdapted from Peggy Maki, Ph.D. by
Marilee J. Bresciani, Ph.D.
Gather Evidence
Interpret Evidence
Mission/Purposes Goals Outcomes
Implement Methods to Deliver Outcomes and Methods
to Gather Evidence
Make decisions to improve programs enhance
student learning and development inform
institutional decision- making, planning,
budgeting, policy, public accountability
Assessment By M.J. Bresciani
  • Most importantly, assessment should be
  • Meaningful faculty (i.e., expert) driven
  • Understood by students
  • Manageable takes into account varying
    resources, including time do not assess
    everything every year
  • Flexible takes into account assessment learning
    curves some people will be more sophisticated
    in their assessment than others
  • Truth-seeking/objective/ethical
  • Informs decisions for continuous improvement or
    provides evidence of proof
  • Promotes a culture of accountability, of
    learning, and of improvement

An Example from.. .
  • Texas AM University

Purpose of Evidence-Based Decision Making (TAMU
Strategic Plan Draft)
  • To genuinely engage faculty, administrators, and
    students in the day-to-day reflection of
    answering these questions
  • Why do we do what we do the way we do? and
  • How do we use what we learn from the process to
    inform policy discussions, curriculum
    improvements, purposeful out-of-classroom
    experiences, and resource allocations?

Goals for Evidence-Based Decision Making for the
Individual Faculty Member (TAMU Strategic Plan
  • Increase the quantity, depth, and durability of
    the learning of students in my course
  • Gather and display data that will allow me to
    make a strong case to NSF, NIH, FIPSE, or other
    funding agencies that my project is worth funding
  • Increase my confidence that I am putting time and
    energy into processes that result in the outcomes
    I value

Goals for Evidence-Based Decision Making for the
Academic Unit (TAMU Strategic Plan Draft)
  • Strengthen our ability to say that our graduates
    are well-prepared to enter the work force,
    graduate school, or service organizations
  • Increase our confidence that we are putting our
    time and energy into activities that result in
    the outcomes we value
  • Have ready access to data that will satisfy the
    requirements of accrediting agencies and other
    accountability driven conversations
  • Gather and display data that will allow us to
    make a strong case for increased university
    funding for our department

Goals for Evidence-Based Decision Making for the
Academic Unit for the University (TAMU Strategic
Plan Draft)
  • Increase our confidence that we are putting our
    time and energy into activities that result in
    the outcomes we value as an institution
  • Have ready access to data that will satisfy the
    requirements of accrediting agencies and other
    accountability driven conversations
  • Increase our confidence that we are allocating
    resources to areas (units, activities, processes)
    that are producing the outcomes we value
  • Ensure the ability to communicate the value and
    prestige of the Texas AM experience

  • Remind people what assessment is for (purpose and
    goals). Make sure it does not become a process
    built to sustain itself.
  • Draw upon examples from your own institution of
    decision making based on assessment results to
    role model those reasons share these examples
    with each other
  • Commit to flexibility - - this is a thinking
    persons process (-Tom Benberg, SACS
  • Nudge and Retreat (Maki, 2001)
  • Celebrate successes

First Things First
  • Acknowledge why you are engaging in outcomes
  • Acknowledge your political environment
  • Define assessment and decide what to call it
  • Articulate a shared conceptual understanding
  • Define a common language
  • Articulate assessment expectation(s)
  • Define how results will be used

First Things First, Cont.
  • Decide what you are going to ask for in the
    assessment plans and reports.
  • Dont ask for what wont be used.

Typical Components of An Assessment Plan
  • Mission
  • Objectives
  • Outcomes
  • Evaluation Methods
  • With criteria and by Outcomes
  • Add Limitations if necessary
  • Implementation of Assessment
  • Who is Responsible for What?
  • Timeline
  • Results
  • By Outcomes
  • Decisions and Recommendations
  • For each outcome and for the assessment process

First Things First, Cont.
  • Identify what you have already done that is
    evaluation/assessment/planning (formal verses
  • Identify easy to access resources (data,
    assessment tools, people, technology, etc.)
  • Articulate roles in this process clearly (will an
    assessment committee be evaluating content and
    another committee be evaluating process?)
  • Establish a plan and system to support
    faculty/staff development
  • Establish a communication plan with your
    varying audiences in mind

First Things First, Cont.
  • Identify short-range and long-range goals
  • Identify appropriate resources, including people,
    to support the educational process
  • Identify documentation resources
  • Identify whether you will have a central
    coordinated process with decentralized delivery,
    or whether you will have a decentralized process,
    or a centralized process
  • Establish a plan for meeting your short range and
    long range goals. Be sure to identify needed
  • Think both in baby steps and in quantum leaps

First Things First, Cont.
  • Develop a multi-year assessment plan to evaluate
    your assessment process
  • Answer the question, what happens if I dont
    engage in assessment?
  • Move forward together, highlight champions and
    nurture resistors
  • Listen and address barriers they are real

Where is Your Institution in this Process?
  • What Other First Thing First steps have you

Questions to Consider
  • Will you have a university/division assessment
    plan timeline for implementation of this process?
  • Will there be someone doing the regular data
    collection (e.g., enrollment figures, retention
    and graduation rates, budget figures)?
  • Will there be someone coordinating the assessment
    planning process?
  • Will assessment plans be centrally located?

Questions, Cont.
  • Will there be someone in charge of documentation?
  • How can you use assessment to inform your
    enrollment planning, performance indicators, and
    other types of evaluation?
  • Can key assessment coordinators get release time
    to get the process established?
  • How will you manage the sometimes competing
    information requests from external constituents
    and internal constituents?
  • How will you use the assessment results?

Questions, Cont.
  • What are the rewards and incentives for engaging
    in assessment? Who provides those rewards and
  • How will assessment results inform resource
    allocation or re-allocation?
  • Will assessment results be used for personnel
  • How will all your planning and evaluation
    initiatives link?
  • Will you have institutional learning outcomes
    that all programs need to assess?
  • What if programs and courses cannot link to
    specific institutional goals?

What other Questions do you Find are Important to
Helpful Reminders
  • Clearly communicate assessment expectations
  • Go ahead and write every program outcome down
  • Dont try to assess every program outcome every
  • You may want to start with course outcomes and
    build program outcomes from those.
  • You can start with institutional,
    college/division, or departmental outcomes and
    see each program or course ties to those.
  • Then, move to implementing the entire assessment
    cycle one outcome at a time making everything for
    that systematic - - In other words, we want to
    begin to form habits of assessment.

Helpful Reminders, Cont.
  • Faculty/Administrators must understand the
    purpose of assessment - - it is not assessment
    for assessments sake
  • Faculty /Administrators must value what is being
  • Faculty /Administrators must have ownership of
    the process
  • Respect varying disciplines academic freedom
  • Recruit influential faculty /administrators to
    lead the process

How Will you Know how Well You Are Doing?
  • Visit the Commission on Higher Educations
    website for Levels of Implementation matrix found
    at http//
  • Use or adapt self-evaluation tools found at
  • Look at what your assessment plans and results
    are telling you
  • Ask your faculty, co-curricular specialists, and


One Minute Evaluation
  • What is the most valuable lesson that you learned
    from this workshop?
  • What is one question that you still have?
  • What do you think is the next step that your
    division/program needs to take in order to
    implement systematic program assessment?

  • Bresciani, M.J. (September, 2002). The
    relationship between outcomes, measurement. and
    decisions for continuous improvement. National
    Association for Student Personnel Administrators,
    Inc NetResults E-Zine. http//
  • Bresciani, M.J., Zelna, C.L., and Anderson, J.A.
    (2004). Techniques for Assessing Student Learning
    and Development in Academic and Student Support
    Services. Washington D.C.NASPA.
  • Ewell, P. T. (2003). Specific Roles of Assessment
    within this Larger Vision. Presentation given at
    the Assessment Institute at IUPUI. Indiana
    University-Purdue University- Indianapolis.
  • Maki, P. (2001). Program review assessment.
    Presentation to the Committee on Undergraduate
    Academic Review at NC State University.

References, Cont.
  • NC State University, Undergraduate Academic
    Program Review. (2001) Common Language for
    Assessment. Taken from the World Wide Web
    September 13, 2003 http//
  • Palomba, C.A. and Banta, T.W. (1999). Assessment
    essentials Planning, implementing and improving
    assessment in Higher Education. San Francisco
  • Texas AM University (2004) Strategic Plan for
    Coordinating and Supporting Evidence-Based
    Decision Making