Using Program/Unit Review to Facilitate Change - PowerPoint PPT Presentation

1 / 74
About This Presentation
Title:

Using Program/Unit Review to Facilitate Change

Description:

Using Program/Unit Review to Facilitate Change SACS Annual Meeting December 6, 2004 Terri M. Manning, EdD Kathy Drumm, DBA, CPA Central Piedmont Community College – PowerPoint PPT presentation

Number of Views:213
Avg rating:3.0/5.0
Slides: 75
Provided by: mtm1
Category:

less

Transcript and Presenter's Notes

Title: Using Program/Unit Review to Facilitate Change


1
Using Program/Unit Review to Facilitate Change
  • SACS Annual Meeting
  • December 6, 2004
  • Terri M. Manning, EdD
  • Kathy Drumm, DBA, CPA
  • Central Piedmont Community College
  • Charlotte, North Carolina

2
Some history.
  • CPCC is the largest community college in both the
    Carolinas with approximately 60,000 enrolled
    students.
  • We have over 100 instructional program areas
    including for-credit, continuing education and
    literacy
  • We have almost 50 administrative and student
    affairs units

3
Some history.
  • When we started this process (1998), the College
    had done nothing truly IE related since the
    last SACS visit in 1992 when they received a
    recommendation for most must statements in
    section III.
  • The IR office staff had rolled over and all
    institutional memory was gone.
  • We had less than 5 years to put every program and
    unit through a review process.

4
Why this Process?
  • The evaluation of teaching and educational
    effectiveness is a top priority for most
    institutions
  • However, the evaluation of business,
    administrative and student services units is
    often undervalued and overlooked
  • Evaluating the effectiveness of services (not
    just customer satisfaction) makes good business
    sense

5
Institutional Effectiveness and the Southern
Association of College and Schools
  • At the heart of SACS philosophy is the concept
    of institutional effectiveness (IE)
  • IE involves a process of planning, evaluation,
    assessment and use of results
  • Under the new core requirements and comprehensive
    standards, there are two mandates that
    specifically address program or unit reviews

6
Under the New Core Requirements and Comprehensive
Standards
  • Core Requirement 2.5 states The institution
    engages in ongoing, integrated, and
    institution-wide research-based planning and
    evaluation processes that incorporate a
    systematic review of programs and services that
    (a) results in continuing improvement, and (b)
    demonstrates that the institution is effectively
    accomplishing its mission.
  • Comprehensive Standard 3.3.1 States The
    institution identifies expected outcomes for its
    educational programs and its administrative and
    educational support services assesses whether it
    achieves these outcomes and provides evidence of
    improvement based on analysis of those results.

7
IE at Central Piedmont
  • CPCC established an IE committee in 1998 and an
    IE website in 2000 (http//inside.cpcc.edu/IE)
    (this website gets an average of 61 hits a day
    most outside the college)
  • Committee members were from various instructional
    departments, student services and
    administrative/business offices
  • The committee established an IE plan for the
    college

8
Visual of the IE Plan
9
Four Major Overlapping Pieces of the IE Plan
  • Annual goal setting and follow-up required by all
    units (Strategic Plan)
  • Program/unit review on a 3-5 year cycle
  • College assessment plan involving surveys,
    assessment and data analysis
  • The evaluation of general education

10
Developing the Process
  • We wanted to create a meaningful program/unit
    review process
  • We wanted programs to complete the process having
    learned something valuable (not a document to set
    on a shelf)
  • We wanted the process to be outcome-based to
    stand the test of time

11
Developing the Process
  • The first review process was created for
    instructional programs
  • During the 1998-99 year, 18 programs were
    reviewed (approximately 100 programs over five
    years)
  • The perception at the beginning was that this is
    just another academic exercise
  • But the results were very different than any
    review process previously completed

12
Organizational Stages of Outcome Evaluation
Stage 5
Acceptance adaptation Challenge
competition Catalyst - Proactive
Stage 4
Depression - exhaustion Compliance - Passive
reactive
Stage 3
Bargaining - no time/no money Seek outside sources
Stage 2
Anger and antagonism Resistant Reactive
Disbelief Denial Paralysis - Passive
resistance
Stage 1
13
Instructional Program Review
  • The review process contained five sections
  • I. The Program Profile
  • II. Program Content
  • III. Student Learning Outcomes
  • IV. Need for Change
  • V. Future Issues

14
Instructional Program Review
  • Tasks accomplished through review
  • Defined their program
  • Detailed faculty and staff (credentials, prof.
    dev, accomplishments)
  • Unit mission or purpose
  • Link unit mission to the college mission
  • Defined the content of their program
  • Defined the population served
  • Set outcome objectives
  • Performed some means of assessment
  • Analyzed results
  • Determined strengths and weaknesses
  • Created strategies for improvement
  • Determined future needs (including curricular
    change, needed resources, staff and space)

15
Instructional Program Review
  • Rules for the process included
  • Involvement on the part of all faculty in the
    department under review (not just one person).
    It is recommended that the program begin with a
    brief faculty retreat to discuss and divide
    tasks.
  • All programs must use at least one external
    committee (advisory groups are fine) to provide
    feedback to programs.
  • All programs must utilize feedback from students.

16
What We Learned.
  • The first group was dragged kicking and screaming
    through the process
  • Faculty had very little time for the details
  • Faculty had trouble identifying student learning
    outcomes
  • The results produced great marketing materials
  • Once they saw the results, faculty embraced the
    process

17
Identifying Outcomes
  • We focused on two types of measures
  • Outcomes
  • Program Outcomes
  • Student Learning Outcomes
  • Administrative Objectives

18
Administrative Objectives
  • Many units do not directly serve students or they
    want results within their units that are not
    truly outcomes.
  • They want to improve services or approach an old
    problem in a new way.
  • They want to become more efficient and effective.
  • They will set administrative objectives.

19
My Administrative Objectives
  • 1. 80 of faculty/staff responding to the
    faculty/staff survey will perceive that Planning
    and Research responds quickly to their requests
    for data.
  • 2. 80 of faculty/staff responding to the survey
    will perceive that Planning and Research makes a
    significant contribution to the College.
  • 3. 80 of faculty/staff responding to the survey
    will perceive that Planning and Research
    contributes to the effectiveness of CPCC.
  • 4. 80 of faculty/staff responding to the survey
    will indicate that Planning and Research produces
    enough reports to meet the planning and
    information needs of faculty and staff.

20
Outcomes
  • "Outcomes" are benefits for people changes in
    knowledge, values, position, skills, behavior or
    status. More simply stated, outcomes are
    typically what service providers hope recipients
    achieve once they complete a program or receive
    services. This is not the what but the why
    of education.
  • Student learning outcomes are outcomes related to
    the learning that takes place in the classroom.
    We measure improvements in writing, speaking,
    understanding the scientific method, etc.
  • Program outcomes are the benefits to a student
    who receives an associate degree in Nursing or
    completes a certificate in Network
    Administration? Typical outcomes might be
    licensure exam scores, job placement, etc.
  • Outcome objectives are just objectives that
    relate to the identified outcomes.

21
Program Outcome Model
INPUTS
ACTIVITIES
OUTPUTS
Resources Services Products or Results
of Activities Staff Education
(classes) Numbers served Buildings Services FTE
(input next year) Facilities Counseling
Classes taught State funds Student activities
Students recruited FTE Who
participated Constraints Laws State regulations
22
Program Outcomes Model
gt gt gt Benefits
for People New knowledge
Increased
skills Changes in values
Modified behavior Improved
condition Altered status
New opportunities
INPUTS
ACTIVITIES
OUTPUTS
OUTCOMES
23
We Had to do Training on How to Set Objectives
  • Theres no magic number
  • e.g. 80 or 90
  • What is reasonable?
  • What can you afford?
  • What realistically can your staff accomplish?
  • May need to benchmark (e.g. enrollment growth)
  • What percent shows youre not committed and what
    percent shows youre naïve?

24
How to Set Objectives
  • Examples
  • Fifty percent of students will be able to
    communicate effectively in
    writing (complete the writing exam with a grade
    of 60 D or better)
  • By the end of the spring term, 95 of faculty and
    staff will have completed 20 contact hours of
    professional development (workshops, college
    courses, conferences, onsite trainings, etc.)

25
More Realistic
  • Seventy percent of students will be able to
    communicate effectively in writing (complete the
    writing exam with a grade of 75 C or better)
  • By the end of the spring term, the professional
    development office will increase their offerings
    for faculty and staff by 10 over what was
    offered last year (workshops, college courses,
    conferences, onsite trainings, etc.)

26
What Happened
  • Deans used the results to make a case for
    resources
  • The administration became interested in what was
    learned through the review process
  • Instruction created a position to deal mainly
    with program review and IE issues within
    instruction
  • Faculty knew their programs were working

27
Unit Review Lessons Learned
  • To do this well you need several critical pieces
  • 1. Support from the top (President or Chancellor,
    Vice Presidents or Vice Chancellors)
  • 2. Buy in from the grassroots level
    (participation in the development of the process)
  • 3. Across-the-college participation (no one is
    exempt)
  • 4. Technology to make it easy (web page and
    review templates)
  • 5. Technical support from institutional research

28
IE Committee Decision
  • The instructional program review process was
    successful
  • The review of instructional units was important
    but administrative/ESS units helped created an
    environment conducive to learning at the college
    and supported the learning process
  • Administrative units should go through a similar
    process
  • Create a committee to draft a similar process for
    the administrative and students services areas of
    the college
  • Administrative units were spread across three VP
    areas
  • Two representative from each VP area participated
    in drafting the new review process
  • We spent approximately six months creating a
    workable process

29
Timeline for Administrative Units
  • During the
  • 1999-2000 year six units went through the
    review
  • 2000-2001 year eight units went through
    the review
  • 2001-2002 year nine units went through
    the review
  • Vice Presidents set an annual timeline for their
    units with due dates for each section

30
The Administrative Unit Review Design
  • I. The Unit/Program Profile
  • A. The Mission/Purpose
  • 1. Role unit plays in the college mission
  • 2. Unit/program goals as they relate to the
    colleges mission
  • B. The Staff
  • 1. Professional and administrative staff
  • (since the last review)
  • a. Position description/duties
  • b. Credentials (full and part-time, if any)
  • c. Accomplishments (if applicable)
  • d. Service to college, community and nation

31
The Administrative Unit Review (continued)
  • B. The Staff (continued)
  • e. Professional development activities
  • 2. Classified Staff
  • a. List of names and positions
  • b. List of required credentials (if any)
  • C. The Customer/Client Served
  • 1. Breakdown of students/faculty or staff by
    type or demographic information (thorough
    explanation of who is served)

32
The Administrative Unit Review (continued)
  • II. Definition of Services or Program
  • A. Definition of day-to-day duties of the
    unit
  • B. Innovations, new projects, new
    initiatives, local, state-wide or national
    efforts
  • C. Required functions of unit (description
    and status of compliance)
  • 1. SACS "must" statements
  • 2. State mandates
  • 3. Federal mandates
  • 4. Other

33
The Administrative Unit Review (continued)
  • III. Administrative Objectives and Student
    Outcomes (where appropriate)
  • A. Administrative Objectives (2-3
    objectives)
  • B. Outcomes (or status if incomplete) of
    innovations, new projects, new initiatives,
    local, state or national efforts
  • C. Assessment explanation (what was
    assessed, who, when, how many)
  • D. Results of Administrative Objectives
    Based on Assessment

34
Assessment of Administrative Units
  • During the year of program review, the Annual
    Faculty/Staff Survey contained questions from
    those units being reviewed.
  • Results were given to each unit and broken out by
    campus and job type
  • http//inside.cpcc.edu/planning
  • Theres a link to survey results.

35
The Administrative Unit Review (continued)
  • IV. Need for Change
  • A. Strengths identified by external
    sources, faculty, staff and students
  • B. Weaknesses identified by external sources,
    faculty, staff and students
  • C. Recommendations by faculty, staff, external
    sources and students to improve the unit's
    services and programs
  • D. Strategies for change (based on input from
    Sections A, B C) - closing the loop
  • E. A one-year follow-up brief report to the
    Unit VP reporting on the progress of D above
    (due at the end of the year following review)

36
The Administrative Unit Review (continued)
  • V. Future Issues - Resources needed for future
    efforts
  • A. Market trends within the broad service
    unit or program area (based on
    best- practices, the literature or training
    received)
  • B. Anticipated future changes and needs
    (based on market trends)
  • C. Resources, equipment, space, staffing and
    work load changes needs for future growth or
    continuation
  • D. Future plans of unit

37
Assistance from the Website
  • The IE website
  • http//inside.cpcc.edu/IE
  • Explanation of IE process
  • Templates and forms for review
  • Perfect examples for clarification
  • Schedule for review

38
(No Transcript)
39
(No Transcript)
40
(No Transcript)
41
(No Transcript)
42
Overall Benefits
  • 1 No recommendations in the area of
    Institutional Effectiveness from SACS during the
    October 2002 re-accreditation visit
  • The college became change-oriented
  • Units had to define strategies for change
  • Didnt have to be perfect but rather making
    continuous progress
  • Strategies for change helped identify needed
    resources for units
  • Units had to close the loop with the one-year
    follow-up (couldnt promise and not deliver)

43
Overall Benefits
  • Units became empowered to perform their functions
    in an optimal manner and to ask for what they
    needed (no one noticed them before, now they do)
  • Created data to support needs
  • Accomplishments were reported to major
    administrative groups across the college
    (Presidents Cabinet, Planning Council, etc.)

44
Overall Benefits
  • The college community understood the purpose and
    function of every unit
  • Senior administration realized the benefits and
    became strong supporters
  • Data are reported annually from program/unit
    review
  • VPs became stronger advocates for their units
    making changes to improve services

45
Overall Lessons Learned Through Program/Unit
Review
  • The response rate to on-line surveys was double
    that of pencil-paper surveys
  • Faculty and staff did not know what many units
    did
  • Faculty and staff were unsure of many policies
    (how to obtain funds from the Foundation, etc.)
  • Email was the preferred method of communication
    for faculty and staff
  • Overall, faculty and staff were pleased with
    services
  • Wednesday afternoons were the best time to hold
    trainings

46
Overall Lessons Learned Through Program/Unit
Review (cont.)
  • Faculty and staff had good ideas on how to
    improve services if we just asked them
  • Use of P-Cards was saving the college time and
    money and faculty/staff preferred using them
  • Faculty and staff wanted to continue to receive
    paycheck receipts in the mail (all employees are
    direct deposit)
  • More than 80 of faculty and staff were using
    remote access to email
  • Faculty and staff wanted changes in the budgeting
    process too complicated

47
Administrative Response
  • Units spend a lot of time working on the program
    review.
  • If we want them to take it seriously, we have to
    take it seriously.
  • Once reviews are reported at the end of the year,
    their Dean or Vice President/Chancellor need to
  • 1. Read them
  • 2. Respond to them

48
From Personal Experience
  • Sit down with the unit (group meeting)
  • Give them my attention
  • Discuss what was learned
  • Discuss their major issues
  • Discuss what they need to make improvements
  • Actually attempt to direct resources to them to
    make those improvements
  • Then they do not see it as an academic exercise

49
How It Works for Us
  • Administrative Units Reviewed in 2003
  • Administrative Technology Services
  • Compliance Audit, Inventory Control, Basic
    Skills Compliance Reporting
  • Facilities Services
  • Information Technology Services
  • Institutional Advancement/Foundation
  • Planning and Research
  • Resource Development

50
Faculty/Staff Survey 2003
  • During the Spring 2003, an on-line faculty-staff
    survey was developed, administered and analyzed.
  • A total of 222 faculty and staff completed the
    survey with the following numbers by type

Central 156 Classified Staff 51
North 6 Professional Staff 55
Northeast 13 Faculty 68
Levine 20 Administration 48
Southwest 9 Total 222
West 15 Full-time 210
Other 3 Part-time 12
51
Facilities Services
  • Consists of the following departments
  • Facilities Design and Construction Oversee
    planning, design, and construction of new
    facilities and major renovations
  • Facilities Management Operates and maintains
    existing facilities of the College
  • Security Provides physical security services at
    all campuses
  • Distribution Services Manages collection and
    delivery of U.S. and intercampus mail, package
    deliveries, and centralized shipping and
    receiving services
  • Inventory Control Manages the Colleges
    inventory tracking, reporting, and disposal
    functions

52
Facilities Services
  • Survey respondents gave high rankings (in the
    80-90 range) for
  • Newly constructed or newly renovated facilities
    being of high quality and meeting needs
  • Knowing what was planned for the future
    facilities of their respective campuses
  • Classrooms in buildings built since 1990 being
    adequate in size, furnishings and other amenities

53
Facilities Services
  • Classrooms and common areas being generally clean
  • Lighting levels being adequate
  • A safe and secure environment being provided for
    all members of the College community
  • Knowing procedures to follow if a medical
    emergency on campus
  • Providing mail, shipping and receiving, and
    inventory control services

54
Areas Needing Improvement And Actions To Be Taken
  • Only 73 of respondents were pleased with new
    office spaces.
  • To address this concern, the Facilities Standards
    Subcommittee of Facilities Partners will review
    the adopted standards for offices and furnishings
    and make recommendations for revisions.
  • Only 50 faculty and staff were not pleased with
    the temperatures in classrooms.
  • The Central Energy Plant began operating in the
    summer of 2003 when connected to the main body
    of Central Campus, the back-up capabilities of
    this facility should minimize the disruption of
    cooling. The College is also developing a plan
    for the future pairing of boilers to provide
    back-up heating for Central Campus buildings. At
    the suburban campuses, central plant approaches
    are being used to link buildings, thereby
    providing back-up capability.

55
Areas Needing Improvement And Actions To Be Taken
  • Only 57 of the respondents expressed
    satisfaction with the cleanliness of bathrooms.
  • The College entered into a new housekeeping
    contract as of July 1, 2003 which employs a new
    vendor and has enforceable cleanliness standards
    that must be met. Additionally, Facilities
    Management is addressing the appearance of some
    of the older restroom facilities on Central
    Campus as part of a cyclical rehabilitation
    program.

56
Future Considerations
  • The College will be adapting its long-term
    capital planning to new guidelines from the
    County and seeking the use of alternative funding
    sources.
  • Facilities Services will be working with other
    units of the College to ensure coordinated
    planning relating to the staffing and specialty
    services for new facilities.
  • Facilities Services will be developing new
    strategies for meeting operational demands in the
    face of decreasing dollars per square feet in the
    County budgets.

57
Planning and Research
  • Survey respondents gave high rankings (in the
    80-90 range) for
  • Responding quickly to their need for
    data/information.
  • Producing enough reports annually to meet the
    planning and information needs of faculty and
    staff.
  • Making a significant contribution to the College.
  • Contributing to the effectiveness of CPCC.
  • Assisting in the area of survey
    development/design.
  • Assisting with program/unit review.
  • Having a quality Institutional Effectiveness
    Website.
  • Having a quality Institutional Research Website.
  • Having a quality Fact Book (available online).

58
Strengths and Weaknesses
  • Strengths Identified
  • Serving the College well
  • Contributing to the Colleges effectiveness
  • Doing a good job
  • Assisting the College well with survey design and
    program review support
  • Providing helpful web sites and Fact Book
  • How We Can Improve
  • Increase awareness by communicating its services
    and results
  • Take a look at the websites for possible
    improvement
  • Conduct more studies

59
Future Issues
  • Strategies for Change
  • The department will consider a name change
  • The webmasters for the two websites will assess
    them over the next year to attempt to reduce
    clutter
  • The senior research analysts will undertake
    additional studies for the college (one each over
    the next year)

60
Future Issues
  • Future Needs/Plans
  • Expect to open the Center for Community-based
    Research
  • Will need to increase space (plus parking) and
    staff for this project
  • Send staff to training on new SACS criteria
  • Increased software for survey analysis and
    qualitative research

61
Information Technology
  • Survey respondents gave high rankings (in the
    80-90 range) for
  • Courteous and helpful staff in answering
    questions and resolving issues with
    administrative systems access.
  • Providing regular scheduled reports accurately
    and on time.
  • Reliability of systems and availability
    especially during heavy registration.
  • Awareness of the statewide implementation of the
    Data-tel Colleague system this year.

62
Strengths and Weaknesses -Administrative
Computers Services
  • Strengths Identified
  • Strong staff retention
  • Strong understanding of policies and procedures
  • Consistency of services
  • Established relationships with other departments
  • Reliable Mainframe system/applications
  • What We Can Improve
  • Complex and inconsistent applications
  • Knowledge base
  • Users unclear on best usage/practices on new CIS
  • Inconsistent knowledge/support of both systems
  • Communication

63
Strengths and WeaknessesHelp Desk
  • Strengths Identified
  • Customer Service
  • Progressive Implementation of Technology / New
    Services
  • Existence of Robust Infrastructure to allow for
    implementation of new technology
  • How We Can Improve
  • Email Concerns
  • Automate account request process/eliminate paper
  • Insufficient Human Resources preventing suitable
    marketing / awareness of IT Services
  • Communication Internal/ External
  • Follow Up and customer service

64
Strategies for Change
  • Speed up email migration
  • Communicate scheduled downtime
  • Automate account request process eliminate paper
  • Develop awareness of IT Services
  • Communication Internal/ External
  • Technology Training
  • Follow Up and customer service

65
Strategies for Change
  • Future Needs (staff, resources)
  • Implement a marketing initiatives
  • Improve department image to customers
  • Provide Campus Wide Technology Forums
  • Increase customer awareness
  • Resources
  • Consolidate CPCC Call Center Services to maximize
    college resources and provide consistent
    information increase services provided to all
    customers.

66
Future Issues
  • Strategies for Change
  • Develop solution to integrate applications
  • Focus on Application Framework, especially with
    standards
  • Portal development
  • Cross-train staff to improve effectiveness and
    support
  • Understand best means of communication, plan for
    it, and communicate internally/externally
  • Oral communication highlighting major events
  • Published communication detailing projects,
    issues, and maintenance
  • Published communication on major project statuses

67
Future Issues
  • Future Needs (staff, resources, etc.)
  • Staff
  • Architect to properly guide the framework and
    standards development
  • Technical Writer/Trainer to assist with
    communication
  • Especially focus on patches from Datatel/ACS
  • Resources
  • Microsoft Project Server
  • Assist with internal communication of projects

68
Future Issues
  • Strategies for Change
  • Future Needs (staff, resources, ect)
  • Implement a marketing initiatives
  • Improve department image to customers
  • Provide Campus Wide Technology Forums
  • Increase customer awareness
  • Resources
  • Consolidate CPCC Call Center Services(2722) to
    maximize college resources and provide consistent
    information increase services provided to all
    customers.

69
An Example from Instruction
  • Workplace Basic Skills
  • This program is a literacy initiative that goes
    directly into the worksite and teaches ESL
    classes, GED prep and GED classes.
  • During their review, they surveyed both employers
    and students.
  • This was the first time they had even done this.

70
What They Learned
  • Employers said
  • 43.8 of employers reported increases in employee
    performance as a result of participation in the
    program.
  • 31.3 reported a reduction in absenteeism by
    participants.
  • 87.5 said classes improved the morale of their
    employees
  • 37.5 said participants received raises
  • 50 said communication had improved.

71
What Students Said
  • 70.2 reported being able to fill out job forms
    better
  • 35.5 said they could now help their children
    with their homework
  • 91.1 said they felt better about themselves
  • 44.4 said they had received a raise, promotion
    or opportunity as a result of the courses
  • 86.3 said their ability to communicate in the
    workplace had improved

72
What Has Happened Since
  • Their assessment data has shown up in their
    marketing brochures to employers.
  • Their enrollment has grown dramatically.
  • They have received funding and marketing support
    from Charlotte Reads (considered a model adult
    literacy program).

73
Best Results of Best Practice
  • Better use of data across the college
  • We have become more client/customer focused
  • Review gives direction for goals and needed
    changes
  • Departments are empowered to do their jobs (cant
    slip through the cracks and be unnoticed)
  • Problems must be resolved (there is no hiding and
    no excuses)
  • Surveys provide needs assessment data as well as
    evaluation which gives departments direction

74
For a copy of this presentation
  • Contact Terri Manning or Kathy Drumm
  • terri.manning_at_cpcc.edu
  • kathy.drumm_at_cpcc.edu
  • Download or print presentation
  • http//inside.cpcc.edu/planning
  • Click on studies and reports
  • Listed as SACS Presentation
Write a Comment
User Comments (0)
About PowerShow.com