Assessment Techniques for Curricular Improvement - PowerPoint PPT Presentation

1 / 39
About This Presentation

Assessment Techniques for Curricular Improvement


Assessment Techniques for Curricular Improvement Roxanne Canosa, Rajendra K. Raj Department of Computer Science Rochester Institute of Technology – PowerPoint PPT presentation

Number of Views:256
Avg rating:3.0/5.0
Slides: 40
Provided by: ComputerScienceDep96


Transcript and Presenter's Notes

Title: Assessment Techniques for Curricular Improvement

Assessment Techniques for Curricular Improvement
  • Roxanne Canosa, Rajendra K. Raj
  • Department of Computer Science
  • Rochester Institute of Technology

  • What is Assessment?
  • Analytic vs. Holistic Approaches
  • Assessment Grading?
  • Terminology
  • Assessment vs. Accreditation
  • Outcomes vs. Objectives
  • Performance Criteria
  • Direct vs. Indirect
  • Evaluation and Continuous Improvement

What is Assessment?
  • Assessment is one or more processes that
    identify, collect, and prepare data to evaluate
    the achievement of program outcomes and
    educational objectives
  • 2006-2007 Criteria for Accrediting Computing
    Programs Appendix A (Proposed Changes)
  • From Section II.D.1 of the ABET Accreditation
    Policy and Procedure Manual

Analytic vs. Holistic Approaches
  • Analytic approach
  • All students/courses analyzed to diagnose areas
    in need of improvement
  • Holistic approach
  • Focus on overall performance of the program
  • Input from employers, alumni, advisory board
  • Develop efficient and effective processes
  • Lean, mean assessment machine
  • Dont commit random acts of assessment
  • Gloria Rogers

What is Your Assessment Goal?
  • Assessing all students or specific groups of
  • Assessing students, department, or program?
  • Assessing for short-term improvement or long-term
  • Assessing for formative or summative purposes?

Grading vs. Assessing
  • Grading
  • Measures extent to which a student meets faculty
    requirements and expectations for a course
  • Can grades infer students achievement of an
  • Factors
  • Student knowledge
  • Work ethic
  • Faculty variance in course content, grading
    components, beliefs, bias,
  • Assessing
  • Measures extent to which a student achieves each
    course (program) outcome
  • Can we leverage grading components for
  • Use rubrics, which are pre-announced performance

Assessment vs. Accreditation
  • Institutional accreditation through Middle
    States, SACS, etc. are increasingly requiring
    direct assessment of program objectives and
  • Jargon may be different, but the essential ideas
    are the same

Terminology (Jargon)
From ABET perspective
Terminology Lessons
  • Use terminology for your situation
  • Sometimes dictated by institutional accreditation
    (SACS, Middle States)
  • Sometimes dictated by program accreditation
  • Keep a glossary of terms handy for any external
  • Stick to your terminology
  • Terms are not fungible without causing too much

Proposed Changes toABET Criteria for Computing
  • Old criteria
  • Intents and Standards
  • New criteria (2008-2009 cycle)
  • General
  • Program Specific

New ABET Criteria
  • 8 General Criteria
  • Students
  • Program Educational Objectives
  • Program Outcomes (a) through (i)
  • Assessment and Evaluation
  • Curriculum
  • Faculty
  • Facilities
  • Support
  • CS Program Specific Criteria
  • Outcomes and Assessment (a) and (b)
  • Faculty Qualifications
  • Curriculum (a), (b), and (c)
  • IT/IS Program Specific Criteria

Program Audit Concern
  • Concern
  • A criterion is currently satisfied however,
    potential exists for this situation to change in
    the near future such that the criterion may not
    be satisfied. Positive action is required to
    ensure full compliance with the Criteria.

Program Audit Weakness
  • Weakness
  • A criterion is currently satisfied but lacks
    strength of compliance that assures that the
    quality of the program will not be compromised
    prior to the next general review. Remedial action
    is required to strengthen compliance with the

Program Audit Deficiency
  • Deficiency
  • A criterion is not satisfied. Therefore, the
    program is not in compliance with the Criteria
    and immediate action is required.

Program Objectives
  • Program educational objectives are broad
    statements that describe the career and
    professional accomplishments that the program is
    preparing graduates to achieve.
  • Long-term goals
  • Should be distinct to your program
  • Should be publicly available
  • Must be measurable!

Program Outcomes
  • Program outcomes are narrower statements that
    describe what students are expected to know and
    be able to do by the time of graduation. These
    relate to the skills, knowledge, and behaviors
    that students acquire in their matriculation
    through the program.
  • Should be publicly available
  • Must be measurable!

Objectives vs. Outcomes
  • Example objective
  • Graduates will exhibit effective communication
  • Example outcomes
  • By the time of graduation, students will
  • demonstrate effective written communication
  • demonstrate effective oral communication skills

- Gloria Rogers
Performance Criteria
  • Define and describe progression toward meeting
    important components of work being completed,
    critiqued, or assessed
  • Student provides adequate detail to support
    his/her solution/argument
  • Student uses language and appropriate word choice
    for the audience
  • Student work demonstrates an organizational
    pattern that is logical and conveys completeness
  • Student uses the rules of standard English
  • Provide solid evidence of progression

What is Solid Evidence?
  • Direct Evidence
  • Easier to measure
  • Familiar to most faculty - exam or project
    grades, presentation skills, etc.
  • Indirect Evidence
  • Difficult to measure
  • Attitudes or perceptions
  • For example, a desired outcome of a course may
    include improving students appreciation of team

Direct vs. Indirect Assessment
  • The assessment process should include both
    indirect and direct measurement techniques
  • A variety of sources should be used
  • Employers, students, alumni, etc.
  • Converging evidence from multiple sources can
    reduce the effect of any inherent bias in the data

Direct Assessment
  • Direct examination or observation of student
    knowledge or skills using stated, measurable
  • Faculty typically assess student learning
    throughout a course using exams/quizzes,
    demonstrations, and reports
  • Sample what students know or can do
  • Provide evidence of student learning

Direct Assessment of PEOs
  • Employment statistics
  • Promotions and career advancement of graduates
  • Job titles, advanced degrees earned, additional
    course work taken after graduation, etc.
  • PEOs must be assessed separately from POs

Direct Assessment of POs
  • Common final exams
  • Locally developed exit exams
  • Standardized regional or national exit exams
  • External examiner
  • Co-op reports from employers
  • Portfolios of student work

Indirect Assessment
  • Indirect assessment of student learning
    ascertains the perceived extent or value of
    learning experiences
  • Assess opinions or thoughts about student
    knowledge or skills
  • Provides information about student perception of
    their learning and how this learning is valued by
    different constituencies

Indirect Assessment Measures
  • Exit and other kinds of interviews
  • Archival data
  • Focus groups
  • Written surveys and questionnaires
  • Industrial advisory boards
  • Employers
  • Job fair recruiters
  • Faculty at other schools

Survey of Assessment Methods
Direct and Indirect
  • Duality of some instruments, e.g., an exit
  • Indirect
  • Survey of opinions about the perceived value of
    the program components
  • Direct
  • If person asking the questions uses it as a way
    of assessing students skills (e.g., oral
    communication), then the survey is being used as
    a direct measure of the achievement of that

  • Evaluation is one or more processes for
    interpreting the data and evidence accumulated
    through assessment practices. Evaluation
    determines the extent to which program outcomes
    or program educational objectives are being
    achieved, and results in decisions and actions to
    improve the program.

Continuous Improvement
  • Accreditation boards are moving towards
    outcomes-based assessment of CS, IS, and IT
  • Programs must have an established outcomes-based
    assessment plan in place (or at least be making
    progress in that direction)
  • Process must be documented
  • Process must show continuous improvement (both
    quantitatively and qualitatively)

Faculty Responsibility
  • All faculty must have a commitment to and be
    directly involved in the evaluation of program
    educational objectives and program outcomes, as
    well as the process for continuous improvement of
    the program

Need for Faculty And Staff Buy-In
  • What makes most academics tick?
  • Rewards
  • Money?
  • Fun?
  • Appreciation?
  • Recognition?
  • How to encourage involvement?
  • We all resent any extra work!

Where to Begin?
  • Define your Mission Statement
  • Define your Program Educational Objectives (PEOs)
  • Define your Program Outcomes (POs)
  • Define Course Outcomes (COs)
  • Include specific course outcomes on each course
  • Make publicly available

Then What?
  • Show how course outcomes map to program outcomes
  • Show how program outcomes map to program
    educational objectives
  • Choose measurement tools, both direct and
  • Collect data

  • Present data to faculty in an easily digestible
  • Charts, graphs, tables, etc.
  • Faculty evaluates the data
  • Are students actually learning the material that
    the faculty believe (and claim) they are
  • Faculty make recommendations for improvement as

The Big Picture
Performance Criteria
Mission Statement
Stakeholders(students, alumniemployers
faculty, )
Course Outcomes
Program Objectives
Program Outcomes
Assess Collect and Analyze Evidence
Assess Collect and Analyze Evidence
Evaluate Interpret Evidence Take Action
Educational Practices/Strategies
The Big Picture
  • Show relationship between mission statement,
    objectives, and outcomes
  • Assess and evaluate objectives and outcomes
  • Map program outcomes to program objectives
  • Map course outcomes to program outcomes
  • Identify weaknesses and implement focused
    improvements in targeted areas

  • All assessment methods have their limitations and
    contain some bias
  • Meaningful analysis requires both direct and
    indirect measures from a variety of sources
  • Students, alumni, faculty, employers, etc.
  • Multiple assessment methods provides converging
    evidence of student learning

Assessment Lessons
  • Cannot do everything at once
  • Try an approach for first round learn and refine
  • Having data isnt all there is to it!
  • Easy to generate lots of bad data
  • One size fits all NOT!
  • Programs, courses, instructors all differ
  • Be ready to compromise
  • Perfection is neither possible nor desirable
  • Faculty evaluation and promotion
  • Do not tie to data generated from assessment

  • http//
  • http//
Write a Comment
User Comments (0)