Joni E' Spurlin - PowerPoint PPT Presentation

1 / 33
About This Presentation
Title:

Joni E' Spurlin

Description:

Do I have to develop new assessment methods for each outcome? ... correctly in engineering analyses involving mechanical, fluidic, thermal systems ... – PowerPoint PPT presentation

Number of Views:67
Avg rating:3.0/5.0
Slides: 34
Provided by: jesp64
Category:
Tags: fluidic | joni | spurlin

less

Transcript and Presenter's Notes

Title: Joni E' Spurlin


1
Sharing Data Developing a Model and a
WebsiteFor the Collection and Analyses of Data
  • Joni E. Spurlin
  • Sarah A. Rajala

2
Questions Posed by Faculty
  • Do I have to develop new assessment methods for
    each outcome?
  • How much data should I gather to address each
    outcome?
  • What types of data should I gather?
  • How can I ensure that all faculty have access to
    all the data?
  • In what ways can we link program outcomes to
    outcomes in each course?

3
Our goals
  • Have a process and documentation that would
    achieve program improvement, programmatic
    accreditation, regional accreditation,
    institutional program review
  • Principle of Good Practice for Assessing Student
    Learning Through assessment, educators meet
    responsibilities to students and to the public.
  • Be oriented to program objectives and outcomes
    and not oriented to methodology
  • Be developed with limited resources easy to put
    into place, easy to use and update

4
(No Transcript)
5
(No Transcript)
6
(No Transcript)
7
Faculty question
  • In what ways can we link program outcomes to
    outcomes in each course?
  • (Break into small groups to discuss)

8
Faculty question In what ways can we link
program outcomes to outcomes in each course?
  • Matrix of courses within program vs. program
    outcomes
  • Does that course add significantly to the
    learning of that outcome? Does that course add
    significantly to the assessment of that outcome?
  • Relationship of course outcomes to program
    outcomes
  • Redefine course outcomes as learning outcomes
  • Map to program outcomes

9
ABET Criteria 3a-k
  • A. Math, science, engineering
  • B. Design, conduct experiments
  • C. Design components, systems, processes
  • D. Multi-disciplinary teams
  • E. Solve engineering problems
  • F. Professional conduct, ethics
  • G. Effective communication skills
  • H. Societal and global impact
  • I. Life-long learning
  • J. Contemporary issues
  • K. Modern engineering tools

10
Engineering Program X Relationship of Courses to
Program Outcomes   I Implement (teach) the
outcome in the course in a major way A Assess
the outcome in the course in a major way IA Do
both  
 
11
Course Contribution to Program Outcomes
  • Major (4) Topics are fully introduced, developed
    and reinforced throughout the course in course
    lectures, labs, homework assignments, tests,
    exams, projects an "application knowledge"
  • Moderate (2)Topics are introduced and further
    developed and reinforced in course lectures,
    labs, assignments, tests, etc a "working
    knowledge
  •  Minor (1) Topics introduced in course lectures,
    labs, homework, assignments, etc a "talking
    knowledge" or "awareness"

12
List Course Outcomes in Spreadsheet and Sort as
Needed
13
Develop Classroom Based Assessment for Related
Program Outcomes
14
Faculty questions
  • How much data should I gather to address each
    outcome? (e.g., How many different methods? For
    how many years?)
  • What types of data should I gather?
  • (Break into small groups to discuss)

15
Faculty questions
  • How much data should I gather to address each
    outcome?
  • What types of data should I gather?
  • How can I ensure that all faculty have access to
    all the data?
  • See Figure 3, Page 6 of Article
  • See pages 7-11 on detailed explanation of data
    that can be collected

16
DATABASE
Information About Students
Course Assessments
Students Ability Reported by Students
Satisfaction Reported by Students
Students Ability Reported by Employers
Nationally Normed Tests
Information about Faculty Courses
Information about Facilities Equipment
17
(No Transcript)
18
Website Organized by Outcomes
  • Data is organized by Outcome
  • Data collected from 2-4 multiple sources
  • Only data related to outcomes is collected and
    analyzed. It is possible to collect more data
    than can be effectively used this process helps
    limit extraneous data collection
  • Click on Outcome and see all the data related to
    that outcome for past 3-5 years.
  • Principle of Good Practice for Assessing Student
    Learning Assessment is most effective when it
    reflects an understanding of learning as
    multidimensional, integrated, and revealed in
    performance over time.

19
(No Transcript)
20
Mapped Survey Questions to Program Outcomes
  • Senior Survey
  • Alumni Survey
  • Employer Survey
  • More specific the outcome, easier to find
    matching questions
  • See our website for mapping scheme

21
Course Assessments
Textile Engineering Outcome TE graduates will
demonstrate the ability to design and develop
useful products, processes, machines and
systems. Assessment Method Major Report in TE
402, using Evaluation Rubric
22
EVALUATION OF STUDENTS' DESIGN CAPABILITIES TE
402 Textile Engineering Senior Design Date
Spring 2002 Summary  
Percentage Of Students Who Meet Each Dimension
PROBLEM DEFINITION The student should
N/A poor good superior
CONCEPT REFINEMENT The student should
23
Nationally Normed Tests
  • Industrial Engineering Outcome
  • To demonstrate that graduates have an ability to
    apply knowledge of mathematics, science, and
    engineering, they should show that they can
    employ general principles, theories, concepts,
    and/or formulas from mathematics, science, and
    engineering in the solution of a wide range of
    industrial engineering problems.
  • Assessment Method
  • Fundamentals of Engineering Examination AM
    General Part (subject areas Mathematics,
    Chemistry, Statics, Computers, Electrical
    Circuits, Engineering Economics).

24
FE Exam Industrial Engineering Students
Percentage of NCSU students who passed/
Percentage of National students who passed
25
Link to University Planning Interactive Database
University Planning and Analysis has Interactive
Database with many variables Therefore faculty
do not have to collect this data, just use it
26
Faculty Questions
  • Now that I have collected this data for the past
    year, what do I do with it?
  • What do I report on? How do I report it?
  • How do I show changes to the program over time?

27
(No Transcript)
28
Reports Organized by Outcomes
  • Reports are organized by Outcomes
  • Click on Outcome and see assessment reports

Principle of Good Practice for Assessing Student
Learning Assessment makes a difference when it
begins with issues of use and illuminates
questions that people really care about.
29
NCSU Annual Report Format
Section 1 Summary of Assessment Results   This
section should summarize the results from the
assessment conducted the prior year. It should
summarize what you have learned. It should not
be pages of data, but should review what the
major findings were and what they suggest. It
is recommended that the summary reflect the
program outcomes or program educational
objectives defined in the assessment
plan.   Suggested outline   Outcome 1 list
outcome         Assessment method actually
used         Summary of findings        
Location of data   Outcome 2 list
outcome         Assessment method actually
used         Summary of findings        
Location of data   Continue until report on
all Outcomes
30
Report Format (Continued)
Section 2 Proposed Modifications to Curriculum
or Program   2A. Modifications made during the
reporting period Describe any actions taken based
on the assessment data list modifications
according to outcomes/objectives   2B.
Modifications planned for the coming
year Describe any actions to be taken based on
the assessment data list modifications according
to outcomes/objectives     Section 3
Modifications to the Assessment Plan   Record any
modifications to be made to the assessment plan.
This section may be very short. Reflect any
assessment methods that do not need to be
continued and why.
31
Example
  • Civil Engineering Outcome
  • To demonstrate that graduates have an ability to
    apply knowledge of mathematics, science, and
    engineering
  • Assessment Report
  • Assessment of FE data and Survey data indicated
    that students had difficulty in differential
    equations and other math concepts
  • Added Math 245 in Fall of 1998.
  • Students reported ability in math has risen
    since the new course was added to the curriculum.
    Also higher than other students in engineering
    programs at NCSU.

32
Added Math 245 Here
33
Issues Raised By Faculty
  • Do I keep actual student work or just summary
    data reports?
  • Id rather see pictures than numbers.
  • I want it all on one page!
  • Should I make comparisons to other programs?

34
Final Discussion
  • How well does the model (Figures 1-3) fit with
    your understanding of assessment?
  • Do you think this model will work on your campus?
  • How would you change the model to fit your
    campus?
  • (Break into small groups to discuss)

35
Final Points on Developing Database and Website
  • Based on Assessment Model
  • Outcomes Oriented
  • All data, findings, reports in one place
  • Available to ALL faculty, at ALL times
  • Keeps several years of data on display
  • Evaluators may use website/CD
  • Keep it Simple!

36
URL of Website Contact Info
  • www.engr.ncsu.edu/assessment
  • Joni E. Spurlin, Ph.D. Sarah A. Rajala, Ph.D.
  • Director of Assessment, Associate Dean,
  • College of Engineering College of Engineering
  • North Carolina State University North Carolina
    State University
  • jespurli_at_eos.ncsu.edu sar_at_eos.ncsu.edu
  • 919-513-4626 919-515-3693
Write a Comment
User Comments (0)
About PowerShow.com