Title: Assessing Short and Long-Term Value of the Research Experience
1Assessing Short and Long-Term Value of the
Research Experience
- Karen Webber Bauer, session facilitator
- Reinvention Center Assessment Specialists Meeting
11/8/06 - kwbauer_at_uga.edu
2Assessment in Higher Education
- An assumed practice recent report from
Spellings Commission - External and internal reasons
- Institutional effectiveness, learning within the
curriculum, and program effectiveness
3What is Assessment?
- Assessment is a means for focusing our
collective attention, examining our assumptions,
and creating a shared culture dedicated to
continuously improving the quality of higher
learning. Assessment requires making
expectations and standards for quality explicit
and public systematically gathering evidence on
how well performance matches those expectations
and standards analyzing and interpreting the
evidence and using the resulting information to
document, explain, and improve performance. - (Thomas A. Angelo, AAHE Bulletin, April 1995,
p.11).
4Why engage in assessment of Undergraduate Research
- External accountability needs
- Increase funding (or keep what you have)
- Better understand students
- their needs
- the learning process
- the interactions between students and faculty
- The value added dimensions of college experience
5How to best engage in assessment of UR?
- Assessment is most effective when it reflects an
understanding of learning as multidimensional,
integrated, and revealed in performance over time - Principles of Good Practice for
- Assessing Student Learning, AAHE, 2004
6Assessment vs. Evaluation
7Steps in Assessment Planning
- Step 1. Specify intended educational
outcomesStep 2. Identify methods and
measuresStep 3. Obtain assessment resultsStep
4. Report assessment results - Step 5. Close the Loop (i.e., use results to
improve program)
8Outcomes for Institutional Effectiveness Vs.
Student Learning
- 1. Institutional Effectiveness the broad,
summative measures that indicate success of the
institution - 2. Student Learning Outcomes- encompass wide
range of student attributes and abilities which
measure how the college experience supports
individual development - Cognitive acquisition of specific knowledge
skills (i.e., in major) pre-post measures - Affective how has the college experience
affected student values, goals, attitudes,
self-concepts, world views, and behaviors
9Assessment of theUniversity of
DelawaresUndergraduate ResearchProgram
10NSF RAIRE Award
- UD received one of ten RAIRE awards
- Used a portion of the funds to thoroughly assess
the value-added dimensions of Undergraduate
Research - Shows the use of short and long-term perceived
value of UR
11Major Questions That Emerged
- Does participation in undergraduate research
- Sharpen ability to think critically, creatively,
synthetically? - Develop problem-solving, leadership, teamwork
abilities? - Increase intellectual curiosity and desire to
learn? - Do alumni perceive benefits of UR?
- What motivates faculty to participate what are
the obstacles? - What educational outcomes do faculty perceive for
students who participate in research?
12Four Major Components
- I. Content Analysis
- previous years formative evaluations
- science and engineering sophomores
- II. Alumni Survey
- all majors UR and non-UR
- III. Faculty Survey
- all science and engineering departments
- IV. 4-Year Longitudinal Study Class of 2000
- UR and non-UR science and engineering students
13Faculty Survey
Content Analysis
Alumni Survey
Longitudinal Survey
14I. Content Analysis
- Content Categories Perceived Learning
- Increased technical skills....96
- Increased independence..57
- Insight into graduate school..45
- Teamwork learned and valued....43
- Learned to work with obstacles and
ambiguities37 - Learned to think creatively/synthetically.32
- Increased desire to learn.32
- Self-confidence gained..28
- Communication skills improved....24
- Understanding knowledge...24
15II. Alumni Survey Results (selected)
- Growth in 8 general cognitive and behavioral
skills greater for UR than non-UR alums - Carry out research
- Develop intellectual curiosity
- Acquire information independently
- Understand scientific findings
- Analyze literature critically
- Speak effectively
- Act as a leader
- Possess clear career goals
- Growth in 3 factors greater for URP than non-UR
alums - Science, math, logic, problem-solving
- Literature, language, mastery of contexts
- Personal initiative and communication
16III. Faculty Survey Results Student Skills
Gained
- Highest-rated skills (by 77-80 of respondents)
- Develop intellectual curiosity
- Think logically about complex materials
- Understand scientific findings
- Also highly rated (by 63-69 of respondents)
- Synthesize/use information from diverse sources
- Solve problems independently
- Approach problems creatively
- Maintain openness to new ideas
- Work as part of a team
17IV. UDAES UD Academic Experience Study--
Longitudinal Study of Science and Engineering
Majors
- Goals
- Capture info from currently enrolled students
- Measure change in skills gained over time
- Maintain comparison groups for all measurements
- Compare results from several types of instruments
18Select Longitudinal Study Results
- Personality Although overall, students
decreased in neuroticism and increased in
openness to experience, no significant
differences were found between UR and non-UR
students - CSEQ UR students perceived greater increases
for themselves than did non-UR students in - -- academic effort (this self-reported
information was also reflected in students
course registrations) - -- scientific and technological skills
19Select Longitudinal Results, cont.
- WGCTA Although Biological/Physical
Sciences/Chemical Engineering majors with
intensive research involvement showed larger
increase over 4 years in critical thinking
(logic) than did non-research students in these
majors, there were no significant differences by
UR. - RCI (1) Biological/Social Science majors with
intensive research involvement showed larger
increase in reflective judgment over 3 years than
did majors in these subjects with a smaller
amount of research or no research experience. - (2) Women with intensive research involvement
showed higher gains in reflective judgment over 3
years than women with a smaller amount of
research or no research experience.
20How Did We Come to the Chosen Method and
Measures?
- Clear to me that no one measure would answer the
questions - Important to have non-UR comparison group
- Important that subjects be unaware of the
studies connection to undergraduate research - Important to control for what you can e.g.,
motivation - Important to examine data from multiple
perspectives- alums, faculty, current students
21Value of Multiple Measures
- Some constructs such as cognitive growth are hard
to measure - Academic and psychosocial behavior change are
easier but still tough to separate from
extraneous factors - Multiple measures enabled us to look at different
educational outcomes affected by UR
22Value of Multiple Perspectives
- Faculty study enabled us to examine levels of UR
involvement and what faculty think students learn - Also enabled us to better understand why faculty
participate in what ways they benefit - Alumni have the advantage of distance and seeing
how educational experiences helped with career or
graduate school - Students can describe their perceptions of their
own academic experiences - Multiple perspectives help tell a robust story
23Important to Consider Use of Resources
- Comparison group larger sample
- Large sample statistical power
- Larger sample impracticality of more
qualitative examination through individual
interviews - High attrition rate would threaten
generalizability of study, so follow-up is
important - Larger sample more personnel time to follow up
with nonrespondents
24Publications
- Bauer, K.W. Bennett, J.S. (2003). Alumni
perceptions used to assess the undergraduate
research experience. Journal of Higher
Education. - Zydney, A.L., Bennett, J.S., Shahid, A. Bauer,
K.W. (2002). Impact of undergraduate research
experience in engineering. Journal of
Engineering Education. 91(2)151-157.
engineering data considered separately - Zydney, A.L., Bennett, J.S., Shahid, A. Bauer,
K.W. (2002). Faculty perspectives regarding the
undergraduate research experience in science and
engineering. Journal of Engineering Education.
91(3)291-297. - Content Analysis http//wwww.udel.edu/RAIRE
25- Thanks to and acknowledgement of those who
assisted with this project - Dr. Joan Bennett, Director, Undergraduate
Research Program, UD - Dr. Phil Wood, Associate Professor, UMissouri
- Dr. Abdus Shahid, Research Associate, University
of Pennsylvania - Post Doctoral and Graduate Students Hye-Sook
Park, Sarah Fine, and Christine Ward
26So where are we with assessment in UR today?
- The need to assess still exists
- List possible methods that could be used
- advantages, disadvantages
- List possible measures
- advantages, disadvantages
- Other considerations
- Resources available time, , personnel,
faculty and admin support
27Methods and Measures
- Quantitative methods
- Qualitative methods
- Triangulation of data from the above
- Cross-sectional
- Longitudinal
28Sources for Academic Quality Effectiveness Data
- Prospective Student Surveys
- Entering Student Surveys - demographics,
attitudes, values, goals - Assessment of General Education
- Assessment in the Major
- Freshman to Senior Year Cohort Studies
- Student Satisfaction Surveys
- Alumni Studies
- Faculty Productivity
- Program/Department Review
29Methods of Assessment
- Direct
- Course Embedded
- Portfolio
- Professional Juror
- Performance
- Thesis/Senior Report
- Focus Group
- Test/Exam
- Indirect
- Tracking Student Data
- Surveys
- Paper/pencil
- Web
- Telephone poll or interview
30Qualitative and Course-Embedded
- Developing scoring rubrics http//www.bridgew.edu
/AssessmentGuidebook/rubrics.cfm - http//www.bgsu.edu/offices/assessment/Rubrics.ht
m - http//pareonline.net/getvn.asp?v7n3
- http//intranet.cps.k12.il.us/Assessments/Ideas_a
nd_Rubrics/ideas_and_rubrics.html - Using portfolios in assessment
- http//www.aacu.org/issues/assessment/portfolio.c
fm - http//www.indiana.edu/reading/ieo/bibs/portfoli
.html - http//www.siue.edu/deder/assess/portf.html
- Blooms Taxonomy
31http//www.fctel.uncc.edu/pedagogy/resources/Artic
lesOnCriticalThinking.html
32(No Transcript)
33(No Transcript)
34(No Transcript)
35Remember the Steps
- Step 1. Specify intended educational
outcomesStep 2. Identify methods and
measuresStep 3. Obtain assessment resultsStep
4. Report Assessment Results - Step 5. Close the Loop (i.e., use results to
improve program)
36Articulate a Plan
37References for Survey Research
- Alreck, P.L. (1995). The survey research
handbook Guidelines and strategies for
conducting a survey. Chicago Irwin. - Bangura, A.K. (1992). The limitations of survey
research methods in assessing the problem of
minority student retention in higher education.
San Francisco Mellen Research University Press. - Cook, C., Heath, F., Thompson, R. (2000). A
Meta-analysis of response rates in web- or
internet-based surveys. Educational
Psychological Measurement, 60 (6), 821-36. - Dillman, D. (1978). Mail and telephone surveys
The total design method. New York John Wiley
Sons. - Dillman, D. (2000). Mail and telephone surveys.
2nd edition, New York John Wiley Sons.
38References, cont.
- Fink, A. (1995). The survey handbook. Thousand
Oaks, CA Sage. - Fowler, F.J. (1993). Survey research methods.
Newbury Park, CA Sage. - Kalton, G. (1983). Introduction to survey
sampling. Beverly Hills Sage. - Krueger, R.A. (1994). Focus groups A practical
guide for applied research (2nd ed.). Thousand
Oaks, CA Sage. - Lederman, L.C. (1990) Assessing Educational
Effectiveness The Focus Group Interview.
Communication Education, 38. - NPEC Sourcebook http//nces.ed.gov/pubs2000/20001
95.pdf - Porter, S. Whitcomb, M. (2003). Impact of
Lottery Incentives on Student Response Rates.
Research in Higher Education, 44 (4), 389-407.
39References, cont.
- Rea, L.M. (1992). Designing and conducting
survey research A comprehensive guide. San
Francisco Jossey-Bass. - Sax, L., Gilmartin, S., Bryant, A. (2003).
Assessing Response Rates and Nonresponse Bias in
Web and Paper Surveys. Research in Higher
Education, 44(4), 409-432, - Suskie, L. (1997). Questionnaire survey research
What works (2nd ed.) Tallahassee, FL AIR.
40Additional References
- AACU (2006). Value-added Assessment
Accountabilitys New Frontier.
http//www.aascu.org/publications/default.htm - Banta, T. Associates (1998). Assessment in
practice putting principles to work on college
campuses. San Francisco Jossey-Bass. - Huba, M., Freed, (2001). Learner-centered
assessment on campus Shifting the focus from
teaching to learning. Boston Allyn Bacon. - Maeroff, G. (2006). Beyond the Rankings
Measuring Learning in HEd. Hechinger Institute on
Education and the Media, Columbia University.
http//www.tc.columbia.edu/hechinger - Palomba, C. (1999). Assessment essentials
planning, implementing, and improving assessment
in higher education. San Francisco Jossey-Bass. - Spellings Commission Report US DoE (2006). A
Test of Leadership Charting the Future of US
Higher Education. Washington, DC.
41Questions?