These Confusing Technology Times: Making Decisions About What to Assess and Evaluate - PowerPoint PPT Presentation

About This Presentation
Title:

These Confusing Technology Times: Making Decisions About What to Assess and Evaluate

Description:

Most often, students are assessed in one of the following knowledge-focused ways ... Catching da cheaters! Increasing Cheating Online ... – PowerPoint PPT presentation

Number of Views:148
Avg rating:3.0/5.0
Slides: 82
Provided by: Curtis103
Category:

less

Transcript and Presenter's Notes

Title: These Confusing Technology Times: Making Decisions About What to Assess and Evaluate


1
These Confusing Technology Times Making
Decisions About What to Assess and Evaluate
  • Dr. Curtis J. Bonk
  • Indiana University and CourseShare.com
  • http//php.indiana.edu/cjbonk
  • cjbonk_at_indiana.edu

2
Confusion Reigns
  1. How allocate time?
  2. When to assess?
  3. How to assess?
  4. How grade teamwork?
  5. Whose work is it?

3
Other Issues?

4
Student AssessmentProduct Focus
5
Traditional Assessment Methods
  • Most often, students are assessed in one of the
    following knowledge-focused ways
  • Objective test questions
  • Essay test questions
  • Papers/Reports
  • Projects
  • All are product-oriented in nature

6
Most Assessment Tools
  • Focus on tests
  • Automatic grading/feedback
  • Test pools
  • Timing
  • Favor objective questions
  • Few tools to facilitate other forms of assessment
  • File exchange/dropbox

7
Focus of Assessment?
  1. Basic Knowledge, Concepts, Ideas
  2. Higher-Order Thinking Skills, Problem Solving,
    Communication, Teamwork
  3. Both of Above!!!
  4. Other

8
Technology Assessments Possible
  • Online Portfolios of Work
  • Discussion/Forum Participation
  • Online Mentoring
  • Weekly Reflections
  • Tasks Attempted or Completed, Usage, etc.

9
Sample Portfolio Scoring Dimensions(10 pts
each)(see http//php.indiana.edu/cjbonk/p250syl
a.htm)
  1. Richness
  2. Coherence
  3. Elaboration
  4. Relevancy
  5. Timeliness
  6. Completeness
  7. Persuasiveness
  8. Originality
  1. Insightful
  2. Clear/Logical
  3. Original
  4. Learning
  5. Fdback/Responsive
  6. Format
  7. Thorough
  8. Reflective
  9. Overall Holistic

10
More Possible Assessments
  • Quizzes and Tests
  • Peer Feedback and Responsiveness
  • Cases and Problems
  • Group Work
  • Web Resource Explorations Evaluations

11
E-Case Analysis Evaluation
  • Peer Feedback Criteria
  • (1 pt per item 5 pts/peer feedback)
  • (a) Provides additional points that may have been
    missed.
  • (b) Corrects a concept, asks for clarification
    where needed, debates issues, disagrees
    explains why.
  • (c) Ties concepts to another situation or refers
    to the text or coursepack.
  • (d) Offer valuable insight based on personal
    experience.
  • (e) Overall constructive feedback.

12
Possible Methods of Assessment
  • Review of online group work spaces
  • Evidence of regular and substantial contributions
  • Self and peer assessment
  • Have students rate team members on various
    dimensions
  • Have students indicate where work plan was
    followed/not followed
  • Student reflection
  • Have students write brief reflections on their
    group process, indicating what they might change
    the next time

13
E-Peer Evaluation Form
  • Peer Evaluation. Name ____________________
  • Rate on Scale of 1 (low) to 5 (high)
  • ___ 1. Insight creative, offers
    analogies/examples, relationships drawn, useful
    ideas and connections, fosters growth.
  • ___ 2. Helpful/Positive prompt feedback,
    encouraging, informative, makes suggestions
    advice, finds, shares info.
  • ___ 3. Valuable Team Member dependable, links
    group members, there for group, leader,
    participator, pushes group.
  • ___ Total Recd Contribution Pts (out of 15)

14
Assessment Issues
15
Issues to Consider
  1. Bonus pts for participation?
  2. Peer evaluation of work?
  3. Assess improvement?
  4. Is it timed? Give unlimited time to complete?
  5. Allow retakes if lose connection? How many
    retakes?

16
Issues to Consider
  1. Cheating? Is it really that student?
  2. Authenticity?
  3. Negotiate tasks and criteria?
  4. How measure competency?
  5. How do you demonstrate learning online?

17
Catching da cheaters!
18
Increasing Cheating Online(7-30/page,
http//www.syllabus.com/ January, 2002, Phillip
Long, Plagiarism IT-Enabled Tools for Deceit?)
  • http//www.academictermpapers.com/
  • http//www.termpapers-on-file.com/
  • http//www.nocheaters.com/
  • http//www.cheathouse.com/uk/index.html
  • http//www.realpapers.com/
  • http//www.pinkmonkey.com/
  • (youll never buy Cliffnotes again)

19
(No Transcript)
20
(No Transcript)
21
Reducing Cheating Online
  • Ask yourself, why are they cheating?
  • Do they value the assignment?
  • Are tasks relevant and challenging?
  • What happens to the task after submittedreused,
    woven in, posted?
  • Due at end of term? Real audience?
  • Look at pedagogy b4 calling plagiarism police!

22
Reducing Cheating Online
  • Proctored exams
  • Vary items in exam
  • Make course too hard to cheat
  • Try Plagiarism.com (300)
  • Use mastery learning for some tasks
  • Random selection of items for item pool
  • Use test passwords, rely on IP screening
  • Assign collaborative tasks

23
Reducing Cheating Online(7-30/page,
http//www.syllabus.com/ January, 2002, Phillip
Long, Plagiarism IT-Enabled Tools for Deceit?)
  • http//www.plagiarism.org/ (resource)
  • http//www.turnitin.com/ (software, 100, free 30
    day demo/trial)
  • http//www.canexus.com/ (software essay
    verification engine, 19.95)
  • http//www.plagiserve.com/ (free database of
    70,000 student term papers cliff notes)
  • http//www.academicintegrity.org/ (assoc.)
  • http//sja.ucdavis.edu/avoid.htm (guide)

24
(No Transcript)
25
Turnitin Testimonials
  • "Many of my students believe that if they do not
    submit their essays, I will not discover their
    plagiarism. I will often type a paragraph or two
    of their work in myself if I suspect plagiarism.
    Every time, there was a "hit." Many students were
    successful plagiarists in high school. A service
    like this is needed to teach them that such
    practices are no longer acceptable and certainly
    not ethical!

26
Online Assessment Concerns
  • Problem Cheating on tests
  • Copying from neighbor
  • Copying from course materials
  • Someone else taking the test
  • Problem Plagiarism
  • Submitting someone elses paper (previous class)
  • Copying from (online) sources
  • Buying paper online
  • Both are product-oriented concerns

27
Assessment Process Focus(Vanessa Dennen, Sept
2002)
28
Assessing Process
  • Easy to do
  • Many technology tools will archive student
    work/interactions
  • Students create a document trail in process
  • Helps students develop metacognitive knowledge
  • Instructors structure/model/encourage productive
    work processes
  • Students learn how to manage their own work
    processes

29
Why Assess Process?
  • For the instructor
  • Provides formative feedback on course (e.g.,
    helps gather data about why students have
    difficulty with product-oriented assessments)
  • Provides sense of instructor guidance
  • Clarifies who is doing most work in small group
    assignments
  • Helps prevent cheating

30
Why Assess Process?
  • For the student
  • Typically improves the quality of their products
  • Helps them develop productive work processes
  • Puts on a schedule
  • Shows that you care about individual growth

31
Assessment Project Cycle
  • From Classroom Assessment Techniques by Angelo
    Cross (1993)
  • Step 1 Plan
  • Choose class
  • Focus on assessable question
  • Design project to answer question

32
Assessment Project Cycle 2
  • Step 2 Implement
  • Teach target lesson
  • Collect assessment data
  • Analyze data
  • Step 3
  • Interpret results
  • Communicate results
  • Evaluate assessment project

33
I. Term Papers
  • How to do online
  • Have students each start their own thread and
    post topic of interest
  • Peers and instructors give feedback
  • Students post thesis statements, research
    sources, etc., with iterations of feedback
  • Final paper is posted

34
Term Paper Assessments
  • Product the paper
  • Process quality and timeliness of student work
    from time when paper is assigned
  • Process quality and timeliness of feedback
    provided to peers
  • Process responsiveness to feedback received from
    instructor and peers

35
II. Discussion Assignments
  • 1. Chain of thought
  • Have students develop a solution to a problem
  • Have students indicate what led them to a
    particular conclusion, method or approach
  • Can be done in a discussion board

36
Discussion Assignments
  • 2. Theory to Practice
  • Have students match up theories you are learning
    about to actual problems
  • Present students with problems and have them
    explain what theories they would use to solve
    these problems and how they would approach it
  • Debrief the assignment

37
Discussion Assignment
  • 3. Synthesizer
  • Have students take roles being the weekly
    synthesizer of class discussion
  • Add a meta level in which students narrate
    their own experiences while reading the weekly
    discussion

38
III. Group Projects
  • Tools used
  • Chat brainstorming ideas, making group
    decisions, regular way to feel connected (should
    be archived)
  • Discussion board commenting on drafts
  • E-mail quick feedback
  • File exchange sharing project files
  • MS Word Track changes
  • HINT If you dont have a tool that will work,
    refer students to yahoo groups
    http//www.groups.yahoo.com

39
Group Project Assessments
  • Product project files that are turned in
  • Process online archive demonstrating
  • Who contributed what
  • Who provided peer feedback
  • Who worked in a timely manner
  • How collaborative a group was
  • Process peer ratings
  • Process interim instructor consultations

40
III. Project Assignments
  • 1. Work Plans
  • Have students develop a plan of work for their
    project
  • Make them outline topic, schedule, resources
    needed, division of labor and anticipated form of
    final deliverables
  • At end of project, have students evaluate how
    well they followed their own plan and how useful
    it was

41
Project Assignments
  • 2. Research Trail
  • Have students document the steps they took in the
    research process and the results
  • Ask for a brief reflection on how effective their
    process was and what they might change the next
    time

42
Project Assignments
  • 3. Process Presentations
  • Have students focus on their process as well as
    their product in class presentations
  • To maintain focus, ask them to share 3 main
    lessons learned
  • Might ask for some process documents to be
    shared, like an early draft

43
Project Assignments
  • 4. Design Journal
  • Have students maintain a journal of all ideas
    related to their project
  • Encourage sketches, lists, organizational charts,
    etc.
  • Require journals to be turned in with final
    projects

44
IV. Reflection Assignments
  • Have students keep a weekly journal of their
    thoughts on readings and course content AND
    real-world related instances that they noticed
  • May make these public, with each student having
    their own discussion thread

45
Making it Happen
  • Learners need to see that process is valuable
  • Model appropriate processes
  • Provide students with scaffolding (guide sheets)
    to structure their processes
  • Give students feedback on their process
  • Require students to reflect on their processes
  • Grade students on process

46
Online Testing Tools
47
Choice Select companies that specialize in
online assessment.
48
Or Use what the courseware package gives ya
49
Test Selection Criteria (Hezel, 1999)
  • Easy to Configure Items and Test
  • Handle Symbols
  • Scheduling of Feedback (immediate?)
  • Provides feedback for each response
  • Randomize Answers Within a Question
  • Weighting of Answer Options
  • Supports multiple items types multiple choice,
    true-false, essay, keyword

50
More Test Selection Criteria
  • Recording of Multiple Submissions
  • Comprehensive Statistics
  • Summarize in Portfolio and Gradebook
  • Confirmation of Test Submission
  • Incorp graphic or audio elements?
  • Timed Tests

51
More Test Selection Criteria(Perry Colon, 2001)
  • Flexible scoringscore first, last, or average
    submission
  • Flexible reportingby individual or by item and
    cross tabulations.
  • Control over number of times students can submit
    an activity or test
  • Provides item analysis statistics (e.g., Test
    Item Frequency Distributions).

Web Resource http//www.indiana.edu/best/
52
Online Survey Tools for Assessment
53
Sample Survey Tools
  • Zoomerang (http//www.zoomerang.com)
  • SurveyMonkey (http//www.surveymonkey.com/)
  • QuestionMark (http//www.questionmark.com/home.htm
    l)
  • SurveyShare (http//SurveyShare.com from
    Courseshare.com)
  • Survey Solutions from Perseus (http//www.perseusd
    evelopment.com/fromsurv.htm)
  • Infopoll (http//www.infopoll.com)

54
Web-Based Survey Advantages
  • Faster collection of data
  • Standardized collection format
  • Computer graphics may reduce fatigue
  • Computer controlled branching and skip sections
  • Easy to answer clicking
  • Wider distribution of respondents

55
Web-Based Survey Problems Why Lower Response
Rates?
  • Low response rate
  • Lack of time
  • Unclear instructions
  • Too lengthy
  • Too many steps
  • Cant find URL

56
Survey Tool Features
  • Support different types of items (Likert,
    multiple choice, forced ranking, paired
    comparisons, etc.)
  • Maintain email lists and email invitations
  • Conduct polls
  • Adaptive branching and cross tabulations
  • Modifiable templates library of past surveys
  • Publish reports
  • Different types of accountshosted, corporate,
    professional, etc.

57
Web-Based Survey Solutions Some Tips
  • Send second request
  • Make URL link prominent
  • Offer incentives near top of request
  • Shorten survey, make attractive, easy to read
  • Disclose purpose, use, and privacy
  • E-mail cover letters
  • Prenotify of intent to survey

58
Evaluation
59
Champagne Wisher (in press)
  • Simply put, an evaluation is concerned with
    judging the worth of a program and is essentially
    conducted to aid in the making of decisions by
    stakeholders. (e.g., does it work as
    effectively as the standard instructional
    approach).

60
Evaluation Purposes
  • Cost Savings
  • Improved Efficiency/Effectiveness
  • Learner Performance/Competency Improvement/Progres
    s
  • What did they learn?
  • Assessing learning impact
  • How well do learners use what they learned?
  • How much do learners use what they learn?

61
Kirkpatricks 4 Levels
  • Reaction
  • Learning
  • Behavior
  • Results

62
My Evaluation Plan
63
What to Evaluate?
  1. Studentattitudes, learning, jobs.
  2. Instructorpopularity, course enrollments.
  3. Traininginternal and external.
  4. Task--relevance, interactivity, collaborative.
  5. Tool--usable, learner-centered, friendly,
    supportive.
  6. Courseinteractivity, completion rates.
  7. Programgrowth, long-range plans.
  8. Universitycost-benefit, policies, vision.

64
Measures of Student Success(Focus groups,
interviews, observations, surveys, exams, records)
  • Positive Feedback, Recommendations
  • Increased Comprehension, Achievement
  • High Retention in Program
  • Completion Rates or Course Attrition
  • Jobs Obtained, Internships
  • Enrollment Trends for Next Semester

65
1. Student Basic Quantitative
  • Grades, Achievement
  • Number of Posts
  • Participated
  • Computer Log Activitypeak usage, messages/day,
    time of task or in system
  • Attitude Surveys

66
1. Student High-End Success
  • Message complexity, depth, interactivity, qing
  • Collaboration skills
  • Problem finding/solving and critical thinking
  • Challenging and debating others
  • Case-based reasoning, critical thinking measures
  • Portfolios, performances, PBL activities

67
2. Instructor Success
  • High student evals more signing up
  • High student completion rates
  • Utilize Web to share teaching
  • Course recognized in tenure decisions
  • Varies online feedback and assistance techniques

68
3. TrainingOutside Support
  • Training (FacultyTraining.net)
  • Courses Certificates (JIU, e-education)
  • Reports, Newsletters, Pubs
  • Aggregators of Info (CourseShare, Merlot)
  • Global Forums (FacultyOnline.com GEN)
  • Resources, Guides/Tips, Link Collections, Online
    Journals, Library Resources

69
Certified Online Instructor Program
  • Walden Institute12 Week Online Certification
    (Cost 995)
  • 2 tracks one for higher ed and one for online
    corporate trainer
  • Online tools and purpose
  • Instructional design theory techniques
  • Distance ed evaluation
  • Quality assurance
  • Collab learning communities

70
http//www.utexas.edu/world/lecture/
71
(No Transcript)
72
3. TrainingInside Support
  • Instructional Consulting
  • Mentoring (strategic planning )
  • Small Pots of Funding
  • Facilities
  • Summer and Year Round Workshops
  • Office of Distributed Learning
  • Colloquiums, Tech Showcases, Guest Speakers
  • Newsletters, guides, active learning grants,
    annual reports, faculty development, brown bags

73
Technology and Professional Dev Ten Tips to Make
it Better (Rogers, 2000)
  • 1. Offer training
  • 2. Give technology to take home
  • 3. Provide on-site technical support
  • 4. Encourage collegial collaboration
  • 5. Send to prof development conference
  • 6. Stretch the day
  • 7. Encourage research
  • 8. Provide online resources
  • 9. Lunch bytes, faculty institutes
  • 10. Celebrate success

74
RIDIC5-ULO3US Model of Technology Use
  • 4. Tasks (RIDIC)
  • Relevance
  • Individualization
  • Depth of Discussion
  • Interactivity
  • Collaboration-Control-Choice-Constructivistic-Comm
    unity

75
RIDIC5-ULO3US Model of Technology Use
  • 5. Tech Tools (ULOUS)
  • Utility/Usable
  • Learner-Centeredness
  • Opportunities with Outsiders Online
  • Ultra Friendly
  • Supportive

76
6. Course Success
  • Few technological glitches/bugs
  • Adequate online support
  • Increasing enrollment trends
  • Course quality (interactivity rating)
  • Monies paid
  • Accepted by other programs

77
7. Program Considerations
  • Enrollment trends
  • Relevant and current technology
  • Number of Graduates and graduation rates
  • Sense of community
  • Format Self-paced, collaborative, PBL, mentored,
    performance-based, individual, etc.

78
How are costs calculated in online programs???
79
7. Online Program or Course Budget (i.e., how
pay, how large is course, tech fees charged, of
courses, tuition rate, etc.)
  • Indirect Costs learner disk space, phone,
    accreditation, integration with existing
    technology, library resources, on site
    orientation tech training, faculty training,
    office space
  • Direct Costs courseware, instructor, help desk,
    books, seat time, bandwidth and data
    communications, server, server back-up, course
    developers, postage

80
8. Institutional Success
  • E-Enrollments from
  • new students, alumni, existing students
  • Press, publication, partners, attention
  • Additional grants
  • Making Money Cost-Benefit model
  • Faculty and student attitudes
  • Acceptable policies (ADA compliant)

81
Final advicewhatever you do
Write a Comment
User Comments (0)
About PowerShow.com