Part%20II:%20Online%20Learning:%20Opportunities%20for%20Assessment%20and%20Evaluation - PowerPoint PPT Presentation

About This Presentation
Title:

Part%20II:%20Online%20Learning:%20Opportunities%20for%20Assessment%20and%20Evaluation

Description:

(Dan Carnevale, April 13, 2001, Chronicle of Higher Education) ... www.plagiserve.com/ (free database of 70,000 student term papers & cliff notes) ... – PowerPoint PPT presentation

Number of Views:122
Avg rating:3.0/5.0
Slides: 61
Provided by: Curtis123
Category:

less

Transcript and Presenter's Notes

Title: Part%20II:%20Online%20Learning:%20Opportunities%20for%20Assessment%20and%20Evaluation


1
Part II Online Learning Opportunities for
Assessment and Evaluation
  • Dr. Curtis J. Bonk
  • Indiana University and CourseShare.com
  • http//php.indiana.edu/cjbonk
  • cjbonk_at_indiana.edu

2
Online StudentAssessment
3
Assessment Takes Center Stage in Online
Learning(Dan Carnevale, April 13, 2001,
Chronicle of Higher Education)
  • One difference between assessment in classrooms
    and in distance education is that
    distance-education programs are largely geared
    toward students who are already in the workforce,
    which often involves learning by doing.

4
Focus of Assessment?
  1. Basic Knowledge, Concepts, Ideas
  2. Higher-Order Thinking Skills, Problem Solving,
    Communication, Teamwork
  3. Both of Above!!!
  4. Other

5
Assessments Possible
  • Online Portfolios of Work
  • Discussion/Forum Participation
  • Online Mentoring
  • Weekly Reflections
  • Tasks Attempted or Completed, Usage, etc.

6
More Possible Assessments
  • Quizzes and Tests
  • Peer Feedback and Responsiveness
  • Cases and Problems
  • Group Work
  • Web Resource Explorations Evaluations

7
Sample Portfolio Scoring Dimensions(10 pts
each)(see http//php.indiana.edu/cjbonk/p250syl
a.htm)
  1. Richness
  2. Coherence
  3. Elaboration
  4. Relevancy
  5. Timeliness
  6. Completeness
  7. Persuasiveness
  8. Originality
  1. Insightful
  2. Clear/Logical
  3. Original
  4. Learning
  5. Fdback/Responsive
  6. Format
  7. Thorough
  8. Reflective
  9. Overall Holistic

8
E-Peer Evaluation Form
  • Peer Evaluation. Name ____________________
  • Rate on Scale of 1 (low) to 5 (high)
  • ___ 1. Insight creative, offers
    analogies/examples, relationships drawn, useful
    ideas and connections, fosters growth.
  • ___ 2. Helpful/Positive prompt feedback,
    encouraging, informative, makes suggestions
    advice, finds, shares info.
  • ___ 3. Valuable Team Member dependable, links
    group members, there for group, leader,
    participator, pushes group.
  • ___ Total Recommended Contribution Pts (out of
    15)

9
E-Case Analysis Evaluation
  • Peer Feedback Criteria
  • (1 pt per item 5 pts/peer feedback)
  • (a) Provides additional points that may have been
    missed.
  • (b) Corrects a concept, asks for clarification
    where needed, debates issues, disagrees
    explains why.
  • (c) Ties concepts to another situation or refers
    to the text or coursepack.
  • (d) Offer valuable insight based on personal
    experience.
  • (e) Overall constructive feedback.

10
Issues to Consider
  1. Bonus pts for participation?
  2. Peer evaluation of work?
  3. Assess improvement?
  4. Is it timed? Give unlimited time to complete?
  5. Allow retakes if lose connection? How many
    retakes?

11
Issues to Consider
  1. Cheating? Is it really that student?
  2. Authenticity?
  3. Negotiating tasks and criteria?
  4. How measure competency?
  5. How do you demonstrate learning online?

12
Increasing Cheating Online(7-30/page,
http//www.syllabus.com/ January, 2002, Phillip
Long, Plagiarism IT-Enabled Tools for Deceit?)
  • http//www.academictermpapers.com/
  • http//www.termpapers-on-file.com/
  • http//www.nocheaters.com/
  • http//www.cheathouse.com/uk/index.html
  • http//www.realpapers.com/
  • http//www.pinkmonkey.com/
  • (youll never buy Cliffnotes again)

13
(No Transcript)
14
(No Transcript)
15
Reducing Cheating Online
  • Ask yourself, why are they cheating?
  • Do they value the assignment?
  • Are tasks relevant and challenging?
  • What happens to the task after submittedreused,
    woven in, posted?
  • Due at end of term? Real audience?
  • Look at pedagogy b4 calling plagiarism police!

16
Reducing Cheating Online
  • Proctored exams
  • Vary items in exam
  • Make course too hard to cheat
  • Try Plagiarism.com (300)
  • Use mastery learning for some tasks
  • Random selection of items for item pool
  • Use test passwords, rely on IP screening
  • Assign collaborative tasks

17
Reducing Cheating Online(7-30/page,
http//www.syllabus.com/ January, 2002, Phillip
Long, Plagiarism IT-Enabled Tools for Deceit?)
  • http//www.plagiarism.org/ (resource)
  • http//www.turnitin.com/ (software, 100, free 30
    day demo/trial)
  • http//www.canexus.com/ (software essay
    verification engine, 19.95)
  • http//www.plagiserve.com/ (free database of
    70,000 student term papers cliff notes)
  • http//www.academicintegrity.org/ (assoc.)
  • http//sja.ucdavis.edu/avoid.htm (guide)

18
(No Transcript)
19
Turnitin Testimonials
  • "Many of my students believe that if they do not
    submit their essays, I will not discover their
    plagiarism. I will often type a paragraph or two
    of their work in myself if I suspect plagiarism.
    Every time, there was a "hit." Many students were
    successful plagiarists in high school. A service
    like this is needed to teach them that such
    practices are no longer acceptable and certainly
    not ethical!

20
New Zealand Universities Consider Lawsuit Against
Sites Selling Diplomas in Their Names.
  • The Web sites, which already offer fake diplomas
    in the names of hundreds of colleges in the
    United States and abroad, recently added New
    Zealands Universities of Auckland, Canterbury,
    and Otago to their lineup. The degrees sell for
    up to 250 each.
  • Feb 11, 2002, David Cohen, Chronicle of Higher
    Education

21
Online Testing Tools
22
Choice Select companies that specialize in
online assessment.
23
Or Use what the courseware package gives ya
24
Test Selection Criteria (Hezel, 1999)
  • Easy to Configure Items and Test
  • Handle Symbols
  • Scheduling of Feedback (immediate?)
  • Provides Clear Input of Exam Dates
  • Easy to Pick Items for Randomizing
  • Randomize Answers Within a Question
  • Weighting of Answer Options

25
More Test Selection Criteria
  • Recording of Multiple Submissions
  • Timed Tests
  • Comprehensive Statistics
  • Summarize in Portfolio and/or Gradebook
  • Confirmation of Test Submission

26
More Test Selection Criteria(Perry Colon, 2001)
  • Supports multiple items typesmultiple choice,
    true-false, essay, keyword
  • Can easily modify or delete items
  • Incorporate graphic or audio elements?
  • Control over number of times students can submit
    an activity or test
  • Provides feedback for each response

27
More Test Selection Criteria(Perry Colon, 2001)
  • Flexible scoringscore first, last, or average
    submission
  • Flexible reportingby individual or by item and
    cross tabulations.
  • Outputs data for further analysis
  • Provides item analysis statistics (e.g., Test
    Item Frequency Distributions).

Web Resource http//www.indiana.edu/best/
28
Online Survey Tools for Assessment
29
Sample Survey Tools
  • Zoomerang (http//www.zoomerang.com)
  • IOTA Solutions (http//www.iotasolutions.com)
  • QuestionMark (http//www.questionmark.com/home.htm
    l)
  • SurveyShare (http//SurveyShare.com from
    Courseshare.com)
  • Survey Solutions from Perseus (http//www.perseusd
    evelopment.com/fromsurv.htm)
  • Infopoll (http//www.infopoll.com)

30
Web-Based Survey Advantages
  • Faster collection of data
  • Standardized collection format
  • Computer graphics may reduce fatigue
  • Computer controlled branching and skip sections
  • Easy to answer clicking
  • Wider distribution of respondents

31
Web-Based Survey Problems Why Lower Response
Rates?
  • Low response rate
  • Lack of time
  • Unclear instructions
  • Too lengthy
  • Too many steps
  • Cant find URL

32
Survey Tool Features
  • Support different types of items (Likert,
    multiple choice, forced ranking, paired
    comparisons, etc.)
  • Maintain email lists and email invitations
  • Conduct polls
  • Adaptive branching and cross tabulations
  • Modifiable templates library of past surveys
  • Publish reports
  • Different types of accountshosted, corporate,
    professional, etc.

33
Web-Based Survey Solutions Some Tips
  • Send second request
  • Make URL link prominent
  • Offer incentives near top of request
  • Shorten survey, make attractive, easy to read
  • Credible sponsorshipe.g., university
  • Disclose purpose, use, and privacy
  • E-mail cover letters
  • Prenotify of intent to survey

34
Tips on Authentification
  • Check e-mail access against list
  • Use password access
  • Provide keycode, PIN, or ID
  • (Futuristic Other Palm Print, fingerprint, voice
    recognition, iris scanning, facial scanning,
    handwriting recognition, picture ID)

35
Evaluation
36
Champagne Wisher (in press)
  • Simply put, an evaluation is concerned with
    judging the worth of a program and is essentially
    conducted to aid in the making of decisions by
    stakeholders. (e.g., does it work as
    effectively as the standard instructional
    approach).

37
Evaluation Purposes
  • Cost Savings
  • Improved Efficiency/Effectiveness
  • Learner Performance/Competency Improvement/Progres
    s
  • What did they learn?
  • Assessing learning impact
  • How well do learners use what they learned?
  • How much do learners use what they learn?

38
Kirkpatricks 4 Levels
  • Reaction
  • Learning
  • Behavior
  • Results

39
My Evaluation Plan
40
What to Evaluate?
  1. Studentattitudes, learning, jobs.
  2. Instructorpopularity, course enrollments.
  3. Traininginternal and external.
  4. Task--relevance, interactivity, collaborative.
  5. Tool--usable, learner-centered, friendly,
    supportive.
  6. Courseinteractivity, completion rates.
  7. Programgrowth, long-range plans.
  8. Universitycost-benefit, policies, vision.

41
Measures of Student Success(Focus groups,
interviews, observations, surveys, exams, records)
  • Positive Feedback, Recommendations
  • Increased Comprehension, Achievement
  • High Retention in Program
  • Completion Rates or Course Attrition
  • Jobs Obtained, Internships
  • Enrollment Trends for Next Semester

42
1. Student Basic Quantitative
  • Grades, Achievement
  • Number of Posts
  • Participated
  • Computer Log Activitypeak usage, messages/day,
    time of task or in system
  • Attitude Surveys

43
1. Student High-End Success
  • Message complexity, depth, interactivity, qing
  • Collaboration skills
  • Problem finding/solving and critical thinking
  • Challenging and debating others
  • Case-based reasoning, critical thinking measures
  • Portfolios, performances, PBL activities

44
2. Instructor Success
  • High student evals more signing up
  • High student completion rates
  • Utilize Web to share teaching
  • Course recognized in tenure decisions
  • Varies online feedback and assistance techniques

45
3. TrainingOutside Support
  • Training (FacultyTraining.net)
  • Courses Certificates (JIU, e-education)
  • Reports, Newsletters, Pubs
  • Aggregators of Info (CourseShare, Merlot)
  • Global Forums (FacultyOnline.com GEN)
  • Resources, Guides/Tips, Link Collections, Online
    Journals, Library Resources

46
Certified Online Instructor Program
  • Walden Institute12 Week Online Certification
    (Cost 995)
  • 2 tracks one for higher ed and one for online
    corporate trainer
  • Online tools and purpose
  • Instructional design theory techniques
  • Distance ed evaluation
  • Quality assurance
  • Collab learning communities

47
(No Transcript)
48
(No Transcript)
49
http//www.utexas.edu/world/lecture/
50
(No Transcript)
51
3. TrainingInside Support
  • Instructional Consulting
  • Mentoring (strategic planning )
  • Small Pots of Funding
  • Facilities
  • Summer and Year Round Workshops
  • Office of Distributed Learning
  • Colloquiums, Tech Showcases, Guest Speakers
  • Newsletters, guides, active learning grants,
    annual reports, faculty development, brown bags

52
Technology and Professional Dev Ten Tips to Make
it Better (Rogers, 2000)
  • 1. Offer training
  • 2. Give technology to take home
  • 3. Provide on-site technical support
  • 4. Encourage collegial collaboration
  • 5. Send to prof development conference
  • 6. Stretch the day
  • 7. Encourage research
  • 8. Provide online resources
  • 9. Lunch bytes, faculty institutes
  • 10. Celebrate success

53
RIDIC5-ULO3US Model of Technology Use
  • 4. Tasks (RIDIC)
  • Relevance
  • Individualization
  • Depth of Discussion
  • Interactivity
  • Collaboration-Control-Choice-Constructivistic-Comm
    unity

54
RIDIC5-ULO3US Model of Technology Use
  • 5. Tech Tools (ULOUS)
  • Utility/Usable
  • Learner-Centeredness
  • Opportunities with Outsiders Online
  • Ultra Friendly
  • Supportive

55
6. Course Success
  • Few technological glitches/bugs
  • Adequate online support
  • Increasing enrollment trends
  • Course quality (interactivity rating)
  • Monies paid
  • Accepted by other programs

56
7. Online Program or Course Budget (i.e., how
pay, how large is course, tech fees charged, of
courses, tuition rate, etc.)
  • Indirect Costs learner disk space, phone,
    accreditation, integration with existing
    technology, library resources, on site
    orientation tech training, faculty training,
    office space
  • Direct Costs courseware, instructor, help desk,
    books, seat time, bandwidth and data
    communications, server, server back-up, course
    developers, postage

57
7. ProgramOnline Content Considerations
  • Self-Paced or Live mentors?
  • Interactive or content dumping?
  • Individual or Collaborative?
  • Lecture or problem-based learning?
  • Factual or performance assessment?

58
8. Institutional Success
  • E-Enrollments from
  • new students, alumni, existing students
  • Additional grants
  • Press, publication, partners, attention
  • Cost-Benefit model
  • Faculty attitudes
  • Acceptable policies

59
8. Increase Accessibility
  • Make Web material ADA compliant (Bobby)
  • Embed interactivity in lessons
  • Determine student learning preferences
  • Conduct usability testing
  • Consider slowest speed systems
  • Orientations, training, support materials
  • e.g., CD-ROM

60
Final advicewhatever you do
Write a Comment
User Comments (0)
About PowerShow.com