The Ultimate Question: - PowerPoint PPT Presentation

1 / 31
About This Presentation
Title:

The Ultimate Question:

Description:

Geography. Media as systems. Rapid change. Trail of use (B. Bruce, 2001) ... Look for Emerging Themes and Summarize. Using Evaluation Results. Make data ... – PowerPoint PPT presentation

Number of Views:82
Avg rating:3.0/5.0
Slides: 32
Provided by: Ann1150
Category:

less

Transcript and Presenter's Notes

Title: The Ultimate Question:


1
The Ultimate Question
  • Does Your Technology Program Work?
  • Elizabeth Byrom, Principal Investigator
  • Anna Li, Evaluator

2
Objectives
  • Think about the context for evaluating technology
    programs
  • Identify the key elements of an evaluation model
  • Walk through steps for developing an evaluation
    plan
  • Identify Evaluation Resources

3
Why is evaluating technology Programs a challenge?
  • Differences among adopters
  • Scale effects
  • Geography
  • Media as systems
  • Rapid change
  • Trail of use
  • (B. Bruce, 2001)

4
Why is evaluation a challenge?
  • Re-creation of technology
  • New roles for teachers and students
  • Technical characteristics
  • Access
  • (B. Bruce, 2001)

5
Observations from SEIRTEC
  • Evaluation is often the weakest part of a
    technology program.
  • Competing priorities
  • Expertise
  • Policymakers often have unrealistic expectations.
  • Traditional measures do not always apply.

6
Some Things to Consider
  • It takes four or five years for most teachers to
    become highly proficient in teaching with
    technology
  • Effective use of technology usually requires
    changes in teaching strategies.

7
Some Things to Consider
  • Its the combined effect of good teaching,
    appropriate technologies, and conducive
    environment that makes a difference in student
    achievement.
  • Good technology does not make up for poor
    teaching.

8
Assessment and Evaluation
  • Assessment the measurement of knowledge, skills
    and performance learning e.g. student
    assessment, self-assessment
  • Evaluation ways of examining overall technology
    programs as well as specifics of the program

9
Evaluation Process
  • Select an evaluation model
  • Identify performance indicators
  • Identify or develop data collection methods or
    instruments
  • Collect data
  • Analyze data
  • Write evaluation report(s)
  • Use evaluation results to revise, maintain,
    augment or eliminate

10
Key Elements of An Evaluation Plan
  • Logic map(s)
  • Evaluation questions
  • Indicators of success
  • Information sources
  • Criteria and benchmarks
  • Outcomes

11
(No Transcript)
12
Professional Development Map
Inputs
Plans
Needs Assessments
Mandates Policies
Research Best Practices
13
Evaluation Questions
  • At least one question per objective
  • Questions on
  • Accountability
  • Quality
  • Impact
  • Sustainability
  • Lessons learned

14
Indicators
Outcomes
Questions
Methods
Criteria
15
Kinds of Questions
  • Accountability Is the program doing what it is
    supposed to do?
  • Quality How well are we implementing program
    activities and strategies? How good (useful,
    effective, well received) are products and
    services?

16
Kinds of Questions
  • Impact Is the program making a difference? What
    effects are services and products having on
    target populations/
  • Proximal effects
  • Distal effects

17
Proximal and Distal Effects
18
Kinds of Questions
  • Sustainability What elements are, or need to
    be, in place for sustained level of improvement
    in teaching and learning with technology to
    occur?
  • Lessons learned What lessons are we learning
    about the processes and factors that support or
    inhibit the accomplishment of objectives?

19
Sample Questions
  • To what extent are teachers using technology to
    increase the depth of student understanding and
    engagement?
  • How have students been impacted by technology
    integration?
  • How effective has our professional development
    been in helping teachers attain basic technology
    proficiency? In helping them learn effective
    teaching practices?

20
Indicators
  • Definition a statement of what you would expect
    to find out or see that demonstrates particular
    attributes.
  • Focus on
  • Quality, effectiveness, efficacy, usefulness,
    client satisfaction, impact

21
Information Sources
  • Self-reports
  • Questionnaires
  • Interviews
  • Journals and anecdotal accounts
  • Products from participants
  • Sample of work, tests, and portfolios
  • Observations
  • Media videotape, audiotape, photographs
  • Archives

22
Sources of Information Tracking Tools
  • Milken Exchange Framework
  • CEO Forum STaR Chart
  • Learning with Technology Profile Tool (NCRTEC)
  • SEIRTEC Technology Integration Gauge for
    Success
  • Profiler (HPRTEC)

23
profiler.hprtec.org
24
Methods/strategies for collecting data
  • Questionnaire
  • Survey
  • Interview
  • Focus group
  • Observation
  • Archival records

25
Criteria and Benchmark
  • Stick a stake in the ground and say we are here
    today
  • Likert-type Scale or rubrics
  • STaR Chart
  • SEIRTEC Progress Gauge
  • Percentages e.g. 75 passing rate

26
(No Transcript)
27
Outcomes
  • Decisions are made about maintaining, changing,
    or eliminating aspects of the program
  • Convincing evidence is gathered for proposals and
    plans
  • Products developed and distributed
  • Reports
  • Plans

28
Data Analysis
  • Quantitative Data
  • Use Excel Spreadsheet
  • SPSS for Windows or Mac
  • Qualitative Data
  • Content analysis
  • Look for Emerging Themes and Summarize

29
Using Evaluation Results
  • Make data-informed decisions
  • Make a case for continued funding
  • Inform research and the public

30
Evaluation Data on Impact
  • SEIRTEC Professional Development Models listed
    by greatest impact

 
High Quality
Met Needs
Timely
Important Resource
Institutes 100 100 100 100
Academies 100 97.4 97.4 96.5
Core Groups 94.7 86.2 92.9 96.3
Workshops 89 84 87.5 85.9
Presentations 89.3 78.8 83.7 79
31
Hints for Successful Evaluation
  • Think positively. Evaluation is an opportunity
    to learn
  • Try to be objective.
  • Make evaluation an integral part of the program
  • Involve stakeholders in the evaluation process
  • Brag on your successes, however small.
  • Ask for help when you need it.

32
Recommended Books
  • King, Morris, Fitz-Gibbon, How to assess
    program Implementation. Sage, 1987
  • Patton, Utilization Focused Evaluation, Sage, 3d
    edition, 1996.
  • Patton, How to use Qualitative Methods in
    Evaluation, Sage, 1987.
  • Joint Committee on Standards for Educational
    Evaluation, The Program Evaluation Standards, 2nd
    Edition. Sage, 1994.       
  • Campbell, Stanley, Experimental and
    Quasi-Experimental Designs for Research,
    Houghton-Miflin, 1963. (CS)

33
Evaluation Resources
  • SEIRTEC Web Site http//www.seirtec.org
  • US Department of Education
  • http//www.ed.gov/pubs/EdTechGuide/
  • http//www.ed.gov/Technology/TechConf/1999/whitepa
    pers/paper8.html
  • Education Evaluation Primer http//www.ed.gov/off
    ices/OUS/eval/primer1.html

34
Evaluation Resources
  • Muraskin, Understanding Evaluation The way to
    Better Prevention Programs. http//www.ed.gov/PDFD
    ocs/handbook.pdf
  • National Science Foundation, User-Friendly
    Handbook for Program Evaluation,
    www.ehr.nsf.gov/EHR/RED/EVAL/handbook/handbook.htm
  • W.K. Kellogg Foundation Evaluation Handbook,
    http//www.wkkf.org/Publications/evalhdbk/default.
    htm

35
For Further Information, Contact
  • Elizabeth Byrom, Ed.D.
  • Ebyrom_at_serve.org
  • Anna Li
  • Ali_at_serve.org
  • SouthEast Initiatives Regional Technology in
    Education Consortium
  • 1-800-755-3277 www.seirtec.org
Write a Comment
User Comments (0)
About PowerShow.com