Race to the Top Assessment (RTTA) Program Overview and Meeting Goals - PowerPoint PPT Presentation

View by Category
About This Presentation
Title:

Race to the Top Assessment (RTTA) Program Overview and Meeting Goals

Description:

Program Overview and Meeting Goals Ann Whalen U.S. Department of Education * * * * * * State-based collaboration is the hallmark of PARCC * . * * The PARCC assessment ... – PowerPoint PPT presentation

Number of Views:275
Avg rating:3.0/5.0
Slides: 78
Provided by: www2EdGov
Learn more at: http://www2.ed.gov
Category:

less

Write a Comment
User Comments (0)
Transcript and Presenter's Notes

Title: Race to the Top Assessment (RTTA) Program Overview and Meeting Goals


1
Race to the Top Assessment (RTTA) Program
Overview and Meeting Goals
  • Ann Whalen
  • U.S. Department of Education

2
RTTA Public Meetings
  • First in a series of public meetings on RTTA
  • Funded in part by The William and Flora Hewlett
    Foundation
  • Purpose of the meetings
  • To provide technical assistance to and support
    collaborative efforts for PARCC and SBAC as they
    develop the new assessment systems
  • To expand the knowledge and expertise of the
    Department and the public around key assessment
    issues
  • To allow for a public venue for discussion of key
    components of the systems with experts and the
    public at large

3
RTTA Program Goals
  • Support states in delivering a system of more
    effective and instructionally useful assessments
  • More accurate information about what students
    know and can do
  • Achievement of standards
  • Growth
  • On-track to college and career ready by the time
    of high school graduation
  • Reflects and supports good instructional practice
  • Includes all students, including English language
    learners and students with disabilities
  • Usable to inform
  • Teaching, learning, and program improvement
  • Determinations of school effectiveness
  • Determinations of principal and teacher
    effectiveness for the purposes of evaluation and
    support
  • Determinations of individual student college and
    career readiness

4
Other Requirements
  • Assessment systems must include one or more
    summative assessment components that are fully
    implemented by every state in each consortium by
    SY 2014-15, and are administered at least once
    during the academic year in, at a minimum
  • Reading/language arts and mathematics
  • Grades 3-8 and high school
  • Assessments need to be
  • Valid, fair, and reliable
  • Cover the full range of the college- and
    career-ready content standards
  • Elicit complex demonstrations of knowledge and
    abilities
  • Accurately measure high- and low-achieving
    students

5
RTTA Grantees
  • 360 million awarded in September 2010 to two
    consortia
  • Partnership for Assessment of Readiness of
    College and Careers (PARCC)
  • Management Partner Achieve
  • SMARTER Balanced Assessment Consortium (SBAC)
  • Management Partner WestEd
  • Together the two consortia represent 45 states
    plus the District of Columbia
  • Commitment from higher education institutions
    into which 90 (PARCC) and 74 (SBAC) of
    students matriculate directly from K-12

6
Technology Requirements
  • Maximize the interoperability of assessments
    across technology platforms by
  • Developing all assessment items to an
    industry-recognized, open-licensed set of
    interoperability standards
  • Produce all student-level data in a manner
    consistent with an industry-recognized
    open-licensed interoperability standard
  • Use technology to its maximum extent to develop,
    administer, and score assessments and report
    assessment results.

7
Assessment Technology Standards
  • In December 2010, the Department issued a Request
    for Information (RFI) related to assessment
    technology standards for interoperability
  • Responses received from 22 organizations
  • Responses are posted at http//www.ed.gov/oii-new
    s/interoperable-assessment-technology-standards-pu
    blic-responses
  • ED is finalizing a summary of the comments and
    suggestions to be publicly released soon

8
RTTA Opportunities Challenges
  • States will work together using an economy of
    scale to leverage resources
  • Presents an opportunity to push the envelope
  • Innovation in how we test students and measure
    their knowledge and abilities
  • Improvement in timeliness of assessment results
  • Use of assessment data to improve instruction
  • Support for exploration and fostering of
    continuous innovation and improvement
  • Transformation of standards and assessments
    requires significant professional development and
    support throughout the system
  • State and local technology needed to support the
    new systems

9
Broadband Availability
  • 98 percent of schools have internet access
  • Broadband availability and speed vary

June 2010 www.data.ed.gov/broadband-availability/
10
Technology Readiness
  • States, districts, and schools must have the
    capacity to administer the tests and report the
    results by SY 2014-15
  • PARCC and SBAC will each conduct a technology
    readiness assessment to determine existing state
    and district capacity, identify gaps, and plan
    for how to close those gaps
  • What are the technology needs today (e.g.,
    servers, computers, assistive devices)?
  • What are the policies to support the systems
    (e.g., professional development,
    security/validity checks)?
  • What are the technology needs in 2014-2015 to
    support a truly next-generation assessment
    system (e.g., iPads, smartphones)?
  • The consortia are collaborating on the parameters
    for their readiness tool

11
Expectations for the Meeting
  • We have invited a range of experts to this
    meeting to share their knowledge and experience
    with the consortia members, looking at both what
    is and what could be
  • Focus
  • The morning will focus on general issues around
    technology infrastructure
  • The afternoon will focus on specific next steps
    for the consortia in conducting their technology
    readiness assessments

12
Invited Experts
  • Randy Bennett, Educational Testing Service
  • Rebecca Kopriva, Wisconsin Center for Educational
    Research
  • Shelley Loving-Ryder, Virginia Department of
    Education
  • Rick Rozzelle, Center for Educational Leadership
    and Technology
  • Mike Russell, Measured Progress
  • Sarah Susbury, Virginia Department of Education
  • Denny Way, Pearson
  • Mary Wills, Fauquier County Public Schools
    -Virginia

13
Public Comments
  • Purpose ED wants to hear from the public on key
    considerations related to state and local
    infrastructure needs either expanding on the
    morning and afternoon discussions, or raising new
    issues on this topic not previously addressed
  • When At the conclusion of the morning and
    afternoon sessions (1145 a.m. and 230 p.m.)
  • Time limit 3 minutes per person/organization
  • Sign up Individuals can sign up during the
    morning break at the registration desk
  • Due to the limited time available, for those who
    are not able to provide your comments in person,
    we ask that you email your comments to
    racetothetop.assessment_at_ed.gov

14
Meeting Agenda
  • 830-900 Welcome/setting the stage
  • 900-1015 Fishbowl discussion
  • 1015-1030 Break
  • 1030-1145 Fishbowl discussion continued
  • 1145-1215 Public comments
  • 1215-100 Lunch
  • 100-230 Fishbowl discussion
  • 230-245 Public comments
  • 245-300 Wrap-up
  • 300 Adjourn

15
Reminders
  • Cell phones on vibrate please
  • Race to the Top resources Applications, FAQs,
    plus todays materials and transcription
    available at
  • www2.ed.gov/programs/racetothetop-assessment
  • Additional written input may be submitted to
    racetothetop.assessment_at_ed.gov
  • Reminder The purpose is to promote a full
    discussion and hear a wide range of viewpoints on
    local and state technology infrastructure needs
    for computer-administered assessments, as well as
    the challenges and opportunities afforded by the
    Race to the Top Assessment program. Through this
    meeting, the U.S. Department of Education is not
    seeking to promote and/or endorse any particular
    program, project, methodology or approach to this
    work.

16
Smarter Balanced Assessment Consortium5
  • RTTA Public Meeting
  • John Jesse, Utah
  • (SBAC Technology Work Group Co-Chair)
  • Steve Garton, Maine
  • (SBAC Technology Work Group Co-Chair)
  • April 15th, 2011

17
Historical Development of the SMARTER Balanced
Assessment Consortium
  • Computer Adaptive
  • Formative Capacity
  • Integrated System
  • Measuring the Full Range of Knowledge and Skills

18
31 Member States
19
Assessment System Components
20
IT Readiness Tool
  • Evaluate current technology and infrastructure in
    terms of readiness to implement the Consortium
    assessment system
  • Gap Analysis
  • Guide the State education agency(SEA), Local
    education agency (LEA), or school through  a
    process of establishing strategies to address
    technology needs identified by the readiness tool
     
  • SEA, LEA, or school will be able to use the tool
    as a continuing work space to monitor and
    evaluate changes that are made in the system
    and determine the impact of readiness updates
  • Reports from the readiness tool will be available
    to assist SEAs, LEAs, and schools in technology
    planning, describe the plan as established by the
    SEA, LEA, or school, and to describe progress as
    the technology plan is implemented
  • Collaboration with PARCC
  • The IT Readiness Tool will be designed and built
    in collaboration with PARCC.

21
IT Systems Architecture
  • Establish the architectural framework for the
    Consortium assessment system.
  • Architecture
  • Establish the technical direction for the
    Consortiums technology and systems platform.
  • Provides a framework for achieving interoperable
    software solutions and provides an architecture
    under which all software can be selected,
    designed and implemented
  • Develop an architecture that supports a highly
    available assessment system that integrates with
    the low technology capability at the state and
    local levels
  • Utilize industry standard practices for system
    strategy and enterprise architectural frameworks.
  • Integrated System
  • Item Development
  • Computer Adaptive Test Administration
  • System Portal
  • Formative Processes and Tools
  • Interim Assessments
  • Accessibility and Accommodations
  • AI Scoring

22
The Partnership for Assessment of Readiness for
College and Careers Scott Norton, Louisiana
Department of Education USED RTTA Public
Meeting Technology Infrastructure April 15,
2011
23
About PARCC
  • PARCC is an alliance of 25 states working
    together to develop a common set of K-12
    assessments in English and math anchored in what
    it takes to be ready for college and careers
  • PARCC is state-led and a subset of PARCC states
    make up its Governing Board
  • Governing Board AZ, AR, DC, FL, GA, IL, IN, LA,
    MD, MA, NJ, NY, OK, RI, TN
  • Governing Board Chair Massachusetts Commissioner
    Mitchell Chester
  • Fiscal Agent State Florida oversees budget,
    procurement, and reporting functions
  • Project Management Partner Achieve

24
PARCC States
Governing Board States Participating
States
25
The PARCC Vision
  1. Build a pathway to college and career readiness
    for all students
  2. Create high-quality assessments that measure more
    sophisticated, authentic student performances
  3. Support educators in the classroom
  4. Make better use of technology in assessments
  5. Advance accountability at all levels

26
Assessment Design
  • Benefits
  • Through-course assessments provide teachers
    information on their students performance at key
    points during the year, allowing them to adjust
    instruction and target interventions
  • Results will be returned quickly so they are more
    useful to schools
  • Assessments will measure more sophisticated,
    authentic performances
  • Source Graphic adapted from a representation
    prepared by the Center for K-12 Assessment
    Performance Management (www.k12center.org)

27
Role of Technology
  • Technology will be central to PARCC, providing
    cutting edge solutions to test development,
    administration, scoring and reporting
  • Integrated modular system and interoperable
    platform
  • Administration
  • Assessments delivered to schools online
  • Grades 6-11 and end-of-year 100 online
    administration
  • Item types
  • End-of-year computer-enhanced, innovative items
  • Through-course components searchable
    environments for student research and dynamic
    online calculators
  • Preprogrammed accommodations

28
Role of Technology
  • Scoring
  • Automated scoring
  • Distributed scoring
  • Data Management and Reporting Interactive Data
    Tool
  • Online reports and customizable data for all
    stakeholders
  • Open-source system platform-aligned data
    standards
  • Content Management Partnership Resource Center
  • Online professional development resources
  • Open-source ability to share, improve, and
    compare items and resources

29
Key Technical Challenges for PARCC
  • There are a number of technical/technological
    challenges that PARCC is currently facing
    including
  • Developing an interoperable technology platform
    that meets the needs of all PARCC states
  • Transitioning states to an computer-based
    assessment system
  • Will provide state and district needs assessment
  • Will support state and district transition
    planning
  • Developing and implementing automated scoring
    systems and processes
  • Identifying innovative item types that are
    effective measures

30
PARCC Timeline
SY 2012-13 First year pilot/field testing and
related research and data collection
SY 2013-14 Second year pilot/field testing and
related research and data collection
SY 2014-15 Full administration of PARCC
assessments
Summer 2015 Set achievement levels, including
college-ready performance levels
SY 2011-12 Development begins
SY 2010-11 Launch and design phase
31
Partnership for Assessment of Readiness for
College and Careers http//www.achieve.org/parcc h
ttp//www.fldoe.org/parcc/
32
Assessing ReadinessRecommendations based on
the Virginia Experience
Virginia Department of Education Shelley
Loving-Ryder Assistant Superintendent of Student
Assessment School Improvement Sarah
Susbury Director of Test Administration, Scoring
ReportingFauquier County Public Schools in
Virginia Mary Wills Division Director of
Testing
33
Readiness A Shared Responsibility
34
Readiness A Shared Responsibility
  • Technology
  • Assessment
  • Training
  • Communications

35
District and School Readiness Technology
  • Communicate technology requirements early and
    often.
  • Preliminary information is better than no
    information.
  • Provide documentation to all stakeholders
    regarding
  • Device requirements
  • Minimum specifications for student testing
    devices (processor speed, operating system,
    memory, screen resolution, input devices, etc.)
  • Minimum specifications for administrative devices
  • Software requirements (browser, plug-ins, client
    software, etc.)
  • Any known hardware or software incompatibilities
    or concerns
  • Infrastructure requirements and guidelines
  • Server requirements, firewall configuration
    requirements, IP address requirements, content
    filter configurations, etc.
  • Any known infrastructure or configuration
    incompatibilities

36
District and School Readiness Technology
  • Provide documentation to all stakeholders
    regarding
  • Bandwidth requirements and guidelines
  • Minimum bandwidth per test item or test form
  • Compatibility of caching devices or software
  • Electrical power requirements in testing areas
  • Appropriate testing areas (sufficient workspace,
    secure and quiet environment, etc.)
  • Potential printing requirements for student
    authorizations, student score reports, reports
    for teachers, etc.
  • Need for improved communication with examiners or
    proctors during testing (two-way radios, instant
    messaging software, etc.)

37
District and School Readiness Technology
  • Provide software tool(s) to
  • Verify device requirements and configurations
  • Verify infrastructure configurations and
    connectivity
  • Verify network capacity (during potential testing
    times)
  • Multi-stage readiness certification process
  • Self-reported checklist based on minimum
    specifications
  • Network capacity testing
  • Just before testing checklist
  • Alternative certification process for
    technologically challenged districts
  • Technology integrators to evaluate and provide
    solutions

38
District and School Readiness Assessment
  • Scheduling and planning
  • Consider testing dates, available labs, number of
    testing devices, number of tests to be
    administered
  • Staff testing environments (examiners proctors)
  • Plan for test accommodations (specific hardware,
    software, testing environments).
  • Plan for make-up tests, retests.
  • Simulate live testing

39
District and School Readiness Training
  • State-provided training
  • Overview to district level superintendents and
    leaders
  • Policies and procedures for designated technology
    and assessment staff
  • Needed to augment vendor training
  • Ongoing process for training and mentoring new
    district testing staff
  • District-provided training
  • Roles and expectations related to assessment,
    technology, and test security
  • Guide school level staff in assessment planning
  • Testing windows, order of test administration,
    avoid scheduling school events during testing
  • Handling test irregularities

40
District and School Readiness Training
  • School-provided training
  • Roles and expectations related to assessment,
    technology, and test security
  • Administration of accommodated tests
  • Handling technology issues during testing
  • Facilitate sharing of model training resources

41
District and School Readiness Communications
  • Provide a common framework for all stakeholders
  • Offer public access to testing information
    (released tests, examples online test interface,
    etc.)
  • Ensure policy and procedures documents are
    archived and available to districts
  • Monitor and provide details of 3rd party software
    updates
  • Provide a web-based system status page

42
  • Michael Russell
  • Vice President of Innovation
  • Infrastructure and Emerging Technologies

43
  • The Final Mile
  • Consider Every School Unique
  • Bandwidth Into vs. Within School
  • Wired vs. Wireless
  • Type and Distribution of Devices
  • Competition for Bandwidth
  • Technical Skill of Personnel

44
  • Opportunities
  • Will One Option Work Across All Settings?
  • Local Caching
  • Pre-Caching
  • Peer-to-Peer
  • The Cloud

45
  • Delivery Devices
  • They All Have Screens, But Can We Treat Them the
    Same?
  • Desktop and Laptops
  • Netbooks
  • Tablets
  • SmartPhones

46
  • Assistive Technologies
  • Keeping Pace with Versioning
  • Cursor Manipulation
  • Communication Input
  • Communication Output

47
  • Adoption Rates
  • If We Build It, How Quickly Will They Come?
  • Wide Variability Across Educational Systems
  • Leading-Edge vs. Tried-and-True

48
Expert Commentary Security Issues with Online
Assessments
Denny Way, Ph.D. Pearson Presented at the U.S.
Department of Education Public Meeting on Online
Assessments
49
Three Broad Areas of Security as Related to
Online Assessments
  • Secure Test Design
  • Secure Test Delivery
  • Secure Test Environment
  • Primary security concern in K-12 testing The fox
    guarding the henhouse

K12 Testing
K12 Testing
50
Addressing Security Issues in Test Design
  • Using a test design that leverages the computer
  • Adaptive Testing
  • Testlets
  • Random selection of items or tasks
  • Maximize up front content development
  • Supports leveraging the computer for delivering
    content
  • Provides stability across administrations
  • Consider using technology within tasks to vary
    content
  • Task models for mathematics and ELA can enhance
    security
  • Measurement errors due to task variability may
    not impact the levels of aggregation that matter
    for accountability purposes

51
Addressing Security Issues in Test Delivery
  • Secure encryption mechanisms are critical for
    protecting test content and test data end-to-end
  • Cloud computing could be vulnerable
  • Assistive devices or software running in
    conjunction with the delivery system need to be
    white listed
  • Ownership of the delivery system may impact
    security
  • How open should encryption be?
  • Do proprietary delivery systems have security
    advantages?
  • Mobile devices will need to have built-in
    lockdown capabilities
  • Mobile devices do not support Kiosk mode (e.g.,
    disable home button)

52
Validity Issues with Online Assessments
  • Randy Bennett
  • ETS
  • rbennett_at_ets.org

Discussion presentation at the US ED Race to the
Top Assessment Meeting 1, Washington, DC, April
15, 2011
53
Skill Under-Representation
  • Due to the technology implementation, the test
    fails to measure one or more important aspects of
    the CCSS
  • Interface Design
  • I could have answered those show-your-work
    questions correctly if the test would have
    allowed me to draw tables and figures. Theres
    more than one way to solve a math problem!
  • Automated Scoring
  • I knew that the automated essay grader doesnt
    really understand what I write, so this time I
    just used lots of hard words and made my essay
    really long. I got a higher score!

54
Irrelevant Skills
  • Due to the technology implementation, the test
    calls upon extraneous skills
  • Interface Design
  • I write on computer a lot but my word processor
    works very differently from the one in the test
    so I couldnt correct and revise like I do when I
    write at home and school.
  • Entering math equations was not easy! It took
    me a long time to correct my mistakes and I got
    very confused by the equation editor.

55
Comparability
  • The test operates differently from one machine to
    the next, or between paper and computer,
    affecting student performance idiosyncratically
  • Technology Infrastructure
  • There was a long wait between test items on my
    machine, which was very frustrating and made it
    hard for me to focus. Students using the newer
    machines didnt have that problem.
  • My screen was much smaller than the ones used by
    some other kids, so I had more trouble reading.
  • I never hand-write anything! How could I do my
    best on a paper test when the kids in the other
    middle school get to take their writing test on
    computer?

56
RTTA Public Meeting State and Local
Technology Infrastructure Needed to Support
Interoperable Online Assessment SystemsApril
15, 2011
  • Rick RozzelleCenter for Educational Leadership
    and Technology (CELT) Corporation

57
LMS Operational Model
58
What is the IIS/LMS Integration Strategy?
Network Communications Infrastructure and Help
Desk
59
Teacher-Student Data Link 2010
60
Teacher-Student Data Link 2015
61
Large-scale Online AssessmentsRecommendations
based on the Virginia Experience
Virginia Department of Education Shelley
Loving-Ryder Assistant Superintendent of Student
Assessment School Improvement Sarah
Susbury Director of Test Administration, Scoring
ReportingFauquier County Public Schools in
Virginia Mary Wills Division Director of
Testing
62
For Context A Snapshot of Virginia
  • Virginias Demographics
  • Approximately
  • 1.2M students in grades K-12
  • 88,000 teachers
  • 132 school divisions
  • 1,800 public schools
  • Virginias Standards of Learning Assessment
    Program
  • Established in 1998
  • 12 different End-of-Course assessments
  • 22 different tests administered in grades 3 8
  • Mathematics, Reading, Writing, Science, and
    History tests

63
Online Testing in Virginia A Phased Approach
Timeframe Milestone
Fall 2000 Request for proposals for demonstration projects published (11 responses).
Spring 2001 Demonstration phase conducted with 3 vendors in 9 school districts.
Fall 2001 First operational online tests administered for 3 End-of-Course (EOC) test subjects.
Fall 2004 All EOC test subjects (except writing) available as online tests First middle school test subjects available as online tests.
Spring 2006 All grades in all subjects (except writing) available as online tests.
Spring 2010 Begin development of technology-enhanced items for mathematics.
Spring 2011 Field test technology-enhanced mathematics items.
Spring 2012 First operational technology-enhanced mathematics items administered Field test technology-enhanced reading items and science items Field test online direct writing test at grades 5, 8, and EOC.
Spring 2013 All writing and non-writing tests administered online except for students with documented need for paper tests.
64
Online Testing in Virginia A Phased Approach
Chart data showing online tests are in the
speaker notes
2007 2008 Paper tests 1,058,623 (39) Online tests 1,646,614 (61) 2008 2009 Paper tests 841,630 (31) Online tests 1,850,013 (69) 2009 2010 Paper tests 595,709 (22) Online tests 2,104,490 (78)
65
Recommendations for Success
  • Partnerships
  • Technology Requirements
  • Training Support
  • Policies Procedures

66
Partnerships Three Entities
67
Importance of Partnerships
  • Partnerships must exist between assessment and
    technology groups at all levels
  • Communication paths must be clear and used
    consistently among all partners
  • Each partner must feel confident it is supported
    by the others.

68
Technology Requirements
  • Must be established and communicated to all
    partners
  • Minimum hardware requirements
  • Minimum operating system requirements
  • Minimum infrastructure and bandwidth requirements
  • Must be continuously reviewed and monitored
  • Will be impacted by test and test delivery
    methods
  • Item types
  • Test Accommodations
  • Real-time delivery or cached delivery

69
Technology Requirements
  • Must provide
  • Desktop security during test administration
  • Access security
  • Administrative access
  • Student test access
  • Secure transmission storage of test content
  • Secure transmission storage of student data

70
Technology Requirements
  • Test delivery solution must offer
  • Fault tolerance at all levels
  • Contractor locations
  • Hosting back-ups
  • Database back-ups
  • Limited to no down-time
  • District and school locations
  • Can testing continue during local connectivity
    failure?
  • Will student responses be saved when local
    issues occur?
  • Power failure?
  • Connectivity failure?
  • Hardware failure?
  • Single point of access
  • A single web-based portal for all state
    assessment activities

71
Training Support
  • Training must be provided for all entities.
  • State education agency staff
  • Assessment and technology staff
  • Local education agency staff
  • School and district staff
  • Contractor staff
  • Help desk staff and program team staff
  • Training must address
  • Assessment-related topics
  • Technology-related topics
  • Training should include
  • Train-the-trainer format and materials for
    districts
  • Multiple delivery modes (e.g., Web-based,
    face-to-face, hands-on, etc.)

72
Training Support
  • Support models must be adapted to online test
    delivery.
  • Immediate responses are needed when students are
    testing and issues occur.
  • Staff must be prepared to troubleshoot technology
    issues and assessment issues.
  • A clear communication protocol is needed for
    dealing with technology and assessment issues.
  • Designated points of contact for assessment and
    technology.
  • Consistent responses and support must be provided
    to all districts.
  • Financial support must be ongoing to address
    technology replacement and technology changes

73
Training Support
  • Must provide student training opportunities
  • Using online tools
  • Navigating within the test
  • Familiarity with functionality included in test
    items
  • Provide information for other stakeholders
  • Teachers, parents, community members
  • Policy makers, board members, legislators
  • Vendors of hardware, software, and infrastructure
  • Internet service providers

74
Policies Procedures
  • Established and well-communicated
  • Anticipate problems and prepare plans
  • Have an established plan for districts and
    schools to receive information when a problem
    occurs
  • Provide a system status page
  • Reduces phone calls and emails during an issue
  • Ensures all users receive the same information
  • Host on a server external to the assessment
    system
  • Have available for posting custom messages
    quickly
  • Provide contingency plans for districts and
    schools

75
Race to the Top Assessment ProgramTechnical
Assistance Public Meeting Closing Comments
  • Ann Whalen
  • U.S. Department of Education

76
Reminders
  • Transcript from todays meeting will be available
    at
  • www2.ed.gov/programs/racetothetop-assessment
  • Additional written input may be submitted to
    racetothetop.assessment_at_ed.gov

77
Future Public Meetings
  • Additional public meetings will be planned for
    this year
  • Future meetings may focus on
  • the use of artificial intelligence scoring of
    assessments
  • selection of a uniform growth model consistent
    with test purpose, structure, and intended uses
  • innovation in item types and how to leverage
    technology
  • the inclusion of students with disabilities and
    English learners
  • Dates, topics and locations of future meetings
    will be posted at ed.gov once they are determined
    and stakeholder groups will be notified
About PowerShow.com