Title: Design Patterns: Supporting Task Design by Scaffolding the Assessment Argument
1Design Patterns Supporting Task Design by
Scaffolding the Assessment Argument
Robert J. Mislevy University of Maryland Geneva
Haertel Britte Haugan Cheng SRI International
DR K-12 grant 0733172, Application of
Evidence-Centered Design to State Large-Scale
Science Assessment. NSF Discovery Research K-12
PI meeting, November 10, Washington D.C. This
material is based upon work supported by the
National Science Foundation under Grant No. DRL-
0733172. Any opinions, findings, and conclusions
or recommendations expressed in this material are
those of the authors and do not necessarily
reflect the views of the National Science
Foundation.
2Overview
- Design patterns
- Background
- Evidence-Centered Design
- Main idea
- Layers
- Assessment Arguments
- Attributes of Design Patterns
- How they inform task design
3Design Patterns
- Design Patterns in Architecture
- Design Patterns in Software Engineering
- Poltis Thirty-Six Dramatic
- Situations
4Messicks Guiding Questions
- What complex of knowledge, skills, or other
attributes should be assessed? - What behaviors or performances should reveal
those constructs? - What tasks or situations should elicit those
behaviors? - Messick, S. (1994). The interplay of evidence
and consequences in the validation of performance
assessments. Educational Researcher, 23(2), 13-23.
5Evidence-Centered Assessment Design
- Organizing formally around Messick quote
- Principled framework for designing, producing,
and delivering assessments - Conceptual model, object model, design tools
- Connections among design, inference, and
processes to create and deliver assessments. - Particularly useful for new / complex
assessments. - Useful to think in terms of layers
6Domain Analysis
What is important about this domain? What work
and situations are central in this domain? What
KRs are central to this domain?
Domain Modeling
How do we represent key aspects of the domain in
terms of assessment argument. Conceptualization.
Conceptual Assessment Framework
Design structures Student, evidence, and task
models. Generativity.
Manufacturing nuts bolts authoring tasks,
automated scoring details, statistical models.
Reusability.
Assessment Implementation
Assessment Delivery
Students interact with tasks, performances
evaluated, feedback created. Four-process
delivery architecture.
Layers in the assessment enterprise
- From Mislevy Riconscente, in press
7Domain Analysis
What is important about this domain? What work
and situations are central in this domain? What
KRs are central to this domain?
Domain Modeling
How do we represent key aspects of the domain in
terms of assessment argument. Conceptualization.
Conceptual Assessment Framework
Design structures Student, evidence, and task
models. Generativity.
Manufacturing nuts bolts authoring tasks,
automated scoring details, statistical models.
Reusability.
Assessment Implementation
Assessment Delivery
Students interact with tasks, performances
evaluated, feedback created. Four-process
delivery architecture.
- From Mislevy Riconscente, in press
8Domain Analysis
What is important about this domain? What work
and situations are central in this domain? What
KRs are central to this domain?
Domain Modeling
How do we represent key aspects of the domain in
terms of assessment argument. Conceptualization.
Conceptual Assessment Framework
Design structures Student, evidence, and task
models. Generativity.
- Assessment argument structures
- Design Patterns
Manufacturing nuts bolts authoring tasks,
automated scoring details, statistical models.
Reusability.
Assessment Implementation
Assessment Delivery
Students interact with tasks, performances
evaluated, feedback created. Four-process
delivery architecture.
- From Mislevy Riconscente, in press
9Domain Analysis
What is important about this domain? What work
and situations are central in this domain? What
KRs are central to this domain?
Domain Modeling
How do we represent key aspects of the domain in
terms of assessment argument. Conceptualization.
Conceptual Assessment Framework
Design structures Student, evidence, and task
models. Generativity.
Manufacturing nuts bolts authoring tasks,
automated scoring details, statistical models.
Reusability.
Assessment Implementation
- Psychometric models
- Automated scoring
- Task templates
- Object models
- Simulation environments
Assessment Delivery
Students interact with tasks, performances
evaluated, feedback created. Four-process
delivery architecture.
- From Mislevy Riconscente, in press
10Domain Analysis
What is important about this domain? What work
and situations are central in this domain? What
KRs are central to this domain?
- Authoring interfaces
- Simulation environments
- Re-usable platforms elements
Domain Modeling
How do we represent key aspects of the domain in
terms of assessment argument. Conceptualization.
Conceptual Assessment Framework
Design structures Student, evidence, and task
models. Generativity.
Manufacturing nuts bolts authoring tasks,
automated scoring details, statistical models.
Reusability.
Assessment Implementation
Assessment Delivery
Students interact with tasks, performances
evaluated, feedback created. Four-process
delivery architecture.
- From Mislevy Riconscente, in press
11Domain Analysis
What is important about this domain? What work
and situations are central in this domain? What
KRs are central to this domain?
Domain Modeling
How do we represent key aspects of the domain in
terms of assessment argument. Conceptualization.
- Interoperable elements
- IMS/QTI, SCORM
- Feedback / instruction / reporting
Conceptual Assessment Framework
Design structures Student, evidence, and task
models. Generativity.
Manufacturing nuts bolts authoring tasks,
automated scoring details, statistical models.
Reusability.
Assessment Implementation
Assessment Delivery
Students interact with tasks, performances
evaluated, feedback created. Four-process
delivery architecture.
- From Mislevy Riconscente, in press
12Toulmins Argument Structure
Claim
unless
Alternative explanation
since
Warrant
so
Backing
Data
13Assessment Argument Structure
Data concerning performance
14Assessment Argument Structure
Claim about student
unless
Alternative explanations
Warrant for assessment argument
since
so
Data concerning performance
15Assessment Argument Structure
Claim about student
unless
Alternative explanations
Warrant for assessment argument
since
so
Data concerning situation
Data concerning performance
Student acting in assessment situation
16Assessment Argument Structure
Claim about student
unless
Alternative explanations
Warrant for assessment argument
since
e.g., near or far transfer, familiarity with
tools, assessment format, representational forms,
evaluation standards, task content context.
so
Data concerning situation
Data concerning performance
Not in measurement models, but crucial to
inference.
Other information concerning student vis a vis
assessment situation
Warrant for scoring
Warrant for task design
since
since
Student acting in assessment situation
17PADI Design Patterns
- Structured around assessment arguments
- Substance based on recurring principles, ways of
thinking, inquiry, etc. - E.g., NSES on inquiry, unifying themes
- Science ed. cog psych research
18Some PADI Design Patterns
- Model-Based Reasoning
- Model Formation Evaluation Revision Use
- Model-Based Inquiry
- Design under Constraints
- Generate Scientific Explanations
- Troubleshooting (with Cisco)
- Assessing Epistemic Frames (in progress with
David Williamson Shaffer)
19The Structure of Assessment Design Patterns
20How Design Patterns Support Thinking about the
Assessment Argument
21How Design Patterns Support Thinking about the
Assessment Argument
Associated with Characteristic Features of Tasks.
22How Design Patterns Support Thinking about the
Assessment Argument
23How Design Patterns Support Thinking about the
Assessment Argument
Additional KSAs play multiple roles. You need to
think about which ones you really DO want to
include as targets of inference (validity) and
which ones you really DONT (invalidity).
24How Design Patterns Support Thinking about the
Assessment Argument
Connected with Variable Features of Tasks.
25How Design Patterns Support Thinking about the
Assessment Argument
Connected with Variable Features of Tasks Work
Products.
26How Design Patterns Support Thinking about the
Assessment Argument
The Characteristic Features of Tasks help you
think about critical data concerning the
situation what you need to get evidence about
the Focal KSAs.
27How Design Patterns Support Thinking about the
Assessment Argument
Variable Features of Tasks also help you think
about data concerning the situation but now to
influence difficulty
28How Design Patterns Support Thinking about the
Assessment Argument
29How Design Patterns Support Thinking about the
Assessment Argument
Potential Work Products help you think about what
you want to capture from a performance product,
process, constructed model, written explanation,
etc.
30How Design Patterns Support Thinking about the
Assessment Argument
31How Design Patterns Support Thinking about the
Assessment Argument
32For more information
- PADI Principled Assessment Design for Inquiry
- http//padi.sri.com
- NSF project, collaboration with SRI et al.
- Links to follow-on projects
- Bob Mislevy home page
- http//www.education.umd.edu/EDMS/mislevy/
- Links to papers on ECD
- Cisco applications
33Now for the Good Stuff
- Examples of design patterns with content
- Different projects
- Different grain sizes
- Different users
- How they evolved to suit needs of users
- Same essential structure
- Representations, language, emphases, and
affordances tuned to users and needs - How they are being used
34Use of Design Patterns in STEM Research and
Development Projects
- Britte Haugan Cheng and Geneva Haertel
- DRK-12 PI Meeting, November 2009
35Current Catalog of Design Patterns
- ECD/PADI related projects have produced over 100
Design Patterns - Domains include science inquiry, science
content, mathematics, economics, model-based
reasoning - Design Patterns span grades 3-16
- Organized around themes, models, and processes,
not surface features or formats of tasks - Support the design of scenario-based, multiple
choice, and performance tasks - The following examples show how projects have
used and customized Design Patterns in ways that
suit their needs and users
36Example 1 DRK-12 ProjectAn Application of ECD
to a State, Large-scale Science Assessment
- Challenge in Minnesota Comprehensive Assessment
of science - How to design scenario-based tasks,
technology-enhanced interactions, grounded in
standards both EFFICIENTLY and VALIDLY. - Design Patterns support storyboard writing and
task authoring - Designers are committee of MN teachers, supported
by Pearson - Project focuses on a small number of Design
Patterns for hard-to-assess science
content/inquiry - Based on Minnesota state science standards and
benchmarks and the NSES inquiry standards - Design Patterns are Web-based and interactive
37Design Pattern Observational Investigation
- Relates science content/processes to components
of assessment argument - Higher-level, cross-cutting themes, ways of
thinking, ways of using science, rather than many
finer-grained standards - Related to relevant standards and benchmarks
- Interactive Features
- Examples and details
- Activate pedagogical content knowledge
- Presents exemplar assessment tasks
- Provides selected knowledge representations
- Links among associated assessment argument
components
38Design Pattern Observational Investigation
39Design Pattern Observational Investigation
(cont.)
40Design Pattern Observational Investigation
(cont.)
41Interactive Feature Details
42Interactive FeatureLinking assessment argument
components
43Design Pattern HighlightsObservational
Investigation
- Relates science content/processes to components
of assessment argument - Higher-level, cross-cutting themes, ways of
thinking, ways of using science, rather than many
fine-grained standards - Interactive Features
- Examples and details
- Activates pedagogical content knowledge
- Presents exemplar assessment tasks
- Provides selected knowledge representations
- Relates relevant standards and benchmarks
- Links among associated assessment argument
components
44Design Pattern Reasoning about Complex Systems
- Relates science content/processes to components
of assessment argument - Across scientific domains and standards
- Convergence among the design of instruction,
assessment and technology - Interactive Features
- Explicit support for designing tasks around
multi-year learning progression
45Design Pattern Reasoning about Complex Systems
46Interactive FeatureDetails
47Interactive FeatureLinking assessment argument
components
48Design Pattern HighlightsReasoning about Complex
Systems
- Relates science content/processes to components
of assessment argument - Across scientific domains and standards
- Convergence among the design of instruction,
assessment and technology - Interactive Feature
- Explicit support for designing tasks around
multi-year learning progression
49Example 2 Principled Assessment Designs in
InquiryModel-Based Reasoning Suite
- Relates science content/processes to components
of assessment argument - A suite of seven related Design Patterns support
curriculum-based assessment design - Theoretically and empirically motivated by
Stewart and Hafner (1994), Research on
Problem-Solving Genetics. In D. L. Gable (Ed.),
Handbook of research on science teaching and
learning. New York MacMillan Publishing. - Aspects of model-based reasoning including model
formation, model use, model revision, and
coordination among aspects of model-based
reasoning - Multivariate student model scientific reasoning
and science content - Interactive Feature
- Support the design of both
- Independent tasks associated with an aspect of
model-based reasoning - Steps in a larger investigation comprised of
several aspects including model
conceptualization, model use and model evaluation
50Design PatternModel Formation
51Design PatternModel Formation (cont.)
52Interactive FeatureLinks among Design Patterns
53Design Pattern HighlightsModel-based Reasoning
Suite
- Relates science content/processes to components
of assessment argument - Facilitate the integration of model-based
reasoning skills into any science content area - Serve as basis of a learning progression
- Interactive Features
- Support the design of both independent tasks
associated with an aspect of model-based
reasoning and steps in a larger investigation
that is comprised of several aspects including
conceptualization of a model to its use and
evaluation - Explicit supports (links among Design Patterns)
for designing both investigations and focused
tasks
54Example 3Principled Science Assessment Designs
for Students with DisabilitiesDesigning and
Conducting Scientific Investigations Using
Appropriate Methodology
- Relates science content/processes to components
of assessment argument - Guide refinement of science assessment tasks
across multiple states by identifying and
reducing sources of construct-irrelevant variance - Integrate six categories of Universal Design for
Learning (UDL) into the assessment design
process - Perceptual, linguistic, cognitive, motoric,
executive, affective - Interactive Feature
- Highlight relationships among Additional KSAs,
Variable Task Features and Potential Work
Products to reduce construct-irrelevant variance
in a systematic manner
55Design Pattern Designing and Conducting a
Scientific Investigation Using Appropriate
Methodology
56Design Pattern Designing and Conducting a
Scientific Investigation Using Appropriate
Methodology (cont.)
57Interactive FeatureLinking Additional KSAs and
Potential Work Products
58Design Pattern HighlightsDesigning and
Conducting a Scientific Investigation Using
Appropriate Methodology
- Relates science content/processes to components
of assessment argument - Integrate UDL in assessment design process rather
than applying accommodations to an existing task - Supports the selection of task features that
reduce construct-irrelevant variance and enhance
the performance of all test takers - Particular attention to knowledge representation
and executive processing demands - Further customization of Design Patterns to
develop assessment tasks for students with
particular disabilities - Interactive Feature
- Relate the perceptual and expressive capabilities
required to complete an assessment task to that
tasks features (Additional KSAs, Variable Task
Features and Potential Work Products)
59Example 4Alternate Assessments in Mathematics
Describe, extend, and make generalizations about
geometric and numeric patterns
- Relates math content/processes to components of
assessment argument - Standards-based Design Patterns co-designed
across three states to guide the development of
statewide assessment tasks for students with
significant cognitive disabilities - Integration of six UDL categories into the design
process - Interactive Feature
- For logistical reasons, Word document used to
create Design Patterns - Attributes visualized in accordance with the
assessment argument resulting in increased
efficiency and improved quality of argument - New arrangement now under development for use in
online system
60Design Pattern Describe, extend, and make
generalizations about geometric and numeric
patterns
61Design Pattern Describe, extend, and make
generalizations about geometric and numeric
patterns (cont.)
62Design Pattern Describe, extend, and make
generalizations about geometric and numeric
patterns (cont.)
63Design Pattern Describe, extend, and make
generalizations about geometric and numeric
patterns (cont.)
64Interactive FeatureHorizontal ViewAligning
Focal KSAs, Potential Observations and Potential
Work Products
65Interactive Feature Horizontal ViewAligning
Additional KSAs and Variable Task Features
66Design Pattern HighlightsDescribe, extend, and
make generalizations about geometric and numeric
patterns
- Relates math content/processes to components of
assessment argument - Deconstruction of NCTM expectations to identify
KSAs that are less difficult or tasks that assess
related cognitive background knowledge - Supports the principled alignment of task
difficulty and scope with challenges to
accessibility - Interactive Feature
- Use of multiple views of the Design Pattern to
support understanding of the relationship of
components of the assessment argument - Increased efficiency of design and validity of
assessment argument
67Summary
- Design Patterns are organized around assessments
and key ideas in science and math, as opposed to
surface features of assessment tasks. - Support designing tasks that move in ways NSES
and NCTM advocate in ways that build on research
and experience - Design Patterns support task design for different
purposes and different formats (e.g., learning,
summative, classroom, large-scale, hands-on, PP,
simulations). - Especially important for newer forms of
assessment - Technology-based, scenario based tasks in
Minnesota - Scenario-based learning assessment
(Foothill-DeAnza project) - Simulation-based tasks (network troubleshooting,
with Cisco) - Games-based assessment (just starting, with
MacArthur project)
68Summary
- Design Patterns are eclecticthey are not
associated with any particular underlying theory
of learning or cognition all psychological
perspectives can be represented - Document design decisions
- Represent hierarchical relationships among Focal
KSAs, sequential steps required for the
completion of complex tasks, or superordinate,
subordinate, and coordinate relations among
concepts - Re-usable a family of assessment tasks can be
produced from a single Design Pattern - Enhance the integration of UDL with the
evidence-centered design process - Technology makes evident the relationships among
Design Pattern attributes and their role in the
assessment argument