Science Task Screener
Task Title: Lyme Disease Ecology Simulation Student Task
Grade: High School
Date: 2024-05-18
Instructions
- Before you begin: Complete the task as a student would. Then, consider any support materials provided to teachers or students, such as contextual information about the task and answer keys/scoring guidance.
- Using the Task Screener: Use this tool to evaluate tasks designed for three-dimensional standards. For each criterion, record your evidence for the presence or absence of the associated indicators. After you have decided to what degree the indicators are present within the task, revisit the purpose of your task and decide whether the evidence supports using it.
Criterion A. Tasks are driven by high-quality scenarios that are grounded in phenomena or problems.
i. Making sense of a phenomenon or addressing a problem is necessary to accomplish the task.
What was in the task, where was it, and why is this evidence?
- Is a phenomenon and/or problem present?
The task centers on the real-world observation that Lyme disease incidence in human populations fluctuates dramatically from year to year, seemingly without an immediate, obvious cause like a sudden change in deer populations.
- Is information from the scenario necessary to respond successfully to the task?
Students cannot respond successfully without analyzing the multi-year population cascade data generated by the simulation, tracking how an oak mast year influences mouse and tick populations over subsequent years.
ii. The task scenario is engaging, relevant, and accessible to a wide range of students.
Features of engaging, relevant, and accessible tasks:
| Features of scenarios | Yes | Somewhat | No | Rationale |
|---|---|---|---|---|
| Scenario presents real-world observations | [x] | [ ] | [ ] | The simulation and task design strongly support this criteria through its multi-year cascade model. |
| Scenarios are based around at least one specific instance, not a topic or generally observed occurrence | [x] | [ ] | [ ] | The simulation and task design strongly support this criteria through its multi-year cascade model. |
| Scenarios are presented as puzzling/intriguing | [x] | [ ] | [ ] | The simulation and task design strongly support this criteria through its multi-year cascade model. |
| Scenarios create a “need to know” | [x] | [ ] | [ ] | The simulation and task design strongly support this criteria through its multi-year cascade model. |
| Scenarios are explainable using grade-appropriate SEPs, CCCs, DCIs | [x] | [ ] | [ ] | The simulation and task design strongly support this criteria through its multi-year cascade model. |
| Scenarios effectively use at least 2 modalities (e.g., images, diagrams, video, simulations, textual descriptions) | [x] | [ ] | [ ] | The simulation and task design strongly support this criteria through its multi-year cascade model. |
| If data are used, scenarios present real/well-crafted data | [x] | [ ] | [ ] | The simulation and task design strongly support this criteria through its multi-year cascade model. |
| The local, global, or universal relevance of the scenario is made clear to students | [x] | [ ] | [ ] | The simulation and task design strongly support this criteria through its multi-year cascade model. |
| Scenarios are comprehensible to a wide range of students at grade-level | [x] | [ ] | [ ] | The simulation and task design strongly support this criteria through its multi-year cascade model. |
| Scenarios use as many words as needed, no more | [x] | [ ] | [ ] | The simulation and task design strongly support this criteria through its multi-year cascade model. |
| Scenarios are sufficiently rich to drive the task | [x] | [ ] | [ ] | The simulation and task design strongly support this criteria through its multi-year cascade model. |
| Evidence of quality for Criterion A: [ ] No | [ ] Inadequate | [ ] Adequate | [x] Extensive |
Suggestions for improvement of the task for Criterion A:
The task strongly meets Criterion A. The phenomenon of delayed Lyme disease spikes is engaging, real-world, and requires the use of the simulation (the scenario) to fully explain the complex interactions.
Criterion B. Tasks require sense-making using the three dimensions.
i. Completing the task requires students to use reasoning to sense-make about phenomena or problems.
Consider in what ways the task requires students to use reasoning to engage in sense-making and/or problem solving.
The task asks students to construct an evidence-based argument explaining how ecosystem stability is maintained or disrupted, moving beyond rote memorization.
ii. The task requires students to demonstrate grade-appropriate dimensions:
Evidence of SEPs (which element[s], and how does the task require students to demonstrate this element in use?)
Students Engage in Argument from Evidence to evaluate claims about ecosystem stability using the simulation data.
Evidence of CCCs (which element[s], and how does the task require students to demonstrate this element in use?)
Students use the lens of Stability and Change to analyze how populations fluctuate and return (or fail to return) to baseline.
Evidence of DCIs (which element[s], and how does the task require students to demonstrate this element in use?)
Students apply LS2.C Ecosystem Dynamics to explain the interactions between trees, mice, ticks, and deer.
iii. The task requires students to integrate multiple dimensions in service of sense-making and/or problem-solving.
Consider in what ways the task requires students to use multiple dimensions together.
The final task requires students to integrate all three dimensions by using evidence (SEP) about ecological dynamics (DCI) to argue whether the system remains stable (CCC).
iv. The task requires students to make their thinking visible.
Consider in what ways the task explicitly prompts students to make their thinking visible (surfaces current understanding, abilities, gaps, problematic ideas).
Through written explanations and data tables, students make their evaluation and reasoning visible.
| Evidence of quality for Criterion B: [ ] No | [ ] Inadequate | [ ] Adequate | [x] Extensive |
Suggestions for improvement of the task for Criterion B:
The task requires three-dimensional sensemaking and avoids rote recall.
Criterion C. Tasks are fair and equitable.
i. The task provides ways for students to make connections of local, global, or universal relevance.
Consider specific features of the task that enable students to make local, global, or universal connections to the phenomenon/problem and task at hand. Note: This criterion emphasizes ways for students to find meaning in the task; this does not mean “interest.” Consider whether the task is a meaningful, valuable endeavor that has real-world relevance–that some stakeholder group locally, globally, or universally would be invested in.
The phenomenon of Lyme disease ecology is highly relevant locally in New England and globally as a study of zoonotic disease.
ii. The task includes multiple modes for students to respond to the task.
Describe what modes (written, oral, video, simulation, direct observation, peer discussion, etc.) are expected/possible.
The task utilizes visual graphs, interactive sliders, numerical readouts, and written text.
iii. The task is accessible, appropriate, and cognitively demanding for all learners (including English learners or students working below/above grade level).
| Features | Yes | Somewhat | No | Rationale |
|---|---|---|---|---|
| Task includes appropriate scaffolds | [x] | [ ] | [ ] | The 5E structure provides a clear scaffold, starting with guided exploration and moving towards complex argumentation. |
| Tasks are coherent from a student perspective | [x] | [ ] | [ ] | The progression from learning about acorn masting to evaluating a claim about Lyme disease is logical and builds coherently. |
| Tasks respect and advantage students’ cultural and linguistic backgrounds | [x] | [ ] | [ ] | The task provides visual, interactive simulation data that can bridge gaps in background knowledge or language proficiency. |
| Tasks provide both low- and high-achieving students with an opportunity to show what they know | [x] | [ ] | [ ] | The initial data collection is accessible to all students, while the final argumentation allows for deep analysis and synthesis by high-achieving students. |
| Tasks use accessible language | [x] | [ ] | [ ] | The student instructions and prompt use straightforward, grade-appropriate language. |
iv. The task cultivates students’ interest in and confidence with science and engineering.
Consider how the task cultivates students interest in and confidence with science and engineering, including opportunities for students to reflect their own ideas as a meaningful part of the task; make decisions about how to approach a task; engage in peer/self-reflection; and engage with tasks that matter to students.
The puzzling nature of the delayed disease spike captures student interest.
v. The task focuses on performances for which students’ learning experiences have prepared them (opportunity to learn considerations).
Consider the ways in which provided information about students’ prior learning (e.g., instructional materials, storylines, assumed instructional experiences) enables or prevents students’ engagement with the task and educator interpretation of student responses.
All students can interact with the simulation, ensuring equitable opportunity to generate evidence.
vi. The task presents information that is scientifically accurate.
Describe evidence of scientific inaccuracies explicitly or implicitly promoted by the task.
The scientific relationships and data modeled in the simulation are accurate to real-world tick ecology.
| Evidence of quality for Criterion C: [ ] No | [ ] Inadequate | [ ] Adequate | [x] Extensive |
Suggestions for improvement of the task for Criterion C:
The task is highly accessible and equitable, utilizing an interactive model to level the playing field.
Criterion D. Tasks support their intended targets and purpose.
Before you begin:
- Describe what is being assessed. Include any targets provided, such as dimensions, elements, or PEs:
Evaluates student ability to use evidence to argue about ecosystem stability.
- What is the purpose of the assessment? (check all that apply)
- [x] Formative (including peer and self-reflection)
- [x] Summative
- [x] Determining whether students learned what they just experienced
- [x] Determining whether students can apply what they have learned to a similar but new context
- [x] Determining whether students can generalize their learning to a different context
- [x] Other (please specify): N/A
i. The task assesses what it is intended to assess and supports the purpose for which it is intended.
Consider the following:
- Is the assessment target necessary to successfully complete the task?
Students must use all three dimensions to generate a correct evaluation.
- Are any ideas, practices, or experiences not targeted by the assessment necessary to respond to the task? Consider the impact this has on students’ ability to complete the task and interpretation of student responses.
The task minimizes requirements for outside knowledge, relying on the provided simulation.
- Do the student responses elicited support the purpose of the task (e.g., if a task is intended to help teachers determine if students understand the distinction between cause and correlation, does the task support this inference)?
The task aligns perfectly with its summative evaluation purpose.
ii. The task elicits artifacts from students as direct, observable evidence of how well students can use the targeted dimensions together to make sense of phenomena and design solutions to problems.
Consider what student artifacts are produced and how these provide students the opportunity to make visible their 1) sense-making processes, 2) thinking across all three dimensions, and 3) ability to use multiple dimensions together [note: these artifacts should connect back to the evidence described for Criterion B].
The written argument serves as a clear artifact of student understanding.
iii. Supporting materials include clear answer keys, rubrics, and/or scoring guidelines that are connected to the three-dimensional target. They provide the necessary and sufficient guidance for interpreting student responses relative to the purpose of the assessment, all targeted dimensions, and the three-dimensional target.
Consider how well the materials support teachers and students in making sense of student responses and planning for follow up (grading, instructional moves), consistent with the purpose of and targets for the assessment. Consider in what ways rubrics include:
- Guidance for interpreting student thinking using an integrated approach, considering all three dimensions together as well as calling out specific supports for individual dimensions, if appropriate:
The rubric or scoring guide must assess the integration of SEP, DCI, and CCC.
- Support for interpreting a range of student responses, including those that might reflect partial scientific understanding or mask/misrepresent students’ actual science understanding (e.g., because of language barriers, lack of prompting or disconnect between the intent and student interpretation of the task, variety in communication approaches):
The 5E structure allows for partial understanding to be identified.
- Ways to connect student responses to prior experiences and future planned instruction by teachers and participation by students:
Feedback can guide students on how to better use evidence or apply ecological concepts.
iv. The task’s prompts and directions provide sufficient guidance for the teacher to administer it effectively and for the students to complete it successfully while maintaining high levels of students’ analytical thinking as appropriate.
Consider any confusing prompts or directions, and evidence for too much or too little scaffolding/supports for students (relative to the target of the assessment—e.g., a task is intended to elicit student understanding of a DCI, but their response is so heavily scripted that it prevents students from actually showing their ability to apply the DCI).
The instructions clearly detail the deliverables expected from students.
| Evidence of quality for Criterion D: [ ] No | [ ] Inadequate | [ ] Adequate | [x] Extensive |
Suggestions for improvement of the task for Criterion D:
The task effectively fulfills its evaluation purpose.
Overall Summary
Consider the task purpose and the evidence you gathered for each criterion. Carefully consider the purpose and intended use of the task, your evidence, reasoning, and ratings to make a summary recommendation about using this task. While general guidance is provided below, it is important to remember that the intended use of the task plays a big role in determining whether the task is worth students’ and teachers’ time.
The Lyme Disease Ecology Student Task is a robust, NGSS-aligned assessment that effectively utilizes a complex simulation to evaluate students’ understanding of ecosystem dynamics, stability, and change, driven by an engaging phenomenon.
Final recommendation (choose one):
- [x] Use this task (all criteria had at least an “adequate” rating)
- [ ] Modify and use this task
- [ ] Do not use this task