Science Task Screener
Task Title: Urban Watershed Mitigation Design
Grade: High School
Date: Current
Instructions
- Before you begin: Complete the task as a student would. Then, consider any support materials provided to teachers or students, such as contextual information about the task and answer keys/scoring guidance.
- Using the Task Screener: Use this tool to evaluate tasks designed for three-dimensional standards. For each criterion, record your evidence for the presence or absence of the associated indicators. After you have decided to what degree the indicators are present within the task, revisit the purpose of your task and decide whether the evidence supports using it.
Criterion A. Tasks are driven by high-quality scenarios that are grounded in phenomena or problems.
i. Making sense of a phenomenon or addressing a problem is necessary to accomplish the task.
What was in the task, where was it, and why is this evidence?
- Is a phenomenon and/or problem present?
The task centers on an urban river experiencing an algal bloom and fish die-off due to heavy spring rains washing agricultural and urban runoff. This phenomenon requires students to construct an engineering design solution by refining four potential technologies.
- Is information from the scenario necessary to respond successfully to the task?
Students must test multiple mitigation strategies in the simulation, record empirical data, and use reasoning to draft a formal proposal comparing effectiveness vs. cost, reliability, and land use.
ii. The task scenario is engaging, relevant, and accessible to a wide range of students.
Features of engaging, relevant, and accessible tasks:
| Features of scenarios | Yes | Somewhat | No | Rationale |
|---|---|---|---|---|
| Scenario presents real-world observations | [x] | [ ] | [ ] | The phenomenon is a specific, observable event in a local river ecosystem that prompts student inquiry. |
| Scenarios are based around at least one specific instance, not a topic or generally observed occurrence | [x] | [ ] | [ ] | The scenario explicitly focuses on a single instance of a river algal bloom rather than just the topic of ‘pollution’. |
| Scenarios are presented as puzzling/intriguing | [x] | [ ] | [ ] | The sudden die-off of fish after heavy rains presents an engaging puzzle. |
| Scenarios create a “need to know” | [x] | [ ] | [ ] | Students must ‘need to know’ which solutions best prevent the die-off within the $10M budget. |
| Scenarios are explainable using grade-appropriate SEPs, CCCs, DCIs | [x] | [ ] | [ ] | The scenario requires HS-level engineering design (ETS1.B) and human impacts (ESS3.C) DCIs. |
| Scenarios effectively use at least 2 modalities (e.g., images, diagrams, video, simulations, textual descriptions) | [x] | [ ] | [ ] | It uses a computational simulation alongside textual description. |
| If data are used, scenarios present real/well-crafted data | [x] | [ ] | [ ] | The simulation provides realistic output data (pollutant % and biodiversity index) based on input variables. |
| The local, global, or universal relevance of the scenario is made clear to students | [x] | [ ] | [ ] | Water quality and urban runoff are locally relevant to most students. |
| Scenarios are comprehensible to a wide range of students at grade-level | [x] | [ ] | [ ] | The reading level is high school appropriate. |
| Scenarios use as many words as needed, no more | [x] | [ ] | [ ] | The text is concise and focused purely on setting up the simulation boundaries. |
| Scenarios are sufficiently rich to drive the task | [x] | [ ] | [ ] | The scenario directly drives the investigation into the 4 technological refinements. |
| Evidence of quality for Criterion A: [ ] No | [ ] Inadequate | [ ] Adequate | [x] Extensive |
Suggestions for improvement of the task for Criterion A:
None - phenomenon effectively anchors the task.
Criterion B. Tasks require sense-making using the three dimensions.
i. Completing the task requires students to use reasoning to sense-make about phenomena or problems.
Consider in what ways the task requires students to use reasoning to engage in sense-making and/or problem solving.
Constructing Explanations and Designing Solutions: Students evaluate the proposed refinements for cost, safety, aesthetics, and reliability.
ii. The task requires students to demonstrate grade-appropriate dimensions:
Evidence of SEPs (which element[s], and how does the task require students to demonstrate this element in use?)
Stability and Change; Influence of Engineering: Students analyze how their proposed system design acts as a stabilizing feedback loop while managing financial tradeoffs.
Evidence of CCCs (which element[s], and how does the task require students to demonstrate this element in use?)
ESS3.C and ETS1.B: Students evaluate how human-made refinements lower pollutant loads to preclude ecosystem degradation, balancing these interventions against financial constraints.
Evidence of DCIs (which element[s], and how does the task require students to demonstrate this element in use?)
Students integrate all three dimensions when writing their final proposal to city officials, using data (SEP) to justify how their technology design (DCI) stabilizes the ecosystem (CCC).
iii. The task requires students to integrate multiple dimensions in service of sense-making and/or problem-solving.
Consider in what ways the task requires students to use multiple dimensions together.
Students make their thinking visible through the explicit data table and the formal written proposal comparing tradeoffs.
iv. The task requires students to make their thinking visible.
Consider in what ways the task explicitly prompts students to make their thinking visible (surfaces current understanding, abilities, gaps, problematic ideas).
The scenario involves an urban river, allowing students to make direct connections to local watersheds or municipal water treatment facilities.
| Evidence of quality for Criterion B: [ ] No | [ ] Inadequate | [ ] Adequate | [x] Extensive |
Suggestions for improvement of the task for Criterion B:
The sensemaking sequence is well-structured.
Criterion C. Tasks are fair and equitable.
i. The task provides ways for students to make connections of local, global, or universal relevance.
Consider specific features of the task that enable students to make local, global, or universal connections to the phenomenon/problem and task at hand. Note: This criterion emphasizes ways for students to find meaning in the task; this does not mean “interest.” Consider whether the task is a meaningful, valuable endeavor that has real-world relevance–that some stakeholder group locally, globally, or universally would be invested in.
Students respond via data tables and a written formal proposal.
ii. The task includes multiple modes for students to respond to the task.
Describe what modes (written, oral, video, simulation, direct observation, peer discussion, etc.) are expected/possible.
It cultivates interest by placing students in the role of an engineering team making real financial decisions ($10M budget) with visible ecosystem consequences.
iii. The task is accessible, appropriate, and cognitively demanding for all learners (including English learners or students working below/above grade level).
| Features | Yes | Somewhat | No | Rationale |
|---|---|---|---|---|
| Task includes appropriate scaffolds | [x] | [ ] | [ ] | Step-by-step instructions and a clear data table scaffold the complex simulation. |
| Tasks are coherent from a student perspective | [x] | [ ] | [ ] | The 5E structure provides a clear, logical progression. |
| Tasks respect and advantage students’ cultural and linguistic backgrounds | [x] | [ ] | [ ] | It values local environmental justice contexts regarding water quality. |
| Tasks provide both low- and high-achieving students with an opportunity to show what they know | [x] | [ ] | [ ] | The visual simulation allows access for lower-achieving students, while the tradeoff analysis challenges high-achievers. |
| Tasks use accessible language | [x] | [ ] | [ ] | Vocabulary is explicitly defined or visually represented in the simulation. |
iv. The task cultivates students’ interest in and confidence with science and engineering.
Consider how the task cultivates students interest in and confidence with science and engineering, including opportunities for students to reflect their own ideas as a meaningful part of the task; make decisions about how to approach a task; engage in peer/self-reflection; and engage with tasks that matter to students.
The task assumes prior knowledge of basic ecosystems, which is standard for HS-ESS3.
v. The task focuses on performances for which students’ learning experiences have prepared them (opportunity to learn considerations).
Consider the ways in which provided information about students’ prior learning (e.g., instructional materials, storylines, assumed instructional experiences) enables or prevents students’ engagement with the task and educator interpretation of student responses.
The simulation correctly models the tradeoffs of common civil engineering solutions (e.g., wetlands requiring high land use, filtration being expensive). No scientific inaccuracies were found.
vi. The task presents information that is scientifically accurate.
Describe evidence of scientific inaccuracies explicitly or implicitly promoted by the task.
Yes, students must explicitly use the SEP of designing solutions to formulate their proposal.
| Evidence of quality for Criterion C: [ ] No | [ ] Inadequate | [ ] Adequate | [x] Extensive |
Suggestions for improvement of the task for Criterion C:
Consider adding multi-lingual support to the simulation interface if possible.
Criterion D. Tasks support their intended targets and purpose.
Before you begin:
- Describe what is being assessed. Include any targets provided, such as dimensions, elements, or PEs:
The final formal proposal assessing the student’s ability to evaluate a technological solution against constraints.
- What is the purpose of the assessment? (check all that apply)
- Formative (including peer and self-reflection)
- Summative
- Determining whether students learned what they just experienced
- Determining whether students can apply what they have learned to a similar but new context
- Determining whether students can generalize their learning to a different context
- Other (please specify): N/A
i. The task assesses what it is intended to assess and supports the purpose for which it is intended.
Consider the following:
- Is the assessment target necessary to successfully complete the task?
No outside or untaught concepts are required; all necessary constraint data is generated by the simulation.
- Are any ideas, practices, or experiences not targeted by the assessment necessary to respond to the task? Consider the impact this has on students’ ability to complete the task and interpretation of student responses.
Yes, the proposal directly demonstrates their ability to evaluate competing solutions based on constraints.
- Do the student responses elicited support the purpose of the task (e.g., if a task is intended to help teachers determine if students understand the distinction between cause and correlation, does the task support this inference)?
The primary artifact is the written proposal, which requires synthesizing data (SEP), evaluating human impacts (DCI), and managing system stability (CCC).
ii. The task elicits artifacts from students as direct, observable evidence of how well students can use the targeted dimensions together to make sense of phenomena and design solutions to problems.
Consider what student artifacts are produced and how these provide students the opportunity to make visible their 1) sense-making processes, 2) thinking across all three dimensions, and 3) ability to use multiple dimensions together [note: these artifacts should connect back to the evidence described for Criterion B].
The teacher notes map exactly how the student proposal should demonstrate the integration of the three dimensions.
iii. Supporting materials include clear answer keys, rubrics, and/or scoring guidelines that are connected to the three-dimensional target. They provide the necessary and sufficient guidance for interpreting student responses relative to the purpose of the assessment, all targeted dimensions, and the three-dimensional target.
Consider how well the materials support teachers and students in making sense of student responses and planning for follow up (grading, instructional moves), consistent with the purpose of and targets for the assessment. Consider in what ways rubrics include:
- Guidance for interpreting student thinking using an integrated approach, considering all three dimensions together as well as calling out specific supports for individual dimensions, if appropriate:
The open-ended nature of the proposal allows teachers to interpret partial understanding through the student’s data table vs. their written conclusion.
- Support for interpreting a range of student responses, including those that might reflect partial scientific understanding or mask/misrepresent students’ actual science understanding (e.g., because of language barriers, lack of prompting or disconnect between the intent and student interpretation of the task, variety in communication approaches):
The extension options provide direct ways to connect these responses to physical modeling or local GIS analysis.
- Ways to connect student responses to prior experiences and future planned instruction by teachers and participation by students:
The prompts clearly detail what the final proposal must include (combinations, evidence, and constraint analysis) without overly scripting the answer.
iv. The task’s prompts and directions provide sufficient guidance for the teacher to administer it effectively and for the students to complete it successfully while maintaining high levels of students’ analytical thinking as appropriate.
Consider any confusing prompts or directions, and evidence for too much or too little scaffolding/supports for students (relative to the target of the assessment—e.g., a task is intended to elicit student understanding of a DCI, but their response is so heavily scripted that it prevents students from actually showing their ability to apply the DCI).
Evidence is present throughout the 5E sequence and explicitly within the student’s proposal.
| Evidence of quality for Criterion D: [ ] No | [ ] Inadequate | [ ] Adequate | [x] Extensive |
Suggestions for improvement of the task for Criterion D:
None - task targets are strongly supported.
Overall Summary
Consider the task purpose and the evidence you gathered for each criterion. Carefully consider the purpose and intended use of the task, your evidence, reasoning, and ratings to make a summary recommendation about using this task. While general guidance is provided below, it is important to remember that the intended use of the task plays a big role in determining whether the task is worth students’ and teachers’ time.
The task robustly supports HS-ESS3-4 by requiring students to iteratively design and evaluate a technological solution. The simulation seamlessly tracks pollution and biodiversity against constraints like budget and land use, enabling rigorous 3D learning. The 5E structure provides clear scaffolding.
Final recommendation (choose one):
- Use this task (all criteria had at least an “adequate” rating)
- Modify and use this task
- Do not use this task