Science Task Screener
Task Title: Wind Turbine Optimization Challenge
Grade: 9-12
Date: 2024-04-25
Instructions
- Before you begin: Complete the task as a student would. Then, consider any support materials provided to teachers or students, such as contextual information about the task and answer keys/scoring guidance.
- Using the Task Screener: Use this tool to evaluate tasks designed for three-dimensional standards. For each criterion, record your evidence for the presence or absence of the associated indicators. After you have decided to what degree the indicators are present within the task, revisit the purpose of your task and decide whether the evidence supports using it.
Criterion A. Tasks are driven by high-quality scenarios that are grounded in phenomena or problems.
i. Making sense of a phenomenon or addressing a problem is necessary to accomplish the task.
What was in the task, where was it, and why is this evidence?
- Is a phenomenon and/or problem present?
Students cannot simply guess the answer; they must run trials, collect power output data, and use mathematical reasoning to determine the optimal configuration.
- Is information from the scenario necessary to respond successfully to the task?
Constructing Explanations and Designing Solutions: Students use simulation data as empirical evidence to evaluate and propose a specific engineering solution.
ii. The task scenario is engaging, relevant, and accessible to a wide range of students.
Features of engaging, relevant, and accessible tasks:
| Features of scenarios | Yes | Somewhat | No | Rationale |
|---|---|---|---|---|
| Scenario presents real-world observations | [ ] | [ ] | [ ] | The phenomenon requires understanding energy transfer. |
| Scenarios are based around at least one specific instance, not a topic or generally observed occurrence | [ ] | [ ] | [ ] | Students apply ETS1.B and system models. |
| Scenarios are presented as puzzling/intriguing | [ ] | [ ] | [ ] | Renewable energy is a relevant global issue. |
| Scenarios create a “need to know” | [ ] | [ ] | [ ] | Language is scaled for high school. |
| Scenarios are explainable using grade-appropriate SEPs, CCCs, DCIs | [ ] | [ ] | [ ] | Instructions are concise. |
| Scenarios effectively use at least 2 modalities (e.g., images, diagrams, video, simulations, textual descriptions) | [ ] | [ ] | [ ] | The simulation is robust enough for exploration. |
| If data are used, scenarios present real/well-crafted data | [ ] | [ ] | [ ] | Scaffolding is appropriate. |
| The local, global, or universal relevance of the scenario is made clear to students | [x] | [ ] | [ ] | Wind energy is a critical global technology, making the problem universally relevant. |
| Scenarios are comprehensible to a wide range of students at grade-level | [x] | [ ] | [ ] | The scenario is clearly introduced with relatable concepts like blade length. |
| Scenarios use as many words as needed, no more | [x] | [ ] | [ ] | The prompt is concise, quickly transitioning students into exploration. |
| Scenarios are sufficiently rich to drive the task | [x] | [ ] | [ ] | The scenario provides enough constraints (12 m/s wind speed, limited budget) to make the challenge non-trivial. |
| Evidence of quality for Criterion A: [ ] No | [ ] Inadequate | [ ] Adequate | [x] Extensive |
Suggestions for improvement of the task for Criterion A:
Consider asking students to brainstorm their own constraints before revealing the 12 m/s wind speed constraint.
Criterion B. Tasks require sense-making using the three dimensions.
i. Completing the task requires students to use reasoning to sense-make about phenomena or problems.
Consider in what ways the task requires students to use reasoning to engage in sense-making and/or problem solving.
Students must engage in mathematical reasoning to interpret simulation data, finding non-obvious patterns to determine the optimal configuration instead of guessing.
ii. The task requires students to demonstrate grade-appropriate dimensions:
Evidence of SEPs (which element[s], and how does the task require students to demonstrate this element in use?)
ETS1.B: Developing Possible Solutions: Students evaluate solutions taking into account constraints like performance limits and cost.
Evidence of CCCs (which element[s], and how does the task require students to demonstrate this element in use?)
Students integrate the dimensions by using empirical data (SEP) to evaluate engineering limits (DCI) while balancing physical and societal constraints like weight/cost (CCC).
Evidence of DCIs (which element[s], and how does the task require students to demonstrate this element in use?)
Students must write an Engineering Proposal that makes their thinking visible, explicitly justifying their chosen configuration with the data they collected in their tables.
iii. The task requires students to integrate multiple dimensions in service of sense-making and/or problem-solving.
Consider in what ways the task requires students to use multiple dimensions together.
The task connects to the global push for renewable energy and asks students to consider the very real engineering trade-offs companies face when building wind farms.
iv. The task requires students to make their thinking visible.
Consider in what ways the task explicitly prompts students to make their thinking visible (surfaces current understanding, abilities, gaps, problematic ideas).
Students respond via direct data collection in tables, answering short sense-making questions, and writing a formal proposal.
| Evidence of quality for Criterion B: [ ] No | [ ] Inadequate | [ ] Adequate | [x] Extensive |
Suggestions for improvement of the task for Criterion B:
Ensure students explicitly state the specific constraints they are optimizing for in their final proposal.
Criterion C. Tasks are fair and equitable.
i. The task provides ways for students to make connections of local, global, or universal relevance.
Consider specific features of the task that enable students to make local, global, or universal connections to the phenomenon/problem and task at hand. Note: This criterion emphasizes ways for students to find meaning in the task; this does not mean “interest.” Consider whether the task is a meaningful, valuable endeavor that has real-world relevance–that some stakeholder group locally, globally, or universally would be invested in.
Wind energy is a critical part of the global effort to combat climate change, making the problem universally relevant to modern engineering challenges.
ii. The task includes multiple modes for students to respond to the task.
Describe what modes (written, oral, video, simulation, direct observation, peer discussion, etc.) are expected/possible.
Students respond via direct data collection in tables, answering short written sense-making questions, and composing a formal Engineering Proposal.
iii. The task is accessible, appropriate, and cognitively demanding for all learners (including English learners or students working below/above grade level).
| Features | Yes | Somewhat | No | Rationale |
|---|---|---|---|---|
| Task includes appropriate scaffolds | [x] | [ ] | [ ] | The pre-formatted data table and clear parameter boundaries serve as strong scaffolds. |
| Tasks are coherent from a student perspective | [x] | [ ] | [ ] | The 5E structure provides a logical flow from engagement to final proposal. |
| Tasks respect and advantage students’ cultural and linguistic backgrounds | [ ] | [x] | [ ] | Visual simulation elements help multilingual learners, though specific cultural tie-ins are broad. |
| Tasks provide both low- and high-achieving students with an opportunity to show what they know | [x] | [ ] | [ ] | The open-ended proposal allows for varying depths of engineering analysis. |
| Tasks use accessible language | [x] | [ ] | [ ] | Technical terms (Pitch, Efficiency) are defined visually via the simulation UI. |
iv. The task cultivates students’ interest in and confidence with science and engineering.
Consider how the task cultivates students interest in and confidence with science and engineering, including opportunities for students to reflect their own ideas as a meaningful part of the task; make decisions about how to approach a task; engage in peer/self-reflection; and engage with tasks that matter to students.
The simulation gives students direct agency over the design process, allowing self-directed trials which builds confidence in independent problem-solving.
v. The task focuses on performances for which students’ learning experiences have prepared them (opportunity to learn considerations).
Consider the ways in which provided information about students’ prior learning (e.g., instructional materials, storylines, assumed instructional experiences) enables or prevents students’ engagement with the task and educator interpretation of student responses.
The task focuses precisely on evaluating an engineering solution and assumes prior instruction on basic energy forms, allowing accurate assessment without barriers.
vi. The task presents information that is scientifically accurate.
Describe evidence of scientific inaccuracies explicitly or implicitly promoted by the task.
The task and simulation present a scientifically accurate model of wind turbine performance, correctly simulating aerodynamic principles like Betz’s Law.
| Evidence of quality for Criterion C: [ ] No | [ ] Inadequate | [ ] Adequate | [x] Extensive |
Suggestions for improvement of the task for Criterion C:
Consider adding a localized context, like placing the wind farm near the students’ actual city, to increase relevance.
Criterion D. Tasks support their intended targets and purpose.
Before you begin:
- Describe what is being assessed. Include any targets provided, such as dimensions, elements, or PEs:
Assesses students’ ability to evaluate an engineering design against criteria and constraints (HS-ETS1-3).
- What is the purpose of the assessment? (check all that apply)
- Formative (including peer and self-reflection)
- [ ] Summative
- [ ] Determining whether students learned what they just experienced
- [ ] Determining whether students can apply what they have learned to a similar but new context
- [ ] Determining whether students can generalize their learning to a different context
- [ ] Other (please specify): N/A
i. The task assesses what it is intended to assess and supports the purpose for which it is intended.
Consider the following:
- Is the assessment target necessary to successfully complete the task?
Yes, the engineering proposal directly elicits whether the student can evaluate a solution and balance trade-offs, supporting the formative purpose.
- Are any ideas, practices, or experiences not targeted by the assessment necessary to respond to the task? Consider the impact this has on students’ ability to complete the task and interpretation of student responses.
No extraneous or overly advanced knowledge (like fluid dynamics calculus) is required, ensuring the assessment targets exactly what it intends to without interference.
- Do the student responses elicited support the purpose of the task (e.g., if a task is intended to help teachers determine if students understand the distinction between cause and correlation, does the task support this inference)?
The final written proposal serves as the primary observable artifact, making visible the student’s ability to synthesize empirical data (SEP) with engineering constraints (DCI) and societal tradeoffs (CCC).
ii. The task elicits artifacts from students as direct, observable evidence of how well students can use the targeted dimensions together to make sense of phenomena and design solutions to problems.
Consider what student artifacts are produced and how these provide students the opportunity to make visible their 1) sense-making processes, 2) thinking across all three dimensions, and 3) ability to use multiple dimensions together [note: these artifacts should connect back to the evidence described for Criterion B].
The rubric allows teachers to evaluate students’ grasp of ETS1.B even if their English proficiency is developing. Teachers can look for accurate data tables and correct trend identification (e.g. longer blades = more power) as evidence of understanding, independent of written fluency.
iii. Supporting materials include clear answer keys, rubrics, and/or scoring guidelines that are connected to the three-dimensional target. They provide the necessary and sufficient guidance for interpreting student responses relative to the purpose of the assessment, all targeted dimensions, and the three-dimensional target.
Consider how well the materials support teachers and students in making sense of student responses and planning for follow up (grading, instructional moves), consistent with the purpose of and targets for the assessment. Consider in what ways rubrics include:
- Guidance for interpreting student thinking using an integrated approach, considering all three dimensions together as well as calling out specific supports for individual dimensions, if appropriate:
The Teacher Notes explicitly map the task deliverable to the specific NGSS evidence statements for HS-ETS1-3, providing a clear 3-point scoring rubric based on data use and constraint evaluation.
- Support for interpreting a range of student responses, including those that might reflect partial scientific understanding or mask/misrepresent students’ actual science understanding (e.g., because of language barriers, lack of prompting or disconnect between the intent and student interpretation of the task, variety in communication approaches):
The rubric allows teachers to evaluate students’ grasp of ETS1.B independent of written fluency by looking for accurate data tables and correct conceptual trend identification.
- Ways to connect student responses to prior experiences and future planned instruction by teachers and participation by students:
The task’s final question about remaining problems (unpredictable wind) acts as a perfect segue into future lessons on battery storage or grid integration.
iv. The task’s prompts and directions provide sufficient guidance for the teacher to administer it effectively and for the students to complete it successfully while maintaining high levels of students’ analytical thinking as appropriate.
Consider any confusing prompts or directions, and evidence for too much or too little scaffolding/supports for students (relative to the target of the assessment—e.g., a task is intended to elicit student understanding of a DCI, but their response is so heavily scripted that it prevents students from actually showing their ability to apply the DCI).
The prompts provide sufficient scaffolding during the Explore phase to administer the simulation effectively, while leaving the cognitive heavy-lifting to the student’s final proposal.
| Evidence of quality for Criterion D: [ ] No | [ ] Inadequate | [ ] Adequate | [x] Extensive |
Suggestions for improvement of the task for Criterion D:
Distribute the detailed 3-point scoring rubric to students before they begin writing their engineering proposals.
Overall Summary
Consider the task purpose and the evidence you gathered for each criterion. Carefully consider the purpose and intended use of the task, your evidence, reasoning, and ratings to make a summary recommendation about using this task. While general guidance is provided below, it is important to remember that the intended use of the task plays a big role in determining whether the task is worth students’ and teachers’ time.
The Wind Turbine Optimization Challenge is an excellent, three-dimensional learning task tightly aligned with HS-ETS1-3. It effectively uses the interactive simulation to allow students to evaluate design choices and make evidence-based engineering proposals within authentic constraints. The 5E structure provides strong support for implementation.
Final recommendation (choose one):
- Use this task (all criteria had at least an “adequate” rating)
- [ ] Modify and use this task
- [ ] Do not use this task