Science Task Screener

Task Title: Puerto Rico Resilient Microgrid Simulation Task

Grade: High School

Date: 2024-05-20

Instructions

Criterion A. Tasks are driven by high-quality scenarios that are grounded in phenomena or problems.

i. Making sense of a phenomenon or addressing a problem is necessary to accomplish the task.

What was in the task, where was it, and why is this evidence?

  1. Is a phenomenon and/or problem present?

Yes, the task scenario presents a complex, real-world engineering problem: designing a resilient microgrid to provide continuous power to a Puerto Rican community during a 72-hour hurricane outage. This is grounded in a natural phenomenon (hurricanes disrupting standard power grids).

  1. Is information from the scenario necessary to respond successfully to the task?

Yes, students must use the parameters of the scenario (a 72 hour hurricane simulation with constraints on generation and storage) to formulate their design and test it.

ii. The task scenario is engaging, relevant, and accessible to a wide range of students.

Features of engaging, relevant, and accessible tasks:

Features of scenarios Yes Somewhat No Rationale
Scenario presents real-world observations [x] [ ] [ ] Outages after severe storms are heavily documented and observed.
Scenarios are based around at least one specific instance, not a topic or generally observed occurrence [x] [ ] [ ] Focused specifically on a microgrid survival test during a hurricane.
Scenarios are presented as puzzling/intriguing [x] [ ] [ ] It presents an engineering puzzle: optimize the grid without wasting money/capacity.
Scenarios create a “need to know” [x] [ ] [ ] Students need to know how solar power and batteries interact to keep the grid alive.
Scenarios are explainable using grade-appropriate SEPs, CCCs, DCIs [x] [ ] [ ] Requires computational modeling (SEP), engineering solutions (DCI), and system interaction analysis (CCC).
Scenarios effectively use at least 2 modalities (e.g., images, diagrams, video, simulations, textual descriptions) [x] [ ] [ ] Uses textual descriptions and an interactive interactive simulation.
If data are used, scenarios present real/well-crafted data [x] [ ] [ ] The simulation provides realistic output data (e.g. kilowatt-hours over time).
The local, global, or universal relevance of the scenario is made clear to students [x] [ ] [ ] Puerto Rico’s hurricane resilience is a universally recognized and globally relevant engineering challenge.
Scenarios are comprehensible to a wide range of students at grade-level [x] [ ] [ ] The concept of a power outage and the basic functions of solar panels and batteries are highly comprehensible.
Scenarios use as many words as needed, no more [x] [ ] [ ] The text is concise, focusing heavily on interactive simulation exploration.
Scenarios are sufficiently rich to drive the task [x] [ ] [ ] The simulation’s variables and outputs provide ample data for analysis and argumentation.
Evidence of quality for Criterion A: [ ] No [ ] Inadequate [ ] Adequate [x] Extensive

Suggestions for improvement of the task for Criterion A:

None. The scenario is authentic, engaging, and directly drives the need to utilize the computer simulation.

Criterion B. Tasks require sense-making using the three dimensions.

i. Completing the task requires students to use reasoning to sense-make about phenomena or problems.

Consider in what ways the task requires students to use reasoning to engage in sense-making and/or problem solving.

Students must use reasoning to interpret how the interdependent variables (solar generation vs. continuous load vs. battery capacity) affect the overall system. They must reason about trade-offs, such as why load shedding is necessary to protect critical infrastructure even though it turns off non-critical power.

ii. The task requires students to demonstrate grade-appropriate dimensions:

Evidence of SEPs (which element[s], and how does the task require students to demonstrate this element in use?)

Students interact directly with a computer simulation (a computational model) to predict the effects of different design solutions (solar size, battery capacity) on the microgrid system (HS-ETS1-4 Evidence Statement 2a).

Evidence of CCCs (which element[s], and how does the task require students to demonstrate this element in use?)

Students must explicitly define the boundaries of the modeled system and analyze the flow of energy (generation, storage, consumption) within it (HS-ETS1-4 Evidence Statement 1a).

Evidence of DCIs (which element[s], and how does the task require students to demonstrate this element in use?)

Students use the computer simulation to test multiple ways to solve the blackout problem, evaluating the efficiency and trade-offs (e.g., auto load shedding) of their solutions.

iii. The task requires students to integrate multiple dimensions in service of sense-making and/or problem-solving.

Consider in what ways the task requires students to use multiple dimensions together.

The task successfully intertwines the dimensions. Students use the simulation (SEP) to test solutions (DCI) while reasoning about the energy flows and interactions within the boundaries of the microgrid system (CCC). They cannot successfully complete the final recommendation without integrating all three dimensions.

iv. The task requires students to make their thinking visible.

Consider in what ways the task explicitly prompts students to make their thinking visible (surfaces current understanding, abilities, gaps, problematic ideas).

Students make their thinking visible by recording their iterative trial data in the table, answering specific sense-making questions about system principles and trade-offs, and writing an evidence-based recommendation to the community council.

Evidence of quality for Criterion B: [ ] No [ ] Inadequate [ ] Adequate [x] Extensive

Suggestions for improvement of the task for Criterion B:

None. The task rigorously targets the specific evidence statements for HS-ETS1-4.

Criterion C. Tasks are fair and equitable.

i. The task provides ways for students to make connections of local, global, or universal relevance.

Consider specific features of the task that enable students to make local, global, or universal connections to the phenomenon/problem and task at hand. Note: This criterion emphasizes ways for students to find meaning in the task; this does not mean “interest.” Consider whether the task is a meaningful, valuable endeavor that has real-world relevance–that some stakeholder group locally, globally, or universally would be invested in.

Power outages are a universal experience, and the specific context of hurricane resilience in Puerto Rico highlights global climate and infrastructure challenges, fostering empathy and engagement.

ii. The task includes multiple modes for students to respond to the task.

Describe what modes (written, oral, video, simulation, direct observation, peer discussion, etc.) are expected/possible.

Students engage through direct interactive simulation (visual/kinesthetic), data recording (quantitative), short-answer responses (analytical), and a final written recommendation (argumentative/evaluative).

iii. The task is accessible, appropriate, and cognitively demanding for all learners (including English learners or students working below/above grade level).

Features Yes Somewhat No Rationale
Task includes appropriate scaffolds [x] [ ] [ ] The task begins with a baseline trial so all students start from the same shared experience before optimizing.
Tasks are coherent from a student perspective [x] [ ] [ ] The 5E structure provides a clear, logical progression.
Tasks respect and advantage students’ cultural and linguistic backgrounds [x] [ ] [ ] The real-world context respects the lived experiences of students in storm-prone areas.
Tasks provide both low- and high-achieving students with an opportunity to show what they know [x] [ ] [ ] Low-achieving students can successfully iterate to find a working solution, while high-achieving students are challenged to evaluate negative consequences and model limitations.
Tasks use accessible language [x] [ ] [ ] Technical terms (load shedding, capacity) are defined in plain language.

iv. The task cultivates students’ interest in and confidence with science and engineering.

Consider how the task cultivates students interest in and confidence with science and engineering, including opportunities for students to reflect their own ideas as a meaningful part of the task; make decisions about how to approach a task; engage in peer/self-reflection; and engage with tasks that matter to students.

By casting the student as the lead engineer for a community council, the task empowers them to make consequential decisions and see the immediate simulated impact of those decisions.

v. The task focuses on performances for which students’ learning experiences have prepared them (opportunity to learn considerations).

Consider the ways in which provided information about students’ prior learning (e.g., instructional materials, storylines, assumed instructional experiences) enables or prevents students’ engagement with the task and educator interpretation of student responses.

The task assumes a basic prior understanding of what solar panels and batteries do, which is grade-appropriate for high school.

vi. The task presents information that is scientifically accurate.

Describe evidence of scientific inaccuracies explicitly or implicitly promoted by the task.

The simulation accurately models the relationship between generation (kW), capacity (kWh), and consumption over time. The constraints and trade-offs reflect real-world microgrid engineering challenges.

Evidence of quality for Criterion C: [ ] No [ ] Inadequate [x] Adequate [ ] Extensive

Suggestions for improvement of the task for Criterion C:

Teachers could enhance equity by allowing the final recommendation to be presented orally or visually as a presentation to the “community council” rather than strictly written.

Criterion D. Tasks support their intended targets and purpose.

Before you begin:

  1. Describe what is being assessed. Include any targets provided, such as dimensions, elements, or PEs:

HS-ETS1-4: Use a computer simulation to model the impact of proposed solutions to a complex real-world problem with numerous criteria and constraints on interactions within and between systems relevant to the problem.

  1. What is the purpose of the assessment? (check all that apply)
    • Formative (including peer and self-reflection)
    • Summative
    • Determining whether students learned what they just experienced
    • Determining whether students can apply what they have learned to a similar but new context
    • Determining whether students can generalize their learning to a different context
    • Other (please specify):

i. The task assesses what it is intended to assess and supports the purpose for which it is intended.

Consider the following:

  1. Is the assessment target necessary to successfully complete the task?

Yes, students must use the computational simulation target (HS-ETS1-4) in order to gather the data needed to make their recommendations.

  1. Are any ideas, practices, or experiences not targeted by the assessment necessary to respond to the task? Consider the impact this has on students’ ability to complete the task and interpretation of student responses.

Basic arithmetic is required, but it does not distract from or prevent students from achieving the science objectives.

  1. Do the student responses elicited support the purpose of the task (e.g., if a task is intended to help teachers determine if students understand the distinction between cause and correlation, does the task support this inference)?

Yes, the task directly assesses the evidence statements for HS-ETS1-4.

ii. The task elicits artifacts from students as direct, observable evidence of how well students can use the targeted dimensions together to make sense of phenomena and design solutions to problems.

Consider what student artifacts are produced and how these provide students the opportunity to make visible their 1) sense-making processes, 2) thinking across all three dimensions, and 3) ability to use multiple dimensions together [note: these artifacts should connect back to the evidence described for Criterion B].

The completed data table, the short-answer responses defining system boundaries and trade-offs, and the final evidence-based recommendation serve as direct, observable artifacts of 3D learning.

iii. Supporting materials include clear answer keys, rubrics, and/or scoring guidelines that are connected to the three-dimensional target. They provide the necessary and sufficient guidance for interpreting student responses relative to the purpose of the assessment, all targeted dimensions, and the three-dimensional target.

Consider how well the materials support teachers and students in making sense of student responses and planning for follow up (grading, instructional moves), consistent with the purpose of and targets for the assessment. Consider in what ways rubrics include:

  1. Guidance for interpreting student thinking using an integrated approach, considering all three dimensions together as well as calling out specific supports for individual dimensions, if appropriate:

Scoring Rubric (HS-ETS1-4):

  1. Support for interpreting a range of student responses, including those that might reflect partial scientific understanding or mask/misrepresent students’ actual science understanding (e.g., because of language barriers, lack of prompting or disconnect between the intent and student interpretation of the task, variety in communication approaches):
  1. Ways to connect student responses to prior experiences and future planned instruction by teachers and participation by students:

Teachers can use struggles in identifying system boundaries as a pivot to review defining systems (CCC). Struggles in identifying limitations can pivot to discussions on climate variability and structural engineering.

iv. The task’s prompts and directions provide sufficient guidance for the teacher to administer it effectively and for the students to complete it successfully while maintaining high levels of students’ analytical thinking as appropriate.

Consider any confusing prompts or directions, and evidence for too much or too little scaffolding/supports for students (relative to the target of the assessment—e.g., a task is intended to elicit student understanding of a DCI, but their response is so heavily scripted that it prevents students from actually showing their ability to apply the DCI).

The 5E structure provides a clear roadmap for both teacher administration and student completion, scaffolding the cognitive load appropriately without over-scripting the answers.

Evidence of quality for Criterion D: [ ] No [ ] Inadequate [ ] Adequate [x] Extensive

Suggestions for improvement of the task for Criterion D:

None. The rubrics provide clear guidance for teachers to evaluate student mastery of the standard’s evidence statements.

Overall Summary

Consider the task purpose and the evidence you gathered for each criterion. Carefully consider the purpose and intended use of the task, your evidence, reasoning, and ratings to make a summary recommendation about using this task. While general guidance is provided below, it is important to remember that the intended use of the task plays a big role in determining whether the task is worth students’ and teachers’ time.

The Puerto Rico Resilient Microgrid Simulation Task is a robust, high-quality formative assessment aligned seamlessly with HS-ETS1-4. It uses a compelling real-world scenario to drive interaction with a computational model. Students must integrate SEPs, DCIs, and CCCs to optimize a solution, analyze trade-offs, and critique the model itself. The task is well-structured, equitable, and rigorously evaluates the explicit evidence statements of the standard.

Final recommendation (choose one):