Science Task Screener

Task Title: Modeling Feedbacks in Ocean Acidification & Coral Bleaching

Grade: High School

Date: 2026-04-25

Instructions

Criterion A. Tasks are driven by high-quality scenarios that are grounded in phenomena or problems.

i. Making sense of a phenomenon or addressing a problem is necessary to accomplish the task.

What was in the task, where was it, and why is this evidence?

  1. Is a phenomenon and/or problem present?

The task centers on an observable, real-world occurrence: the mass bleaching of vibrant coral reefs in La Parguera, Puerto Rico.

  1. Is information from the scenario necessary to respond successfully to the task?

Students must actively interact with the simulation scenario, collecting specific data regarding global emissions and reef health to successfully explain the underlying feedback mechanism and complete the argumentation deliverable.

ii. The task scenario is engaging, relevant, and accessible to a wide range of students.

Features of engaging, relevant, and accessible tasks:

Features of scenarios Yes Somewhat No Rationale
Scenario presents real-world observations [x] [ ] [ ] Bleaching in La Parguera is a documented real-world phenomenon.
Scenarios are based around at least one specific instance, not a topic or generally observed occurrence [x] [ ] [ ] It focuses specifically on La Parguera, not just general ocean warming.
Scenarios are presented as puzzling/intriguing [x] [ ] [ ] It asks students to figure out how activities ‘thousands of miles away’ destroy reefs.
Scenarios create a “need to know” [x] [ ] [ ] The engage prompt requires students to generate their own questions about the mechanism.
Scenarios are explainable using grade-appropriate SEPs, CCCs, DCIs [x] [ ] [ ] Fully aligned to high school earth science standards regarding climate feedbacks.
Scenarios effectively use at least 2 modalities (e.g., images, diagrams, video, simulations, textual descriptions) [x] [ ] [ ] Uses text description and a full interactive computational simulation.
If data are used, scenarios present real/well-crafted data [x] [ ] [ ] The simulation outputs accurate relative scales for CO2 ppm and temperature.
The local, global, or universal relevance of the scenario is made clear to students [x] [ ] [ ] Directly links global carbon emissions to localized ecosystem destruction.
Scenarios are comprehensible to a wide range of students at grade-level [x] [ ] [ ] Written in accessible language without assuming prior chemical knowledge.
Scenarios use as many words as needed, no more [x] [ ] [ ] The engage scenario is concise and directly leads into exploration.
Scenarios are sufficiently rich to drive the task [x] [ ] [ ] The scenario provides the foundation for the entire 5E instructional sequence.
Evidence of quality for Criterion A: [ ] No [ ] Inadequate [ ] Adequate [x] Extensive

Suggestions for improvement of the task for Criterion A:

None.

Criterion B. Tasks require sense-making using the three dimensions.

i. Completing the task requires students to use reasoning to sense-make about phenomena or problems.

Consider in what ways the task requires students to use reasoning to engage in sense-making and/or problem solving.

Students cannot complete the final argument without reasoning how data showing increased temperature and decreased pH mechanistically cause the observed coral bleaching.

ii. The task requires students to demonstrate grade-appropriate dimensions:

Evidence of SEPs (which element[s], and how does the task require students to demonstrate this element in use?)

Analyzing and Interpreting Data: Students must run simulation trials, record data at different emission rates, and identify causal relationships.

Evidence of CCCs (which element[s], and how does the task require students to demonstrate this element in use?)

Stability and Change: Students must explicitly identify the positive feedback loop between ocean warming and CO2 absorption that destabilizes the climate system.

Evidence of DCIs (which element[s], and how does the task require students to demonstrate this element in use?)

ESS2.A, ESS2.D, ESS3.D: Students apply knowledge of Earth’s dynamic interacting systems and human impacts on global climate.

iii. The task requires students to integrate multiple dimensions in service of sense-making and/or problem-solving.

Consider in what ways the task requires students to use multiple dimensions together.

The final deliverable requires students to use their analyzed data (SEP) to explain the feedback mechanism (CCC) between human atmospheric emissions and ocean/biosphere health (DCI).

iv. The task requires students to make their thinking visible.

Consider in what ways the task explicitly prompts students to make their thinking visible (surfaces current understanding, abilities, gaps, problematic ideas).

The 5E sequence includes explicit sensemaking questions where students must write out their explanations of relationships and feedbacks before drafting the final argument.

Evidence of quality for Criterion B: [ ] No [ ] Inadequate [ ] Adequate [x] Extensive

Suggestions for improvement of the task for Criterion B:

None.

Criterion C. Tasks are fair and equitable.

i. The task provides ways for students to make connections of local, global, or universal relevance.

Consider specific features of the task that enable students to make local, global, or universal connections to the phenomenon/problem and task at hand. Note: This criterion emphasizes ways for students to find meaning in the task; this does not mean “interest.” Consider whether the task is a meaningful, valuable endeavor that has real-world relevance–that some stakeholder group locally, globally, or universally would be invested in.

The phenomenon bridges global industrial activity (carbon emissions) with a specific, culturally relevant local impact (Puerto Rican coral reefs).

ii. The task includes multiple modes for students to respond to the task.

Describe what modes (written, oral, video, simulation, direct observation, peer discussion, etc.) are expected/possible.

Students engage via written text, interactive computational simulation (visual and data chart outputs), and construct written arguments.

iii. The task is accessible, appropriate, and cognitively demanding for all learners (including English learners or students working below/above grade level).

Features Yes Somewhat No Rationale
Task includes appropriate scaffolds [x] [ ] [ ] The 5E structure moves from guided observation to structured data collection before requiring an independent argument.
Tasks are coherent from a student perspective [x] [ ] [ ] The flow from phenomenon to simulation to explanation is logical and contiguous.
Tasks respect and advantage students’ cultural and linguistic backgrounds [x] [ ] [ ] Focusing on a Caribbean location provides representation often missing in standard climate examples.
Tasks provide both low- and high-achieving students with an opportunity to show what they know [x] [ ] [ ] The extension activity provides differentiation, while the core task offers clear scaffolding.
Tasks use accessible language [x] [ ] [ ] Complex terms like ‘positive feedback’ are defined in context within the sensemaking section.

iv. The task cultivates students’ interest in and confidence with science and engineering.

Consider how the task cultivates students interest in and confidence with science and engineering, including opportunities for students to reflect their own ideas as a meaningful part of the task; make decisions about how to approach a task; engage in peer/self-reflection; and engage with tasks that matter to students.

By allowing students to directly manipulate the ‘dial’ on global emissions and see immediate, localized consequences, the task builds confidence in using models to understand complex global issues.

v. The task focuses on performances for which students’ learning experiences have prepared them (opportunity to learn considerations).

Consider the ways in which provided information about students’ prior learning (e.g., instructional materials, storylines, assumed instructional experiences) enables or prevents students’ engagement with the task and educator interpretation of student responses.

The task relies entirely on data generated within the simulation itself, ensuring all students have the opportunity to succeed regardless of prior specific content knowledge.

vi. The task presents information that is scientifically accurate.

Describe evidence of scientific inaccuracies explicitly or implicitly promoted by the task.

The simulation’s modeled relationships between CO2, ocean warming, ocean acidification (pH), and coral bleaching thresholds are scientifically accurate and align with current climate models.

Evidence of quality for Criterion C: [ ] No [ ] Inadequate [ ] Adequate [x] Extensive

Suggestions for improvement of the task for Criterion C:

None.

Criterion D. Tasks support their intended targets and purpose.

Before you begin:

  1. Describe what is being assessed. Include any targets provided, such as dimensions, elements, or PEs:

The task assesses students’ ability to analyze simulation data to construct an argument about climate feedbacks and forecast future impacts (HS-ESS2-2 and HS-ESS3-5).

  1. What is the purpose of the assessment? (check all that apply)
    • [x] Formative (including peer and self-reflection)
    • [x] Summative
    • [ ] Determining whether students learned what they just experienced
    • [ ] Determining whether students can apply what they have learned to a similar but new context
    • [ ] Determining whether students can generalize their learning to a different context
    • [ ] Other (please specify):

i. The task assesses what it is intended to assess and supports the purpose for which it is intended.

Consider the following:

  1. Is the assessment target necessary to successfully complete the task?

Yes, understanding the feedback loops and analyzing the data is strictly required to draft the final evidence-based argument.

  1. Are any ideas, practices, or experiences not targeted by the assessment necessary to respond to the task? Consider the impact this has on students’ ability to complete the task and interpretation of student responses.

No non-targeted ideas are required; the required concepts are provided or discovered via the simulation.

  1. Do the student responses elicited support the purpose of the task (e.g., if a task is intended to help teachers determine if students understand the distinction between cause and correlation, does the task support this inference)?

The final deliverable directly supports the purpose by requiring an evidence-based forecast (HS-ESS3-5) based on feedback mechanisms (HS-ESS2-2).

ii. The task elicits artifacts from students as direct, observable evidence of how well students can use the targeted dimensions together to make sense of phenomena and design solutions to problems.

Consider what student artifacts are produced and how these provide students the opportunity to make visible their 1) sense-making processes, 2) thinking across all three dimensions, and 3) ability to use multiple dimensions together [note: these artifacts should connect back to the evidence described for Criterion B].

Students produce a completed data table, written sensemaking responses, and a final scientific argument that makes their causal reasoning visible.

iii. Supporting materials include clear answer keys, rubrics, and/or scoring guidelines that are connected to the three-dimensional target. They provide the necessary and sufficient guidance for interpreting student responses relative to the purpose of the assessment, all targeted dimensions, and the three-dimensional target.

Consider how well the materials support teachers and students in making sense of student responses and planning for follow up (grading, instructional moves), consistent with the purpose of and targets for the assessment. Consider in what ways rubrics include:

  1. Guidance for interpreting student thinking using an integrated approach, considering all three dimensions together as well as calling out specific supports for individual dimensions, if appropriate:

The Teacher Notes explicitly map the components of the student deliverable back to the specific dimensions and evidence statements.

  1. Support for interpreting a range of student responses, including those that might reflect partial scientific understanding or mask/misrepresent students’ actual science understanding (e.g., because of language barriers, lack of prompting or disconnect between the intent and student interpretation of the task, variety in communication approaches):

The scaffolded questions in Part 3 allow teachers to identify exactly where a student’s understanding breaks down (e.g., if they struggle with organizing data vs. identifying the feedback).

  1. Ways to connect student responses to prior experiences and future planned instruction by teachers and participation by students:

The concepts of positive feedback and tipping points introduced here provide a strong foundation for future instruction on climate mitigation strategies.

iv. The task’s prompts and directions provide sufficient guidance for the teacher to administer it effectively and for the students to complete it successfully while maintaining high levels of students’ analytical thinking as appropriate.

Consider any confusing prompts or directions, and evidence for too much or too little scaffolding/supports for students (relative to the target of the assessment—e.g., a task is intended to elicit student understanding of a DCI, but their response is so heavily scripted that it prevents students from actually showing their ability to apply the DCI).

Student directions are clear, numbered, and specific regarding simulation usage.

Evidence of quality for Criterion D: [ ] No [ ] Inadequate [ ] Adequate [x] Extensive

Suggestions for improvement of the task for Criterion D:

None.

Overall Summary

Consider the task purpose and the evidence you gathered for each criterion. Carefully consider the purpose and intended use of the task, your evidence, reasoning, and ratings to make a summary recommendation about using this task. While general guidance is provided below, it is important to remember that the intended use of the task plays a big role in determining whether the task is worth students’ and teachers’ time.

The task is a highly effective, three-dimensional learning experience. It successfully leverages a computational simulation to allow students to investigate complex climate feedbacks and forecast ecosystem impacts, strongly addressing the targeted high school Earth and Space Science performance expectations.

Final recommendation (choose one):