Science Task Screener

Task Title: Ecosystem Resilience and Disturbances Task

Grade: High School

Date: 2024

Instructions

Criterion A. Tasks are driven by high-quality scenarios that are grounded in phenomena or problems.

i. Making sense of a phenomenon or addressing a problem is necessary to accomplish the task.

What was in the task, where was it, and why is this evidence?

  1. Is a phenomenon and/or problem present?

Yes, the task begins with the anchoring phenomenon of the Yellowstone National Park fires and recovery, contrasted with permanent drought changes.

  1. Is information from the scenario necessary to respond successfully to the task?

Yes, the student must investigate the simulation data explicitly tied to the concepts introduced in the phenomenon to form their argument.

ii. The task scenario is engaging, relevant, and accessible to a wide range of students.

Features of engaging, relevant, and accessible tasks:

Features of scenarios Yes Somewhat No Rationale
Scenario presents real-world observations [x] [ ] [ ] The anchoring phenomenon compares recovery from Yellowstone fires vs permanent change from severe drought.
Scenarios are based around at least one specific instance, not a topic or generally observed occurrence [x] [ ] [ ] This sets up a comparative puzzling scenario directly related to resilience vs new ecosystems.
Scenarios are presented as puzzling/intriguing [x] [ ] [ ] Students need to investigate the simulation to understand why some systems recover while others change.
Scenarios create a “need to know” [x] [ ] [ ] Students must investigate the simulation to understand why some systems recover while others change, creating a genuine need for inquiry.
Scenarios are explainable using grade-appropriate SEPs, CCCs, DCIs [x] [ ] [ ] The task integrates Argument from Evidence, Stability and Change, and Ecosystem Dynamics to explain the scenarios.
Scenarios effectively use at least 2 modalities (e.g., images, diagrams, video, simulations, textual descriptions) [x] [ ] [ ] Uses textual scenarios, an interactive simulation, and graphs/data tables.
If data are used, scenarios present real/well-crafted data [x] [ ] [ ] The simulation generates real-time mathematical data for population growth and decline.
The local, global, or universal relevance of the scenario is made clear to students [x] [ ] [ ] The Yellowstone example establishes real-world relevance.
Scenarios are comprehensible to a wide range of students at grade-level [x] [ ] [ ] The instructions and questions use clear, direct, high school-level language.
Scenarios use as many words as needed, no more [x] [ ] [ ] The prompt is concise and focuses on the core conceptual task without unnecessary reading.
Scenarios are sufficiently rich to drive the task [x] [ ] [ ] The scenarios require running the simulation, collecting data, and writing a comprehensive evaluation argument.
Evidence of quality for Criterion A: [ ] No [ ] Inadequate [ ] Adequate [x] Extensive

Suggestions for improvement of the task for Criterion A:

Ensure students clearly define what constitutes a ‘stable condition’ before applying disturbances.

Criterion B. Tasks require sense-making using the three dimensions.

i. Completing the task requires students to use reasoning to sense-make about phenomena or problems.

Consider in what ways the task requires students to use reasoning to engage in sense-making and/or problem solving.

The task requires students to collect baseline stability data, then apply a modest disturbance and observe the recovery (resilience). Students must also apply an extreme disturbance to observe the breakdown of stability and formation of a new ecosystem state, and use reasoning to evaluate a claim based on this.

ii. The task requires students to demonstrate grade-appropriate dimensions:

Evidence of SEPs (which element[s], and how does the task require students to demonstrate this element in use?)

Students use Engaging in Argument from Evidence by explicitly evaluating the given explanation (the politician’s claim) regarding ecosystem resilience using data gathered from their simulation runs.

Evidence of CCCs (which element[s], and how does the task require students to demonstrate this element in use?)

Students use Stability and Change by tracking population consistency and identifying the breaking point where extreme change results in a new ecosystem rather than recovery.

Evidence of DCIs (which element[s], and how does the task require students to demonstrate this element in use?)

Students demonstrate LS2.C by observing interactions that keep populations stable, documenting recovery from modest disturbances (resilience), and verifying that extreme disturbances can permanently lower carrying capacity.

iii. The task requires students to integrate multiple dimensions in service of sense-making and/or problem-solving.

Consider in what ways the task requires students to use multiple dimensions together.

Students must integrate the DCI (ecosystem interactions and carrying capacity) and the CCC (stability/change) to form their argument evaluating the claim (SEP). The sensemaking cannot be done by only using one dimension.

iv. The task requires students to make their thinking visible.

Consider in what ways the task explicitly prompts students to make their thinking visible (surfaces current understanding, abilities, gaps, problematic ideas).

Students make their thinking visible by recording quantitative data in structured tables and writing a formal argument that explicitly states their Claim, Evidence, and Reasoning.

Evidence of quality for Criterion B: [ ] No [ ] Inadequate [ ] Adequate [x] Extensive

Suggestions for improvement of the task for Criterion B:

Provide a structured data table to help students track population numbers at specific time intervals.

Criterion C. Tasks are fair and equitable.

i. The task provides ways for students to make connections of local, global, or universal relevance.

Consider specific features of the task that enable students to make local, global, or universal connections to the phenomenon/problem and task at hand. Note: This criterion emphasizes ways for students to find meaning in the task; this does not mean “interest.” Consider whether the task is a meaningful, valuable endeavor that has real-world relevance–that some stakeholder group locally, globally, or universally would be invested in.

The phenomenon explicitly links the simulation’s abstract concepts to real-world forest fires (Yellowstone) and droughts, highlighting the global and local relevance of ecosystem management.

ii. The task includes multiple modes for students to respond to the task.

Describe what modes (written, oral, video, simulation, direct observation, peer discussion, etc.) are expected/possible.

Expected modes include interactive simulation interaction, numerical data collection, short written responses, and a formal written argument.

iii. The task is accessible, appropriate, and cognitively demanding for all learners (including English learners or students working below/above grade level).

Features Yes Somewhat No Rationale
Task includes appropriate scaffolds [x] [ ] [ ] Step-by-step instructions for data collection provide necessary structure for the open-ended argumentation.
Tasks are coherent from a student perspective [x] [ ] [ ] The task follows the 5E sequence (Engage, Explore, Explain, Elaborate/Evaluate), creating a logical flow.
Tasks respect and advantage students’ cultural and linguistic backgrounds [x] [ ] [ ] The task does not assume prior cultural knowledge; all necessary information is provided in the phenomenon and simulation.
Tasks provide both low- and high-achieving students with an opportunity to show what they know [x] [ ] [ ] The open-ended argumentation allows for varying levels of sophistication in reasoning while assessing the core standard.
Tasks use accessible language [x] [ ] [ ] Uses clear, direct language and defines simulation terms like ‘carrying capacity ($K$)’.

iv. The task cultivates students’ interest in and confidence with science and engineering.

Consider how the task cultivates students interest in and confidence with science and engineering, including opportunities for students to reflect their own ideas as a meaningful part of the task; make decisions about how to approach a task; engage in peer/self-reflection; and engage with tasks that matter to students.

The simulation allows students to make independent decisions about when to apply disturbances and requires them to form their own argument, fostering agency and confidence in data analysis.

v. The task focuses on performances for which students’ learning experiences have prepared them (opportunity to learn considerations).

Consider the ways in which provided information about students’ prior learning (e.g., instructional materials, storylines, assumed instructional experiences) enables or prevents students’ engagement with the task and educator interpretation of student responses.

The task assumes basic understanding of what plants, herbivores, and carnivores are, which is standard prior knowledge for high school biology.

vi. The task presents information that is scientifically accurate.

Describe evidence of scientific inaccuracies explicitly or implicitly promoted by the task.

No scientific inaccuracies found. The mathematical model correctly implements Lotka-Volterra equations with carrying capacity limits.

Evidence of quality for Criterion C: [ ] No [ ] Inadequate [ ] Adequate [x] Extensive

Suggestions for improvement of the task for Criterion C:

Explicitly prompt students to cite the carrying capacity of plants and its change after an extreme event.

Criterion D. Tasks support their intended targets and purpose.

Before you begin:

  1. Describe what is being assessed. Include any targets provided, such as dimensions, elements, or PEs:

The task assesses students’ ability to evaluate a claim about ecosystem resilience using evidence from a population dynamics model (HS-LS2-6).

  1. What is the purpose of the assessment? (check all that apply)
    • Formative (including peer and self-reflection)
    • Summative
    • Determining whether students learned what they just experienced
    • Determining whether students can apply what they have learned to a similar but new context
    • Determining whether students can generalize their learning to a different context
    • Other (please specify): N/A

i. The task assesses what it is intended to assess and supports the purpose for which it is intended.

Consider the following:

  1. Is the assessment target necessary to successfully complete the task?

Yes, students must understand the target concepts of resilience, disturbances, and carrying capacity (DCI) to successfully evaluate the politician’s claim (SEP).

  1. Are any ideas, practices, or experiences not targeted by the assessment necessary to respond to the task? Consider the impact this has on students’ ability to complete the task and interpretation of student responses.

No external ideas are strictly necessary, though basic graphing literacy is helpful for interpreting the simulation’s output.

  1. Do the student responses elicited support the purpose of the task (e.g., if a task is intended to help teachers determine if students understand the distinction between cause and correlation, does the task support this inference)?

Yes, the argument artifact directly supports determining if students understand the limits of ecosystem resilience and how to use data to evaluate claims.

ii. The task elicits artifacts from students as direct, observable evidence of how well students can use the targeted dimensions together to make sense of phenomena and design solutions to problems.

Consider what student artifacts are produced and how these provide students the opportunity to make visible their 1) sense-making processes, 2) thinking across all three dimensions, and 3) ability to use multiple dimensions together [note: these artifacts should connect back to the evidence described for Criterion B].

Students produce a completed data table, answers to sensemaking questions, and a final written argument linking data to the concepts of stability and change.

iii. Supporting materials include clear answer keys, rubrics, and/or scoring guidelines that are connected to the three-dimensional target. They provide the necessary and sufficient guidance for interpreting student responses relative to the purpose of the assessment, all targeted dimensions, and the three-dimensional target.

Consider how well the materials support teachers and students in making sense of student responses and planning for follow up (grading, instructional moves), consistent with the purpose of and targets for the assessment. Consider in what ways rubrics include:

  1. Guidance for interpreting student thinking using an integrated approach, considering all three dimensions together as well as calling out specific supports for individual dimensions, if appropriate:
  1. Support for interpreting a range of student responses, including those that might reflect partial scientific understanding or mask/misrepresent students’ actual science understanding (e.g., because of language barriers, lack of prompting or disconnect between the intent and student interpretation of the task, variety in communication approaches):

Scoring guidance should provide examples of ‘partial’ understanding, such as a student correctly noting that population numbers drop (DCI) but failing to connect this to the broader concept of permanent ecosystem change vs resilience (CCC).

  1. Ways to connect student responses to prior experiences and future planned instruction by teachers and participation by students:

Teacher guidance should suggest linking the simulation’s findings to local ecological phenomena, such as a recent regional drought, allowing students to apply their newly formed claims to their own environment.

iv. The task’s prompts and directions provide sufficient guidance for the teacher to administer it effectively and for the students to complete it successfully while maintaining high levels of students’ analytical thinking as appropriate.

Consider any confusing prompts or directions, and evidence for too much or too little scaffolding/supports for students (relative to the target of the assessment—e.g., a task is intended to elicit student understanding of a DCI, but their response is so heavily scripted that it prevents students from actually showing their ability to apply the DCI).

The instructions are highly structured during the exploration phase, ensuring students gather the necessary data, but leave the synthesis open-ended to maintain cognitive demand for the final argumentation.

Evidence of quality for Criterion D: [ ] No [ ] Inadequate [ ] Adequate [x] Extensive

Suggestions for improvement of the task for Criterion D:

Provide a sample student response in the Teacher Notes to further clarify expectations for the argument.

Overall Summary

Consider the task purpose and the evidence you gathered for each criterion. Carefully consider the purpose and intended use of the task, your evidence, reasoning, and ratings to make a summary recommendation about using this task. While general guidance is provided below, it is important to remember that the intended use of the task plays a big role in determining whether the task is worth students’ and teachers’ time.

The task provides a robust and interactive method for students to evaluate claims about ecosystem stability, resilience, and change using clear, observable evidence from a population dynamics simulation.

Final recommendation (choose one):