Science Task Screener

Task Title: Spacecraft Reentry Optimization Simulation

Grade: High School

Date: 2024-04-25

Instructions

Criterion A. Tasks are driven by high-quality scenarios that are grounded in phenomena or problems.

i. Making sense of a phenomenon or addressing a problem is necessary to accomplish the task.

What was in the task, where was it, and why is this evidence?

  1. Is a phenomenon and/or problem present?

The task requires students to make sense of the phenomenon of atmospheric reentry. They must analyze how kinetic energy creates extreme heat and deceleration, and logically deduce how manipulating variables impacts the spacecraft’s survival.

  1. Is information from the scenario necessary to respond successfully to the task?

Students actively engage in ‘Using Mathematics and Computational Thinking’ by utilizing the provided computational simulation to model different design solutions and track quantitative data.

ii. The task scenario is engaging, relevant, and accessible to a wide range of students.

Features of engaging, relevant, and accessible tasks:

Features of scenarios Yes Somewhat No Rationale
Scenario presents real-world observations [x] [ ] [ ] The scenario centers on designing a reentry capsule to safely return a payload, which is a specific, concrete instance of the broader engineering challenge.
Scenarios are based around at least one specific instance, not a topic or generally observed occurrence [x] [ ] [ ] The extreme consequences (burning up vs. crushing G-forces) and the interactive nature of balancing competing variables make the problem puzzling and intriguing.
Scenarios are presented as puzzling/intriguing [x] [ ] [ ] Students need to understand the relationship between angle, mass, and heat in order to find a solution that passes the mission constraints.
Scenarios create a “need to know” [x] [ ] [ ] The simulation generates real-time, mathematically derived data (Peak Temp, Max G-Force) based on the inputs provided.
Scenarios are explainable using grade-appropriate SEPs, CCCs, DCIs [x] [ ] [ ] The task is widely comprehensible because the basic inputs (size, mass, angle) are intuitive to adjust, and the outcomes are clearly marked as pass/fail.
Scenarios effectively use at least 2 modalities (e.g., images, diagrams, video, simulations, textual descriptions) [x] [ ] [ ] The scenario uses a universally understood context (space travel) that does not require specific cultural backgrounds to engage with.
If data are used, scenarios present real/well-crafted data [x] [ ] [ ] The mission briefing provides all necessary background constraints, so prior aerospace knowledge is not assumed.
The local, global, or universal relevance of the scenario is made clear to students [x] [ ] [ ] The physics model uses accurate core concepts (kinetic energy conversion, deceleration) and explicitly lists its simplifications in the ‘Limitations’ section.
Scenarios are comprehensible to a wide range of students at grade-level [x] [ ] [ ] The scope is appropriately narrow, focusing purely on the physics and design of the reentry phase rather than the entire space mission.
Scenarios use as many words as needed, no more [x] [ ] [ ] The task addresses the core ideas of engineering design by requiring iterative testing and balancing of tradeoffs.
Scenarios are sufficiently rich to drive the task [x] [ ] [ ] The task directly addresses the targeted standard (HS-ETS1-4) by having students use a computational simulation to model solutions to a complex problem.
Evidence of quality for Criterion A: [ ] No [ ] Inadequate [ ] Adequate [x] Extensive

Suggestions for improvement of the task for Criterion A:

None. The simulation naturally drives students to experiment and make sense of the results to achieve the engineering goal.

Criterion B. Tasks require sense-making using the three dimensions.

i. Completing the task requires students to use reasoning to sense-make about phenomena or problems.

Consider in what ways the task requires students to use reasoning to engage in sense-making and/or problem solving.

The task requires students to use the spacecraft and atmosphere as a system model (CCC: Systems and System Models). Students must recognize how changes to one component affect the behavior of the entire system, and they must evaluate the limitations of this simplified model compared to real-world reentry.

ii. The task requires students to demonstrate grade-appropriate dimensions:

Evidence of SEPs (which element[s], and how does the task require students to demonstrate this element in use?)

The task directly addresses ETS1.B (Developing Possible Solutions) by requiring students to use a computer simulation to iteratively test and refine multiple design configurations against specific engineering constraints, and then recommend the best solution.

Evidence of CCCs (which element[s], and how does the task require students to demonstrate this element in use?)

The task seamlessly integrates the dimensions: students use a computational model (SEP) to test possible solutions to an engineering problem (DCI) while analyzing the interactions and limits within the reentry system model (CCC).

Evidence of DCIs (which element[s], and how does the task require students to demonstrate this element in use?)

Student thinking is made visible through the data table they construct, their written explanations of the physics tradeoffs, and the final Mission Readiness Review memo.

iii. The task requires students to integrate multiple dimensions in service of sense-making and/or problem-solving.

Consider in what ways the task requires students to use multiple dimensions together.

The task requires students to use a computational model to predict the effects of different design solutions on the reentry system and its constraints, which directly evaluates the targeted standard.

iv. The task requires students to make their thinking visible.

Consider in what ways the task explicitly prompts students to make their thinking visible (surfaces current understanding, abilities, gaps, problematic ideas).

The interactive nature of the simulation and the open-ended design challenge give students ownership over the solution, cultivating interest and confidence in solving complex engineering problems.

Evidence of quality for Criterion B: [ ] No [ ] Inadequate [ ] Adequate [x] Extensive

Suggestions for improvement of the task for Criterion B:

Ensure students actually record the data from multiple trials as requested.

Criterion C. Tasks are fair and equitable.

i. The task provides ways for students to make connections of local, global, or universal relevance.

Consider specific features of the task that enable students to make local, global, or universal connections to the phenomenon/problem and task at hand. Note: This criterion emphasizes ways for students to find meaning in the task; this does not mean “interest.” Consider whether the task is a meaningful, valuable endeavor that has real-world relevance–that some stakeholder group locally, globally, or universally would be invested in.

The task provides all necessary context in the mission briefing, ensuring students have the opportunity to learn and succeed regardless of prior aerospace knowledge.

ii. The task includes multiple modes for students to respond to the task.

Describe what modes (written, oral, video, simulation, direct observation, peer discussion, etc.) are expected/possible.

The physics principles (conversion of kinetic energy to heat, deceleration) are accurate, and the simplifications are explicitly acknowledged in the ‘Model Limitations’ section, ensuring high scientific validity.

iii. The task is accessible, appropriate, and cognitively demanding for all learners (including English learners or students working below/above grade level).

Features Yes Somewhat No Rationale
Task includes appropriate scaffolds [x] [ ] [ ] The task scaffolds learning using the 5E sequence, beginning with an Engage prompt and moving to structured Explore trials before requiring independent synthesis in Evaluate.
Tasks are coherent from a student perspective [x] [ ] [ ] The questions logically build on each other, moving from raw data collection to physics interpretation to recognizing model limitations.
Tasks respect and advantage students’ cultural and linguistic backgrounds [x] [ ] [ ] The task provides a universal scientific context (space exploration) that avoids regional or cultural bias, allowing all students to engage equally.
Tasks provide both low- and high-achieving students with an opportunity to show what they know [x] [ ] [ ] The open-ended nature of the optimization allows for diverse responses by accepting any variable combination that meets the mission constraints, allowing students to prioritize different design paths.
Tasks use accessible language [x] [ ] [ ] The language used is clear and straightforward, explaining necessary concepts like ‘kinetic energy’ and ‘G-forces’ directly in the Engage section without overly complex jargon.

iv. The task cultivates students’ interest in and confidence with science and engineering.

Consider how the task cultivates students interest in and confidence with science and engineering, including opportunities for students to reflect their own ideas as a meaningful part of the task; make decisions about how to approach a task; engage in peer/self-reflection; and engage with tasks that matter to students.

The requirement to test multiple variables and justify the solution using the simulation data is absolutely necessary to complete the final ‘Evaluate’ task.

v. The task focuses on performances for which students’ learning experiences have prepared them (opportunity to learn considerations).

Consider the ways in which provided information about students’ prior learning (e.g., instructional materials, storylines, assumed instructional experiences) enables or prevents students’ engagement with the task and educator interpretation of student responses.

The tasks provided directly assess the specific components of HS-ETS1-4, leaving no extraneous work that does not contribute to evaluating the standard.

vi. The task presents information that is scientifically accurate.

Describe evidence of scientific inaccuracies explicitly or implicitly promoted by the task.

The grading criteria differentiate between a successful design (meets all constraints) and an unsuccessful one, while also assessing the student’s reasoning.

Evidence of quality for Criterion C: [ ] No [ ] Inadequate [ ] Adequate [x] Extensive

Suggestions for improvement of the task for Criterion C:

None.

Criterion D. Tasks support their intended targets and purpose.

Before you begin:

  1. Describe what is being assessed. Include any targets provided, such as dimensions, elements, or PEs:

Students must write a Mission Readiness Review memo justifying their design using data and explicitly identifying model limitations.

  1. What is the purpose of the assessment? (check all that apply)
    • [ ] Formative (including peer and self-reflection)
    • [ ] Summative
    • [ ] Determining whether students learned what they just experienced
    • [ ] Determining whether students can apply what they have learned to a similar but new context
    • [ ] Determining whether students can generalize their learning to a different context
    • [ ] Other (please specify): N/A

i. The task assesses what it is intended to assess and supports the purpose for which it is intended.

Consider the following:

  1. Is the assessment target necessary to successfully complete the task?

The inline rubric provided in section iii explicitly measures the student’s ability to use the computational model (SEP) to develop a solution (DCI) and identify system model limitations (CCC), fully addressing the standard.

  1. Are any ideas, practices, or experiences not targeted by the assessment necessary to respond to the task? Consider the impact this has on students’ ability to complete the task and interpretation of student responses.

The tasks provided directly assess the specific components of HS-ETS1-4, leaving no extraneous work or non-targeted ideas that do not contribute to evaluating the standard.

  1. Do the student responses elicited support the purpose of the task (e.g., if a task is intended to help teachers determine if students understand the distinction between cause and correlation, does the task support this inference)?

The task allows teachers to see partial understanding—for example, if a student collects good data but fails to identify model limitations, the teacher can isolate the misunderstanding.

ii. The task elicits artifacts from students as direct, observable evidence of how well students can use the targeted dimensions together to make sense of phenomena and design solutions to problems.

Consider what student artifacts are produced and how these provide students the opportunity to make visible their 1) sense-making processes, 2) thinking across all three dimensions, and 3) ability to use multiple dimensions together [note: these artifacts should connect back to the evidence described for Criterion B].

Student responses to the ‘Elaborate’ section about model limitations provide direct insight into their understanding of system models, which can guide future instruction on complex systems.

iii. Supporting materials include clear answer keys, rubrics, and/or scoring guidelines that are connected to the three-dimensional target. They provide the necessary and sufficient guidance for interpreting student responses relative to the purpose of the assessment, all targeted dimensions, and the three-dimensional target.

Consider how well the materials support teachers and students in making sense of student responses and planning for follow up (grading, instructional moves), consistent with the purpose of and targets for the assessment. Consider in what ways rubrics include:

  1. Guidance for interpreting student thinking using an integrated approach, considering all three dimensions together as well as calling out specific supports for individual dimensions, if appropriate:

The prompts are clearly structured with specific bulleted requirements, and the data table is pre-formatted, providing strong scaffolding without doing the thinking for the student.

  1. Support for interpreting a range of student responses, including those that might reflect partial scientific understanding or mask/misrepresent students’ actual science understanding (e.g., because of language barriers, lack of prompting or disconnect between the intent and student interpretation of the task, variety in communication approaches):

The inline rubric explicitly separates the three dimensions (SEP, DCI, CCC) into distinct criteria with ‘Proficient’ and ‘Developing’ indicators. This allows a teacher to pinpoint if a student is struggling with the mathematical analysis (SEP), the engineering design iteration (DCI), or the conceptual understanding of models (CCC).

  1. Ways to connect student responses to prior experiences and future planned instruction by teachers and participation by students:

The rubric provides clear examples of ‘Developing’ performance (e.g., failing to use specific data points, or struggling to balance constraints). A teacher can use this specific diagnostic information to guide future instruction—for example, doing a mini-lesson on data integration if the SEP dimension is lacking.

iv. The task’s prompts and directions provide sufficient guidance for the teacher to administer it effectively and for the students to complete it successfully while maintaining high levels of students’ analytical thinking as appropriate.

Consider any confusing prompts or directions, and evidence for too much or too little scaffolding/supports for students (relative to the target of the assessment—e.g., a task is intended to elicit student understanding of a DCI, but their response is so heavily scripted that it prevents students from actually showing their ability to apply the DCI).

The inline scoring guidance provided above is robust enough to ensure consistent evaluation of the student memos.

Evidence of quality for Criterion D: [ ] No [ ] Inadequate [ ] Adequate [x] Extensive

Suggestions for improvement of the task for Criterion D:

Provide clear point values for each section if used as a formal assessment.

Overall Summary

Consider the task purpose and the evidence you gathered for each criterion. Carefully consider the purpose and intended use of the task, your evidence, reasoning, and ratings to make a summary recommendation about using this task. While general guidance is provided below, it is important to remember that the intended use of the task plays a big role in determining whether the task is worth students’ and teachers’ time.

This task is rated Green. It provides a robust, interactive scenario that closely aligns with HS-ETS1-4. The screener confirms the task requires authentic use of a computer simulation to test design solutions and evaluate constraints while addressing all required dimensions.

Final recommendation (choose one):