Science Task Screener

Task Title: Electric & Magnetic Field Energy Task — Screener

Grade: 9-12

Date: 2026-04-22

Instructions

Criterion A. Tasks are driven by high-quality scenarios that are grounded in phenomena or problems.

i. Making sense of a phenomenon or addressing a problem is necessary to accomplish the task.

What was in the task, where was it, and why is this evidence?

  1. Is a phenomenon and/or problem present?

Evidence: The Engage section presents an explicit anchoring phenomenon: engineers notice that two magnets/charged objects sometimes require different amounts of work to separate depending on orientation and starting separation. This phenomenon drives student questions in the handout.

  1. Is information from the scenario necessary to respond successfully to the task?

Evidence: Yes. Students must manipulate the simulation and use its observed outputs (distance, force vectors, and the energy meter) to collect evidence and support claims; the scenario and simulation are central to timely data collection and model construction.

ii. The task scenario is engaging, relevant, and accessible to a wide range of students.

Features of engaging, relevant, and accessible tasks:

Features of scenarios Yes Somewhat No Rationale
Scenario presents real-world observations [x] [ ] [ ] Engineers’ challenge designing magnetic docking systems is an applied, observable context.
Scenarios are based around at least one specific instance, not a topic or generally observed occurrence [x] [ ] [ ] Focus is on two interacting objects with explicit orientations and separations.
Scenarios are presented as puzzling/intriguing [x] [ ] [ ] The variability in work required invites student inquiry.
Scenarios create a “need to know” [x] [ ] [ ] Designing functional docking systems creates authentic engineering constraints.
Scenarios are explainable using grade-appropriate SEPs, CCCs, DCIs [x] [ ] [ ] The task aligns to HS-PS3-5 and requires modeling and data interpretation.
Scenarios effectively use at least 2 modalities (e.g., images, diagrams, video, simulations, textual descriptions) [x] [ ] [ ] Simulation visuals plus student-generated tables/diagrams serve multiple modalities.
If data are used, scenarios present real/well-crafted data [ ] [x] [ ] Simulation outputs are qualitative/visual (px and bar width) rather than SI numeric data; teacher guidance makes these usable.
The local, global, or universal relevance of the scenario is made clear to students [x] [ ] [ ] Relevance is explicit via engineering application example.
Scenarios are comprehensible to a wide range of students at grade-level [x] [ ] [ ] Language and steps are simple; teachers can scaffold percent estimation.
Scenarios use as many words as needed, no more [x] [ ] [ ] Instructions are concise and focused on investigation steps.
Scenarios are sufficiently rich to drive the task [x] [ ] [ ] Interaction types and variable magnitudes allow investigative depth.
Evidence of quality for Criterion A: [ ] No [ ] Inadequate [ ] Adequate [x] Extensive

Suggestions for improvement of the task for Criterion A:

Criterion B. Tasks require sense-making using the three dimensions.

i. Completing the task requires students to use reasoning to sense-make about phenomena or problems.

Consider in what ways the task requires students to use reasoning to engage in sense-making and/or problem solving.

Evidence: Students must design an investigation, select magnitudes and separation points, estimate energy from a visual meter, compute averages, interpret trends, and construct a model linking forces to stored energy — all requiring reasoning beyond rote procedures.

ii. The task requires students to demonstrate grade-appropriate dimensions:

Evidence of SEPs (which element[s], and how does the task require students to demonstrate this element in use?)

Evidence: Developing and Using Models — students draw and annotate models showing object positions, force directions, and qualitative energy storage; Planning and Carrying Out Investigations — students select variables and measurement points; Analyzing and Interpreting Data — students compute averages and identify trends.

Evidence of CCCs (which element[s], and how does the task require students to demonstrate this element in use?)

Evidence: Cause and Effect — students reason about how changes in separation cause changes in force magnitude and stored field energy.

Evidence of DCIs (which element[s], and how does the task require students to demonstrate this element in use?)

Evidence: PS3.C Relationship Between Energy and Forces — students use the model and data to determine whether stored field energy increased, decreased, or remained the same, and relate this to work done.

iii. The task requires students to integrate multiple dimensions in service of sense-making and/or problem-solving.

Consider in what ways the task requires students to use multiple dimensions together.

Evidence: Students must use modeling (SEP) informed by observed data (analysis) and DCIs about energy/forces to make cause-and-effect claims.

iv. The task requires students to make their thinking visible.

Consider in what ways the task explicitly prompts students to make their thinking visible (surfaces current understanding, abilities, gaps, problematic ideas).

Evidence: Prompts ask for data tables, averaged values, model sketches, and written claims with evidence and reasoning, producing multiple observable artifacts.

Evidence of quality for Criterion B: [ ] No [ ] Inadequate [x] Adequate [ ] Extensive

Suggestions for improvement of the task for Criterion B:

Criterion C. Tasks are fair and equitable.

i. The task provides ways for students to make connections of local, global, or universal relevance.

Consider specific features of the task that enable students to make local, global, or universal connections to the phenomenon/problem and task at hand.

Evidence: The engineering docking example connects to local industry and technological applications; teachers can prompt students to identify local analogs.

ii. The task includes multiple modes for students to respond to the task.

Describe what modes (written, oral, video, simulation, direct observation, peer discussion, etc.) are expected/possible.

Evidence: Students can respond with written reports, diagrams, oral presentations, and simulation recordings (screen captures); multiple modes support diverse learners.

iii. The task is accessible, appropriate, and cognitively demanding for all learners (including English learners or students working below/above grade level).

Features Yes Somewhat No Rationale
Task includes appropriate scaffolds [x] [ ] [ ] The task offers suggested procedures, a sample data table, and extension options.
Tasks are coherent from a student perspective [x] [ ] [ ] Instructions are stepwise and concrete.
Tasks respect cultural/linguistic backgrounds [x] [ ] [ ] The engineering context is broadly accessible; teacher prompt options can increase relevance.
Tasks provide both low- and high-achieving students with an opportunity to show what they know [x] [ ] [ ] Extensions and scaffolded options allow differentiation.
Tasks use accessible language [x] [ ] [ ] Minimal jargon and explicit steps; teacher notes can further scaffold.

iv. The task cultivates students’ interest in and confidence with science and engineering.

Evidence: Hands-on simulation manipulation and a real-world engineering framing support engagement and confidence-building when students succeed in modeling.

v. The task focuses on performances for which students’ learning experiences have prepared them (opportunity to learn considerations).

Evidence: The task assumes students have prior exposure to basic force/energy vocabulary and averaging calculations; teachers may need to provide a brief refresher.

vi. The task presents information that is scientifically accurate.

Evidence: The simulation qualitatively implements the PS3.C idea (field energy changes with position). Teachers should clarify that distance is reported in pixels and that energy is a normalized visual indicator rather than an SI quantity.

Evidence of quality for Criterion C: [ ] No [ ] Inadequate [x] Adequate [ ] Extensive

Suggestions for improvement of the task for Criterion C:

Criterion D. Tasks support their intended targets and purpose.

Before you begin:

  1. Describe what is being assessed. Include any targets provided, such as dimensions, elements, or PEs:

This task assesses students’ ability to develop and use a model of two objects interacting through electric or magnetic fields (HS-PS3-5), to collect and analyze simulation-based data (distance in px and estimated energy percent), and to construct evidence-based explanations linking forces and stored field energy.

  1. What is the purpose of the assessment? (check all that apply)
    • Formative (including peer and self-reflection)
    • Summative
    • Determining whether students learned what they just experienced
    • Determining whether students can apply what they have learned to a similar but new context
    • Determining whether students can generalize their learning to a different context

i. The task assesses what it is intended to assess and supports the purpose for which it is intended.

  1. Is the assessment target necessary to successfully complete the task?

Evidence: Yes — developing a model and interpreting simulation outputs are required to complete the investigation and support claims.

  1. Are any ideas, practices, or experiences not targeted by the assessment necessary to respond to the task?

Evidence: Minimal background in averaging and basic force/energy vocabulary is needed; the task avoids requiring calculus or advanced mathematics.

  1. Do the student responses elicited support the purpose of the task?

Evidence: Yes — tables, models, and written explanations provide observable artifacts to judge three-dimensional understanding.

ii. The task elicits artifacts from students as direct, observable evidence of how well students can use the targeted dimensions together.

Evidence: Student artifacts (data table, averaged values, model diagrams, and a written argument) make reasoning and integration visible.

iii. Supporting materials include clear answer keys, rubrics, and/or scoring guidelines that are connected to the three-dimensional target.

Evidence: A full 12-point rubric is provided below in this screener (Claim & Explanation, Data & Analysis, Model & Visuals, Reasoning & Connection to Evidence). The task includes a teacher checklist for quick grading and the screener contains detailed scoring anchors and model responses.

Example model responses (scoring exemplars)

Full 12-point rubric (scoring anchors)

Total: 12 points distributed as A (Claim & Explanation) = 4 pts, B (Data & Analysis) = 4 pts, C (Model & Visuals) = 2 pts, D (Reasoning & Connection to Evidence) = 2 pts.

Criterion A — Claim & Explanation (4 pts)

Criterion B — Data & Analysis (4 pts)

Criterion C — Model & Visuals (2 pts)

Criterion D — Reasoning & Connection to Evidence (2 pts)

Scoring bands (final rubric):

Suggestions for improvement (rubric items):

iv. The task’s prompts and directions provide sufficient guidance for the teacher to administer it effectively and for the students to complete it successfully while maintaining high levels of students’ analytical thinking as appropriate.

Evidence: Stepwise directions, suggested measurement points, and explicit deliverable options (written or diagrammatic) support consistent administration; teacher notes call out pixel/percent limitations.

Evidence of quality for Criterion D: [ ] No [ ] Inadequate [x] Adequate [ ] Extensive

Suggestions for improvement of the task for Criterion D:

Overall Summary

The task aligns to HS-PS3-5 and elicits evidence across SEPs, DCIs, and CCCs. With minor clarifications about units and an optional scaffold for percent estimation, the task is appropriate for both formative and summative use.

Final recommendation (choose one):