Science Task Screener
Task Title: Electric & Magnetic Field Energy Task — Screener
Grade: 9-12
Date: 2026-04-22
Instructions
- Before you begin: Complete the task as a student would. Then, consider any support materials provided to teachers or students, such as contextual information about the task and answer keys/scoring guidance.
- Using the Task Screener: Use this tool to evaluate tasks designed for three-dimensional standards. For each criterion, record your evidence for the presence or absence of the associated indicators. After you have decided to what degree the indicators are present within the task, revisit the purpose of your task and decide whether the evidence supports using it.
Criterion A. Tasks are driven by high-quality scenarios that are grounded in phenomena or problems.
i. Making sense of a phenomenon or addressing a problem is necessary to accomplish the task.
What was in the task, where was it, and why is this evidence?
- Is a phenomenon and/or problem present?
Evidence: The Engage section presents an explicit anchoring phenomenon: engineers notice that two magnets/charged objects sometimes require different amounts of work to separate depending on orientation and starting separation. This phenomenon drives student questions in the handout.
- Is information from the scenario necessary to respond successfully to the task?
Evidence: Yes. Students must manipulate the simulation and use its observed outputs (distance, force vectors, and the energy meter) to collect evidence and support claims; the scenario and simulation are central to timely data collection and model construction.
ii. The task scenario is engaging, relevant, and accessible to a wide range of students.
Features of engaging, relevant, and accessible tasks:
| Features of scenarios | Yes | Somewhat | No | Rationale |
|---|---|---|---|---|
| Scenario presents real-world observations | [x] | [ ] | [ ] | Engineers’ challenge designing magnetic docking systems is an applied, observable context. |
| Scenarios are based around at least one specific instance, not a topic or generally observed occurrence | [x] | [ ] | [ ] | Focus is on two interacting objects with explicit orientations and separations. |
| Scenarios are presented as puzzling/intriguing | [x] | [ ] | [ ] | The variability in work required invites student inquiry. |
| Scenarios create a “need to know” | [x] | [ ] | [ ] | Designing functional docking systems creates authentic engineering constraints. |
| Scenarios are explainable using grade-appropriate SEPs, CCCs, DCIs | [x] | [ ] | [ ] | The task aligns to HS-PS3-5 and requires modeling and data interpretation. |
| Scenarios effectively use at least 2 modalities (e.g., images, diagrams, video, simulations, textual descriptions) | [x] | [ ] | [ ] | Simulation visuals plus student-generated tables/diagrams serve multiple modalities. |
| If data are used, scenarios present real/well-crafted data | [ ] | [x] | [ ] | Simulation outputs are qualitative/visual (px and bar width) rather than SI numeric data; teacher guidance makes these usable. |
| The local, global, or universal relevance of the scenario is made clear to students | [x] | [ ] | [ ] | Relevance is explicit via engineering application example. |
| Scenarios are comprehensible to a wide range of students at grade-level | [x] | [ ] | [ ] | Language and steps are simple; teachers can scaffold percent estimation. |
| Scenarios use as many words as needed, no more | [x] | [ ] | [ ] | Instructions are concise and focused on investigation steps. |
| Scenarios are sufficiently rich to drive the task | [x] | [ ] | [ ] | Interaction types and variable magnitudes allow investigative depth. |
| Evidence of quality for Criterion A: [ ] No | [ ] Inadequate | [ ] Adequate | [x] Extensive |
Suggestions for improvement of the task for Criterion A:
- Make explicit in teacher notes that distance is reported in pixels and stored energy is visual; provide an optional conversion scaffold or pre-filled measurement table for classes needing stronger scaffolding.
Criterion B. Tasks require sense-making using the three dimensions.
i. Completing the task requires students to use reasoning to sense-make about phenomena or problems.
Consider in what ways the task requires students to use reasoning to engage in sense-making and/or problem solving.
Evidence: Students must design an investigation, select magnitudes and separation points, estimate energy from a visual meter, compute averages, interpret trends, and construct a model linking forces to stored energy — all requiring reasoning beyond rote procedures.
ii. The task requires students to demonstrate grade-appropriate dimensions:
Evidence of SEPs (which element[s], and how does the task require students to demonstrate this element in use?)
Evidence: Developing and Using Models — students draw and annotate models showing object positions, force directions, and qualitative energy storage; Planning and Carrying Out Investigations — students select variables and measurement points; Analyzing and Interpreting Data — students compute averages and identify trends.
Evidence of CCCs (which element[s], and how does the task require students to demonstrate this element in use?)
Evidence: Cause and Effect — students reason about how changes in separation cause changes in force magnitude and stored field energy.
Evidence of DCIs (which element[s], and how does the task require students to demonstrate this element in use?)
Evidence: PS3.C Relationship Between Energy and Forces — students use the model and data to determine whether stored field energy increased, decreased, or remained the same, and relate this to work done.
iii. The task requires students to integrate multiple dimensions in service of sense-making and/or problem-solving.
Consider in what ways the task requires students to use multiple dimensions together.
Evidence: Students must use modeling (SEP) informed by observed data (analysis) and DCIs about energy/forces to make cause-and-effect claims.
iv. The task requires students to make their thinking visible.
Consider in what ways the task explicitly prompts students to make their thinking visible (surfaces current understanding, abilities, gaps, problematic ideas).
Evidence: Prompts ask for data tables, averaged values, model sketches, and written claims with evidence and reasoning, producing multiple observable artifacts.
| Evidence of quality for Criterion B: [ ] No | [ ] Inadequate | [x] Adequate | [ ] Extensive |
Suggestions for improvement of the task for Criterion B:
- Add a short rubric exemplar or teacher script for guiding students through the CER (Claim–Evidence–Reasoning) connection during discussion.
Criterion C. Tasks are fair and equitable.
i. The task provides ways for students to make connections of local, global, or universal relevance.
Consider specific features of the task that enable students to make local, global, or universal connections to the phenomenon/problem and task at hand.
Evidence: The engineering docking example connects to local industry and technological applications; teachers can prompt students to identify local analogs.
ii. The task includes multiple modes for students to respond to the task.
Describe what modes (written, oral, video, simulation, direct observation, peer discussion, etc.) are expected/possible.
Evidence: Students can respond with written reports, diagrams, oral presentations, and simulation recordings (screen captures); multiple modes support diverse learners.
iii. The task is accessible, appropriate, and cognitively demanding for all learners (including English learners or students working below/above grade level).
| Features | Yes | Somewhat | No | Rationale |
|---|---|---|---|---|
| Task includes appropriate scaffolds | [x] | [ ] | [ ] | The task offers suggested procedures, a sample data table, and extension options. |
| Tasks are coherent from a student perspective | [x] | [ ] | [ ] | Instructions are stepwise and concrete. |
| Tasks respect cultural/linguistic backgrounds | [x] | [ ] | [ ] | The engineering context is broadly accessible; teacher prompt options can increase relevance. |
| Tasks provide both low- and high-achieving students with an opportunity to show what they know | [x] | [ ] | [ ] | Extensions and scaffolded options allow differentiation. |
| Tasks use accessible language | [x] | [ ] | [ ] | Minimal jargon and explicit steps; teacher notes can further scaffold. |
iv. The task cultivates students’ interest in and confidence with science and engineering.
Evidence: Hands-on simulation manipulation and a real-world engineering framing support engagement and confidence-building when students succeed in modeling.
v. The task focuses on performances for which students’ learning experiences have prepared them (opportunity to learn considerations).
Evidence: The task assumes students have prior exposure to basic force/energy vocabulary and averaging calculations; teachers may need to provide a brief refresher.
vi. The task presents information that is scientifically accurate.
Evidence: The simulation qualitatively implements the PS3.C idea (field energy changes with position). Teachers should clarify that distance is reported in pixels and that energy is a normalized visual indicator rather than an SI quantity.
| Evidence of quality for Criterion C: [ ] No | [ ] Inadequate | [x] Adequate | [ ] Extensive |
Suggestions for improvement of the task for Criterion C:
- Provide an alternate data-collection scaffold that maps pixel distances to teacher-defined approximate SI distances for classes requiring numeric units.
Criterion D. Tasks support their intended targets and purpose.
Before you begin:
- Describe what is being assessed. Include any targets provided, such as dimensions, elements, or PEs:
This task assesses students’ ability to develop and use a model of two objects interacting through electric or magnetic fields (HS-PS3-5), to collect and analyze simulation-based data (distance in px and estimated energy percent), and to construct evidence-based explanations linking forces and stored field energy.
- What is the purpose of the assessment? (check all that apply)
- Formative (including peer and self-reflection)
- Summative
- Determining whether students learned what they just experienced
- Determining whether students can apply what they have learned to a similar but new context
- Determining whether students can generalize their learning to a different context
i. The task assesses what it is intended to assess and supports the purpose for which it is intended.
- Is the assessment target necessary to successfully complete the task?
Evidence: Yes — developing a model and interpreting simulation outputs are required to complete the investigation and support claims.
- Are any ideas, practices, or experiences not targeted by the assessment necessary to respond to the task?
Evidence: Minimal background in averaging and basic force/energy vocabulary is needed; the task avoids requiring calculus or advanced mathematics.
- Do the student responses elicited support the purpose of the task?
Evidence: Yes — tables, models, and written explanations provide observable artifacts to judge three-dimensional understanding.
ii. The task elicits artifacts from students as direct, observable evidence of how well students can use the targeted dimensions together.
Evidence: Student artifacts (data table, averaged values, model diagrams, and a written argument) make reasoning and integration visible.
iii. Supporting materials include clear answer keys, rubrics, and/or scoring guidelines that are connected to the three-dimensional target.
Evidence: A full 12-point rubric is provided below in this screener (Claim & Explanation, Data & Analysis, Model & Visuals, Reasoning & Connection to Evidence). The task includes a teacher checklist for quick grading and the screener contains detailed scoring anchors and model responses.
Example model responses (scoring exemplars)
- High (11/12): Student chooses Electric (Opposite Charges) with q₁ = q₂ = 5. Shows averaged energies that increase with separation (example averaged values: 60px → 15%, 100px → 40%, 200px → 75%). Claim: “Stored field energy is higher when the charges are pulled apart for attractive interactions; as distance increases, energy stored increases because work is done against the attractive force.” Model diagram labels forces pointing toward each other and annotates increasing stored energy with larger separations.
Full 12-point rubric (scoring anchors)
Total: 12 points distributed as A (Claim & Explanation) = 4 pts, B (Data & Analysis) = 4 pts, C (Model & Visuals) = 2 pts, D (Reasoning & Connection to Evidence) = 2 pts.
Criterion A — Claim & Explanation (4 pts)
- 4 pts: Clear, scientifically accurate claim that directly answers how stored field energy changes with separation for the chosen interaction; explanation links the claim to measured evidence and describes the sign/direction of work (e.g., work done against an attractive field increases stored energy).
- 3 pts: Claim is correct and supported by evidence but explanation omits one linkage (e.g., mentions work qualitatively but lacks an explicit connection to specific data points).
- 2 pts: Claim present but weakly supported by evidence or contains partial misconceptions.
- 0–1 pts: Claim missing or incorrect.
Criterion B — Data & Analysis (4 pts)
- 4 pts: Data are collected for the requested separations, averages computed correctly, and analysis correctly identifies the trend (increase/decrease/no change) with clear use of at least three data points.
- 3 pts: Data largely complete and averaged correctly but analysis misses one aspect (e.g., insufficient explicit linking of which data support the claim).
- 2 pts: Data present but incomplete or computed incorrectly; trend only weakly established.
- 0–1 pts: Little or no usable data provided.
Criterion C — Model & Visuals (2 pts)
- 2 pts: A labeled model/sketch showing object positions, force directions, and annotated notes about how stored energy changes with separation; diagram accurately reflects the chosen interaction.
- 1 pt: Model present but missing labels or contains small inaccuracies.
- 0 pts: No model or entirely incorrect model.
Criterion D — Reasoning & Connection to Evidence (2 pts)
- 2 pts: Explicit reasoning connects the data/analysis and model to the claim, using correct physics language (work, field energy) and explaining why the trend follows from the interaction type.
- 1 pt: Partial reasoning that attempts to connect evidence to claim but leaves gaps.
- 0 pts: No reasoning or reasoning contradicts the evidence.
Scoring bands (final rubric):
- 10–12: High — Student demonstrates integrated three-dimensional understanding with clear claim, accurate data analysis, sound model, and strong reasoning.
- 7–9: Medium — Student shows adequate evidence and partial integration; one or two rubric elements are incomplete or partially incorrect.
- 4–6: Low — Student has some relevant elements but lacks consistent data, explanation, or accurate model.
- 0–3: No/Minimal — Insufficient or incorrect evidence to support the target performance.
Suggestions for improvement (rubric items):
- Provide a short teacher-facing scoring guide with exemplar student responses at 12, 8, and 4 points to calibrate scoring across graders. Include explicit mapping of observed data ranges (approximate energy-percent patterns) to expected qualitative claims for attractive vs. repulsive interactions.
iv. The task’s prompts and directions provide sufficient guidance for the teacher to administer it effectively and for the students to complete it successfully while maintaining high levels of students’ analytical thinking as appropriate.
Evidence: Stepwise directions, suggested measurement points, and explicit deliverable options (written or diagrammatic) support consistent administration; teacher notes call out pixel/percent limitations.
| Evidence of quality for Criterion D: [ ] No | [ ] Inadequate | [x] Adequate | [ ] Extensive |
Suggestions for improvement of the task for Criterion D:
- Add a pre-filled example table and a short teacher script for administering the task under time limits.
Overall Summary
The task aligns to HS-PS3-5 and elicits evidence across SEPs, DCIs, and CCCs. With minor clarifications about units and an optional scaffold for percent estimation, the task is appropriate for both formative and summative use.
Final recommendation (choose one):
- Use this task (all criteria had at least an “adequate” rating)
- Modify and use this task
- Do not use this task