Science Task Screener

Task Title: Electromagnetism & Induction Task

Grade: High School

Date: 2026-04-24

Instructions

Criterion A. Tasks are driven by high-quality scenarios that are grounded in phenomena or problems.

i. Making sense of a phenomenon or addressing a problem is necessary to accomplish the task.

What was in the task, where was it, and why is this evidence?

  1. Is a phenomenon and/or problem present?

The task begins with an engaging anchoring phenomenon in ‘Part 1: Engage’ about a compass reacting wildly during a lightning strike.

  1. Is information from the scenario necessary to respond successfully to the task?

Yes, the task asks students to generate questions based on this scenario, which leads them into the simulation to collect data about the relationship between electricity and magnetism.

ii. The task scenario is engaging, relevant, and accessible to a wide range of students.

Features of engaging, relevant, and accessible tasks:

Features of scenarios Yes Somewhat No Rationale
Scenario presents real-world observations [ ] [ ] [ ] A compass responding to lightning is an observable, real-world event.
Scenarios are based around at least one specific instance, not a topic or generally observed occurrence [ ] [ ] [ ] It is a specific instance (being lost in the woods, experiencing a nearby lightning strike).
Scenarios are presented as puzzling/intriguing [ ] [ ] [ ] It explicitly poses a puzzling question.
Scenarios create a “need to know” [ ] [ ] [ ] Students must figure out the causal relationship.
Scenarios are explainable using grade-appropriate SEPs, CCCs, DCIs [ ] [ ] [ ] The scenario directly maps to the DCI, SEP, and CCC.
Scenarios effectively use at least 2 modalities (e.g., images, diagrams, video, simulations, textual descriptions) [ ] [ ] [ ] The task uses a textual description and a simulation.
If data are used, scenarios present real/well-crafted data [ ] [ ] [ ] The simulation generates structured, physically accurate data.
The local, global, or universal relevance of the scenario is made clear to students [ ] [ ] [ ] Clear instructions and structure.
Scenarios are comprehensible to a wide range of students at grade-level [ ] [ ] [ ] The task uses clear language and visual simulation.
Scenarios use as many words as needed, no more [ ] [ ] [ ] The simulation provides immediate visual feedback.
Scenarios are sufficiently rich to drive the task [ ] [ ] [ ] The task explicitly asks for explanations.
Evidence of quality for Criterion A: [ ] No [ ] Inadequate [ ] Adequate [ ] Extensive

Suggestions for improvement of the task for Criterion A:

None

Criterion B. Tasks require sense-making using the three dimensions.

i. Completing the task requires students to use reasoning to sense-make about phenomena or problems.

Consider in what ways the task requires students to use reasoning to engage in sense-making and/or problem solving.

The task requires students to construct an evidence-based argument synthesizing their findings to evaluate a specific claim.

ii. The task requires students to demonstrate grade-appropriate dimensions:

Evidence of SEPs (which element[s], and how does the task require students to demonstrate this element in use?)

Students must investigate how voltage and loops affect a magnetic field, and how a moving magnet induces current.

Evidence of CCCs (which element[s], and how does the task require students to demonstrate this element in use?)

The final argumentation prompt explicitly requires students to use their collected empirical evidence to explain the causal mechanism.

Evidence of DCIs (which element[s], and how does the task require students to demonstrate this element in use?)

Students use data tables and answer specific prompts requiring them to explain their reasoning.

iii. The task requires students to integrate multiple dimensions in service of sense-making and/or problem-solving.

Consider in what ways the task requires students to use multiple dimensions together.

The task relates the concepts to a relatable scenario (lightning and compass) which could happen anywhere.

iv. The task requires students to make their thinking visible.

Consider in what ways the task explicitly prompts students to make their thinking visible (surfaces current understanding, abilities, gaps, problematic ideas).

Students respond via data tables, short answers, and a final written argument.

Evidence of quality for Criterion B: [ ] No [ ] Inadequate [ ] Adequate [ ] Extensive

Suggestions for improvement of the task for Criterion B:

None

Criterion C. Tasks are fair and equitable.

i. The task provides ways for students to make connections of local, global, or universal relevance.

Consider specific features of the task that enable students to make local, global, or universal connections to the phenomenon/problem and task at hand. Note: This criterion emphasizes ways for students to find meaning in the task; this does not mean “interest.” Consider whether the task is a meaningful, valuable endeavor that has real-world relevance–that some stakeholder group locally, globally, or universally would be invested in.

The task avoids overly complex jargon and relies on clear, intuitive terms.

ii. The task includes multiple modes for students to respond to the task.

Describe what modes (written, oral, video, simulation, direct observation, peer discussion, etc.) are expected/possible.

The task allows students to independently explore the simulation, fostering confidence through discovery.

iii. The task is accessible, appropriate, and cognitively demanding for all learners (including English learners or students working below/above grade level).

Features Yes Somewhat No Rationale
Task includes appropriate scaffolds [ ] [ ] [ ] The structure allows for varying levels of detail in responses.
Tasks are coherent from a student perspective [ ] [ ] [ ] The step-by-step investigation flows logically.
Tasks respect and advantage students’ cultural and linguistic backgrounds [ ] [ ] [ ] The scenario uses universal concepts (lightning, compasses).
Tasks provide both low- and high-achieving students with an opportunity to show what they know [ ] [ ] [ ] The simulation allows exploration while the argument requires synthesis.
Tasks use accessible language [ ] [ ] [ ] The instructions avoid overly complex vocabulary.

iv. The task cultivates students’ interest in and confidence with science and engineering.

Consider how the task cultivates students interest in and confidence with science and engineering, including opportunities for students to reflect their own ideas as a meaningful part of the task; make decisions about how to approach a task; engage in peer/self-reflection; and engage with tasks that matter to students.

The simulation and task instructions assume basic familiarity with electricity and magnetism concepts but build the complex relationships through guided inquiry.

v. The task focuses on performances for which students’ learning experiences have prepared them (opportunity to learn considerations).

Consider the ways in which provided information about students’ prior learning (e.g., instructional materials, storylines, assumed instructional experiences) enables or prevents students’ engagement with the task and educator interpretation of student responses.

The simulation generates physically accurate data (magnetic field in mT, induced current in A) consistent with standard physics principles.

vi. The task presents information that is scientifically accurate.

Describe evidence of scientific inaccuracies explicitly or implicitly promoted by the task.

The task directly targets HS-PS2-5, assessing the required DCI, SEP, and CCC.

Evidence of quality for Criterion C: [ ] No [ ] Inadequate [ ] Adequate [ ] Extensive

Suggestions for improvement of the task for Criterion C:

None

Criterion D. Tasks support their intended targets and purpose.

Before you begin:

  1. Describe what is being assessed. Include any targets provided, such as dimensions, elements, or PEs:

This task assesses understanding of HS-PS2-5.

  1. What is the purpose of the assessment? (check all that apply)
    • [ ] Formative (including peer and self-reflection)
    • [ ] Summative
    • [ ] Determining whether students learned what they just experienced
    • [ ] Determining whether students can apply what they have learned to a similar but new context
    • [ ] Determining whether students can generalize their learning to a different context
    • [ ] Other (please specify): N/A

i. The task assesses what it is intended to assess and supports the purpose for which it is intended.

Consider the following:

  1. Is the assessment target necessary to successfully complete the task?

The student handout requires students to record data, write explanations, and construct a final argument.

  1. Are any ideas, practices, or experiences not targeted by the assessment necessary to respond to the task? Consider the impact this has on students’ ability to complete the task and interpretation of student responses.

The teacher notes provide alignment to the NGSS dimensions and explain how student responses demonstrate the evidence statements.

  1. Do the student responses elicited support the purpose of the task (e.g., if a task is intended to help teachers determine if students understand the distinction between cause and correlation, does the task support this inference)?

The structured 5E format provides clear guidance for teachers to facilitate and students to follow.

ii. The task elicits artifacts from students as direct, observable evidence of how well students can use the targeted dimensions together to make sense of phenomena and design solutions to problems.

Consider what student artifacts are produced and how these provide students the opportunity to make visible their 1) sense-making processes, 2) thinking across all three dimensions, and 3) ability to use multiple dimensions together [note: these artifacts should connect back to the evidence described for Criterion B].

The task requires students to explain phenomena using the targeted DCI and SEP.

iii. Supporting materials include clear answer keys, rubrics, and/or scoring guidelines that are connected to the three-dimensional target. They provide the necessary and sufficient guidance for interpreting student responses relative to the purpose of the assessment, all targeted dimensions, and the three-dimensional target.

Consider how well the materials support teachers and students in making sense of student responses and planning for follow up (grading, instructional moves), consistent with the purpose of and targets for the assessment. Consider in what ways rubrics include:

  1. Guidance for interpreting student thinking using an integrated approach, considering all three dimensions together as well as calling out specific supports for individual dimensions, if appropriate:

Students must identify variables and write a concluding evidence-based argument.

  1. Support for interpreting a range of student responses, including those that might reflect partial scientific understanding or mask/misrepresent students’ actual science understanding (e.g., because of language barriers, lack of prompting or disconnect between the intent and student interpretation of the task, variety in communication approaches):

The task uses the 5E sequence and clear steps to scaffold the investigation.

  1. Ways to connect student responses to prior experiences and future planned instruction by teachers and participation by students:

The simulation allows for visual exploration, while the data tables structure responses.

iv. The task’s prompts and directions provide sufficient guidance for the teacher to administer it effectively and for the students to complete it successfully while maintaining high levels of students’ analytical thinking as appropriate.

Consider any confusing prompts or directions, and evidence for too much or too little scaffolding/supports for students (relative to the target of the assessment—e.g., a task is intended to elicit student understanding of a DCI, but their response is so heavily scripted that it prevents students from actually showing their ability to apply the DCI).

The task provides clear instructions and rubrics aligned to the DCI and SEP.

Evidence of quality for Criterion D: [ ] No [ ] Inadequate [ ] Adequate [ ] Extensive

Suggestions for improvement of the task for Criterion D:

None

Overall Summary

Consider the task purpose and the evidence you gathered for each criterion. Carefully consider the purpose and intended use of the task, your evidence, reasoning, and ratings to make a summary recommendation about using this task. While general guidance is provided below, it is important to remember that the intended use of the task plays a big role in determining whether the task is worth students’ and teachers’ time.

The task is comprehensive, accurate, and ready for use.

Final recommendation (choose one):