HomepageISTEEdSurge
Skip to content
ascd logo

Log in to Witsby: ASCD’s Next-Generation Professional Learning and Credentialing Platform
Join ASCD
February 1, 2025
Vol. 82
No. 5

How to Unpack a Learning Standard Using ChatGPT

author avatar
Teachers can use AI to generate assessment criteria, rubrics, hands-on activities, and more.

premium resources logo

Premium Resource

Technology
An abstract, colorful illustration of a laptop with vibrant speech bubble shapes bursting out in pink, green, and blue. A swirling yellow ribbon floats above.
Credit: Bob Vector / iStock
My dad remembers exactly where he was on July 20, 1969. Adjusting the rabbit-ear antenna atop an old black-and-white television, he marveled at the grainy images of Neil Armstrong setting foot on the moon. I’ll always remember where I was on February 10, 2023—at an educational technology meeting—when my friend and colleague Matt asked me whether I’d ever used ChatGPT. 
I had no idea what he was referring to. I slid my chair over, and he proceeded to type in a few prompts on his computer. He explained, “Google finds things that already exist, whereas ChatGPT makes it up as we go along—it’s generative.” He asked it to solve a math equation, suggest a dinner plan and include a shopping list, write computer code, and compose a song about flying kites atop Mount Everest. In mere seconds, it generated each of these things. I immediately wondered how this game-changing technology would disrupt education. 
Generative artificial intelligence (GenAI) is a disrupter. It’s here to stay, and embracing it can improve instruction, learning, and assessment. In the months that followed that meeting, I took a deep dive into GenAI. In both my district role as vice principal for communicating student learning and as an educational speaker and consultant, I designed presentations on integrating GenAI into teaching and assessment practices for educators, many of whom were unfamiliar with GenAI. Although the teachers were amazed at GenAI’s brainstorming potential, many struggled with how to blend it into their practice. 

Notice the conversational approach to the GenAI prompts; we converse more than search.

Author Image

I focused on learning standards—the building blocks of lesson planning and curriculum design—and have since developed an eight-step process that educators can use to unpack a learning standard with ChatGPT (or a similar chatbot). This process supports a number of objectives:
  • Reflects a purpose-driven approach by including clear assessment criteria and rubrics.
  • Combines effective educational practices with new technology.
  • Affirms the important role of the educator.
  • Works with a variety of tools, such as ChatGPT, ­Copilot, Gemini, or any other large language model.
  • Saves time and effort.
  • Enables teachers to learn about GenAI along with their students.
As we into dive into the example below, notice the conversational approach to the GenAI prompts: We ­converse more than search. Technology experts argue that you get better results by taking a more human approach, by articulating what’s most important for you and why, and even saying “please” and “thank you” (Wood, 2024). The example I provide documents the process for unpacking a learning standard that I used in early 2024; it follows an eight-step sequence (see fig. 1), with each prompt building on the previous one. Since developing this process, I’ve witnessed many teachers effectively use it to guide the infusion of GenAI into their classrooms and practice.
» View a PDF
How to Unpack a Learning Standard Using ChatGPT Figure 1

Step 1: Select a learning standard

Working with a group of high school science teachers during a professional development session, we chose to examine this learning standard:  
Investigate the characteristic properties of metals, nonmetals, and metalloids, and classify elements according to these properties. (Example: conductivity of heat and electricity.)
Remember that a learning standard is essentially the combination of a verb and a noun (Dueck, 2021). In this example, the verb is investigate and the noun, or content, consists of metals, nonmetals, metalloids, and conductivity.

Step 2: Confirm AI accuracy and credibility

While working collaboratively with the teachers, I demonstrated the process by starting with a simple, conversational, and welcoming prompt that included the exact words of the standard:
I’m a 9th grade science teacher, and I want to explore the following topic with my students: Investigate the characteristic properties of metals, nonmetals, and metalloids, and classify elements according to these properties. Do you know what this all means?
Moments after I pressed the “enter” key, I got a brief introduction and several bullet points under each of the key terms. For instance, under the heading “metals,” ChatGPT noted that metals are:
  • Generally shiny or metallic in lustre 
  • Good conductors of heat and ­electricity 
  • Malleable (to be hammered into thin sheets) and ductile (stretched to make wire)
  • Elements with high melting and boiling points
  • Typically solid at room temperature (with the exception of mercury)
After reading this information, the science teachers gave a thumbs up for accuracy. This is just one of several steps in the sequence where teachers’ knowledge and subject matter expertise are crucial. Nearly every GenAI tool informs the user that the output may not be accurate. For instance, beneath its search bar, ChatGPT notes, “ChatGPT can make mistakes. Check important info” (OpenAI, 2024). If the AI prompt does not elicit a correct response, it’s imperative to follow up with a more precise prompt or prompts.

Step 3: Develop an introductory activity, lab, or quest

With this step, we began to harness the incredible brainstorming power of GenAI, while, as you’ll see, never relinquishing the valuable role of the teacher. This was a continuous series, so we didn’t need to repeat the learning standard in subsequent prompts. We next used GenAI to suggest a hook or an interesting activity to kick off the unit:
Let’s start with a demonstration. Can you suggest three interesting activities that will hook my ­students?
ChatGPT immediately offered three activities that would “illustrate the properties of metals and their ability to conduct electricity.” One was titled, “Copper vs. Aluminum Wire Conduction Race” and included a list of materials and a step-by-step procedure for the demonstration. The activity was designed to be “visually appealing” because students would see the lightbulb illuminate faster with copper than with aluminum. ChatGPT also provided an overall explanation of the activity and discussion prompts for the class. 
To the untrained eye, this demonstration looked fantastic. But once again, we encountered the importance of educator experience and knowledge. When I shared the demonstration with a long-time colleague who teaches physics, he immediately said, “That won’t work!” He welcomed me to his lab to illustrate his concern. He hooked up the two different metals to a power source, and, sure enough, the light lit up immediately, regardless of which metal strip we used. Clearly this activity wasn’t “visually appealing” because it was impossible for the human eye to detect any difference in illumination. Instead of a conduction race between copper and ­aluminum, we redesigned the demonstration to use a multimeter to “see” the difference, and we replaced aluminum with nichrome, which is higher in resistance and thus a poorer conductor of electricity.
This underscored two important truths: (1) GenAI could certainly help us brainstorm interesting activities; in fact, the conduction race could be a neat idea, provided the correct metals are used. And (2) We needed to keep our hands on the wheel to be sure the activity was effective and doable in the classroom. With just a couple of changes, we created a better and more scientifically accurate classroom ­demonstration.
Armed with this improved version of the demonstration, I then asked ChatGPT the following:
Thanks, now could you design a lab for my students to investigate the conductivity of metals? Please use nichrome wire instead of aluminum and incorporate a multimeter to measure conductivity.
ChatGPT immediately produced an effective lab plan consisting of a title, a list of needed materials, a procedure, follow-up questions, and ­opportunities to extend the learning. 

Step 4: Develop specific success criteria

The follow-up questions in the lab plan that ChatGPT generated included: 
  • What factors contribute to the ­conductivity of a material?
  • How do the conductivity results align with the hypothesis?
  • Why are certain materials, such as metals, better conductors than other materials?
Teachers can use questions like these in the next step of the sequence, as I did in my subsequent prompt:
I like the three follow-up questions. I’ll give them as an assignment to my students. Can I get some specific assessment criteria?
What ChatGPT was able to do next has the potential to catapult teacher competence in not only unpacking a learning standard, but also assessing student understanding of that standard. After I pressed “enter,” ChatGPT gave me specific criteria under the following headings:
  • Scientific understanding
  • Data analysis and interpretation
  • Critical thinking and application
For example, the assessment criteria for scientific understanding were as follows:
  • Demonstrates a clear understanding of the factors influencing conductivity.
  • Provides accurate explanations of the observed results in relation to the properties of materials.
  • Incorporates relevant scientific concepts and terminology in the explanation.
In addition, ChatGPT suggested a four-level scoring system:
  • Advanced (4): Thorough and insightful understanding demonstrated with detailed and accurate explanations
  • Proficient (3): Clear understanding with accurate explanations 
  • Basic (2): Adequate understanding with some inaccuracies or lack of detail 
  • Below Basic (1): Limited understanding with significant inaccuracies or lack of detail
Now that we had clear assessment criteria and a scoring system, it was time to create a useable rubric. This brought us to Step 5. 

Step 5: Develop an assessment tool for the specific success criteria (rubric)

Asking teachers to embrace a standards-based approach while unpacking a learning standard can be daunting, but GenAI can make this important work much easier. Instead of creating a long and not-so-user-friendly document using the criteria from the three headings that ChatGPT supplied, I asked the following prompt:
This material is a little long. Can you summarize it a bit and build a ­four-column rubric?
With the press of a button, a rubric with headings, columns, and descriptors appeared. I shook my head in disbelief, never having imagined that a tool would be able to instantly generate a customized rubric based on my requests. I then asked ChatGPT to replace the columns with terms that reflected the proficiency scale used in my jurisdiction: Extending, Proficient, Developing, and Emerging. In addition, I asked it to articulate all levels in strength-based terminology. It was like I was waving a magic wand—it did it all. Figure 2 shows the revised rubric. 
How to Unpack a Learning Standard Using ChatGPT Figure 2
Next, my focus shifted to ­application and personalization—steps 6, 7, and 8.

Steps 6, 7, and 8: Generate real-world applications and activities

The final three steps are more holistic than the previous ones because they may overlap, blend, and be multifaceted. For this reason, this section combines these steps and demonstrates how they deepened our understanding of the science standard. 
Here are the last three steps in the process:
  • Step 6. Apply understanding to real-world problems.
  • Step 7. Generate individual student inquiries.
  • Step 8. Generate instructional activities that encourage community and student interaction.
Educators might want to ­consider one of them, all of them, or an ­alternative not listed here. 

Steps 6 and 7

Using GenAI for steps 6 and 7—real-world application and student inquiry—was inspired by teachers expressing concerns about students turning to AI to plagiarize. On this note, unless schools’ traditional approaches surrounding assignments, homework, grading, and assessment evolve, they may not survive the GenAI era. As with the dinosaurs, some practices will soon face extinction. 

ChatGPT can catapult teacher competence in not only unpacking a learning standard, but also assessing student understanding.

Author Image

Speaking of extinction, in a separate conversation about homework, plagiarizing, and student behavior, ChatGPT offered me an interesting list of items under the heading: “To mitigate the issue of copying, educators can consider the following strategies.” Eight practical strategies followed that not only mitigate student copying, but also promote differentiated assessment, a more positive classroom climate, and academic integrity. One ­suggestion grabbed my attention: “Use ­educational technology tools that generate unique questions or randomize elements of assignments, discouraging copying.”
While constructing the conductivity sequence, an idea came to mind. We could ask GenAI to blend a real-world application with individual student-led inquiry—steps 6 and 7. I couldn’t type the next prompt fast enough.
On the topic of conductivity, I’d like to have each of my 28 students tackle an inquiry combining conductivity with a real-world application. Can I get 28 different prompts that would be appropriate for grade 9 students?
Could it really generate this request? I held my breath as I pressed “enter.”
You’ve got to be kidding me! It generated 28 different avenues for learning. Here’s a sampling:
  • Inquiry 1. Investigate how the conductivity of various metals influences their use in electric wiring. Explore the importance of choosing the right material for this application.
  • Inquiry 7. Investigate how ­conductivity plays a role in the design of wearable technology. Discuss the advantages and challenges of using conductive materials. 
  • Inquiry 19. Investigate the conductivity of materials used in prosthetic limbs, and discuss how it influences the performance and integration with the human body.
The conductivity inquiries were as interesting as they were different. Other inquiries focused on such varied areas as electric cars, agriculture, paint applications, and sports equipment.
Now if I had a grade 9 science class, I’d copy this list, paste it into a document, print it, and then cut it up so that each inquiry appeared on a separate strip of paper. The following day, I’d have students pick one out of a jar; they could either pursue their question, negotiate a trade, or modify the inquiry. Regardless, each student would be on a different learning journey. Rather than being concerned over the potential for copying, we would be offering students an opportunity for collaboration and communication.

Step 8 

While GenAI offers powerful digital tools for learning, educators must consider how to use it thoughtfully in classrooms. This is particularly important given that concerns over adolescent screen time is mounting—recent data suggest that many young people are on screens for more than seven hours a day (Duarte, 2023). Many educators and parents worry that young people don’t spend enough time communicating face-to-face, exploring their communities, and going outside. Considering this angst, I was both encouraged and emboldened by a book on GenAI by Fei-Fei Li, The Worlds I See: Curiosity, Exploration, and Discovery at the Dawn of AI (Flatiron Books, 2023). Li argues that GenAI has the potential to make our world better—if we embrace these tools through human-centeredness. As educators for future generations, we have a crucial role to play in ensuring technology enhances rather than replaces human interaction in learning.

We have a crucial role to play in ensuring technology enhances rather than replaces human interaction in learning.

Author Image

Before AI becomes just another reason for students to stare at screens, might we use Darth Vader to slay Darth Vader? In other words, could we use AI to limit the use of AI? Maybe. 
Consider these three prompts that reflect our eighth and last step: Use AI to generate instructional activities that encourage community and student interaction.
What categories of people or types of occupations in my community might I contact to further my students’ understanding of conductivity? [You might add additional context here, such as, “I teach in an urban elementary school” or “Our ­community is a rural, agriculturally-based economy.”]
Speaking of conductivity, can you give me five examples of field trips I might take in [insert a nearby town or city] to explore this topic?
As we explore conductivity, can you suggest 10 ways students could work together, share their thinking, and learn from each other without using screens or digital tools? I’m looking for hands-on, interactive approaches that get students talking and ­collaborating.
Prompts such as these, in addition to tackling the learning outcome, can develop a human-centeredness in our schools and communities. Like many educators I’ve worked with, these science teachers appreciated having a clear, scaffolded pathway for using GenAI—a way to harness this powerful tool’s potential while maintaining the teacher’s essential role in student learning.

Human—and Future—Focused

Generative AI. It’s here, more of it’s coming—and it’s clear it can improve instruction, assessment, and learning. We must be aware, however, that becoming an effective user of GenAI is not like learning to use a calculator or a microwave. Users can expect ­platforms to change, add features, and get smarter. Therefore, although reading the owner’s manual used to be an adequate way to get up to speed, we’ll need to join our students as learners. GenAI has the potential to change every time we use it. By embracing this evolution alongside our students, we can harness GenAI’s capabilities to create deeper learning experiences while maintaining the human element that is essential to good teaching.

Reflect & Discuss

➛ How could you adapt the eight-step process for unpacking standards to fit your subject area and grade level?

➛ In what ways might you use AI to assess student understanding while maintaining the “human-centeredness” Dueck emphasizes?

How to Unpack a Learning Standard Using ChatGPT Sidebar
References

Duarte, F. (2023, November 10). Average screen time for teens. Exploding Topics. 

Dueck, M. (2021). Giving students a say: Smarter assessment practices to empower and engage. ASCD.

OpenAI. (2024, October 17). ChatGPT. https://chatgpt.com/

Wood, M. (2024, February 26). Charles Duhigg on how to build new habits for the AI era. The Worklab Podcast. Microsoft.

For 23 years, Myron Dueck has worked as an educator and administrator. Through his current district position, as well as working with educators around the world, he continues to develop grading, assessment, and reporting systems that give students a greater opportunity to show what they understand, adapt to the feedback they receive, and play a significant role in reporting their learning.

Dueck has been a part of administrative teams, district groups, school committees, and governmental bodies in both Canada and New Zealand sharing his stories, tools, and first-hand experiences that have further broadened his access to innovative ideas. He is the author of the bestselling book Grading Smarter, Not Harder (ASCD, 2014) and the new book Giving Students a Say (ASCD, 2021).

Learn More

ASCD is a community dedicated to educators' professional growth and well-being.

Let us help you put your vision into action.
Related Articles
View all
undefined
Technology
Can AI Assess Student Learning?
Bryan Goodwin
in 2 days

undefined
Bringing Artificial Intelligence to the PLC Table
Meghan Hargrave & Douglas Fisher et al.
in 2 days

undefined
Turning AI into a Tool for Equity
Kate Stoltzfus & Ken Shelton
in 2 days

undefined
Deeper Learning, Not Passive Compliance
Tony Frontier
in 2 days

undefined
If Schools Don’t Change, the Potential of AI Won’t Be Realized
Yong Zhao
in 2 days
Related Articles
Can AI Assess Student Learning?
Bryan Goodwin
in 2 days

Bringing Artificial Intelligence to the PLC Table
Meghan Hargrave & Douglas Fisher et al.
in 2 days

Turning AI into a Tool for Equity
Kate Stoltzfus & Ken Shelton
in 2 days

Deeper Learning, Not Passive Compliance
Tony Frontier
in 2 days

If Schools Don’t Change, the Potential of AI Won’t Be Realized
Yong Zhao
in 2 days
From our issue
Issue cover featuring a red geometric apple on a white background overlaid with the the text "AI in Schools"
AI in Schools: What Works and What's Next?
Go To Publication