In Arkansas, a group of students at Bentonville West High School created an AI-powered mobile app that allows patients to take images of their mouths to screen for signs of oral cancer (Samsung Newsroom US, 2025). Their project didn’t start with a unit on machine learning. It started from a place of curiosity and human empathy: How can we address this critical health problem in rural, low-income areas of our state where there is limited access to dentists and doctors?
In Massachusetts, high school students in the FutureMakers program at MIT created KineX, a physical therapy app that uses a computer’s camera to observe and evaluate a patient as they complete exercises (MIT Raise Initiative, 2025). Their spark was the desire to help people recover faster and better, even at home. Their curiosity about how to make physical therapy more effective led to the innovation.
Elementary students in Minnesota, inspired to prevent some of the 250 deaths that occur each year from falling through thin ice, developed a device with sensors and a microcomputer to measure the thickness of ice. They then built an app that works with the device to report the ice’s weight-bearing capacity (Samsung Newsroom US, 2025).
These students aren’t just “using” AI. They are applying it to solve real-world problems. This is applied intelligence: the synergy of human intelligence and artificial intelligence to solve problems, create new things, and do work that neither could accomplish alone.
Students from Bentonville West High School in Arkansas used AI to solve a real-world problem: early detection of oral cancer. Here they prepare to present their app and prototype to a panel of judges for the Samsung Solve for Tomorrow STEM competition.
This powerful, empathetic, and creative work, however, is not what is happening in most schools. Instead, three years after the release of ChatGPT, many students are living in a complicated academic world when it comes to AI. When schools initially banned AI tools and fed the rhetoric that these tools were for cheating, they inadvertently triggered the classic “forbidden fruit” effect. Students became even more curious about and eager to experiment with technology that educators had labeled as illicit, transforming AI from just another digital tool into something tantalizingly off-limits.
Now, students are navigating mixed messages about when to use AI tools. In some classrooms, they are back to using pencil and paper, writing essays under strict supervision. In others, teachers encourage using bots for feedback or for ethically questionable tasks like “chatting” with historical figures, which can lead to students internalizing AI-hallucinated facts. In many classrooms, students are using pretrained bots for specific use cases including tutoring, feedback, brainstorming, and summarizing. Students are explicitly forbidden to turn in work generated by a large language model, so they have gotten craftier about making their work look authentically human. This has led to a convoluted game of cat and mouse, with teachers trying to catch students using AI to complete low-level cognitive tasks while students try to fly under the radar.
The concerns about cheating and misinformation are important, but they miss some of the larger points. First, the reason students are turning to this technology is because they are infinitely curious about artificial intelligence. It may be true that some students are using AI to complete their work, but even more students are using AI outside of school in all sorts of interesting ways. This technology is fascinating, and many people are toying with it to discover its potential. Additionally, the real-world labor market is rapidly adopting these tools, which will affect the workforce. A recent study (Brynjolfsson et al., 2025) found that since the adoption of generative AI, early-career workers (ages 22–25) in the most AI-exposed occupations have already seen a significant relative decline in employment. The automation of low-level tasks, the very “clerk work” we often assign in schools, is already having a disproportionate impact on young people entering the workforce.
*Applied* intelligence is the synergy of human intelligence and artificial intelligence to solve problems, create new things, and do work that neither could accomplish alone.
If our students are to navigate the world they are inheriting, they must be the drivers of this technology, not its passengers. They will need to design the bots, not just be users of them. How can we make this happen? Educators will need to shift the focus from AI as a “gotcha” to AI as a colleague. The Brynjolfsson study provides a clear mandate for this shift: Employment declines are concentrated in occupations where AI automates tasks. In contrast, roles centered on augmenting human labor remain stable or are growing. Teaching applied intelligence is therefore no longer just a pedagogical choice; it is an economic imperative.
Putting Students in the Driver’s Seat
How do we get from the cat-and-mouse game to the measuring thin ice app? It begins with curiosity, with shifting the goal from “using AI” to “applying AI.” When students use AI, they rely on it to provide an answer, often bypassing productive cognitive struggle. When students apply AI, they expand their curiosity and creativity and become creators. In these cases, AI is part of the solution to a larger, messier problem that cannot be solved or answered with a simple prompt.
High school student Matteo Paz, driven by an interest in astronomy he’d had since grade school, used machine learning to detect more than 1.9 million previously unknown objects in space, publishing his first single-author piece in The Astronomical Journal before graduating from high school (Motrunich, 2025). His spark was pure curiosity. As a high school senior, Paz wondered if by using a machine learning model he could identify and categorize variable objects (like stars and quasars) that were hidden in the NEOWISE data already collected by NASA. He hypothesized that by using an AI model to sort through the billions of data points, he would be able to identify previously unknown objects. His curiosity has had a real impact on the astronomical community.
High school senior Matteo Paz explains his poster presentation at the 2025 Regeneron Science Talent Search event. Photo courtesy of Chris Ayers Photography/Licensed by Society for Science.
The shift to partnering with AI doesn’t need to wait until high school. In an article for Connected Classroom (2025), 3rd grade teacher Timothy Cook describes how his students didn’t just use an AI tool, they built one. Three students approached him and asked if they could build an AI chatbot that could help their class with their schoolwork. They got the green light. Naming it Growth Spurt, the 8-year-olds designed the entire instruction set, deciding together how the system should help students while keeping it focused on learning, not just giving answers. Then their whole class systematically tested its boundaries and its impact.
As Cook’s class shows, you don’t need to be a professional programmer to be a system architect; you need to be curious and then use creativity and problem solving to apply AI to create a solution. To design an effective bot, students (and adults) must exercise deep critical thinking and problem-solving skills. They need to ask themselves: What specific problem am I solving? What logic must the AI follow to give a helpful answer? How do I structure the rules to avoid hallucinations? What guardrails do I need to instill to prevent the bot from doing things it shouldn’t do? By engaging in this iterative design process, students aren’t just learning a tech skill; they are learning to deconstruct real-world problems and apply critical thinking to systematize their own curiosity.
If students are intimidated by the idea of coding or don’t have a strong background in computer science, they can still tap into their curiosity to create solutions through vibe coding—the practice of building software using plain English rather than programming languages. Students can rapidly prototype apps using natural language on platforms like Google’s AI Studio or Claude Code and create webapps, websites, and mobile apps without the skills of a computer scientist. Writing code in a language like Python or Java is now no longer a barrier for anyone who wants to create a technology solution to a problem.
But first, teachers can start by building chatbots with students, not for students. A simple class discussion—“We are going to build our own AI writing tutor. What rules should it have? How should it respond if you ask it to write the essay for you?”—is the lesson. The act of writing these rules teaches critical thinking, ethics, and system design—the very “augmenting” skills the labor market demands.
Patterns and Predictions
Using AI as a demonstration of applied intelligence, or the use of AI as a partner in solving problems, can also look like teaching students how to use a bot to do the things it does well and combine it with what humans do well. For example, AI bots are excellent at noticing patterns in large sets of data. There have been many examples of how machine learning has sorted through mountains of data that no human could ever analyze efficiently or accurately, which has led to advances in medicine, the environment, and disaster relief. In a recent study, Mass General Brigham researchers demonstrated that an AI tool could screen 458 patients for eligibility in a heart failure clinical trial for treatment when humans were only able to screen 284 patients in that same time (Unlu, 2025). With faster screening, patients can more quickly get access to lifesaving treatments.
When students apply AI, they expand their curiosity and creativity and become creators.
In a classroom setting, a teacher teaching the Civil Rights Movement could have students feed an AI tool several newspaper articles from 1955–1968 from different regions (Northern vs. Southern papers, Black vs. white-owned publications, etc.). Students could ask the AI tool to identify the words that appear most frequently, noticeable patterns in language shifts, events that get more coverage, and the framing of these events. While students could do this work themselves, an AI tool can do it more efficiently and likely more accurately, though teachers should always encourage students to fact-check the AI outputs for accuracy, as hallucinations are one of the risks of using artificial intelligence.
After students gather this data, they must interpret why these patterns matter. This is where human intelligence becomes essential. Students can interview community elders about their memories of reading these papers, research the economic pressures on different publications, and connect the linguistic patterns to broader power structures. Students can then create multimedia presentations that weave together the AI’s data insights with human stories, historical context, and their own analysis of how media shaped public opinion.
In this Civil Rights Movement lesson, AI provides the analytical foundation, but students provide the critical thinking, empathy, historical reasoning, and creative synthesis that transforms raw patterns into meaningful understanding. Neither AI nor the student alone could produce this depth of insight. Students learn both historical content and how to collaborate effectively with AI tools they’ll encounter throughout their lives.
Fostering Human-Only Skills
A significant collateral win of teaching students to use AI to solve problems is that it naturally fosters the very “human” skills we champion for our graduates. These are precisely the skills that AI cannot replicate and that the modern workforce demands. And it all starts with something so naturally human: curiosity.
Reflect & Discuss
Look at a recent assignment. Does it ask for “clerk work” (summarizing, retrieving facts) or “applied intelligence”? How could you redesign it so AI becomes a necessary tool, rather than a way to bypass the work?
What are some “messy,” real-world problems in your school’s community that students could solve if they viewed AI as a partner rather than a shortcut?