Tech Explained: Here’s a simplified explanation of the latest technology update around Tech Explained: I’m Not Worried AI Helps My Students Cheat. I’m Worried How It Makes Them Feel (Opinion) in Simple Termsand what it means for users..
I recently displayed a photo to a 9th grade humanities class and asked what should have been an easy question: “Is this real?”
The students voted, and half of them believed the image was AI-generated. It was not. It was a photo I had taken at a local Vermont beach.
I tried again, this time with a photo of my cat, Rafiki. The results were even more stark. Almost all the students thought it was an AI-generated image. Nope. It was real, taken in my kitchen.
When we talk about artificial intelligence in schools, we usually focus on cheating and plagiarism. But what I saw that day wasn’t about academic integrity. It was about trust.
When students can’t be sure whether a picture of their teacher’s cat is real, we are facing something much bigger than a student using ChatGPT on a history assignment. We are facing a world where certainty itself feels unstable and school suddenly feels like just another place where students aren’t sure they can trust the version of reality being offered.
For most of modern schooling, facts might get debated, but we didn’t question whether they existed. We argued over interpretation and meaning, not whether the basic thing in front of us was real. Our shared reality was the starting point. You could trust your eyes.
Many of us grew up inside that shared reality. I remember the World Book Encyclopedia. If someone had the “G” volume, you waited your turn before starting your report on Greenland or germs. Once you got the book, you trusted it. Information was scarce, and that book represented a version of the truth we generally agreed on, even when it was flawed and incomplete. It still gave us a shared starting point.
Today, that common floor has dropped out. Our students don’t have that shared reality. They exist within a digital feed that doesn’t stop and never seems to agree with itself. This erosion isn’t new. The internet had already made it easier to question everything and trust nothing before the widespread use of generative AI, but it has accelerated dramatically.
Truth used to feel like something we could find if we searched hard enough, but that’s not the experience our students are having. In fact, the more they search, the harder it can become to tell fact from fiction. In this new AI landscape, we’re all sorting through endless versions of reality and deciding which one we’re willing to live with. It’s exhausting and it’s exactly what I saw in my students when they looked at those photos.
This isn’t just an AI problem. If you can’t trust an image of a cat in a kitchen, it becomes harder to trust the larger promises society makes about the future. Gen Z financial commentator Kyla Scanlon calls this the “end of predictable progress.” For decades, the path was clear for many of us. You went to school, got a job, and eventually bought a house. But that path has dissolved into the same fog as AI-generated images.
Our students feel this instability everywhere. They are told AI may replace careers before they even start them. They see a housing market that feels permanently closed. They live in what Scanlon calls a “casino economy,” where a viral moment can feel more valuable than years of steady work.
The version of learning we’re offering our students no longer matches the world students are trying to survive. When even a teacher’s photo doesn’t feel stable, the old model of school cracks. If students are taught to question every headline and doubt every promise of the future, why would they walk into a classroom and trust us? School can start to feel like just another simulation, a game of compliance disconnected from the physical world they actually have to navigate.
School is too important to be a game. We have to stop asking the small questions. We spend so much time debating whether AI can do a student’s work, but the students are stuck on much more existential questions. They are trying to figure out if the work still matters, if school still matters, and honestly, if they still matter.
If school is going to mean anything in this world, maybe it’s time to shift from “learning” as a way to prepare for the future to learning as a way to understand and change the present. Students are demanding relevance. We can’t just hand them information anymore or tell them to trust us that what we’re teaching them today will matter in the future. We have to give them work that carries real and immediate consequence.
We need students creating things they can touch and solving problems in their own schools and neighborhoods that won’t get fixed unless they are there to do it. We need them grappling with what it means to be human, what it means to be needed, to be necessary. AI can write a report, but it can’t stand in the cold Vermont snow to help a neighbor. It can’t make students feel like they matter. That’s what will actually make school feel real.
With AI reshaping everything we see, showing our students how we live with uncertainty may be the most honest thing we can do. We have to stop pretending we have the answers and start making our own questions visible. When a source feels unreliable, we should think out loud. We need to model how we weigh evidence and how we decide what actually deserves our trust. No handbook or district policy can do that for us.
Trust is built by showing up for a student day after day. A chatbot can generate a perfect answer, but it can’t recognize the moment when a teenager finally starts to understand who they are and it can’t understand what it takes to keep showing up when everything feels uncertain. That is human work, and it is where teachers matter more than ever.
The goal is no longer just to teach the curriculum. The goal is to give students something real.
