← Back to insights
FrameworkTeachers
Guide excerpt · Field Note 4April 20, 20265 min read

Why Students Reach for AI

Students reach for AI for predictable reasons: panic, confusion, fear, or perceived irrelevance. When we address those causes through design, AI over-reliance usually drops on its own.

When we discover a student used AI heavily, the first reaction is often disappointment or anger. "They took the easy way out." "They don't care about learning."

Sometimes that's true. But usually it's more complicated.

Students reach for AI for reasons that make sense from their perspective. When we understand those reasons, we can design around them. This isn't about excusing shortcuts. It's about recognizing that behavior has causes, and many of those causes are things we can influence.

Six reasons, and what design can do

1. Panic and time pressure

The assignment is due tomorrow, they haven't started, and they're overwhelmed. AI promises a lifeline.

Design response: Add checkpoints with real deadlines and grades. When an outline is due two weeks before the final draft and counts for 15% of the grade, students can't wait until the last minute. Build in studio time so starting happens in class, with support, not at 11pm alone.

2. "This feels like busy work"

The student doesn't see the point. The work feels disconnected from anything real, and if it doesn't seem to matter, letting AI do it feels reasonable. This is usually about perceived value rather than laziness.

Design response: Make purpose explicit. Add a "why this matters" line to every assignment. Connect to authentic contexts: real audiences, actual problems, local examples. Offer choice in topic or format.

3. Blank-page paralysis

They want to do the work but sit down, stare at the empty page, and nothing comes. AI feels like the only way to get unstuck.

Design response: Provide scaffolding: sentence starters, templates, or frameworks. Model the process, and show messy first attempts. Create low-stakes starting points ("write three bad ideas" is easier than "write your thesis").

4. Unclear expectations

They're not sure what's allowed or what quality means, so they're guessing, and AI fills in the gaps. This is confusion, not defiance.

Design response: State your AI policy clearly on every assignment. Use the Stoplight System. Make success criteria concrete: not "analyze effectively" but "identify three specific examples and explain how each supports your claim." Show models of what A, B, and needs-improvement look like.

5. Skill gaps they're hiding

They don't have the skills the assignment assumes. They're using AI to hide what they don't know, driven more by shame and self-protection than bad character.

Design response: Build in legitimate support structures so using resources is normal. Create skill-building moments within the assignment itself. Offer choices that work with different strengths. Normalize revision, and frame work as "not there yet" rather than "wrong."

6. Everyone else is doing it

Their friends use AI, they see other students succeeding with it, and it feels like the new normal. This is a social norms problem more than an individual moral failure.

Design response: Establish clear class norms early. Make appropriate AI use visible and valued. Design assignments where AI shortcuts are obvious. When you verify learning through checkpoints and oral defenses, heavy AI use becomes visible, and social norms shift.

What this means for design

| Student reason | Design response | |---|---| | Panic and time pressure | Checkpoints, studio time, visible timelines | | Feels like busy work | Clear purpose, authentic contexts, choice | | Blank-page paralysis | Scaffolding, models, unsticking support | | Unclear expectations | AI policy visible, concrete criteria, examples | | Skill gaps | Support structures, choices, revision valued | | Social norms | Clear class norms, verified learning |

Notice that none of these solutions involve better detection or stricter punishment. They're all about teaching and assignment design.

The question to ask

When you suspect a student used AI heavily, before you feel angry or disappointed, ask:

What about this assignment made AI feel like the right move to this student?

Sometimes the answer is "they took an unethical shortcut." But often the answer points to something we can fix.


Students reach for AI for predictable reasons: panic, confusion, fear, or perceived irrelevance. When we address these causes through better assignment design, AI over-reliance often decreases on its own.

Chris Meehan

Chris Meehan

I lead academic technology at Berkshire School and recently finished my master's at Brown, researching AI in grades 9-12. I publish frameworks, tools, and articles for secondary-school educators.