Here's something that doesn't make it into the AI-in-education keynotes:

A student gets into an altercation during lunch. Nothing extreme — a shove, some words, the kind of thing that happens at every school in America three times a week.

What happens next should take 10 minutes. What it actually takes is 45.

The teacher or admin sits down to write it up. They're pulling language from SEL frameworks, trying to draft a reflection that's developmentally appropriate, constructing a parent letter that's firm but not punitive, documenting everything for the file. They're doing this between classes, during prep, after school — time that was supposed to be for instruction or, god forbid, eating lunch.

And here's the part that actually matters: by the time all that documentation is done, the moment is gone. The student has moved on. The emotional window for a real restorative conversation has closed. The paperwork outlasted the learning opportunity.

That's the problem we decided to solve.

How it started

This came out of the Silicon Schools Fund's Exploratory AI program in 2024-25. Researchers from the Center on Reinventing Public Education at ASU were tracking more than 80 teachers and administrators across 18 California schools who were building and piloting their own AI tools. Our team at Gilroy Prep was one of them.

We didn't walk in with a product idea. We walked in with a question: where are teachers losing the most time on work that isn't actually teaching?

The answer kept coming back to the same place — the documentation layer around student behavior. Navigator Schools uses restorative justice practices for discipline, and we believe in that model. But the reality is that generating a restorative activity, writing a parent communication, and documenting the incident takes serious time. Time that competes directly with the restorative conversation itself.

What we built

The Restorative Practice Generator takes a few inputs: a description of what happened, the severity, the grade and reading level of the students involved, the behavioral goals you're targeting (empathy, responsibility, accountability), and how much time you have. It returns a restorative activity — usually a reading passage with reflection and discussion questions — plus a parent letter you can review and send.

The whole thing takes about 3 minutes. Not 3 minutes of clicking through menus. Three minutes from description to deliverable.

Ally Funk was on the development team and one of the first teachers to use it in a real situation. She was a 6th-grade STEM teacher at Gilroy Prep at the time — she's since stepped into the assistant principal role. During a field trip last year, a couple of students acted up. She used the tool on the spot. It generated a related reading with reflection questions and a parent communication letter she could proofread and send. The incident got addressed while it was still fresh, the parents were informed the same day, and Ally didn't lose her evening to paperwork.

That's what 42 minutes back actually looks like. Not in theory. On a field trip.

What we got wrong first

I want to be honest about the messy part, because this is where the real learning happens.

It took weeks of trial and error to get the outputs right. The team could upload our school's behavior policies and decision matrix to shape the tool's responses, but we couldn't — for privacy reasons — enter personal student data. That meant the tool couldn't detect behavioral patterns across incidents for individual students. It can respond to what you tell it about this incident, but it doesn't know the student's history.

That's a real limitation. And it's one we chose deliberately. The privacy constraint shapes what the tool can do, and we'd rather have a tool that's useful within clear boundaries than one that's powerful but unsafe.

The other thing the team learned: you have to keep refining. A chatbot is only as good as what you teach it, and the first versions weren't producing reflections that matched what teachers actually needed. The people doing the work — Ally and the rest of the dev team — kept changing the requirements as they tested. That's not a bug in the process. That is the process.

The thing Ally said that I keep coming back to

When Education Week covered this project last summer, Ally said something that I think every district leader needs to hear. She was clear that the tool only works within the context of strong student-teacher relationships. The generator gives you a piece of paper with questions. But if you haven't built trust with that student first, the paper doesn't matter.

She put it simply: relationships should be priority number one.

That's not a caveat. That's the whole design philosophy. The tool doesn't replace the human work. It clears the path so the human work can actually happen — while it still matters, while the student still cares, while the moment is still alive.

What the researchers found

CRPE's Chelsea Waite, who was tracking all 18 school teams, put it in a way that stuck with me. She said AI could be a "core accelerator" — fueling teachers' capacity to deliver on an instructional goal — or it could be a "paint job." The difference wasn't the technology. It was whether the school had a clear vision for what problem they were solving.

That tracks with everything I've seen. The schools where AI tools actually stuck were the ones where the tool was built to solve a specific, painful workflow problem. Not "let's add AI to something." More like "this thing takes 45 minutes and it should take 5. Can we fix that?"

The Restorative Practice Generator is now being expanded across all Navigator Schools campuses for 2025-26. Not because someone mandated it from the top. Because teachers at Gilroy Prep kept using it, and teachers at other campuses started asking for it.

That's the adoption pattern I trust. Not rollout. Pull.

The question for your district

If your teachers are spending more time documenting student behavior than actually addressing it — and I'd bet most of them are — that's not a discipline problem. It's a systems problem.

And the fix isn't "buy an AI tool." The fix is sitting down with the people doing the work, mapping where the time actually goes, and building something together that gives them those minutes back.

That's what cobuilding looks like. Not a vendor demo. A field trip, a couple of kids acting up, and a teacher who had the right tool because she helped design it.

— Dan

P.S. If you want to see where your district stands on AI readiness across six dimensions — policy, privacy, teacher support, student-facing AI, tool governance, and leadership vision — I built a free self-assessment for exactly that. No vendor pitches in the margins. [checklist.smarterbydesign.app]

Keep Reading