AI Won't Fix Your Leadership Problem

A principal recently wrote about how AI saves him time, letting him focus on relationships. But reading through his examples, I kept seeing the same pattern: AI as a band-aid for leadership problems that shouldn’t exist in the first place.

That IS the Job

The author complains that “it’s very challenging for principals to find time for ‘principaling’” because they’re too busy “balancing instruction and safety, managing the endless stream of unexpected crises, and communicating with staff, parents, and students.”

I mean, that is principaling. Like it or not, that’s the job. If you’re waiting for a calm moment to finally do “principal work,” you’ve misunderstood the role entirely.

Boring Meetings Are a Leadership Problem, Not a Tech Problem

He admits he “sometimes dread[s] faculty meetings as much as many teachers do.” His solution? Ask AI how to spice them up.

Here’s the thing: if you’re the principal and your meetings are boring, that’s on you. You have the power to make them engaging from the start. Nobody is going to complain about not having boring meetings. Boring meetings aren’t a contractual requirement—they’re a sign of lackluster leadership.

And what does AI suggest? Staff recognitions, student performances, teachers sharing strategies. Fine ideas. But they’re generic answers to a generic problem because he asked a generic question. I imagine his prompt went something like: “I am a principal and my faculty meetings are boring. How can I spice them up?”

Compare that to a purposeful prompt: “It’s February, we’re starting a new semester. One-fourth of my 35 teachers have new students. Our goal is to discuss interventions for kids who struggled first semester. Help me create a plan that gives teachers time to collaborate using their expertise.”

The first prompt gets AI slop. The second gets something useful. But you have to know what you’re actually trying to accomplish first.

Having AI Read Student Feedback Kills Relationships

This one floored me. He had 25 students submit “thoughtful, detailed responses” to a apply to join a student feedback forum. His proud solution? AI read them in 30 seconds.

He basically told those students: “I don’t actually care what you think, because I had an AI read your responses and pick the winners.”

How are kids supposed to trust that he’ll listen to them when he’s proud that a machine ingested their thoughtful responses? You can’t claim to prioritize relationships while outsourcing the act of listening.

Generic Prompts, Generic Results

For hiring, he uses AI to “generate targeted, competency-based interview questions” with “just a brief description of the role.” But feeding AI just the role gets you AI slop questions that don’t reveal anything meaningful about candidates.

He’s starting to get it when he notes there’s a difference between asking for “teacher interview questions” versus “questions to identify an elite English teacher who prioritizes student learning.” But what does “elite” even mean? Are we talking Freedom Writers? Dangerous Minds? Stand and Deliver? AI does better when you define your terms.

And using AI to find inspirational quotes for morning announcements? There are hundreds of websites that do exactly this. That’s just a search engine with a couple fewer steps and a little more customization.

His Advice Contradicts His Examples

The author’s big takeaway is to “only use AI for tasks that don’t undermine relationships with other people.”

But look at his own examples:

  • “Spicing up” meetings for the sake of not being boring, without adding real value
  • Having AI decide which student feedback matters
  • Finding quotes that could be wrong and reflection questions that lack context

Each one, if you’re being honest, undermines the relationships he claims to prioritize.

He even says teachers “shouldn’t use it to grade an essay.” But it’s fine to use it to grade applications to a student feedback forum?

The Question He Never Asks

AI can save time. AI can make things easier. He’s right that relationships matter most. But he never asks the real question: Should I even be doing these things in the first place?

Before you optimize a process with AI, ask whether that process should exist. Before you speed through student feedback, ask whether speed is what those students need from you.

AI can’t create human moments of connection. That part he got right. But the rest of the article shows how easily we can convince ourselves we’re building relationships while automating them away.


This article is a response to “Four Ways I Use AI as a Principal” by S. Kambar Khoshaba.

Notes mentioning this note

There are no notes linking to this note.


Here are all the notes in this site, along with their links, conveniently visualized as a graph.

Follow
Follow
Follow