AI in classrooms is inevitable. Equity isn't.
teaching is broken header
AI is a buzzword in education right now, and it’s not hard to see why. It promises to ease teacher workloads, flag problems before they spiral, and personalise learning like never before. In a sector perpetually starved of time, money, and staff, that kind of help feels revolutionary.

But here’s the part we’re not saying loud enough: the arrival of AI in classrooms isn’t guaranteed to be a good thing. It’s a turning point. And like all turning points, where we end up depends entirely on the choices we make now.

Because if we’re not careful, if we don’t design for equity from day one, we won’t just risk repeating old mistakes, we’ll code them in.
We’ve been here before
During COVID, digital tools flooded classrooms overnight. Some students barely flinched, armed with high-speed internet, quiet rooms, and laptops of their own. Others squinted through cracked phone screens, shared devices with three squabbling siblings, and crossed their fingers that their home Wi-Fi would hold for the duration of the lesson. And it showed. The already stark attainment gap only grew.

The digital divide isn’t new. But the pandemic made it impossible to ignore.

Now, with AI, we’re once again watching technology move faster than schools can keep up and we’re once again assuming that access equals impact.

Ask any underfunded or overstretched school what it means to ‘roll out’ new tech. It’s not just about plugging it in. It’s about whether the Wi-Fi works, whether there are enough devices, and importantly, whether teachers have time — any time — to learn the system, troubleshoot it, and still do their jobs.

This isn’t resistance to change. It’s reality.
The promise is real. But so is the risk.
No one’s denying that AI could make a meaningful difference in classrooms. Teachers spend hours (often unpaid ones) marking work, writing feedback, and chasing data. The cognitive load is immense, and the time it steals is irreplaceable.

Imagine a system that drafts feedback, flags patterns across classes, and handles admin at scale. That doesn’t replace the teacher, but gives them more room to be one. That’s powerful. Necessary, even.

But we also have to reckon with how AI ‘reads’ student work. These systems don’t come neutral. If they’re trained on narrow data — standardised language, middle-class patterns of expression, neurotypical outputs — they risk penalising what they don’t understand. A student using non-standard grammar might be flagged as underperforming. A neurodivergent answer could be misread as incoherent. AI can’t yet ‘read’ nuance the way a teacher can. If we’re not careful, we risk building systems that confuse difference with deficiency.

AI isn’t just a tool. It’s a lens. And unless we tune that lens with care, we risk filtering out the very students we’re trying to support.
This time, we can plan ahead.
Unlike COVID, AI isn’t an emergency. It’s a slow-build revolution. And that gives us something precious: time to think. To build with intention. To ask: who is this really for?

Is it for students still waiting on functional IT suites? For teachers logging into five different platforms a day just to track attendance and homework? For families with no digital literacy support?

Or is it for the schools already ahead of the curve, the ones with EdTech budgets, internal IT teams, and a parent community that knows how to advocate?

Look internationally and you’ll see what intentionality can look like. In Finland, AI is already embedded into the national curriculum, not just as a tool, but as something students are taught to understand and question. Singapore’s digital strategy long-term infrastructure investment and mandatory teacher training. The tech isn’t just delivered. It’s supported.

What would it look like to have that level of readiness here?

Because unless we design with the most stretched schools in mind — unless those classrooms are the baseline, not the exception — we’ll build yet another system that works beautifully for the few and barely at all for the rest.

The best AI in education won’t just speed things up. It’ll deepen what matters. It’ll help teachers notice more: the kid who’s quietly slipping, the class that’s losing focus, the pattern that’s not just data but a call for help.

And most importantly, it will protect space for the things machines can’t do: listen, care, and ask the questions that change everything.

The UK government seems to agree. It recently invested £1 million into 16 Ed-Tech companies to build these AI-powered marking and feedback tools with equity in mind.

As Professor Sylvie Delacroix said at the Human Intelligence in the Age of AI conference:
‘The challenge isn’t whether AI belongs in education — it’s whether we can design systems that augment rather than automate away the art of paying attention.’
That’s it. That’s the whole game.
What we’re building towards
At Classify, we’re not trying to be the smartest tool in the classroom. We’re trying to be the most useful. Our focus is simple: give teachers better, faster feedback tools that work in the real world — across patchy infrastructure, packed timetables, and impossible demands.

We don’t believe AI replaces teachers. We believe it can finally give them some room to breathe.

But that only happens if we build with them. Not just for the high-performing schools with sleek tech stacks. For the ones doing the hardest work in the toughest conditions.

Because AI in classrooms is inevitable.

Equity isn’t.

Let’s choose it anyway.
View on Medium

Interested in what Classify can do for your school?
Thank you for contacting Classify! One of our team will be in touch shortly!
Oops! Something went wrong while submitting the form.
Let's Talk!
If you have any questions or feedback about Classify, we'd love to hear from you!
Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.