Sep 22 / Patrick Hickey

Is there an AI crisis in Irish education?

I ask the same question at the start of every training session: "How many of you are using AI regularly in your teaching practice?" The response is nearly always the same: teachers looking around the room, waiting for someone else to raise their hand first. Maybe two or three hands creep up, to the surprise of the vast majority.

What follows reveals the true scale of our crisis. I demonstrate what AI can do in proficient hands, watching their expressions transform from polite scepticism to genuine shock as they witness AI generating differentiated lesson plans, creating personalised feedback, and producing resources that would take them hours to develop. The looks on their faces are unmistakable: How did I not know about this? But I don't stop there. The demonstration takes a darker turn as I show how easily AI,
produces authoritative-sounding responses filled with dangerous misinformation, and creates
student work that's nearly impossible to detect. At this point, the natural progression would
be to discuss implementing the school's AI policy.
However, that conversation cannot happen. That policy doesn't exist
Leaving Cert Chemistry Grinds

The Policy Vacuum

Eighteen months after the Department of Education promised official guidelines, we still have nothing. The scale of this problem becomes starkly apparent when we consider that Leaving Certificate reforms have already been rolled out. Project work generally carries a weighting of 40% of students final grades, yet teachers find themselves in an impossible position. They have legitimate concerns about not being equipped to guide students on ethical AI use, whilst
simultaneously witnessing the fundamental unfairness that sees students with premium AI subscriptions gaining huge advantages over those without.

These are not abstract concerns 
about future possibilities; they are immediate realities affecting assessment fairness right now. Meanwhile, educators remain paralysed by the complete absence of policy guidance. They cannot create meaningful school-level guidelines without national direction, yet they watch
their students race ahead with powerful tools they themselves don't fully understand.

Students are already using AI

Our students are using AI. Not proficiently, but the are using it nonetheless. This is not a future threat we can prepare for at leisure; it is happening in every classroom right now. This reality creates a dangerous educational vacuum where students experiment with powerful technology whilst their teachers lack the expertise to guide them safely.
Making this crisis worse are the forces actively pushing AI adoption without regard for teacher readiness. Tech companies are flooding schools with seductive promises: lesson plans generated in seconds, PowerPoints created at the click of a button etc. Their marketing
suggests that teaching has been reduced to content delivery, where educators become mere facilitators of AI-generated materials.
But of course the tech industry's agenda is shaped more by profit than pedagogy, prioritising shareholder returns over educational outcomes. They are selling AI as a replacement for teachers, not as a tool to empower them.

Keeping Teachers at the Centre of Education

We must use AI as a tool for teaching and learning, not become the tool of AI. CPD work must focus on keeping teacher and human expertise at the centre of all AI processes. To ensure that balance, we can use the 5 Ps framework:

1. Professional Expertise: Defining the goal and framing the task with subject knowledge.

2. Prompting: Crafting the actual instruction or question posed to the AI.

3. Priming: Providing relevant context or examples to guide the AI's response.

4. Prechecking Prompt: Evaluating the prompt before submission to predict likely pitfalls.

5. Professional Expertise (for validation): Critically assessing the AI's output for suitability.

 Notice how professional expertise sandwiches the entire process. Only a qualified, experienced teacher knows what they're looking for at the beginning, and only they possess the knowledge to recognise when something isn't right and needs to be fixed or rejected
entirely. Consider the difference between a history teacher using AI to generate a World War II timeline versus a student doing the same. The teacher brings decades of subject knowledge, understanding of historical context, awareness of common misconceptions, and intimate knowledge of their students' prior learning. They can immediately spot factual errors and adapt output for their specific classroom needs.

 The student, lacking this domain knowledge, may accept AI output uncritically, potentially internalising misinformation. Yet this is precisely what's happening in our schools daily.

The Student Safety Imperative

Unsupervised student use of AI is educationally irresponsible. Students using AI without guidance aren't just risking poor grades; they're potentially compromising their intellectual development. They may become overly reliant on technology rather than developing critical thinking skills, and lose the ability to distinguish between authentic inquiry and AI-assisted shortcuts.

 Yet this seems to be precisely what numerous IT and EdTech companies are promoting with their seductive promises of feedback on your work in seconds, personalised guides for study, and
instant academic support. These companies are actively encouraging the very unsupervised student use that educators recognise as dangerous, marketing directly to students and parents
whilst bypassing teacher expertise entirely.

Action Required

We need immediate, comprehensive action. First, the Department of Education must prioritise teacher AI literacy as a national educational imperative. We need standardised mandatory training programmes before any discussion of student access can begin. Second, teacher training institutions must integrate AI literacy into initial teacher education programmes immediately. New teachers must be equipped with these skills from day one. Third, we need clear policies addressing educator and student use of AI tools.

One thing is crystal clear: teachers want to understand AI. They know it’s their responsibility to guide students safely, but they are being failed by the complete lack of support and training. The Department of Education must now deliver the long-overdue national AI guidelines for schools. This cannot be delayed any longer. Mandatory training for all Irish educators must
follow, built around those guidelines. Only then can we begin to have a serious conversation about student access.

Without this foundation, we are placing powerful tools in the hands of young people while leaving their teachers underprepared and unsupported.
Every day these guidelines are delayed, AI continues to reshape classrooms without oversight. The choice facing our educational leaders is clear: act now, or take responsibility for the chaos that may follow.