.
Using AI responsibly means knowing when not to use it
.
By Sam Illingworth, Edinburgh Napier University Professor of Creative Pedagogies Published: February 18, 2026
.
AI and professional responsibility
Educators need to distinguish when AI supports learning and when it substitutes for the cognitive work that produces understanding. Journalists need criteria for evaluating AI-generated content. Healthcare professionals need protocols for integrating AI recommendations without abdicating clinical judgment. This is the work I pursue through Slow AI, a community exploring how to engage with AI effectively and ethically. The current trajectory of AI development assumes we will all move faster, think less and accept synthetic outputs as a default state. Critical AI literacy resists that momentum. None of this requires rejecting technology. The Luddites (textile workers who organised against factory owners across the English Midlands in the early 19th century) who smashed weaving frames were not opposed to progress. They were skilled craftsmen defending their livelihoods against the social costs of automation. When Lord Byron rose in the House of Lords in 1812 to deliver his maiden speech against the frame-breaking bill (which made the destruction of frames punishable by death), he argued these were not ignorant wreckers but people driven by circumstances of unparalleled distress. The Luddites saw clearly what the machines meant: the erasure of craft and the reduction of human skill to mechanical repetition. They were not rejecting technology. They were rejecting its uncritical adoption. Critical AI literacy asks us to recover that discernment. Moving beyond “how to use” toward an understanding of “how to think”. The stakes are not hypothetical. Decisions made with AI assistance are already shaping hiring, healthcare, education and justice. If we lack frameworks to evaluate these systems critically, we outsource judgement to algorithms whose limitations remain invisible. Ultimately, critical AI literacy is not about mastering prompts or optimising workflows. It is about knowing when to use AI and when to leave it the hell alone.Looking for something good? Cut through the noise with a carefully curated selection of the latest releases, live events and exhibitions, straight to your inbox every fortnight, on Fridays. Sign up here.
Sam Illingworth, Professor of Creative Pedagogies, Edinburgh Napier University This article is republished from The Conversation under a Creative Commons license. Read the original article.
Discover more from MENA-Forum
Subscribe to get the latest posts sent to your email.