There's Engineering Skills Hiding in the Arts Department
- Bartholomew Lind
- 6 days ago
- 9 min read
Why the skills we’ve historically treated as “soft” are becoming the most powerful levers of the AI age.
Spend any time around schools, universities, teacher training, or early-career recruitment at the moment and you’ll feel it: a strange quiet panic, an uneasy hum vibrating under everything. Education has always been a game of aiming at a moving target, but right now it
feels like that target is moving and the ground under your feet is sliding sideways.
There’s a dizzy inevitability to AI — the sense that this technology is here, it’s everywhere, and it’s clearly important… yet no one quite knows how to think about it. We’ve got inevitability with a healthy serving of existential uncertainty piled on top. Every week brings a new breakthrough, a new “GPT-plus-plus-turbo-ultra-max,” a new think-piece announcing utopia or doom. Every sector has an opinion, every pundit has a hot take, and every parent hears something different on their favourite podcast.
So instead of adding to the noise, this post attempts something calmer and oddly radical: a principle we can (and urgently need to) stand on. A simple, durable idea for deciding which skills we should be teaching, learning, and valuing - regardless of how the technology evolves.
Literacy, communication, and critical thinking - the core skills traditionally taught in the Arts - will become more valuable, not less, in the age of AI.
Not because we’re saving Shakespeare from the robots. Not because I’m trying to sneak more philosophy into your child’s curriculum like a spinach leaf in a smoothie. But because this conclusion falls out naturally from one of the most useful ways to understand AI:
AI isn’t just new software. It’s a new user interface for computing: a shift from coding to natural language.
That lens matters because it’s:
universal (it applies to everyone)
durable (this shift can’t be undone)
familiar (we’ve seen interface revolutions before)
Before we dive in, two caveats (because what’s an Arts-adjacent argument without some caveats?).
This isn’t a brand-new discovery.
Scholars, technologists, and employers have been hinting at this already. McMaster University philosopher Johannes Steizinger writes, “more than ever, we need citizens who have learned to think for themselves and developed the capacity to critically assess the complex challenges in the world.” The conversation is happening - it’s just not reaching the people who need it most: students, teachers, and those making career choices today.
I’m biased.
I’m writing from a Humanities background, shaped by teachers and artists who genuinely believe the world is made better through critical thought, curiosity, and interpretation. That passion can come with a bias. When people like the philosopher above argue that thinking deeply is the key to a meaningful, capable life, they’re often describing the life they’ve found value in. It’s a bit of what Nietzsche warned about: philosophers imagining that the “good life” looks suspiciously like their own.
Then again, the only reason I can spot that bias is because I vividly remember a stuffy summer lecture where my first philosophy professor - with a healthy drop of Nietzsche’s madness in his eyes - described this very issue. There is a chance we overestimate the importance of our discipline because it shaped us. But I truly believe that the logic of what I outline here stands on its own feet, it just makes objective sense.
This isn’t about replacing STEM. It’s about recognising that literacy and critical thinking are now infrastructure. They enable every other discipline to harness AI effectively.
With that, let’s get to the heart of it.
Natural Language: The New User Interface
We have spent decades teaching people that computers require specialised languages. SQL, Python, JavaScript, C++, R. To talk to a machine, you had to speak machine. Now the most powerful software on Earth responds to: English. Te reo Māori. Mandarin. Spanish. Txt-speak. Emojis, sometimes.
AI tools can interpret, analyse, manipulate, generate, and act using nothing but plain language instructions. That’s why MIT describes prompting as “a machine you are programming with words.” You don’t need to know how the engine works - only how to drive it, with language as the steering wheel.
This shift is bigger than “AI helps you write emails faster.” It means:
Writing is becoming a technical skill.
Reading critically is becoming cybersecurity for your brain.
Vocabulary is becoming a control panel.
Conversation is becoming a form of programming.
This isn’t hype. It’s a transformation in how we operate technology itself.
Let’s break down why this matters - and why it’s a huge green flag for Arts-department skills.
1) Universality: Everyone Becomes a User
Not everyone will be an AI engineer. But nearly everyone will interact with AI as part of their daily work - just as we all learned to use web browsers, email, and smartphones without being software developers.
Three-quarters of global knowledge workers were already using generative AI by 2024. Eighty-six percent of students were using it for study in that same period. In New Zealand, even primary teachers are joining in, albeit slower, with 66% adoption.
Outside productivity? Millions are using AI recreationally as a companion, a creative toy, a tutor, or without even knowing their content is AI-generated.
The takeaway:
If the interface of AI is language, then everyone who can think clearly and write precisely becomes more capable.
Everyone who can’t… becomes increasingly dependent on whatever the model spits out.
Literacy becomes a competitive advantage. Illiteracy becomes a liability.
2) Durability: We Can’t Put This Shift Back in the Box
Many AI debates focus on fragile predictions:
Which models will “win”?
Will regulation slow things down?
How many jobs will disappear?
Will AI destroy civilisation, or just marketing departments?
All important questions - but they miss the durable foundation. Even if:
compute plateaus,
performance stagnates,
the industry consolidates,
or we collectively decide AI should slow down…
We still don’t go back.
We already unlocked the scientific breakthroughs that matter:
Deep learning architectures
Transformer models
Scaled training on massive datasets
Multimodal alignment
Natural language instruction as a control layer
Attention Is All You Need’s architectural and technological breakthroughs will not be un-invented. The public’s intuitive discovery - “I can command machines by talking to them” - cannot be un-felt. Even if every tech company went bankrupt tomorrow, someone would rebuild this tech just from the knowledge alone.
The genie is out, and it speaks plain English.
This means:
The only long-term difference between people using AI well and badly will be their ability to think, judge, and communicate.
Not their ability to code in six languages or memorise the internal workings of a GPU cluster. There will always be specialists for that. For everyone else, the technical barrier is already melting away - leaving literacy as the differentiator.
3) Familiarity: We’ve Seen Interface Revolutions Before
This isn’t the first time a new interface made technology accessible to millions.
When we moved from command lines to graphical user interfaces, computers stopped being tools for experts and became tools for everyone.
When we moved from low-level programming to high-level languages and compilers, software went from elite craft to global industry.
When we moved from print publishing to the social-media “Publish” button, communication exploded - along with misinformation and attention-hack business models. (A cautionary tale we should probably not repeat.)
Every interface revolution does three things:
Reduces the technical barrier
Expands the number of users
Introduces new risks we didn’t anticipate
Sound familiar?
We know this pattern. It means:
Adoption will be messy but inevitable.
Value won’t emerge from technology alone.
Risks will come from human behaviour, not code.
And history gives us something else useful:
The winners of new interface revolutions are rarely the most technical. They are the ones who understand what these tools are for.
That requires creativity, reasoning, ethics, strategy - in other words, the Humanities.
Vocabulary Is Your Control Panel
Let’s get practical. The better your command of language and concepts, the more powerful you become with AI.
A strategist who knows Porter’s Five Forces can ask for a competitor analysis using that exact lens.
A marketer who understands tone and rhetorical devices can request wry minimalism instead of “friendly.”
An artist can ask for Baroque chiaroscuro with brutalist forms and get something no “draw me a cool building” prompt ever could.
If you can be specific and articulate, you unlock more value.
Zoom out a little and you can see the next layer: someone who understands all three domains - strategy, communication, and creative aesthetics - and is comfortable with cross-contextual thinking, could string them together into a multi-stage system. With the right vocabulary and the ability to think abstractly, that person could orchestrate a phased pipeline of specialised AI assistants: one conducting market research, another analysing competitor positioning, another drafting high-performing creative, and one optimising deployment. Not just a ‘chatbot user’, but something closer to an AI architect - someone who engineers outcomes by engineering language, sequencing, and perspective.
That ability doesn’t come from shortcuts, templates, or “Top 20 Prompts for Marketing Success.” It comes from literacy, critical thinking, and conceptual breadth. In other words: the exact skills Arts education has been training for thousands of years.
Without those skills?
You get generic answers.
You fail to critique the output.
You don’t know what to ask next.
You can’t distinguish insight from nonsense.
South Australia’s EdChat pilot with Microsoft showed that the most successful students weren’t the ones who knew the most tech. They were the ones who could articulate what they needed, critically evaluate what they received, and iterate.
UNESCO warns that AI’s confident tone can produce uncritical acceptance of misinformation, leading people to trust incorrect or biased answers simply because they sound smart.
The solution isn’t more tech training. It’s literacy training.
Words aren’t fluff. They’re levers. They’re system commands. They’re quality control.
This is the opposite of what schools have been told for a decade.
The Skills the Economy Is Shouting For
Employers have quietly figured out what our education systems haven’t yet caught up to. The World Economic Forum’s 2025 Future of Jobs Report doesn’t put cloud engineering, Python, or database management at the top of its list. The top skills are:
Analytical Thinking
Creative Thinking
LinkedIn’s 2024 study shows 92% of executives now rank communication and critical reasoning as more valuable than technical expertise. McKinsey’s 2025 report warns that the biggest barrier to AI adoption isn’t cost or capability - it’s “a shortage of critical thinking needed to redesign workflows and decisions around AI.”
Here’s the important part:
Literacy - broadly understood as reading deeply, writing clearly, and thinking critically - is not a nostalgic relic of the pre-AI world. It’s the differentiator of success in the AI world.
Yet the systems we’ve built to train our akōnga, our workforce, and even ourselves don’t reflect that reality. Our policies, curricula, and cultural attitudes still treat Arts subjects as “nice-to-haves,” electives, enrichment activities, or worse: fallback options for students “not suited” to STEM or business. The irony is astounding - we are deprioritising the exact skills that now make technology valuable, ethical, and effective.
This isn’t a soft-skills problem. It’s a design problem.
If we treat reading, writing, and critical analysis as secondary disciplines, we will produce a generation that can access AI, but cannot wield it - a workforce that consumes machine output but cannot challenge, redirect, compose, or orchestrate it.
We risk creating users, not thinkers. Operators, not architects.
But the inverse is incredibly hopeful.
A once-in-a-generation opportunity
We could unlock a new wave of productivity, creativity, and innovation simply by empowering the people already doing the work: the teachers, librarians, music tutors, drama coaches, debate mentors, arts lecturers, writing instructors, historians, and the misfit educators building critical thinkers in classrooms where no one funded their printer ink.
These aren’t “extras.” They are infrastructure.
We have already invested 5,000 years into developing the very skill set that AI amplifies - and we do not need to reinvent them. We simply need to value them, design around them, and teach them like they matter economically, civically, and creatively. Because they do.
So what should we do?
Double down on deep reading, discussion, and argumentative writing.
Treat prompt-writing as a form of communication, not a cheat code.
Encourage teams and students to cross disciplines and build conceptual range.
Celebrate fluency, skepticism, creativity, and curiosity as technical assets.
Frame these skills as levers, not luxuries.
These competencies will soon be measured, tested for, and prioritised in hiring - not because they are fluffy or fashionable, but because they materially change the quality of outcomes when working with AI. This cultural shift matters as much as any policy or software upgrade.
This Has to Be the Point
Let’s drop the economics, the job market, and the policy debates for a moment. Here’s the question underneath it all:
What do we actually want humans to do when machines can do so much for us?
We cannot automate spreadsheets so we have more time… to make more spreadsheets. We cannot free workers from drudge tasks only to fill their freed-up calendars with more meetings about being overwhelmed. That’s not innovation. That’s administrative masochism.
The point of technology is not efficiency for efficiency’s sake. It’s freedom:
to think more,
to create more,
to empathise more,
to build meaningful things.
Automation should be the great unlocking - not of productivity alone, but of potential.
AI does not make Arts skills obsolete. It makes them the whole point.
The goal isn’t to produce humans who act like machines.
The goal is to let machines handle the machine work so humans can become more human.
That is the future I’m interested in building - not just software adoption, but a world where technology amplifies curiosity, creativity, literacy, and critical thinking. A world where the Arts aren’t fighting for their place; they’re powering the engine.
Comments