TL;DR: August’s edition examines how companies are eliminating graduate jobs while redirecting recruitment budgets to failing AI pilots. From ancient rhetoric becoming essential survival skills to pre-social media university life, this edition explores why authentic human connection is our ultimate competitive advantage.

Image created on Midjourney
The future
“First, we made our jobs robotic. Then we trained the robots how to do them. If AI takes your job, it won’t be because it’s so smart. It will be because over time we’ve made work so narrow, so repetitive, so obsessed with minimising variance and deferring to metrics, that it’s perfectly designed for machines.”
Tom Goodwin’s above observation about how we’ve made our jobs robotic before training robots to do them, articulated in mid-August on LinkedIn, feels remarkably prescient as thousands of teenagers prepare for university. When I interviewed the business transformation consultant, bullshit caller / provocateur, and media expert in 2022, following an update to his book Digital Darwinism, he warned about a looming leadership crisis. That crisis is now playing out in ways he probably didn’t fully anticipate.
The timing of the post couldn’t be more brutal. It’s been 25 years since I received my A-level results. Yet, I can still easily recall the pre-reveal trepidation followed by relief that I’d scraped the grades to study English Literature and Language at the University of St Andrews (as a peer of Prince William – more on this below, including a 20-year-old graduation picture).
What a thrilling time of year it should be: the end of school, then taking the next step on life’s magical journey, be it straight to university without passing go, a gap year working and then travelling, or eschewing higher education to begin a career.
I wonder how this year’s A-level leavers feel, given they’re walking into the most uncertain graduate job market in a generation. The promises made to them about university – to study hard, earn a degree, and secure a good job – are crumbling in real time.
Data from job search engine Adzuna suggests that job postings for entry-level positions requiring a degree have dropped by almost two-thirds in the UK since 2022, roughly double the decline for all entry-level roles (as quoted in the Financial Times). The same research found that entry-level hiring is down 43% in the US, and 67% in the UK, since ChatGPT launched in November 2022.
The study results tally with other sources. In June, for instance, UK graduate job openings had plunged by 75% in banking and finance, 65% in software development, and 54% in accounting compared to the same month in 2019, according to Indeed (also in the FT piece).
Meanwhile, students graduating from universities in England in 2025 have an average student loan debt of approximately £53,000, with total outstanding loans reaching £267 billion. Frankly, is university worth it today?
I was fortunate enough to be part of the last cohort to benefit from minimal tuition fees in Scotland before they were introduced to all students in the 2005-6 academic year. Further, when I studied my postgraduate degree in magazine journalism at Cardiff University’s JOMEC, we were (verbally and anecdotally) guaranteed jobs within a year; and, as far as I know, all my peers achieved that. Such certainty feels alien now, even quaint.
But where does this trend lead? What happens when an entire generation faces systematic exclusion from entry-level professional roles?
A Stanford University study tracking millions of workers through ADP payroll data revealed something rather more troubling: young workers aged 22-25 in “highly AI-exposed” jobs experienced a 13% employment decline since OpenAI released its LLM just under three years ago, while older colleagues in identical roles saw steady or rising employment.
Arguably, we’re witnessing the first generation where machines are genuinely better at doing what universities taught them than they are.
Erik Brynjolfsson, one of the Stanford paper’s co-authors (and a professor whom I interviewed a couple of months after ChatGPT was unveiled – even back then he was warning about the likely problems with AI acceleration and jobs), put it bluntly: “There’s a clear, evident change when you specifically look at young workers who are highly exposed to AI.”
The research controlled for obvious alternatives — COVID effects, tech sector retrenchment, interest rate impacts — and the correlation held. Software developers and customer service agents under the age of 25 saw dramatic employment drops. Home health aides, whose work remains both physically and emotionally demanding, saw employment rise.
The distinction matters. AI isn’t just replacing workers randomly, but it’s targeting specific types of work. The Stanford team found that occupations where AI usage is more “automative” (completely replacing human tasks) showed substantial employment declines for young people. In contrast, “augmentative” uses (where humans collaborate with AI) showed no such pattern.
Anthropic CEO Dario Amodei warned in May that half of “administrative, managerial and tech jobs for people under 30” could vanish within five years. He’s probably being conservative.
However, what’s especially troubling about this shift is that a new MIT research, The GenAI Divide: State of AI in Business 2025, suggests that many AI deployment programmes are failing to deliver expected returns on investment, with companies struggling to show meaningful productivity gains from their technology investments. Specifically, 95% of generative AI pilots at companies are failing, and delivering next to no return on investment.
Despite this, organisations continue redirecting budgets from graduate recruitment to AI initiatives. Half of UK companies now want to redirect money from staff to AI, according to Boston Consulting Group research.
This creates a dangerous paradox: companies are cutting the graduate pipeline that develops future leaders while betting on technologies that haven’t yet proven their worth. What happens to organisational capability in five years when the cohort of junior professionals who should be stepping into senior roles doesn’t exist, or those that are in the job market don’t have any meaningful experience?
This connects directly to Tom Goodwin’s observation. The combined forces of consulting culture, efficiency obsessions, and metric-driven management have reshaped roles once built on creativity, empathy, relationships, imagination, and judgment into “checklists, templates, and dashboards”. We stripped away the human qualities that made work interesting and valuable, creating roles “perfectly designed for machines and less worth doing for humans”.
Consider those entry-level consulting, law, and finance roles that have vanished. They were built around tasks like document review, basic data analysis, research synthesis, and report formatting – precisely the narrow, repetitive work at which large language models excel.
Yet amid this disruption, there are signals of adaptation and hope. During a recent conversation I had with Joanne Werker, CEO of the people engagement company Boostworks, she shared statistics and personal insights that capture both the challenges and the opportunities facing this generation. Her organisation’s latest research, published in late July, indicates that 57% of Gen Z and 71% of Millennials are exploring side hustles, not as passion projects, but to survive financially. Taking a positive view of this situation, one could argue that this will be a boon for innovation, given that necessity is the mother of invention.
Also noteworthy is that nearly one in five Gen Zers is already working a second job. Joanne’s daughters illustrate a different relationship with AI entirely. One, aged 30, works in music, while the other, 24, is in fashion, both creative fields where AI might be expected to pose a threat. Instead, they don’t fear the technology but embrace it strategically. The younger daughter used ChatGPT to redesign their family’s living room, inputting photos and receiving detailed interior design suggestions that impressed even Jo’s initially sceptical husband. As Joanne says, both daughters use AI tools not to replace their creativity, but to “be smarter and faster and better”, for work and elsewhere. “The younger generation’s response to AI is just, ‘OK, this is another thing to make life easier.’”
This strategic approach extends to education. Nikki Alvey, the (brilliant) PR pro who facilitated my conversation with Jo, has children at the right age to observe this transition. Her son, who just completed his A-levels, used AI extensively for revision, creating quizzes, podcasts, and even funny videos from his notes. As Nikki pointed out: “I wish I’d had that when I was doing GCSEs, A-levels, and my degree; we barely had the internet.”
Elsewhere, her daughter, who is studying criminology at the University of Nottingham, operates under different constraints. Her university maintains a blanket ban on AI use for coursework, though she uses it expertly for job applications and advocacy roles. This institutional inconsistency reflects higher education’s struggle to adapt to technological reality: some universities Nikki’s son looked at were actively discussing AI integration and proper citation methods, while others maintain outright bans.
Universities that nail their AI policy will recognise that future graduates need capabilities that complement, rather than compete with, AI. This means teaching students to think critically about information sources.
As I described during a recent conversation with Gee Foottit on St James’s Place Financial Adviser Academy’s ‘The Switch’ podcast: “Think of ChatGPT as an army of very confident interns. Don’t trust their word. They may hallucinate. Always verify your sources and remain curious. Having that ‘truth literacy’ will be crucial in the coming years.”
Related to this, LinkedIn’s Chief Economic Opportunity Officer Aneesh Raman describes this as the shift from a knowledge economy to the “relationship economy”, where distinctly human capabilities matter most.
The Stanford research offers clues about what this means. In occupations where AI augments rather than automates human work – such as complex problem-solving, strategic thinking, emotional intelligence, and creative synthesis – young workers aren’t being displaced.
Success won’t come from competing with machines on their terms, but from doubling down on capabilities that remain uniquely human.
On The Switch podcast episode, which will be released soon (I’ll share a link when it does), I stressed that the future belongs to those – young and old – who can master what I call the six Cs of skills to dial up:
- Collaboration
- Communication
- Compassion
- Courage
- Creativity
- and Curiosity
These are no longer soft skills relegated to HR workshops but survival capabilities for navigating technological disruption.
There’s a deeper threat lurking, though. The issue isn’t that the younger generations are AI-literate while their elders struggle with new technology, but understanding how to maintain our humanity while leveraging these tools.
No doubt nurturing the six Cs will help, but a week or so ago, Microsoft’s AI chief, Mustafa Suleyman, described something rather more unsettling: “AI psychosis”: a phenomenon where vulnerable individuals develop delusions after intensive interactions with chatbots. In a series of posts on X, he wrote that “seemingly conscious AI” tools are keeping him “awake at night” because of their societal impact, even though the technology isn’t conscious by any human definition.
“There’s zero evidence of AI consciousness today,” Suleyman wrote. “But if people just perceive it as conscious, they will believe that perception as reality.”
The bitter irony is that the capabilities we now desperately need – namely, creativity, empathy, relationships, imagination, and judgement – are exactly what we stripped out of entry-level work to make it “efficient”. Now we need them back, but we’ve forgotten how to cultivate them at scale.
The generation entering university in September may lack traditional job security, but they possess something their predecessors didn’t: the ability to direct AI while (hopefully) remaining irreplaceably human. And that’s not a consolation prize. It’s a superpower.
The present

On stage with Tomás O’Leary at Origina Week
I tap these words on the morning of August 29 from seat 24F on Aer Lingus EI158 from Dublin to London Heathrow, flying high after a successful 48 hours on the Emerald Isle. A software client, Origina, flew me in. I’ve been assisting with the CEO, Tomás O’Leary’s thought leadership and the company’s marketing messaging for over a year (his words of wisdom around pointless software upgrades and needless infrastructure strain in my July newsletter).
Having struck up a bond – not least thanks to our days reminiscing about playing rugby union (we were both No8s, although admittedly I’m a couple of inches shorter than him) – Tomás invited me to participate in Origina Week. This five-day extravaganza mixes serious business development skills with serious fun and learning.
Tomás certainly made me work for my barbecued supper at the excellent Killashee Spa Hotel: on Thursday, I was on stage, moderating three sessions for three consecutive hours. The final session – the last of the main programme – involved men and Tomás having a fireless “fireside” chat about technology trends as I see them, and his reflections on their relevance to the software space.
I was grateful to make some superb connections, be welcomed deeper into the bosom of the Origina family, and hear some illuminating presentations, especially behavioural psychologist Owen Fitzpatrick’s session on the art of persuasion.
Watching Owen work was a masterclass in human communication, which no AI could replicate. For 90 minutes, around 250 people from diverse countries and cultures were fully engaged, leaning forward, laughing, and actively participating. This was neural coupling in action: the phenomenon where human brains synchronise during meaningful interaction. No video call, no AI assistant, no digital platform could have generated that energy.
This is what Tomás understood when he invested in bringing his global team together in the Irish capital. While many executives slash training budgets and rely on digital-only interactions, he recognises that some learning only happens face-to-face. That’s increasingly rare leadership in an era where companies are cutting human development costs while pouring billions into AI infrastructure.
Owen’s session focused on classical rhetoric: the ancient art of persuasion, which has become increasingly relevant in our digital age. He walked us through the four elements: ethos (credibility), logos (logic), pathos (emotion), and demos (understanding your audience). These are precisely the human skills we need as AI increasingly handles our analytical tasks.
It was a timely keynote. Those who have completed their A-levels this summer are entering a world where the ability to persuade, connect, and influence others becomes more valuable than the ability to process information.
Yet we’re simultaneously experiencing what recent research from O.C. Tanner calls a recognition crisis. Its State of Employee Recognition Report 2025 found that UK employees expect in-person interactions with recognition programmes to increase by 100% over the next few years, from 37% to 74%. These include handwritten notes, thank you cards, and award presentations. People are craving authentic human interaction precisely because it’s becoming scarce.
Recent data from Bupa reveals that just under a quarter (24%) of people feel lonely or socially isolated due to work circumstances, rising to 38% among 16-24-year olds. Over a fifth of young workers (21%) say their workplace provides no mental health support, with 45% considering moves to roles offering more social interaction.
Also, new research from Twilio reveals that more than one-third of UK workers (36%) demand formally scheduled “digital silence” from their workplace. Samantha Richardson, Twilio’s Director of Executive Engagement, observed: “Technology has transformed how we work, connect, and collaborate – largely for the better. But as digital tools become increasingly embedded in everyday routines, digital downtime may be the answer to combating the ‘always-on’ environment that’s impeding productivity and damaging workplace culture.”
This connects to something that emerged from Owen’s session. He described how the most powerful communication occurs through contrast, repetition, and emotional resonance – techniques that require human judgment, cultural understanding, and real-time adaptation. These are precisely the skills that remain irreplaceable in an AI world.
Consider how Nikki’s son used AI for revision. Rather than passively consuming information or getting out the highlighter pens and mapping information out on a big, blank piece of paper (as I did while studying, and still do sometimes), he actively prompted the technology to create quizzes, podcasts, and videos tailored to his learning style. This was not AI replacing human creativity, but human creativity directing AI capabilities.
The challenge for today’s graduates isn’t avoiding AI, but learning to direct it purposefully. This requires exactly the kind of critical thinking and creative problem-solving that traditional education often neglects in favour of information retention and standardised testing.
What’s particularly striking about the current moment is how it echoes patterns I’ve observed over the past year of writing this newsletter. In June 2024’s edition, I explored how AI companions were already changing human relationships. I’ve also written extensively about the “anti-social century” and our retreat from real-world connection. Now we’re seeing how these trends converge in the graduate employment crisis: technology is doing more than just transforming what we do. It is also changing how we relate to each other in the process.
On this subject, I’m pleased to share the first of a new monthly podcast series I’ve begun with long-term client Clarion Events, which organises the Digital Transformation Expo (DTX) events in London and Manchester. The opening episode of DTX Unplugged features Nick Hodder, Director of Digital Transformation and Engagement at the Imperial War Museums (IWM), highlighting why meaningful business transformation begins with people, not technology.
The answer, whether in a hotel conference room in Dublin or a corporate office in Manchester, remains the same: in a world of AI, our ability to connect authentically with other humans has become our competitive edge.
The past
Twenty-four years ago in September, I sat in my first tutorial at the University of St Andrews — nine students around a table, including Prince William and seven young women. That tutorial room held particular energy. We were there to think, question, argue with texts and each other about ideas that mattered. Will, who played for my Sunday League football team, was just another student.
The economic backdrop was fundamentally different. Graduate jobs were plentiful, social media was (thankfully) nascent – Facebook was three years away, and only mildly registered in my final year, 2004-05 – and so partying with peers was authentic, and free from fears of being digitally damned. Moreover, the assumption that a degree led to career success felt unshakeable because it was demonstrably true.
The social contract was clearer, too. Society invested in higher education as a public good that would generate returns through increased productivity, innovation, and civic engagement. Students could focus on learning rather than debt management because the broader community bore the financial risk in exchange for shared benefits.

My graduation day at the University of St Andrews in 2005
Looking back, what strikes me most is the simplicity of the intellectual environment. We read physical books, researched in libraries using card catalogues, and didn’t have any digital devices in the lecture halls or tutor rooms. (And the computers we had in our rooms took up a colossal amount of space.) Our critical thinking developed through friction: the effort required to find information, synthesise arguments from multiple sources, and express ideas clearly without technological assistance.
Knowledge felt both scarce and valuable precisely because it was hard to access. You couldn’t Google historical facts during seminars. If you hadn’t done the reading, everyone knew. If your argument was poorly constructed, there was nowhere to hide. The constraints forced genuine intellectual development.
The human connections formed during those four years proved more valuable than any specific subject knowledge. Late-night debates in residence halls, study groups grappling with challenging texts, and casual conversations between lectures – these experiences shaped how we thought and who we became.
We could explore medieval history, philosophical arguments, or literary criticism without worrying whether these subjects would directly translate to career advantages. The assumption was that broad intellectual development would prove valuable, even if connections weren’t immediately obvious. (Again, I was fortunate to be in the last cohort of subsidised university education.)
That faith in indirect utility seems almost lost now. Today’s students, facing massive debt burdens, quite reasonably demand clear pathways from educational investment to career outcomes. The luxury of intellectual exploration for its own sake becomes harder to justify when each module costs hundreds – if not thousands – of pounds.
Some elements remain irreplaceable. The structured opportunity to develop critical thinking skills, build relationships with peers and mentors, and discover intellectual passions in supportive environments still offers unique value.
Indeed, these capabilities matter more now than they did a quarter of a century ago. When information is abundant but truth is contested, when AI can generate convincing arguments on any topic, and when economic structures are shifting rapidly, the ability to think independently becomes genuinely valuable rather than merely prestigious.
My 10-year-old son will reach university age by 2033. By then, higher education will have undergone another transformation. The economics might involve shorter programmes, industry partnerships, apprenticeship alternatives, or entirely new models that bypass traditional degrees. But the fundamental question remains unchanged: how do we prepare young minds to think independently, act ethically, and contribute meaningfully to society?
The answer may require reimagining university education entirely. Perhaps residential experiences focused on capability development rather than content transmission. Maybe stronger connections between academic learning and real-world problem-solving. Possibly more personalised pathways that recognise different learning styles and career ambitions. What won’t change is the need for structured environments where young people can develop their humanity while mastering their chosen fields of expertise.
The students opening their A-level results this last month deserve better. They deserve educational opportunities that develop their capabilities without crushing them with debt. They deserve career pathways that use their human potential rather than competing with machines on machine terms. Most importantly, they deserve honest conversations about what higher education can and cannot provide in an age of technological disruption.
Those conversations should start with acknowledging what that tutorial room at St Andrews represented: human minds engaging directly with complex ideas, developing intellectual courage through practice, and building relationships that lasted decades (although my contact with Prince Will mysteriously stopped after I began working at the Daily Mail Online!).
These experiences – whether at university or school, or elsewhere – remain as valuable as ever. The challenge is whether we can create sustainable ways to provide them without bankrupting the people who need them most.
Statistics of the month
🎓 A-level computing drops
Computing A-level entries fell by 2.8% in the UK despite years of growth, though female participation rose 3.5% to reach 18.6% of students taking the subject. Meanwhile, maths remains most popular with 112,138 students, but girls represent just 37.3% of the cohort. 🔗
👩💼 AI skills gender divide widens
Only 29% of women report having AI skills compared to 71% of men, while nearly 70% of UK jobs face high AI exposure. Under half of workers have been offered AI-related upskilling opportunities. 🔗
💰 Office return costs surge
UK employees spend an average £25 daily on commuting and expenses when working from the office, potentially costing nearly £3,500 annually in commuting alone if expected to be in the office for five days a week. 🔗
🏢 Summer hiring advantage emerges
Some 39% of UK businesses have struggled to hire in the last 12 months, with competition and slow hiring cited as key barriers. 🔗
🌍 Extreme poverty redefined
The World Bank raised its International Poverty Line from $2.15 to $3 per day, adding 125 million people to extreme poverty statistics. However, global extreme poverty has still fallen from 43% in 1990 to 10% today. 🔗
Thank you for reading Go Flux Yourself. Subscribe for free to receive this monthly newsletter straight to your inbox.
All feedback is welcome, via oliver@pickup.media. If you enjoyed reading, please consider sharing it via social media or email. Thank you.
And if you are interested in my writing, speaking and strategising services, you can find me on LinkedIn or email me using oliver@pickup.media.
2 thoughts on “Go Flux Yourself: Navigating the Future of Work (No. 20)”