The Quiet Quitting of Learning: Why Your Training Programs Have an Engagement Crisis
There’s a pattern I’ve been seeing across organisations that’s starting to worry me. Employees attend training programs. They tick the completion boxes. They might even pass the assessments. But they’re not learning. Not really.
I’m calling it the quiet quitting of learning, and it’s happening more widely than most L&D teams realise.
The Numbers Don’t Lie
Let’s look at the data. LinkedIn’s 2026 Workplace Learning Report found that while training hours per employee increased by 12% in 2025, self-reported skill confidence only improved by 3%. That gap tells a story. People are spending more time in learning environments without feeling meaningfully more capable.
In my conversations with L&D professionals across Australia, a consistent picture emerges. Completion rates are high, sometimes above 90%, because organisations tie them to performance reviews or compliance requirements. But application rates, the percentage of people who actually use what they learned on the job, hover between 15% and 25%.
We’re creating a compliance theatre of learning, and it’s costing organisations in ways they haven’t calculated.
Why It’s Happening
The causes are layered, but a few stand out:
Training overload. The average employee in a mid-to-large Australian organisation is now expected to complete 40-60 hours of training per year. That’s up from about 25 hours five years ago, driven largely by the explosion of AI literacy programs stacked on top of existing compliance, safety, and professional development requirements. People are overwhelmed, and the rational response to overwhelm is to disengage while appearing to comply.
Irrelevance, perceived or real. When an employee in accounts payable is required to complete a generic “Introduction to AI” course that uses examples from software engineering, they check out. Not because they’re uninterested in AI, but because nobody bothered to make the training relevant to their actual work.
Bad timing. Pushing training during peak work periods guarantees surface-level engagement at best. I’ve seen organisations launch major upskilling initiatives in Q4, right when every team is scrambling to hit year-end targets. The training gets done, technically. But nothing sticks.
No follow-through. Even good training fails when there’s no mechanism for applying it. If someone completes an excellent AI prompt engineering course and then returns to a role where AI tools aren’t available or aren’t approved, what was the point?
The Engagement Signals You’re Missing
Most L&D platforms track completion and time spent. Those are the wrong metrics. Here’s what to look for instead:
Voluntary re-engagement. Are people coming back to the learning content after the initial requirement? If not, the content wasn’t valuable enough to revisit.
Peer sharing. Are employees recommending training resources to colleagues? This is one of the strongest signals that learning is actually landing.
Question quality. In live training sessions, are people asking clarifying questions that show genuine processing, or are they sitting silent until the session ends?
Application attempts. Even failed attempts to apply new skills are positive signals. If nobody’s even trying to use what they learned, the training isn’t creating motivation to change behaviour.
What Good Looks Like
The organisations doing this well share some common characteristics:
They make training optional where possible. This sounds counterintuitive, but optional training that’s genuinely useful gets better engagement than mandatory training that’s merely checked off. Keep compliance training mandatory, obviously. But for skills development, give people agency.
They connect training to real projects. One financial services firm I work with assigns a “learning project” alongside each training program. Complete the AI literacy module, then use what you learned to improve one real process in your team. The learning project is what gets reviewed, not the module completion.
They respect time. The best L&D teams I know have an explicit principle: we won’t ask for more than 30 minutes of anyone’s time without a clear and immediate payoff. Microlearning isn’t just a trend. It’s a recognition that attention is finite and precious.
They measure behaviour change, not completion. This is harder and more expensive to measure, but it’s the only metric that matters. Did the training change how people work? If you can’t answer that question, your measurement framework needs work.
The Hard Conversation
Here’s the part that’s uncomfortable for L&D professionals: some of this is our fault. The industry has been so focused on content creation and platform deployment that we’ve neglected the fundamentals of adult learning. Adults learn when they see immediate relevance, when they have autonomy, and when they can connect new knowledge to existing experience.
If your training programs don’t account for those principles, you can have the fanciest learning platform in the world and people will still quietly disengage.
The fix isn’t more training. It’s better training, delivered at the right time, in the right context, with the right follow-through. And sometimes it’s less training, focused on fewer things that actually matter.
The quiet quitting of learning is a signal, not a sentence. It’s telling us that our current approach isn’t working. The question is whether we’re willing to listen.
Michelle Torres is an L&D strategist focused on workforce transformation and capability building.