The promise was efficiency. The reality is exhaustion.
Here’s what we were told: AI will make your job easier. It will handle the tedious tasks. It will free you up to do more meaningful work. You’ll be more productive, more creative, more strategic. AI is a tool, not a replacement:it augments human intelligence, doesn’t replace it.
And in some ways, that’s true. AI can draft emails in seconds. It can analyze data faster than any human. It can generate first drafts, summarize documents, create basic designs. It’s genuinely impressive technology.
But here’s what nobody warned us about: when your company adopts AI and it actually works, you don’t get to do less work. You get to do twice as much.
A recent Harvard Business Review study conducted over eight months at a U.S. tech company with about 200 employees confirms what many of us are experiencing: AI use did not shrink work:it intensified it and made employees busier. The study found that task expansion happened because AI filled in gaps in knowledge, so people started doing work that used to belong to other roles or would have been outsourced or deferred. That shift created extra coordination and review work for specialists, including fixing AI-assisted drafts and coaching colleagues whose work was only partly correct or complete.
Even more telling: boundaries blurred because starting work became as easy as writing a prompt, so work slipped into lunch, meetings, and the minutes right before stepping away. Multitasking rose because people ran multiple AI threads at once and kept checking outputs, which increased attention switching and mental load. Over time, this faster rhythm raised expectations for speed through what became visible and normal, even without explicit pressure from managers.
Because now that you can draft a report in an hour instead of four, you’re expected to produce four reports. Now that you can analyze a dataset in minutes instead of days, you’re expected to analyze all the datasets. Now that AI can generate a first draft, you’re expected to produce ten first drafts, edit all of them, and still add the “human touch” that makes them actually usable.
The efficiency isn’t making our jobs easier. It’s just raising the baseline of what’s considered acceptable output. And we’re burning out trying to meet it.
The Productivity Paradox Nobody Talks About
There’s a well-documented phenomenon in economics called the productivity paradox: when you introduce technology that should make workers more productive, overall productivity doesn’t increase proportionally:and sometimes worker wellbeing actually decreases.
We saw this with email. We saw it with smartphones. We saw it with Slack and Teams and every “efficiency tool” that promised to streamline communication but actually just made us available 24/7.
And now we’re seeing it with AI.
The problem isn’t the technology itself. The problem is the assumption that efficiency gains should immediately translate to increased output rather than improved work-life balance, reduced hours, or simply less stressful workdays.
When a company adopts AI and discovers that tasks that used to take four hours now take one hour, the response isn’t “great, now you can work four-hour days” or “great, now you have time for deep work and strategic thinking.” The response is “great, now you can do four times as much.”
The efficiency is captured by the company, not the worker. You’re more productive, which means you’re more valuable, which means they can extract more from you. The savings don’t go to you. They go to the bottom line.
The Human Touch Nobody Wants to Pay For
Here’s the other thing: AI output isn’t actually finished work. It’s a starting point.
AI can generate a draft, but it can’t understand nuance, context, audience, or intention the way a human can. It hallucinates facts. It produces generic, soulless prose. It misses cultural sensitivities. It lacks judgment.
Research from BetterUp Labs and Stanford found that 41% of workers have encountered what they call “workslop,” which is low-quality AI-generated content that appears polished but lacks real substance. Each instance costs nearly two hours of rework, creating downstream productivity, trust, and collaboration issues. Workers described opening AI-generated documents and feeling confusion followed by frustration wondering if the sender simply used AI to generate large blocks of text instead of thinking it through.
So yes, AI can produce something in minutes that would have taken hours. But then you have to spend time editing it, fact-checking it, rewriting the parts that sound robotic, adding the insights and analysis that actually matter, making it sound like something a human with expertise would write.
You’re not doing less work. You’re doing different work, and often, it’s more cognitively demanding work because you’re quality-controlling AI output while also doing your actual job.
The promise was that AI would handle the grunt work so you could focus on the high-level thinking. The reality is that AI does the grunt work badly, so now you’re doing the high-level thinking AND cleaning up after AI AND you’re expected to do it all at twice the speed because “AI makes you more efficient.”
When “Augmented Intelligence” Becomes “Doubled Workload”
The scale of this problem is growing rapidly. AI adoption has exploded across industries:companies are racing to integrate these tools into every aspect of work. But nobody is measuring the human cost.
Let’s be specific about what this looks like in practice:
Marketing and content creation: You used to write three blog posts a week. Now AI can generate drafts, so you’re expected to produce ten:and still edit all of them, ensure brand voice consistency, fact-check claims, optimize for SEO, add original insights, and make them actually worth reading.
Customer service: You used to handle thirty support tickets a day. Now AI can draft responses, so you’re expected to handle eighty:and still personalize each one, de-escalate frustrated customers, solve complex problems AI can’t understand, and make sure the AI didn’t say something wildly inappropriate.
Design and creative work: You used to create five concepts for a client. Now AI can generate variations instantly, so you’re expected to present twenty options:and still do the actual creative thinking, understand the client’s needs, refine the AI slop into something usable, and explain why most of the AI generations are terrible.
Legal, medical, and technical fields: AI can research faster, but you’re still responsible for the accuracy, the judgment calls, the ethical implications, the human expertise that AI cannot replace. Except now you’re expected to see twice as many clients, review twice as many cases, handle twice as many projects:because AI “helps.”
The through line: the time you save with AI doesn’t become your time. It becomes your employer’s time, and they fill it with more work.
The Burnout Is Just Repackaged
We’ve known for years that overwork leads to burnout. We’ve known that constant context-switching damages productivity. We’ve known that trying to do more with less creates stress, anxiety, health problems, and turnover.
Research from the World Health Organization has classified burnout as an occupational phenomenon characterized by energy depletion, mental distance from one’s job, and reduced professional efficacy. It’s caused by chronic workplace stress that hasn’t been successfully managed.
So what happens when AI is introduced as a solution to efficiency, but the result is that workers are expected to produce exponentially more in the same amount of time? What happens when the “help” just raises the bar?
You get the same burnout. Just faster.
The exhaustion of having to do twice as much work doesn’t go away because AI is involved. If anything, it’s worse:because now there’s this expectation that you should be able to keep up, because “technology is helping you.” If you can’t handle the increased workload, clearly you’re not leveraging AI effectively enough. Clearly you’re the problem, not the unrealistic expectations.
The Lie of the “AI-Augmented Worker”
The narrative around AI in the workplace is always about augmentation, not replacement. “We’re not replacing you:we’re giving you tools to be more effective!”
And on some level, that’s true. Most companies aren’t firing their workforce and replacing them with ChatGPT. They’re keeping their workers and giving them AI tools.
But what they’re not saying is that “AI-augmented worker” often means “one person doing the job of three people, with AI making it technically possible even if it’s unsustainable.”
The Harvard Business Review research revealed a critical disconnect: senior leaders, often insulated from the day-to-day mechanics of AI-assisted work, tend to view the technology through the lens of output metrics:more reports generated, more emails sent, more code written. Meanwhile, employees on the ground described feeling like “quality-control inspectors for an unreliable but prolific junior colleague who never sleeps.” The cognitive load didn’t decrease:it shifted to a new and uniquely draining form of vigilance.
We’re being sold the idea that we’re more empowered, more capable, more valuable. What we’re actually experiencing is more work, more pressure, more expectation, and the same or sometimes worse compensation and working conditions.
Because here’s the thing about AI making you “more productive”: productivity gains that benefit the company don’t automatically benefit you. Your salary doesn’t double because your output doubles. Your workload just doubles. And when you can’t keep up with the new expectations, you’re the one who’s failing:not the system that’s demanding the impossible.
What Employers Get Wrong
If you’re a manager, executive, or business owner reading this:this is for you.
Adopting AI and immediately increasing workload expectations is not a sustainable strategy. It’s a short-term productivity boost that will lead to long-term problems: burnout, turnover, decreased quality of work, loss of institutional knowledge, and a workforce that resents the tools they’re supposed to be grateful for.
Here’s what AI adoption should look like if you actually care about your employees:
Use efficiency gains to improve quality of life. If AI cuts a task from four hours to one, let your employee use those three hours for something else:deep work, professional development, strategic thinking, or just not working. Don’t immediately fill it with three more tasks.
Maintain realistic expectations. Just because something is technically possible doesn’t mean it’s sustainable. One person producing the output of three people isn’t impressive:it’s a recipe for burnout.
Invest in training and transition time. Learning to work with AI effectively takes time. Learning to prompt correctly, to fact-check output, to integrate it into workflows:that’s a skill. Don’t just drop AI tools on your team and expect them to figure it out while maintaining their existing workload.
Pay people for increased productivity. If AI is making your workers significantly more productive and your company is capturing those gains, compensate them for it. Profit-sharing, bonuses, raises:something that acknowledges that they’re creating more value.
Recognize that AI doesn’t replace human expertise. The person using AI to draft a document still needs to understand the subject matter, check for accuracy, add nuance, and ensure quality. That expertise is valuable. Don’t devalue it just because AI exists.
Monitor workload and burnout indicators. If you’ve adopted AI and your team is suddenly working longer hours, missing deadlines, showing signs of stress, or experiencing higher turnover:the AI isn’t helping. You’re just asking for too much.
What Workers Need to Do
If you’re an employee dealing with increased workload expectations post-AI adoption, here’s what you need to know:
You’re not imagining it. If it feels like you’re working harder since AI was introduced, you probably are. You’re not failing to leverage the technology. The expectations are genuinely unrealistic.
You don’t have to match the machine. AI doesn’t get tired. AI doesn’t need boundaries. AI doesn’t have a life outside work. You do. Just because AI can generate output 24/7 doesn’t mean you should try to keep up with it.
Document your workload. Track what you’re being asked to do, how long it actually takes, what’s realistic vs. what’s not. When you’re asked to do more, you need data to push back with.
Set boundaries. If the expectation is that AI makes you so efficient you can handle double the work, and that’s not sustainable for you, say so. “This workload isn’t sustainable” is a complete sentence.
Demand compensation for increased output. If your productivity has doubled, your compensation should reflect that. Don’t let your company capture all the gains.
Find collective power. One person pushing back is easier to ignore than an entire team. Talk to your colleagues. Are they experiencing the same thing? Can you advocate together?
The goal is not to reject AI. The goal is to refuse to let AI become an excuse for exploitation.
The Future We Should Be Building
AI has the potential to genuinely improve our working lives. It could reduce drudgery, handle tedious tasks, free us up for creative and strategic work, give us more time for the parts of our jobs we actually find meaningful.
But that only happens if we make deliberate choices about how we implement it.
If we choose to use AI to extract more work from fewer people, we’re just accelerating burnout with better technology. If we choose to use AI to improve quality of life:shorter work weeks, more time for deep work, less stress, better work-life balance:then it becomes the transformative tool we were promised.
Right now, we’re choosing the former. And we’re all suffering for it.
The question is: does it have to be this way? Or can we build something better?
Because the technology isn’t the problem. It’s what we’re choosing to do with it.
And right now, we’re choosing exhaustion.
PSA and Quick mental health check: If you just read this entire article nodding along furiously while simultaneously responding to Slack messages, editing an AI-generated document, and attending a Zoom meeting on mute, you might want to close your laptop and take a walk. Seriously. We’ll still be here when you get back. The work will wait. Your nervous system won’t.