Author’s note: I started writing this post in Dec. 2025 when I decided to launch the site. Even in those few short weeks, it was hard to keep up with the changing direction and even my own sentiment on the topic because of how quickly it’s evolving. This piece is a snapshot in time. Probably a very brief snapshot, but I hope it will at least help you organize your own thoughts on the matter.
There’s a lot of existential crisis going around among us knowledge workers. And hang on, kids, because we’re gonna compartmentalize.
AI, by which I mean generative and agentic technologies (not the self-aware stuff, which scares the Sarah Connor bejesus out of me), has flipped the world on its head, putting the discovery of fire, the Renaissance, the Industrial Revolution, the Internet, and Taylor Swift’s wedding to shame.
Set aside for a moment the head-spinning WTAF consequences of universal AI-ification. You know, the environmental impacts, privacy concerns, rampant nefarious activity, outsourcing of thought and rotting of minds, economic bubble-bursting, job devaluation, and theft of copyrighted works.
(That’s a big compartment. We’ll get back to it.)
Regardless of your stance on it, AI isn’t going away. There’s no putting this horse back in the barn. A new normal will emerge, and future generations will look back on us and wonder how we ever got by with primitive “mobile phones” and “laptops.”
The trouble is, the technology is evolving so fast no one knows what the new normal will look like. We don’t know how to prepare for it nor, at a gut-check level, what it will do to our paychecks. If you’re a knowledge worker, you’re likely waiting for the pink slip to land. Especially if you’re using AI at work and it’s not producing the time savings your boss wants to see.
Why this particular change hurts so much
Humans might grouse and drag their feet a bit, but they’re pretty good at adapting to change. Heck, this is the third total-disruption event I’ve seen in my own career. This one’s different, though, because the speed and breadth are cutting so deeply at personal identity and the way the world in general works.
My official stance on AI is optimistic pragmatist, which is an amalgamation of the compartments I sort it into. I’m idealistic enough to believe human creativity and ingenuity won’t be lost in the new normal. I’m pragmatic enough to know the entire paradigm of work will shift and those who don’t shift with it will be left behind. And I’m cynical enough to suspect exploitative but monetizable uses will be a distraction to real progress for a while to come. (Looking at you, ChatGPT sexytalk.)
Image and video generators are an example. Exploitative uses are depressingly common: six-fingered slop, deepfakes, pseudo-porn, outright porn, and uncanny-valley ads that put One Weird Thing to shame. Gaming is an interest of mine, and I swear, if I see another female anime character with over-exaggerated assets come through my social feeds despite consistently marking “not interested,” I will scream.
Wait. I think I just fell victim to a rage-bait engagement algorithm. Ugh.
So yeah, there’s that. But these tools are also enabling creative works that are downright amazing. The creator isn’t putting brush to canvas, but it’s still an artform to craft the references and prompts and workflows and iterations that translate what’s in their head into what comes out of the tool. That doesn’t change as the technology improves and gets easier for noobs like me to work with.
In other words, skills that were formerly specialized have become democratized.
This is what hurts so much. Suddenly everyone can do what you do. Or worse, robots can do what you do. Or even worse, your company thinks robots can do what you do. And overnight, you’re devalued in a sea of slop. Cue: Existential crisis.
Like with any skill, though, the devil’s in what you do with it. Just because you can create something doesn’t automatically make it good, or even a good idea.
Here’s my prediction for how this will play out for knowledge workers in the short term. The technology is too volatile to see much beyond that.
We’re going to end up with two tracks. OK, three if you count the cynical stuff.
The utilitarian track
Full disclosure: My current role involves facilitating this track.
If there’s even a remote possibility a task or process can be automated, it will be AI-ified to the hilt. Agents will perform the tasks, and humans will shift toward training, orchestration, and in-the-looping behind the scenes.
This doesn’t mean inserting AI into the old workflow. The step-change the execs are looking for won’t happen until we fundamentally change the systems that were created for the pre-AI era. Everything — tools, processes, roles & responsibilities, governance — needs to be redesigned specifically for human-AI collaboration. Otherwise, it’s just AI theater.
Standard business processes and tasks are in this track, as well as the operational and transactional necessities that go along with doing business. Often, these are “cost center” activities — meaning, they’re a cost of doing business but don’t directly bring revenue in. This includes a lot of customer support and help stuff, as well as starter code, anything templatable, and creation of generic or “stock” media, like my featured image above. Face it, if you’re busy relaxing in a spa, you probably don’t notice or care if the dulcet tones of the pan flute in the background were 100% produced by Zamfir. (Google the name or ChatGPT it. But double-check the reference links if you get the answer from AI.)
I’ve incriminated the work of many current and former colleagues in that paragraph. And Zamfir. To all of whom, I apologize.
I’m guessing you’re not opposed to getting rid of the chores in your work. I’m guessing you *are* opposed to the fruits of your labor being deemed low-value enough that we can cut the robots loose on it without too many worries. But this is simply the latest iteration of a familiar business cycle. Anyone remember the offshoring trend? The we-don’t-need-no-stinkin’-copy-editors trend? Content farms?
We survived all that, and we’ll survive this, too.
In this track, there’s a fair chance you’ll be tapped to fix bad AI outputs by hand. This, of course, defeats the purpose of AI as a productivity driver and can be soul-crushing when you know you could do it better yourself. (Side quest: This piece from Cory Doctorow of enshittification fame is doom-and-gloomy but explains the risk well.)
If you’re not keen on cleaning up robot messes after the fact, you can move your skillset upstream. You know things AI doesn’t, and you’re the best equipped to test, build, and fine-tune agents for the workflows you know by heart. You can architect the right knowledge bases and context graphs. Perform quality evaluations. Build solutions for the non-AI-savvy. Teach the next generation of knowledge workers. Create governance guardrails. Guide your newly democratized colleagues in how to use these systems to best effect.
And if you’re the first on your team to start thinking about agentic orchestration and governance, you will be regarded as a genius. You’ll also have transferable skills that insulate you no matter where the winds of business blow.
Fair warning: You might have to teach yourself those skills. Companies don’t have a good handle on the whole AI readiness thing yet. (Sorry, Talent & Development. “Ask our intranet chatbot a question!!” doesn’t count.) Commercial educational outlets are stepping up, but if you’re among the laid-off, you might need to find paths for demonstrating how you’ve applied the skills.
The authenticity track
Full disclosure: My current side gigs are in this track. Yep, I’m playing both sides.
At the time I’m writing this, there’s a growing backlash against ubiquitous slop and the injection of AI into every possible interface (hi, Windows). See also: Just because you can do something doesn’t make it a good idea.
Despite the AI companies’ fumbling attempts at persona-ing (hi, xAI), AI has no personality and no inherent talent. It generates based on math and probability and needs a lot of guidance to not liken itself to Hitler (hi again, xAI). This puts true human connection, taste, perspective, and judgment at a premium. Discernment, too, because we’ve entered a world where you can no longer default to believing what you see or read.
Startups, solopreneurs, and money-making teams (business speak: revenue centers) are reading the room and leaning heavily into human authenticity as a core tactic. You’ll see this in: Marketing campaigns that leverage un-polishedness. Products for underserved customer segments. Newsletters and — gasp! — email. Community-building. Thought leadership that isn’t the bland equivalent of a Live Laugh Love sign. The creator economy (the real one, not the TikTok click-bait).
Authenticity doesn’t preclude AI from being used in the creative process. Speed-to-market is important in a business environment, and AI can be useful for jump-starting ideas, breaking writers block, generating mood boards, rapid prototyping, creating mock-ups, and all the logistical tasks involved in distribution, outreach, and customer management. But you can’t outsource your decision-making to it. It’s an accelerator, not a replacement.
A side note about artistic endeavors — authors, musicians, filmmakers, etc. The line between “create art for commercial use” and “create art to capture authentic human experience” can be quite blurry. These folks are in a tough spot, and the decision on how or if to use AI in their work is deeply personal and polarizing. An argument can be made that artists choose the medium for their work, and generative tools are an option. An argument can also be made that this entire technology is based on *cough* borrowing other people’s work, and that a lot of people are making a lot of money bypassing the artist. Not to mention all the quick-buck hucksters who are churning out so much crap that it devalues the quality stuff.
Fair point. And I don’t know how this will settle out over time.
The cynical track
We’ll see two other disappointing trends continue. First: The companies that are inserting AI into every nook and cranny of the customer experience won’t stop unless their bottom line takes a big enough hit from reputational backlash or economic crash. That’s business, folks. Second: Hucksters, bad actors, and horny idiots will always find a way to monetize or churn out garbage. AI didn’t cause this, but it definitely magnified it.
All the WTF-eries I listed above are also real and being swept aside to the detriment of society. (I told you we’d get back to it!) There will be a reckoning, but it will get worse before it gets better. Speak up, lobby the politicians, vote with your wallet, and do what you can to encourage responsibility.
Unfortunately, I think a lot more people will lose their jobs, not specifically because of AI but because of the convergence of changes going on in the world at large. There’s a better-than-average chance I’ll be among those folks, but I’ve reinvented myself before, and I can do it again. I’ve made peace with the existential crisis by learning all I can about the technology, what its limits are, how to make it do what I want it to do, etc. I’m rather fond of receiving a paycheck, and this is now a necessity, just like learning the in’s and out’s of digital content was a necessity earlier in my career.
The Rules of the World have changed. I don’t have any control over that. I do have control over what I do instead.
All opinions here are my own. All text is my own, too, including the em dashes. I welcome constructive comments and discussion on LinkedIn and Bluesky.

