Hey honeys and hustlers,
Today, I’m bringing you some words from Joe Lazer, the writer behind the Storytelling Edge newsletter. This post is adapted from a chapter of his book, Super Skill: Why Storytelling Is the Superpower of the AI Age. I read a chapter of his book on Honey & Hustle for the daily podcast challenge I’m doing for the month of February. Why preview one chapter when you could have two?! I started following his newsletter when he was publishing only on LinkedIn (*gasp*), and his writing was so good and valuable that I followed him to Substack. I appreciate his perspective on AI replacing writers. I’ve dabbled in vibe coding to see what all the hype was about, and I’m inclined to agree: storytellers are here to stay.
This post is a bit long and may get clipped in your inbox. You can always read it in the browser by clicking the link in the top corner (and leave comments! I like those!). We have a message from our sponsor and a community spotlight waiting for you at the end, so be sure to check those out!

Have you ever been shopping online and seen an item that was marked down from $100 to $59.99? What was your first thought? That’s a damn good deal, right?
At that moment, you’re a victim of one of advertisers’ favorite psychological tactics: anchoring bias.
Humans have a common inclination: We form opinions and make decisions based on the first piece of information we receive. This is why when advertisers show us the original price of an item ($100) and then the discount price ($59.99), we’re more likely to think we’re getting a good deal. Our perspective anchors to the original price, even if it’s total bullshit.
The same dynamic plays out with explosive new technology like generative AI.
This week marks three years since ChatGPT was released, and in this week’s newsletter, I want to lay out the case that our initial anchoring bias—and much of the conventional wisdom about which skills will be most valuable in the next age of work—was all wrong.
For three years, we’ve heard that writers are cooked, an endangered profession boiling in the pot over the undying fire of a red-hot AI economy. But what if the opposite is true? What if great writers have never been more valuable, and storytelling is the most valuable skill of the AI age?
The Ted Chasm
ChatGPT felt like it came out of nowhere, but it’d been around for years.
Starting in 2021, anyone could use OpenAI’s GPT-3 model to generate ad copy or blog posts through OpenAI’s research playground. Few did. That’s why OpenAI built ChatGPT. They wanted to attract users to get faster feedback on their latest model, GPT-3.5, so they built a simple chatbot interface for their model.
The development of AI models is terrifyingly opaque, and for whatever reason, GPT-3.5 made a significant leap in writing ability. GPT-3 was like Steve, a C-minus freelance copywriter who you’d probably fire after two or three jobs. ChatGPT was like Ted, a B-minus copywriter with tons of ideas and the occasional flash of brilliance. We’ve all had Teds on our team and thought, “Yeah, he’s solid.” So once AI crossed the Ted Chasm, people lost their fucking minds.
Because writing was the first major use case that most people experienced with generative AI, an anchoring bias took hold: We assumed writers were cooked. The humanities had been in decline since the Great Recession, while coders were the ascendant kings of the economy. This was the final battle in which the code economy would take full control.
But three years later, the reality is much different. Most people just don’t realize it yet. Generative AI’s writing abilities have stagnated, while its coding prowess has exploded. When a new model like Gemini 3 comes out, it impresses by vibe-coding a translation platform in minutes, not by writing the Great American Novel. (It can’t even write the Great American Thought Leadership Post.)
And there’s reason to believe this trend will continue.
AI coding vs AI writing
Why has AI made bigger leaps in coding than in writing? And why might we expect that to continue into the future?
I spent three years as an exec inside an AI startup and just spent a year writing a book about AI and the future of storytelling (to be announced soon!). And I believe it comes to three primary factors:
Training: How LLMs are trained.
Culture: How people respond to AI taking on certain tasks, and the long-term impact that will have.
Investment: How AI companies are investing in LLM development, and where they’re making trade-offs in improving its capabilities.
Training
Large language models are tricky to train, and when it comes to writing well, they face a few big limitations:
Our Silicon Valley overlords have trained their models on all the available text on the web—copyright be damned—and much of which is poorly written. AI writing is better than the data it’s based on, but it’s still not very good.
When human workers fine-tune AI models, they’re instructed to rate verbose, jargon-y answers more highly, which is why AI often rambles like it chased a six-pack of beer with 30 mg of Adderall.
When an LLM isn’t sure of an answer, it’s trained to hedge by spitting out extra text, basically like a student bullshitting their way through an answer.
The bigger problem: AI writing is much harder to objectively define than AI coding. With coding, AI can simulate tests of its code until it produces something that works when deployed. The training data for AI code, while not perfect, is also considered much higher quality.
With writing, AI systems plateau at a level where they sound like an insufferable junior McKinsey analyst from Connecticut named Brett. And honestly, that’s good enough for low-stakes business emails and decks, so there’s not a huge incentive to make them better. After all, there’s much more money in replacing engineers than copywriters. (More on that in a second).
Culture
It’s telling that the entire category of AI-generated content has earned the epithet of “AI slop,” while AI-generated coding has a much more positive nickname — “vibe coding.”1
AI coding has proven much more culturally acceptable than AI writing.
Dozens of studies of AI-generated content have come to the same conclusion: When people think a piece of content was generated by AI, they rate it poorly regardless of its quality. For example, in late 2024, researchers at the University of Florida gave people two versions of the same story. One version was written by a human, while the other was written by generative AI.
Sometimes, they correctly labeled the AI-written story as AI-generated. Other times, they labeled the human-written story as AI-generated. Whenever either story was labeled as AI, people rated it poorly. Similarly, researchers at the University of Vienna found that when people believe that an author has used AI to generate content, their perception of the author plummets.
Over the past few years, we’ve seen this play out dozens of times. In July, a group of Redditors discovered that a new classic rock band called The Velvet Sundown, which had started flooding Spotify playlists, was secretly AI. Users were furious with Spotify and vowed to cancel their subscriptions because the streaming service had no policy on the labeling or monetization of AI-generated content. They also saw it as a direct affront to human artists who were losing streams to an AI-created band.
The next month, J.Crew came under fire after a Substack newsletter, Blackbird Spyplane, published an investigation revealing that J.Crew was using AI-generated models in a new nostalgia campaign. Followers flooded the brand’s Instagram page, expressing their disappointment and disgust that the company would publish AI-generated content without disclosing it.
By contrast, brands like Aerie are differentiating themselves by vowing not to use AI and seeing a huge jump in brand engagement as a result. You can feel a vibe shift. Is there a market for AI slop? Certainly, but it comes with real risks; if you’re trying to build genuine connections with people as a “thought leader’ or brand, blatantly using AI is a dangerous game.
And as Sparktoro founder Rand Fishkin deftly showed this week, “NOT sounding like AI is a linguistic superpower.” People love it when they feel like you took the time to write something for them; by contrast, AI-generated content just feels impersonal.
The data also indicates that people’s use of AI tools has been shifting. In early 2025, Anthropic became the first AI model to provide a detailed breakdown of how people used it. Computer programming and software development were the largest use cases by far, accounting for over 37% of all usage, nearly four times greater than writing tasks.

When ChatGPT published its 2025 usage report, it revealed that the percentage of writing tasks on the platform had decreased by 33%, from 36% the previous year to 24%.
Investment
Most telling, though, is the direction that AI giants are taking their products.
AI investments are staggering. OpenAI reportedly lost $11.5 billion last quarter and has made $1.4 trillion in commitments against $13 billion in revenue. Anthropic (only?) burned $5.6 billion last year. To close that gap, AI giants need to focus on use cases that deliver tremendous value to enterprise companies.
That’s why, after releasing reasoning models that dramatically improved their coding capabilities over the past year, the CEOs of Silicon Valley’s AI giants began speculating that AI could do much of the work of software engineers, the most expensive line item on most companies’ budgets.
In March, Anthropic CEO Dario Amodei told the crowd at a Council of Foreign Relations event, “I think we will be there in three to six months, where AI is writing 90% of the code. And then, in 12 months, we may be in a world where AI is writing essentially all of the code.”
Shortly after, Zuckerberg predicted on a podcast that AI would be writing all of Meta’s code by the end of 2026, claiming, “It writes higher-quality code than the average person on the team already.”
OpenAI CEO Sam Altman said that AI would soon be “just as good as those of an experienced software engineer.”
Amazon CEO Andy Jassy went further, predicting that AI would take on all technical tasks, telling CNBC in June 2025 that AI would take over coding, research, analytics, security, website localization, and spreadsheets.
Were these CEOs just talking their own book? Without a doubt. As I write this, Amodei’s prediction has not come true. Nowhere near 90% of code is being written by AI, and many software engineers will tell you that AI is still nowhere close to getting the job done on its own. But it’s trending up.
According to one extensive meta-analysis, 30% of all Python code was written by AI as of December 2024. Startup CEOs have reported a similar trend. In March, Y Combinator CEO Gary Tan revealed that AI was producing “95% of the code” inside some of the accelerator’s startups. In September, Brian Armstrong, co-founder of Coinbase, reported that over 40% of the code at the company was being written by AI, with the goal of rapidly increasing that figure.
AI’s coding capabilities have progressed rapidly. Claude Opus 4.5 and Gemini 3, released over the past month, have made another tremendous leap. And coding is the use case where AI labs are focusing the most.
Daniel Balsam is the co-founder and Chief Technology Officer at Goodfire, a prominent AI research lab in Silicon Valley that uses interpretability software to perform “brain surgery” on large language models and make them safer and more effective2. After GPT-5 was released in August, he told me we’ve reached a stage where the producers of large language models need to make trade-offs about what they want their models to get smarter at, thanks to limitations in power, microchip supply, and cost.
“That’s forcing companies to make certain distribution trade-offs,” he explained. “Anthropic is all-in on code. And Claude 4 is not a particularly better chat model. In fact, it might be a worse chat model than the Claude 3 series, but it automates non-trivial parts of software engineering.”
Balsam buys into the AI CEOs' predictions that AI agents will soon do the vast majority of coding. He predicts that, in two years, AI will be able to write code that works 90% of the time, significantly reducing demand for people with technical coding skills. You’ll still need some people who have the domain expertise to understand the code base and fix bugs, but those will be a small amount of roles, shifting which skills he prioritizes. “I’d much rather hire people with really good research or product instincts,” he said.
There are clear signs of a real-world shift. Computer programmer jobs have plummeted to their lowest level since the 1980s, according to the Bureau of Labor Statistics, falling 27.5% since ChatGPT came out.
At the start of this year, Salesforce CEO Mark Benioff announced a hiring freeze for engineers after the company reported a 30% increase in productivity from AI. According to a report from the Reserve Federal Bank of New York, computer engineering graduates aged 22-27 have one of the highest unemployment rates at 7.5%; by comparison, art history grads are at 3%.
For over a decade, Silicon Valley firms ran massive marketing campaigns targeting schools and schoolchildren, promising that if they learned to code, a six-figure job would be waiting for them. As New York Times reporter Natasha Singer said on The Daily: “This represents a stunning breakdown in the promise Silicon Valley made to American school kids.”
Balsam agrees that AI is progressing faster at coding than writing, and that’s likely to continue for logical reasons. “There’s a lot more money if you can automate code than creative writing.”
Tech companies aren’t just targeting coding. Anthropic and OpenAI are spending billions to train AI agents in technical business applications like Salesforce and Excel. AI has found its killer use case, and it was all the technical skills we’d been told were future-proof.
Welcome to the Storytelling Economy
Some economists and researchers anticipated AI’s impact on technical jobs. Beginning in 2023, economists and researchers working with LinkedIn began tracking where humans were most vulnerable to AI job loss and where the greatest opportunities would be. They estimated that AI would replace 96% of a software engineer’s current skills. “Technical and data skills that have been highly sought after for decades appear to be among the most exposed to advances in artificial intelligence,” they wrote in The New York Times.
What did these researchers find to be the most durable skills for the next era of work? The ones we’ve long derided as “soft”: communication, leadership, empathy, and critical thinking.
For much of the past two decades, we’ve seen our value at work reflected in our ability to operate like high-performing machines, executing tasks. Across roles, our value will move from technical tasks to high-touch activities. Engineers’ value will come less from writing code and more from understanding the business and how to solve customers’ problems with novel technological approaches.
Marketers’ value will not come from writing Google search ad copy, but from building relationships with customers, unearthing their success stories, and telling those stories in engaging ways.
Doctors will spend less time on diagnosis and administrative work, and more time practicing empathetic listening to patients’ stories to build trust and unearth information that might otherwise go unsaid.
CFOs will spend less time crunching spreadsheets and more time building strategic relationships across the business. Salespeople will spend less time updating call and deal notes in Salesforce and more time forging connections with customers and prospects.
In an age when AI offers infinite ideas and outputs, the most valuable workers will have the taste to discern the best path and rally people towards a shared goal.
While others see their communication and critical thinking skills atrophy due to an over-reliance on AI, the ones who resist and continue to hone their writing and storytelling chops will have a huge advantage.
Deep down, executives preaching the AI dogma secretly agree. The World Economic Forum’s 2025 Jobs Report found an interesting dynamic. When you ask executives what skills they think will be most valuable in the future, they follow the conventional wisdom that AI skills trump all.
But when you ask them about the skills they want now, the top five most in-demand skills were all soft skills: Critical analytical thinking, flexibility and agility, leadership and social influence, creative thinking, and motivation and self-awareness.
AI skills placed 11th. Computer programming was 23rd, just below environmental stewardship and barely beating out “manual dexterity.”
The good news for many of us is this: There’s one skill that makes us better at all the soft skills that will be most valuable in the next age of work: storytelling. We lead through stories. We communicate and sell through stories. We sell through stories. We develop empathy and trust through stories, and collaborate best when we feel like protagonists in the same plot.
When we tell stories, the listener’s brain activity mirrors that of the storyteller through a process known as neural coupling, which helps create a shared emotional state and builds bonds. If you want to become a desirable teammate and a sharper thinker, there’s no more important skill than storytelling.
In an era in which customers are flooded with generic, AI-generated emails and copy, people with deeply resonant stories and a distinct voice will earn the loyalty and trust of customers and colleagues.

Today, a great storyteller can reach billions of people worldwide in seconds. They can tell the story of a new company, and without a single customer, it’ll be worth billions. Sam Altman’s ability to tell the brilliant story of AI’s sci-fi future has OpenAI valued at $500 billion, even though OpenAI’s business loses $5-15 billion annually.
In April, Altman bought famed Apple designer Jony Ive’s design company, IO, for $6.5 billion — even though Ive’s company was little more than a fiction incorporated. It didn’t have a single product or customer. By all accounts, Altman did it for the plot. The acquisition washed Google I/O — the flagship AI event of OpenAI’s biggest competitor — out of the news.
Three years ago, we saw AI write a passable blog post and anchored to a false future. The storytellers who kept writing, thinking, and honing their craft weren’t naive. They knew something essential. Storytelling has been humanity’s cognitive superpower for millennia. And it will stay that way for years to come.
Thanks for reading! 💌

Joe Lazer (FKA Joe Lazauskas) is an award-winning marketing leader, author, and keynote speaker who transforms how businesses build relationships with their customers.
🚀 Community Spotlight
Myles Youngblood is the founder and CEO of SFG Media, a content growth agency that helps creators, podcasters, and founders turn long-form content into high-performing short-form that drives audience growth and demand. He’s worked behind the scenes with creators across finance, business, and media, building repeatable content systems designed for scale. Myles also hosts Creator CEO, where he breaks down how modern creators turn attention into leverage, cash flow, and real businesses.
Open to: promo/ad swaps, newsletter co-recommendations, and other collaborations.
Want to be featured in this newsletter? Add your name to the Creator Database or nominate someone by replying to this email! I’d love to share your story and what you’re working on with our community!
This post is supported by our sponsor.
Fast, accurate financial writeups
When accuracy matters, typing can introduce errors and slow you down. Wispr Flow captures your spoken thinking and turns it into formatted, number-ready text for reports, investor notes, and executive briefings. It cleans filler words, enforces clear lists, and keeps your voice professional. Use voice snippets for standard financial lines, recurring commentary, or compliance-ready summaries. Works on Mac, Windows, and iPhone. Try Wispr Flow for finance.
If you’d like to sponsor editions of Please Hustle Responsibly and reach {{active_subscriber_count}} marketers, creators, and entrepreneurs, you can respond to this email or visit our media page below.

