You’re Not the Worst One Here

Every morning I open my phone and there it is again. Another model dropped. Another framework that “changes everything.” Another person on X/LinkedIn explaining, with quiet confidence, how they’ve already integrated AI into every corner of their workflow while the rest of us were apparently still figuring out folders.

It’s exhausting in a way that’s hard to describe. Not the regular kind of tired — the kind where you feel behind before you’ve even started. Where opening a browser to learn something feels like arriving to a party three years after it ended.

I’ve been in this feeling for a while. And I want to talk about it honestly, because I don’t think enough people do. Most people either pretend they’re fine or quietly give up.

A smartphone resting on a rumpled bed with light-colored sheets and pillows, partially covered by blankets.

Bots talking to bots. You’re watching

Tyler Durden said it in Fight Club: “You are not your khakis.” But there’s another line that’s been living rent-free in my head lately. The narrator, describing himself on a photocopier: “I am Jack’s copy of a copy of a copy.”

In 2026, that’s the internet.

Large language models trained on content produced by earlier large language models. Image generators scraping outputs from other image generators. Social network posts written by AI, reshared by bots, rewritten by other AIs, and served to you as proof that everyone is moving faster than you. Each generation slightly flatter than the last. Each iteration a little more confident, a little less grounded in anything real.

The people who understand what they’re talking about at a deep level? They’re a small fraction of the noise and the rest is signal decay with a high-fidelity simulation of expertise running on a loop, that feels completely artificial.

Almost all the online content may be generated synthetically by 2026. Take a moment to consider that. Let’s say, nine out of every ten things you come across online — be it posts, opinions, or assertive explanations — might not involve any human input whatsoever.

More than half of long-form LinkedIn posts are likely generated by AI, and to be honest, I often skip them because I feel like I’m wasting my time reading content that feels machine-generated. Those insightful, well-organized posts on “how I incorporate AI into my daily routine” are, in fact, just there to make you click for their stats, but the content itself is mediocre or average; frankly, I believe you’ve already seen or known it.

The term “AI slop” was notably recognized as Word of the Year in 2025 by some institutions. This calls to mind the Dead Internet theory from 2016 and emphasizing how rapidly the new is taking shape right before our eyes.

A comic strip illustrating the concept of imperfection over time, showing an original character on the left, followed by three progressively distorted copies labeled A, B, C, and D.

The atmosphere of constant, breathless expertise is entirely manufactured by bots or by agents configured by people with something to sell. It is created by individuals resharing content they don’t fully understand, wrapped in language that sounds authoritative, simply to attract an audience.

And Here’s the irony — and this is the whole thesis of this post: the technology making you feel behind is the same technology creating the illusion that everyone is ahead. In my eyes, that is just the perfect game they set to this real world simulator that we are all in.

And those numbers about “everyone” using AI? Only 1 in 6 people globally were actively using generative AI tools in late 2025. Not experimenting once. Not having autocomplete finish a sentence. Actually using it. The “everyone is already an expert” feeling is in just an algorithmically amplified and synthetically generated lie.

If you sensed something was off — if the noise seemed hollow and the confidence felt staged — you were right. That instinct is important to trust.

The internet broke your brain on purpose

I’m not going to medicalize it with clinical terms, because that’s not what this is about. But I will say this: if you feel paralyzed, that’s not because you’re weak. It’s because the environment is genuinely irrational.

You sit down to learn one thing, and by the time you understand it, three newer things have replaced it. The pressure to constantly absorb, adapt, and build — while also doing your actual job, living your actual life — is a weight that compounds silently. And the people generating the pressure don’t carry it themselves, because most of them are either bots or people who’ve learned to perform certainty they don’t feel.

The people who seem unaffected? They’re either very deep in and past the initial overwhelm, or they stopped genuinely engaging and started performing. Neither is something to aspire to. The discomfort is a sign you’re taking this seriously — not a sign you’re too late.

The people ahead of you are also lost

Yes, some people started earlier. Yes, that matters. I’m not going to pretend the gap is imaginary.

But skills for AI-exposed roles are changing much faster than other jobs based on what I am reading. That statistic has changed how I think about all of this. It means the people who were “early” are constantly re-learning too. The model they mastered three months ago is already partially obsolete.

From my own experience, I see people daily using ChatGPT as they did in 2023. They started at that time and haven’t really changed their habits, sticking to what they know. However, whenever you talk with them and show tools like Claude Code and plugged it with Claude Opus 4.6, their eyes are completely blown away by possibilities they never even considered.

The thing is, I get it. Staying with what works feels rational. But in a field where the tools themselves are changing underneath you, comfort is a slow way to fall behind. I learned that the hard way. And I’d rather say it out loud than pretend I figured it all out gracefully.

Nobody is settled. Nobody is done.

The playing field doesn’t reset to zero — but it resets. Getting in now still puts you above the people who never step in. And the opportunity isn’t closed: AI talent demand grew exponentially year-over-year in 2025. The field is expanding but not gatekeeping (at least in 2026).

But I gave you a warning, I mean seriously.

There’s a version of “starting with AI” that I think is a trap, and I want to name it before you walk into it. People will tell you: just let AI do it. Generate the whole project. Don’t worry about understanding the underlying code or logic — the AI handles it.

Here’s the scenario:

you ask AI to build something in a language you don’t know, for a codebase you’ve never touched. It works. You ship it. Three weeks later a customer reports a critical bug. The AI isn’t available — maybe the service is down, maybe you’ve hit your usage limit, maybe the model’s been updated and doesn’t produce the same output. Whatever the reason, it’s gone. And you’re standing in front of code you cannot read, for a problem you cannot diagnose, with a customer who is not happy.

A cartoon dog sitting calmly at a table in a room that is on fire, with a coffee mug beside it, saying 'THIS IS FINE'.

This isn’t hypothetical. It’s a cliff a lot of people are walking toward with confidence.

My rule: always work in a language you understand. Always know what the code is doing. Let AI make you faster at things you already grasp — don’t use it to bypass the grasping. If you can do something yourself and understand it, then let AI handle the parts you don’t enjoy. Delegate the tedious, not the critical.

AI is an assistant. A powerful one. But if it becomes a replacement for your own understanding, you’re not building leverage — you’re building dependency. And dependency, in a field this unstable, is the most dangerous thing you can build.

Everyone’s fighting for the same door

The mainstream AI conversation is obsessed with a very narrow set of things — the same tools, the same language, the same problems, chased by the same people. But the interesting work isn’t happening at the center. It’s happening at the edges, where AI meets something it doesn’t understand on its own.

A nurse who understands AI beats an AI engineer who knows nothing about healthcare. A teacher who can use AI to rethink how students learn has something no prompt engineer can replicate. Domain expertise combined with AI literacy is more valuable than AI expertise alone — and almost everyone in the mainstream conversation is chasing the latter while ignoring the former.

There’s another angle most people miss: as AI-generated content floods everything, genuinely human perspectives become rarer, not less valuable. The more the internet fills with synthetic text and manufactured takes, the more a real voice — with real experience, real uncertainty, real texture — stands out. You being human, with a specific history and a specific lens, is becoming an asset in ways that weren’t true two years ago.

I don’t know exactly where my angle is yet. But I’m looking. And if you’re reading this, you probably are too.

Start anyway

Not because everything is going to be fine. Not because the overwhelm goes away. But because the question you’re asking — is there still room for me? — only gets answered by stepping in and finding out.

The noise will still be there tomorrow. Most of it will be fake. The gap is smaller than it looks, the field is more open than it feels, and the fact that you’re here — sitting with the discomfort instead of scrolling past it — matters.

A cherry blossom tree with clusters of pink flowers against a clear blue sky.

Start as someone building to understand. And know that wherever you’re starting from, you’re not the worst one here.

Leave a Reply

Discover more from Fumi's Box

Subscribe now to keep reading and get access to the full archive.

Continue reading