Top
Best
New

Posted by napolux 1/11/2026

The next two years of software engineering(addyosmani.com)
328 points | 383 commentspage 2
austin-cheney 1/11/2026|
I have been telling people that, titles aside, senior developers were the people not afraid to write original code. I don’t see LLMs changing this. I only envision people wishing LLMs would change this.
CSSer 1/11/2026||
I almost think what a lot of people are coming to grips is with is how much code is unoriginal. The ones who've adjusted the fastest were humble to begin with. I don't want to claim the title, but I can certainly claim the imposter syndrome! If anything, LLMs validated something I always suspected. The amount of truly unique, relevant to success, code in a given project is often very small. More often than not, it's not grouped together either. Most of the time it's tailored to a given functionality. For example, a perfectly accurate Haversine distance is slower than an optimized one with tradeoffs. LLMs have not yet become adept at housing the ability to identify the need for those tradeoffs in context well or consistently, so you end up with generic code that works but not great. Can the LLM adjust if you explicitly instruct it to? Sure, sometimes! Sometimes it catches it in a thought loop too. Other times you have to roll up your sleeves and do the work like you said, which often still involves traditional research or thinking.
HarHarVeryFunny 1/11/2026||
I disagree.

1) Senior developers are more likely to know how to approach a variety of tasks, including complex ones, in ways that work, and are more likely to (maybe almost subconsciously) stick to these proven design patterns rather than reinvent the wheel in some novel way. Even if the task itself is somewhat novel, they will break it down in familar ways into familar subtasks/patterns. For sure if a task does require some thinking outside the box, or a novel approach, then the senior developer might have better intuition on what to consider.

The major caveat to this is that I'm an old school developer, who started professionally in the early 80's, a time when you basically had to invent everything from scratch, so certainly there is no mental block to having to do so, and I'm aware there is at least a generation of developers that grew up with stack overflow and have much more of a mindset of building stuff using cut an paste, and less having to sit down and write much complex/novel code themselves.

2) I think the real distinction of senior vs junior programmers, that will carry over into the AI era, is that senior developers have had enough experience, at increasing levels of complexity, that they know how to architect and work on large complex projects where a more junior developer will flounder. In the AI coding world, at least for time being, until something closer to AGI is achieved (could be 10-20 years away), you still need to be able to plan and architect the project if you want to achieve a result where the outcome isn't just some random "I let the AI choose everything" experiment.

austin-cheney 1/12/2026|||
I completely agree with your second point. For your first point my experience tells me the people least afraid to write original code are the people least oppositional to reinventing wheels.

The distinguishing behavior is not about the quantity of effort involved but the total cost after consideration of dependency management, maintenance time, and execution time. The people that reinvent wheels do so because they want to learn and they also want to do less work on the same effort in the future.

BoiledCabbage 1/12/2026|||
> in the early 80's, a time when you basically had to invent everything from scratch, so certainly there is no mental block to having to do so, and I'm aware there is at least a generation of developers that grew up with stack overflow and have much more of a mindset of building stuff using cut an paste, and less having to sit down and write much complex/novel code themselves.

I think this is really underappreciated and was big in driving how a lot of people felt about LLMs. I found it even more notable on a site named Hacker News. There is an older generation for whom computing was new. 80s through 90s probably being the prime of that era (for people still in the industry). There constantly was a new platform, language, technology, concept to learn. And nobody knew any best practices, nobody knew how anything "should work". Nobody knew what anything was capable of. It was all trying things, figuring them out. It was way more trailblazing / exploring new territory. The birth of the internet being one of the last examples of this from that era.

The past 10-15 years of software development have been the opposite. Just about everything was evolutionary, rarely revolutionary. Optimizing things for scale, improving libraries, or porting successful ideas from one domain to another. A lot of shifting around deck chairs on things that were fundamentally the same. Just about every new "advance" in front-end technology was this. Something hailed as ground breaking really took little exploration, mostly solution space optimization. There was almost always a clear path. Someone always had an answer on stack overflow - you never were "on your own". A generation+ grew up in that environment and it felt normal to them.

LLMs came about and completely broke that. And people who remembered when tech was new and had potential and nobody knew how to use it loved that. Here is a new alien technology and I get to figure out what makes it tick, how it works how to use it. And on the flip side people who were used to there being a happy path, or a manual to tell you when you were doing it wrong got really frustrated as their being no direction - feeling perpetually lost and it not working the way they wanted.

I found it especially ironic being on hacker news how few people seemed to have a hacker mindset when it came to LLMs. So much was, "I tried something it didn't work so I gave up". Or "I just kept telling it to work and it didn't so I gave up". Explore, pretend you're in a sci-fi movie. Does it work better on Wednesdays? Does it work better if you stand on your head? Does it work differently if you speak pig-latin? Think side-ways. What behavior can you find about it that makes you go "hmm, that's interesting...".

Now I think there has been a shift very recently of people getting more comfortable with the tech - but still was surprised of how little of a hacker mindset I saw on hacker news when it came to LLMs.

LLMs have reset the playing field from well manicured lawn, to an unexplored wilderness. Figure out the new territory.

Terr_ 1/12/2026|||
To me, the "hacker" distinction is not about novelty, but understanding.

Bashing kludgy things together until they work was always part of the job, but that wasn't the motivational payoff. Even if the result was crappy, knowing why it was crappy and how it could've been better was key.

LLMs promise an unremitting drudgery of the "mess around until it works" part, facing problems that don't really have a cause (except in a stochastic sense) and which can't be reliably fixed and prevented going forward.

The social/managerial stuff that may emerge around "good enough" and velocity is a whole 'nother layer.

layer8 1/12/2026||||
No, the negative feelings about LLMs are not because they are new territory, it’s because they lack the predictability and determinism that draw many people to computers. Case in point, you can’t really cleverly “hack” LLMs. It’s more a roll of the dice that you try to affect using hit-or-miss incantations.
bossyTeacher 1/12/2026||
>the negative feelings about LLMs are not because they are new territory, it’s because they lack the predictability and determinism that draw many people to computers

Louder for those turned deaf by LLM hype. Vibe coders want to turn a field of applied math into dice casting.

hooverd 1/12/2026||||
an unexplored wilderness that you pour casino chips into (unless you're doing local model stuff yea yea)
bossyTeacher 1/12/2026|||
>I found it especially ironic being on hacker news how few people seemed to have a hacker mindset when it came to LLMs

You keep using the word "LLMs" as if Opus 4.x came out in 2022. The first iterations of transformers were awful. Gpt-2 was more of a toy and Gpt-3 was an eyebrow-raising chatbot. It has taken years of innovations to reach the point of usable stuff without constant hallucinations. So don't fault devs for the flaws of early LLMs

burnermore 1/12/2026||
Something very odd about the tone of this article. Is it mostly AI written? There is a lot of references and info. But I am feeling far more disconnected with it.

For the record, I was genuinely trying to read it properly. But it is becoming unbearable by mid article.

nerdsniper 1/12/2026||
Yes, lots of AI style/writing in this article. I wouldn't necessarily discredit an article just based on stylization if the content was worth engaging with ... but like you mentioned, when the AI is given too much creative control it goes off the rails by the middle and turns into what the kids call "AI slop".

It resembles an article, it has the right ingredients (words), but they aren't combined and cooked into any kind of recognizable food.

burnermore 1/12/2026|||
Thanks a lot for taking the time to confirm. Not hating on AI slop or something. But I do genuinely feel if he/she/they tried to invest time in writing it, people would consume and enjoy it better.

Its hard to put my finger on it. But it lacks soul, it factor or whatever you want to call it. Feels empty in a way.

I mean, this is not the first AI assisted article am reading. But usually, it's to a negligible level. Maybe it's just me. :)

godshatter 1/12/2026|||
I suppose that eventually enough people will have grown up reading mostly ai slop that that way of speaking will eventually become the norm.
gofreddygo 1/12/2026||
100% has that AI slop smell.

intro... Problem... (The Bottom line... What to do about it...) Looped over and over. and then Finally...

I want to read it, but I can't get myself to.

burnermore 1/12/2026||
Understandable. I usually only recognise AI assist cos someone in the comment section points it out. But the off putting tone of this was blatantly obvious. This is by far the most AI influenced article I have read yet.
reedf1 1/12/2026||
All well-documented knowledge fields will be gone if software goes. Then the undocumented ones will become documented, and they too will go. The best advice to junior devs is get a hands on job before robotic articulating sausages are perfected and humans become irrelevant blobs of watery meat.
dw_arthur 1/12/2026||
I think the GPT3 or 4 minute mile moment for robotics will be when we see a robotic hand with the dexterity of a 6 year old. Once that happens it will quickly be over.
djeastm 1/12/2026||
I'm looking forward to the antics of first-generation robotic plumbers and electricians, myself.
jimbokun 1/13/2026||
Will be amusing watching them fail at common tasks for a couple years then they will suddenly be better than any human plumber or electrician.
bwfan123 1/12/2026||
imo, The OP has bad ai-assisted takes on almost every single "critical question". This makes me doubt if he has breadth of experience in the craft. For example.

> Narrow specialists risk finding their niche automated or obsolete

Exactly the opposite. Those with expertise will oversee the tool. Those without expertise will take orders from it.

> Universities may struggle to keep up with an industry that changes every few months

Those who know the theory of the craft will oversee the machine. Those who dont will take orders from it. Universities will continue to teach the theory of the discipline.

solaire_oa 1/13/2026|
I think this is a fair take (despite the characteristic HN negativity/contrarianism), and succinctly summarizes a point that I was finding hard to articulate while reading the article.

My similar (verbose) take is that seniors will often be able to wield LLMs productively, where good-faith LLM attempts will be the first step, but will be frequently be discarded when they fail to produce the intended results (personally I find myself swearing at the LLMs when they produce trite garbage; output that gets `gco .`-ed immediately- or LLM MR/PRs that get closed in favor of manually accomplishing the prompted task).

Conversely, juniors will often wield LLMs counterproductively, accepting (unbeknown) tech debt that the neither the junior nor the LLM will be able to correct past a given complexity.

misja111 1/12/2026||
> Senior developers: Fewer juniors means more grunt work landing on your plate

I'm not sure I agree with that. Right now as a senior my task involves reviewing code from juniors; replace juniors with AI and it means reviewing code from AI. More or less the same thing.

thw_9a83c 1/12/2026||
> More or less the same thing.

Worse. The AI doesn't share any responsibility.

cheschire 1/12/2026||
And can’t be mentored by the senior except in some ersatz flat text instruction files.
icedrift 1/12/2026|||
And the mistakes AI makes don't carry the same code smells juniors make. There are markers in human code that signals how well they understood the problem, AI code more often looks correct at a glance even if it's horribly problematic.
sanderjd 1/12/2026|||
Yeah, this is a big thing. AIs (at the moment) don't learn. You wait for a new model to come out and hope it is better than the last one. But that isn't the same thing as a person learning over time.
thw_9a83c 1/12/2026||
> AIs (at the moment) don't learn.

Yes, and even when it learns (because there's new version of the AI model) it doesn't learn according to your company/team's values. Those values might be very specific to your business model.

Currently, AI (LLM) is just a tool. It's a novel and apparently powerful tool. But it's still just a tool.

girvo 1/12/2026|||
The juniors get better and closer to the ideal that my team requires via this process. Current AIs don’t, not the same way.
raw_anon_1111 1/12/2026|||
And then because of salary compression and inversion where new employees at the same level as your former junior developers get paid more based on market trends while HR won’t give your former junior developers raises, they end up leaving when ther get “good enough”.

So why hire juniors at all instead of poaching a mid level ticket taker from another company?

If you are a line level manager, even if you want to retain your former junior now mid level developer, your hands are probably tied.

glouwbug 1/12/2026|||
Humans resemble AGI more than they do LLMs
groguzt 1/12/2026|||
currently my job as a junior is to review vibe code that was "written" by seniors. it's just such bullshit and they make mistakes I wouldn't even dare to make in my first year of school
BeetleB 1/12/2026||
Except that the AI doesn't get tired, and your superiors know it. The volume of code you'll have to review will increase.
globular-toast 1/12/2026||
This article suggests it is specialists who are "at risk", but as much more of a generalist I was thinking the opposite and starting to regret not specialising more.

My value so far in my career has been my very broad knowledge of basically the entire of computer science, IT, engineering, science, mathematics, and even beyond. Basically, I read a lot, at least 10x more than most people it seems. I was starting to wonder how relevant that now is, given that LLMs have read everything.

But maybe I'm wrong about what my skill actually is. Everyone has had LLMs for years now and yet I still seem better at finding info, contextualising it and assimilating it than a lot of people. I'm now using LLMs too but so far I haven't seen anyone use an LLM to become like me.

So I remain slightly confused about what exactly it is about me and people like me that makes us valuable.

cowl 1/12/2026||
LLMs have read EVERYTHING yes. that includes a lot of not optimal solutions, repeating mantras about past best practices that are not relevant anymore, thousands of blog posts about how to draw an owl by drawing two circles and leaving the rest as an exercise to the reader etc.

The value of a good engineer is his current-context judgment. Something that LLMs can not do Well.

Second point, something that is being mentioned occasionally but not discussed seriously enough, is that the Dead Internet Theory is becoming a reality. The amount of good, professionally written training materials is by now exhausted and LLMs will start to feed on their own slop. See How little the LLM's core competency increased in the last year even with the big expansion of their parameters.

Babysitting LLM's output will be the big thing in the next two years.

falloutx 1/12/2026||
I mean there is no strat that saves you 100% from it. The layoffs are kind of random, based on teams they dont see any vision for, or engineers who dont perform. Generalising is better imo.
danieltanfh95 1/12/2026||
The most useful thing juniors can do now is use AI to rapidly get up to the speed with the new skill floor. Learn like crazy. Self learning is empowered by AI.

Engineers > developers > coders.

kace91 1/12/2026||
AI has a lot of potential as a personal, always on teaching assistant.

It's also an 'skip intro' button for the friction that comes with learning.

You're getting a bug? just ask the thing rather than spending time figuring it out. You don't know how to start a project from scratch? ask for scaffolding. Your first boss asks for a ticket? Better not to screw up, hand it to the machine just to be safe.

If those temptations are avoided you can progress, but I'm not sure that lots of people will succeed. Furthermore, will people be afforded that space to be slow, when their colleagues are going at 5x?

Modern life offers little hope. We're all using uber eats to avoid the friction of cooking, tinder to avoid the awkwardness of a club, and so on. Frictionless tends to win.

haspok 1/12/2026|||
That is quite some wishful thinking there. Most juniors won't care, just vibe-code their way through.

It takes extra discipline and willpower to force yourself do the painful thing, if there is a less painful way to do it.

amrocha 1/12/2026|||
Because employers famously hire based on skill and not credentials or existence
auggierose 1/12/2026||
Scientists > engineers > developers > coders
auggierose 1/12/2026||
Mathematicians > scientists > engineers > developers > coders
auggierose 1/12/2026||
https://xkcd.com/435/
keybored 1/12/2026||
Is there a Jeapordy for guessing prompts? Give an executive summary of GenAI trends where GenAI is the destiny and everything reacts to it. Touch on all “problems”. Don’t be divisive by making hard proclamations. Summarize in a safe way by appealing to the trope of the enthusiastic programmer who dutifully adapts to the world around them in order to stay “up to date”; the passive drone that accepts whatever environment they are placed in and never tries to change it. But add insult to injury by paradoxically concluding that the only safe future is the one you (individual) “actively engineer”.

I’m not saying that this was prompted. I’m just summarizing it in my own way.

sanderjd 1/12/2026||
> The flip scenario: AI unlocks massive demand for developers across every industry, not just tech. Healthcare, agriculture, manufacturing, and finance all start embedding software and automation. Rather than replacing developers, AI becomes a force multiplier that spreads development work into domains that never employed coders. We’d see more entry-level roles, just different ones: “AI-native” developers who quickly build automations and integrations for specific niches.

This is what I expect to happen, but why would these entry-level roles be "developers". I think it's more likely that they will be roles that already exist in those industries, where the responsibilities include (or at least benefit from) effective use of AI tools.

I think the upshot is that more people should probably be learning how to work in some specific domain while learning how to effectively use AI to automate tasks in that domain. (But I also thought this is how the previous iteration of "learn to code" should be directed, so maybe I just have a hammer and everything looks like a nail.)

I think dedicated "pure tech" software where the domain is software rather than some other thing, is more likely to be concentrated in companies building all the infrastructure that is still being built to make this all work. That is, the models themselves and all the surrounding tools, and all the services and databases etc. that are used to orchestrate everything.

pphysch 1/12/2026||
An AI-enabled developer is still a full-time job that requires SWE expertise. I think the quoted portion is correct, but it will be a gradual change as CTO/CIOs realize the arbitrage opportunity in replacing most of their crappy SaaS subscriptions with high-velocity in-house solutions. The savvy ones, at least.
sanderjd 1/12/2026||
This is true if you want to build professional software. But what I foresee is a lot more tasks being accomplished with task-specific tools created by the people responsible for doing those tasks. Like how people use spreadsheets to get their jobs done, but with a much broader set of use cases.

Once it is easier to just make almost anything yourself than it is to go through a process of expressing your requirements to a professional software development group and iterating on the results, that will be a popular choice.

jimbokun 1/13/2026||
That works until it’s important enough to secure, monitor, operate, debug, and have people on call in case it breaks.

At that point it gets handed over to the engineers.

sanderjd 1/14/2026||
Yes, but I think the reason things become that important is when they are used by a lot of people. People (or small teams) building purpose-specific tools for themselves don't require any of that.
jimbokun 1/16/2026||
Depends.

If it touches private customer data, you better have security right from version 1.

broochcoach 1/13/2026||
[dead]
tigrezno 1/12/2026|
The next two years of software engineering will be the last two years of software engineering (probably).
amelius 1/12/2026||
I don't see the market flooded yet with software that was "so easy to build using LLMs".

Last year was, as it seems, just a normal year in terms of global software output.

steve1977 1/12/2026|||
If anything, looking at for example what Microsoft has been releasing, it's been a year below average (in terms of quality).
falloutx 1/12/2026||||
You are not looking at right places. Github repo counts have been high since 2020 because there are companies & individuals who run fork scripts. So AI cant match the numbers.

But on product hunt, the amount of projects is First week of Jan: 5000+, Entire Jan 2018: 4000 approx.

amrocha 1/12/2026||
That doesn’t mean industry output is high, it means people are starting new products.

Has the output of existing companies/products increased substantially?

Have more products proven successful and started companies?

hard to say but maybe a little

falloutx 1/12/2026||
>Has the output of existing companies/products increased substantially?

Would be impossible to tell.

amrocha 1/12/2026||
No, it would be pretty easy if it looked like new features were shipping significantly faster. But they’re not.
theshrike79 1/13/2026||||
The software being written isn't coming to "the market". It's all internal development.

Look at smaller SaaS offerings and people selling tiny utility apps, those will go away slowly.

Why would I pay for something when I can make it for my own (or company internal) use in an afternoon?

cmpxchg8b 1/12/2026|||
This is such a stupid argument. A very significant amount of code never makes it into the public sphere. None of the code I've written professionally in the last 26 years is publicly accessible, and if someone uses a product I've written they likely don't care if it was written with the aid of an LLM or not.

Not to mention agent capabilities at the end of last year were vastly different to those at the start of the year.

amelius 1/12/2026||
Even if a portion of software is not released to the general public, you'd still expect an increase in the amount of software released to the general public.

Even if LLMs became better during the year, you'd still expect an increase in releases.

cmpxchg8b 1/15/2026||
Maybe, but these days a vast amount of software is hidden behind online services. I'm not sure you'd see the hidden iceberg.
izacus 1/12/2026|||
What are you willing to bet on that prediction? Your car? Your home?

Talk is cheap, let's see the money :D

kubb 1/12/2026||
Please don’t get my hopes up. Adaptable people like me will outcompete hard in the post-engineering world. Alas, I don’t believe it’s coming. The tech just doesn’t seem to have what it takes to do the job.
falloutx 1/12/2026||
Some related fields will be gone too. And the jobs which will remain will be impossible to get.
menaerus 1/12/2026|||
> And the jobs which will remain will be impossible to get.

Exactly my thoughts lately ... Even by yesterday's standards it was already very difficult to land a job and, by tomorrow's standards, it appears as if only the very best of the best will be able to keep their jobs and the ones in a position of decision making power.

More comments...