Top
Best
New

Posted by napolux 1/11/2026

The next two years of software engineering(addyosmani.com)
328 points | 383 commentspage 5
jason_s 1/13/2026|
> Senior developers: Position yourself as the guardian of quality and complexity.

Agreed but it's not an easy charge to fulfill.

bjt12345 1/12/2026||
I honestly think middle management will be where AI causes job cuts.

In today's corporate environment, 70% of the costs are in management and admin muddlement, do we really think these "people skills" translate into anything useful in an AI economy?

The junior devs have far more hope.

fogzen 1/12/2026|
I agree. The lack of discussion about replacing middle-management with AI betrays the real politics of business.

Middle-managers output exactly what LLMs do: chats, documents, summaries. Particularly working remotely. They don't even generate tickets/requirements – that's pushed to engineers and product people.

qsera 1/12/2026||
I would like to see how things will be when using AI would require half of a devs current paycheck.
mawadev 1/12/2026||
I mean it's pretty simple: management will take bad quality (because they don't understand the field) over having and paying more employees any day. Software engineer positions will shrink and be unrecognizable: one person expected to be doing the work of multiple departments to stay employed. People may leave the field or won't bother learning it. When the critical mass is reached, AI will be paywalled and rug pulled. Then the field evens itself out again over a long, expensive period of time for every company that fell for it, lowering the expectations back to reality.
Anamon 1/13/2026||
That's another one for concluding that there's nothing new under the sun. This is the exact dynamic that happened during the offshoring hype.

Now, it's expecting senior engineers to "orchestrate" 10 coding agents, then it was expecting them to orchestrate 10 cheap developers on the other side of the world. Then, the reckoning came when those offshore developers realised that if they produced code as good as that of a "1st world" engineer, they can ask a similar salary, too, and those offshoring clients who didn't want to pay up were left with those contractors who weren't good enough to do that. This time, it will be agent pricing approaching the true costs. Both times, the breaking point is when managers realise that writing code was never the bottleneck in the first place.

falloutx 1/12/2026||
This is truly the problem: You either get fired or you get to work 10x more to survive. Only question is how many of us will be in 1st group and how many in the 2nd group, its a lose lose situation.
mawadev 1/12/2026||
Exactly. Some jobs moved from database, backend, frontend and devops to "fullstack", which means 4 jobs with the pay of one. People do that job, but with only 8h-10h in a day the quality is as expected. I think overall people will try to move out of the field, no matter how much of a force multiplier AI might be. Its simply a worse trade to carry so much responsibility and burden when you can work in IT or outside of IT in a less cognitively demanding field with set hours and expectations for the same pay (in EU, very hyperbolic statement tbh). Especially when the profit you bring dwarfs the compensation with all the frustrations that come with knowing that and being kept down in the corporate ladder.
crassus_ed 1/12/2026||
This was an amazing read! The entire thought of being so versatile as the article mentions is similar to the book Homo Deus by Harari.

In my opinion we always needed to be versatile to stand any chance of being comfortable in these insanely rapid changing times.

claytongulick 1/12/2026||
One option that didn't seem to be discussed in TFA is turning away from AI.

There's an implicit assumption in the article that the coding models are here to stay in development. It's possible that assumption is incorrect for multiple reasons.

Maybe (as some research indicates) the models are as good as they are going to get. They're always going to be a cross between a chipper stochastic parrot and that ego inflated junior dev that refuses to admit a mistake. Maybe when the real (non-subsidized) economics present themselves, the benefit isn't there.

Perhaps the industry segments itself to a degree. There's a big difference in tolerance for errors in a cat fart app and a nuclear cooling system. I can see a role for certified 100% AI free development. Maybe vibe coders go in one direction, with lower quality output but rapid TTM, but a segment of more highly skilled developers focus on AI free development.

I also think it's possible that over time the AI hyper-productivity stuff is revealed to be mostly a mirage. My personal experience and a few studies seem to indicate this. The purported productivity boost is a result of confirmation bias and ridiculous metrics (like LOC generated) that have little to do with actual value creation. When the mirage fades, companies realize they are stuck with heaps of AI slop and no technical talent able to deal with it. A bitter lesson indeed.

Since we're reading tea leaves, I think the most likely outcome is that the massive central models for code generation fade due to enormous costs and increased endpoint device capabilities. The past 50 years have shown us clearly that computing will always distribute, and centralized mainframe style compute gets pushed down to powerful local devices.

I think it settles at an improved intellisense running locally. The real value of the "better search engine" that LLMs hold today reduces as hard economics drive up subscription fees and content is manipulated by sponsors (same thing that happened to the Google search results).

For end users, I think the models get shoved into a box to do things they're really good at, like giving a much more intuitive human-computer interface, but structured data from that is handed off to a human developer to reason about, MCP will expand and become the glue.

I think that over time market forces will balance between AI and human created content, with a premium placed on the latter. McDonalds vs a 5 star steakhouse.

rcxdude 1/13/2026||
Assuming AI is at all useful it's likely to be used for safety-critical software development. Safety-critical processes aren't likely to care about LLM involvement much at all, much like they don't generally care about competence of those doing the work already.
godshatter 1/12/2026||
>Maybe (as some research indicates) the models are as good as they are going to get. They're always going to be a cross between a chipper stochastic parrot and that ego inflated junior dev that refuses to admit a mistake. Maybe when the real (non-subsidized) economics present themselves, the benefit isn't there.

I'd put my money on this. From my understanding of LLMs, they are basically mashing words together via markov chains and have added a little bit of subject classification with attention, a little bit of short-term memory, and enough grammar to lay things out correctly. They don't understand anything they are saying, they are not learning facts and trying to build connections between them, they are not learning from their conversations with people. They aren't even running the equivalent of a game loop where they can even think about things. I would expect something we're trying to call an AI to call you up sometimes and ask you questions. Trillions of dollars have got us this far, how far can it actually take us?

I want my actual AI personal assistant that I have to coerce somehow into doing something for me like an emo teen.

mishkovski 1/12/2026||
This article reads like it was written by an AI.
jillesvangurp 1/12/2026||
Change is a constant for software engineers. It always has been. If your job is doing stuff that should be automated, either you are automating it or you are not a very good software engineer.

A few key fallacies at play here.

- Assuming a closed world assumption: we'll do the same amount of work but with less people. This has never been true. As soon as you meaningfully drop the price of a unit of software (pick your favorite), demand goes up and we'll need more of them. Also it opens the door to building software that previously would have been too expensive. That's why the amount of software engineers has consistently increased over the years. This despite a lot of stuff getting a lot easier over time.

- Assuming the type of work always stays the same. This too has never been true. Stuff changes over time. New tools, new frameworks, new types of software, new jobs to do. And the old ones fade away. Being a software engineer is a life of learning. Very few of us get to do the same things for decades on end.

- Assuming people know what to ask for. AIs do as you ask, which isn't necessarily what you want. The quality of what you get correlates very much to your ability you ask for it. The notion that you get a coherent bit of software in response to poorly articulated incoherent prompts is about as realistic as getting a customer to produce coherent requirements. That never happened either. Converting customer wishes into maintainable/valuable software is still a bit of a dark art.

The bottom line: many companies don't have a lot of in house software development capacity or competence. AI doesn't really help these companies to fix that in exactly the same way that Visual Basic didn't magically turn them into software driven companies either. They'll use third party companies to get the software they need because they lack the in house competence to even ask for the right things.

Lowering the cost just means they'll raise the ambition level and ask for more/better software. The type of companies that will deliver that will be staffed with people working with AI tools to build this stuff for them. You might call these people software engineers. Demand for senior SEs will go through the roof because they deliver the best AI generated software because they know what good software looks like and what to ask for. That creates a lot of room for enterprising juniors to skill up and join the club because, as ever, there simply aren't enough seniors around. And thanks to AI, skilling up is easier than ever.

The distinction between junior and senior was always fairly shallow. I know people that were in their twenties that got labeled as senior barely out of college. Maybe on their second or third job. It was always a bit of a vanity title that because of the high demand for any kind of SEs got awarded early. AI changes nothing here. It just creates more opportunities for people to use tools to work themselves up to senior level quicker. And of course there are lots of examples of smart young people that managed to code pretty significant things and create successful startups. If you are ambitious, now is a good time to be alive.

doug_durham 1/11/2026||
The author has a bizarre idea of what a computer science degree is about. Why would it teach cloud computing or dev ops? The idea is you learn those on your own.
happytoexplain 1/11/2026||
If that's "the idea", then clearly we need a more holistic, useful degree to replace CS as "the" software degree.
kibwen 1/12/2026|||
Despite what completely uninformed people may think, the field "computer science" is not about software development. It's a branch of mathematics. If you want an education in software development, those are offered by trade schools.
AnimalMuppet 1/12/2026|||
What I want is for universities to offer a degree in Software Engineering. That's a different field from Computer Science.

You say that belongs in a trade school? I might agree, if you think trade schools and not universities should teach electrical engineering, mechanical engineering, and chemical engineering.

But if chemical engineering belongs at a university, so does software engineering.

xboxnolifes 1/12/2026|||
Many do. Though, the one I'm familiar with is basically a CS-lite degree with software specific project design and management courses.

Glad I did CS, since SE looked like it consisted of mostly group projects writing 40 pages of UML charts before implementing a CRUD app.

collingreen 1/12/2026||||
Plenty of schools offer software engineering degrees alongside computer science, including mine ~20 years ago.

The bigger problem when I was there was undergrads (me very much included) not understanding the difference at all when signing up.

none2585 1/12/2026||||
Saying this as a software engineer that has a degree in electrical engineering - software "engineering" is definitely not the same as other engineering disciplines and definitely belongs in a trade school.
menaerus 1/12/2026||
Right, because the guy sitting next to me and is designing a PCB for next copy of rPI is so much more for an engineer than the other guy designing a distributed computing algorithm? It shows that you only dealt with the trivial things in SE. There are very complex areas in both disciplines and as much as I can find trivial things in SE I can do the same for EE. Let's just not pretend it's a science fiction when it's not.
none2585 1/12/2026||
Developing a distributed computing algorithm I think would squarely fall into CS. Engineering is the application of stuff like that.
mxkopy 1/12/2026||||
Last I checked ASU does, and I’m certain many other universities do too.
pkaye 1/12/2026|||
My university had Electrical Engineering, Computer Engineering, Software Engineering and Computer Science degrees (in additional to all the other standard ones.)
happytoexplain 1/13/2026||||
I didn't say it taught you software development (though it does, non-primarily), I said it was '"the" software degree'. I.e. the degree you get if you want to get into software in general - which, in reality, is what people believe is true, and reality/pragmatism is all that matters, even if you feel so superior to those people as to resort to insults.

To be clear, I did not go into CS. But I do live in this world

kerblang 1/12/2026|||
This widely circulated claim ignores the fact that math is not science.
throwaway7783 1/12/2026||||
The degree is (should be) about CS fundamentals and not today's hotness. Maybe a "trades" diploma in CS could teach today's hotness.
wrs 1/12/2026||||
Cloud computing is not some new fundamental area of computer science. It’s just virtual CPUs with networks and storage. My CS degree from 1987 is still working just fine in the cloud, because we learned about CPUs, virtualization, networks, and storage. They’re all a lot bigger and faster, with different APIs, but so what?

Devops isn’t even a thing, it’s just a philosophy for doing ops. Ops is mostly state management, observability, and designing resilient systems, and we learned about those too in 1987. Admittedly there has been a lot of progress in distributed systems theory since then, but a CS degree is still where you’ll find it.

School is typically the only time in your life that you’ll have the luxury of focusing on learning the fundamentals full time. After that, it’s a lot slower and has to be fit into the gaps.

wakawaka28 1/11/2026|||
There has to be a balance of practical skills and theory in a useful degree, and most CS curricula are built that way. It should not be all about random hot tech because that always changes. You can easily learn tech from tutorials, because the tech is simple compared to theory. Theory is also important to be able to judge the merits of different technology and software designs.
Ekaros 1/12/2026|||
I am not sure abot devops. But Cloud Computing likely has lot of science behind it. When done properly. They are not any less complex systems to reason about than just code. And I mean it as understanding and designing cloud platforms. Not as deploying code to them.
tibbar 1/11/2026||
Why is this necessarily true?
sys_64738 1/11/2026||
A CS degree is there to teach you concepts and fundamentals that are the foundation of everything computing related. It doesn't generally chase after the latest fads.
tibbar 1/12/2026||
Sure, but we need to update our definitions of concepts/fundamentals. A lot of this stuff has its own established theory and has been a core primitive for software engineering for many years.

For example, the primitives of cloud computing are largely explained by papers published by Amazon, Google, and others in the early '00s (DynamoDB, Bigtable, etc.). If you want to explore massively parallel computation or container orchestration, etc, it would be natural to do that using a public cloud, although of course many of the platform-specific details are incidentals.

Part of the story here is that the scale of computing has expanded enormously. The DB class I took in grad school was missing lots of interesting puzzle pieces around replication, consistency, storage formats, etc. There was a heavy focus on relational algebra and normalization forms, which is just... far from a complete treatment of the necessary topics.

We need to extend our curricula beyond the theory that is require to execute binaries on individual desktops.

zipy124 1/12/2026|||
This is mostly software engineering not computer science though. That is but a small sub-section of computer science.
tibbar 1/12/2026||
I just don't see the distinction. Looking at it from the other direction: most CS degrees will have you spend a lot of time looking at assembly language, computer architecture, and *nix tools. But none of these are mathematical inevitabilities - they're just a core part of the foundations of software engineering.

However, in the decades since this curricula was established, it's clear that the foundation has expanded. Understanding how containerization works, how k8s and friends work, etc is just as important today.

sys_64738 1/13/2026||
Containerization would be covered in a lecture on OS Concepts. A CS degree isn't to teach you about using containerization. Take a course specific to that.
michaelsalim 1/12/2026|||
I do agree that the scale has expanded a lot. But this is true with any other fields. Does that mean that you need to learn everything? Well at some point it becomes unfeasible.

See doctors for example, you learn a bit of everything. But then if you want to specialise, you choose one.

Kate5477 1/18/2026|
[dead]