Top
Best
New

Posted by napolux 1 day ago

The next two years of software engineering(addyosmani.com)
261 points | 277 commentspage 3
dhruv3006 7 hours ago|
I recently started as a developer advocate - I have similar opinions to the author - junior devs have a hard time getting hired and flipping to something like devrel makes a lot of sense.
Havoc 9 hours ago||
One of the better analysis of this question I think.

On the optimistic take side - I suspect it might end up being true that software might be infused into more niches but not sure it follows that this helps on the jobs market side. Or put different demand for software and SWE might decouple somewhat for much of that additional software demand.

slfnflctd 8 hours ago|
I'm mostly convinced at this point that the jobs market will only be affected temporarily.

This is really just another form of automation, speeding things up. We can now make more customized software more quickly and cheaply. The market is already realizing that fact, and demand for more performant, bespoke software at lower costs/prices is increasing.

Those who are good at understanding the primary areas of concern in software design generally, and who can communicate well, will continue to be very much in demand.

megamix 12 hours ago||
The most important question is who will get paid the most? I don't think the future of software engineering will be attractive if all you do is more work for same or even less pay. A second danger is too much reliance on AI tools will centralise knowledge and THAT is the scariest thing. Software systems will need to perform for a long time, having juniors on board and people who understand software architecture will be massively important. Or will all software crash when this generation retires?
jpadkins 6 hours ago||
the people who start successful new companies will get paid the most.
falloutx 12 hours ago||
The people who don't lose their jobs will also not be in a great spot, there wont be a guarantee that they will never lose their jobs, they will continue to live on the wobbly and uncertain foundation, will get fired for first no they say to the management. If software engineering falls, all the related industries will fall too, thus creating a domino effect, that none of the execs can imagine right now.
menaerus 10 hours ago|||
I really do wonder what sort of economy change is coming to us because companies will hypothetically need to hire less people to sustain the equal output of today. They can do that basically today so not even hypothetically anymore, it just needs some time to take off.

The question IMO is, who will be creating the demand on the other side for all of these goods produced if so many people will be left without the jobs? UBI, redistribution of wealth through taxes? I'm not so convinced about that ...

Ray20 9 hours ago|||
> The question IMO is, who will be creating the demand on the other side for all of these goods produced if so many people will be left without the jobs?

There is no reason why people will left without jobs. Ultimately, "job" is simply a superstructure for satisfying people's needs. As long as people have needs and the ability to satisfy them, there will be jobs in the market. AI change nothing in those aspects.

menaerus 7 hours ago||
I think it very much does. Those exact needs so far have been fulfilled by N people jobs. Today those same needs are going to be fulfilled by N-M people jobs. For your hypothesis to work, human, or shall I say better, market needs to scale such that M people left redundant will be needed to cover that new gap. The thing is that I am not so sure about the "scaling" part. Not to mention that people's skills also need to scale such so that they can deliver the value for scaling the market. Skills that we had until yesterday are slowly started to begin a thing of a past so I am wondering what type of skills people will need in order to get those "new" jobs? I would genuinely like to hear the opinion because I am not really positive that the market will self-adjust itself such that the economy will remain the same.
falloutx 10 hours ago|||
UBI is just a pipe dream. The rich are clutching their pearls even harder.
menaerus 10 hours ago||
I think so too.
hoss1474489 11 hours ago|||
> there wont be a guarantee that they will never lose their jobs, they will continue to live on the wobbly and uncertain foundation

The people who lose their jobs prove this was always the case. No job comes with a guarantee, even ones that say or imply they do. Folks who believe their job is guaranteed to be there tomorrow are deceiving themselves.

Eong 20 hours ago||
Love the article, I had a struggle with my new identity and thus had to write https://edtw.in/high-agency-engineering/ for myself, but also came to the realisation that the industry is shifting too especially for junior engineers.

Curious about how the Specialist vs Generalist theme plays out, who is going to feel it more *first* when AI gets better over time?

mellosouls 23 hours ago||
On the junior developer question:

A humble way for devs to look at this, is that in the new LLM era we are all juniors now.

A new entrant with a good attitude, curiosity and interest in learning the traditional "meta" of coding (version control, specs, testing etc) and a cutting-edge, first-rate grasp of using LLMs to assist their craft (as recommended in the article) will likely be more useful in a couple of years than a "senior" dragging their heels or dismissing LLMs as hype.

We aren't in coding Kansas anymore, junior and senior will not be so easily mapped to legacy development roles.

snovv_crash 13 hours ago|
Sorry but no. Software engineering is too high dimensional such that there is no rulebook for doing it the way there is for building a bridge. You need to develop taste, much like high level Go players do. This is even more critical as LLMs start to spit out code at an ever higher rate allowing entropy to accumulate much faster and letting unskilled people paint themselves into corners.

I think of it a bit like ebike speed limits. Previously to go above 25mph on a 2-wheeled transport you needed a lot of time training on a bicycle, which gave you the skills, or you needed your motorcycle licence, which required you to pass a test. Now people can jump straight on a Surron and hare off at 40mph with no handling skills and no license. Of course this leads to more accidents.

Not to say LLMs can't solve this eventually, RL approaches look very strong and maybe some kind of self-play can be introduced like AlphaZero. But we aren't there yet, that's for sure.

mellosouls 7 hours ago||
I don't think that conflicts with what I said but perhaps counters with something I didn't; your ebike analogy implies a recklessness that the junior with the attributes I mentioned will be averse to. Conversely the senior with the full grasp of LLMs and the "taste" and judgement will naturally be ahead.

But the comparison I made was between the junior with a good attitude and expert grasp on LLMs, and the stick-in-the-mud/disinterested "senior". Those are where the senior and junior roles will be more ambiguous in demarcation as time moves forward.

mishkovski 7 hours ago||
This article reads like it was written by an AI.
zqna 14 hours ago||
My question: are those people who were building crappy, brittle software, which was full of bugs and and orher suboptimal behavior, that were the main reasons of slowing down the evolution that software, will they now begin writing better software because of AI? Answering yes, implies that the main reason of those problems was that those developers didn't have enough time to spend on analyzing those problems or to build protection harnesses. I would stronly argue that was not the case, as the main reason is of intelectual and personal nature - inability to build abstractions, to follow up the route causes (thus not aquiring necessary knowledge), or to avoid being distracted by some new toy. In 2-5 years I expect the industry going into panic mode, as there will be a shortage of people who could maintain the drivel that is now being created en masse. The future is bright for those with the brains, just need to wait this out
streetcat1 18 hours ago||
For some reason miss two important points:

1) The AI code maintainence question - who would maintain the AI generated code 2) The true cost of AI. Once the VC/PE money runs out and companies charge the full cost, what would happen to vibe coding at that point ?

cyberpunk 4 hours ago||
AI assists the maintenance. A lot of posts seem to think like once the code is committed the AI’s what, just go away? If you can write a test for a bug, likely it can be either fully or partially fixed by an ai even today.
NitpickLawyer 16 hours ago|||
I think this post is a great example of a different point made in this thread. People confuse vibe-coding with llm-assisted coding all the time (no shade for you, OP). There is an implied bias that all LLM code is bad, unmaintainable, incomprehensible. That's not necessarily the case.

1) Either you, the person owning the code, or you + LLms, or just the LLMs in the future. All of them can work. And they can work better with a bit of prep work.

The latest models are very good at following instructions. So instead of "write a service that does X" you can use the tools to ask for specifics (i.e. write a modular service, that uses concept A and concept B to do Y. It should use x y z tech stack. It should use this ruleset, these conventions. Before testing run these linters and these formatters. Fix every env error before testing. etc).

That's the main difference between vibe-coding and llm-assisted coding. You get to decide what you ask for. And you get to set the acceptance criteria. The key po9int that non-practitioners always miss is that once a capability becomes available to these models, you can layer them on top of previous capabilities and get a better end result. Higher instruction adherence -> better specs -> longer context -> better results -> better testing -> better overall loop.

2) You are confusing the fact that some labs subsidise inference costs (for access to data, usage metrics, etc) with the true cost of inference on a given model size. Youc an already have a good indication on what the cost is today for any given model size. 3rd party inference shops exist today, and they are not subsidising the costs (they have no reason to). You can do the math as well, and figure out an average cost per token for a given capability. And those open models are out, they're not gonna change, and you can get the same capability tomorrow or in 10 years. (and likely at lower costs, since hardware improves, inference stack improves, etc).

haspok 11 hours ago||
Perhaps thinking about AI generated code in terms of machine code generated by a compiler helps. Who maintains the compiled program? Nobody. If you want to make changes to it, you recompile the source.

In a similar fashion, AI generated code will be fed to another AI round and regenerated or refactored. What this also means is that in most cases nobody will care about producing code with high quality. Why bother, if the AI can refactor ("recompile") it in a few minutes?

qsera 11 hours ago|
I would like to see how things will be when using AI would require half of a devs current paycheck.
More comments...