Top
Best
New

Posted by nilirl 22 hours ago

Why senior developers fail to communicate their expertise(www.nair.sh)
644 points | 285 commentspage 5
donatj 18 hours ago||
Speed… speed… velocity… speed. All I hear about these days. Every meeting.

Honest question does high velocity / first mover ever really pay off these days?

I don't feel like having the first AI slop to the market has actually paid off for anyone? Am I wrong? Am I missing something? Am I out of touch?

The way I see it, first movers do a lot of work proving the idea works, and everyone else swoops in with better product or at least at a cheaper rate.

Beyond that, let's take the company I work for, for example. We have an ingrained and actually relatively happy customer base on a subscription model. I feel like the only thing increased velocity can do is rapidly ruin their experience.

dotancohen 7 hours ago||
I stopped communicating my experience-derived lessons when I discovered that 1. it cheapened the perception of "my genius", and 2. nobody wants to hear it anyway. From non-tech workers for whom I'd write a bat or bash script for, to engineers for whom I'd debug a complex race condition - they all just want the answer and care nothing about how I got it.

Fine, then, I'll keep the experience to myself.

drbojingle 14 hours ago||
They fail to communicate in the same way we fail to download a copy of "the truths of the world as we know it" into every child's brain. It's easily to say " look both ways when you cross the road" but speech is so one dimensional. It's a slow tape reel and that's just the encoding.
robin64 9 hours ago||
I enjoyed reading this, and I agree with the underlying message: communicating better with our audience.

I think the framing started in the right path and then took a slightly wrong turn.

Both loops presented benefit from being tighter, faster. One to take a system to a “stable” (maintainable) setpoint quickly. The other to handle uncertainty.

And the additional insight about splitting the systems to better adapt to AI… we’ve described spikes for years, well before AI went mainstream.

robin64 9 hours ago||
I enjoyed reading this, and I agree with the underlying message: communicating better with our audience.

I think the framing started in the right path and then took a slightly wrong turn.

Both loops presented benefit from being tighter, faster. One to take a system to a “stable” (maintainable) setpoint quickly. The other to handle uncertainty.

And the additional insight about splitting the systems to better adapt to AI… we’ve described spikes for years, well before AI went mainstream

block_dagger 16 hours ago||
It seems to me that the author fails to extrapolate on the effects of recursive self improvement. The only things preventing 95% engineer obsolescence will be compute/energy constraints and the speed of adoption, which can take years for large infrastructure companies. But it's coming.
halfcat 15 hours ago|
Cuts both ways. If supply chain attacks continue recursive self improvement, everyone’s going to be working in air-gapped facilities. Departments also need to be air-gapped from one another. And each team air-gapped. And so on.

There’s a speed limit, because the faster you go the less room for error you have. It’s the same as being heavily leveraged with debt. If you have a cash investment and it drops by 50% you can just wait. If you’re leveraged 100-to-1, a 1% drop forced liquidation and wipes you out.

don-code 20 hours ago||
I agree with the author's premise - that one feedback loop optimizes for speed, and the other for scale - but I don't think the market is bearing the conclusion - that AI should be utilized to enable more rapid experimentation, where we better scale what works.

Many vendors seem to be learning (or not learning, but just throwing their weight against it anyway) that adding hastily-generated AI features are causing customer dissatisfaction, as more people brand the features "slop".

In the best case, the users give the company more chances. Infinitely more chances.

In a worse case, the users assume the new feature will always be bad, given their first impression. It's hard for a vendor to make people reconsider a first impression.

The absolute worst case is that AI enables a new market, but the first attempts are so poor that the first movers make people write that market off as a dead end, leading to a lost opportunity.

egorfine 5 hours ago||
> this is my senior developer. The avoider, the reducer, the recycler. They want to avoid development as much as they can

And push an insurmountable pile of technical debt onto the successor.

Well, yeah, I understand the idea and I'm all for it: the less code the better, the less changes the better.

However in certain industries it is no longer a right approach for the job. In modern frontend development if you did not update your codebase for like a couple of months, this codebase falls so much behind that it becomes way more expensive to push an upgrade as compared to daily minor updates of packages. Yeah, I hate this as much as you do, but this is the pace frontend is moving at, and if you don't follow, you will mount technical debt.

doxeddaily 9 hours ago|
I actually think the article makes some pretty interesting points. It's not about the name of it though.
More comments...