Top
Best
New

Posted by xk3 12/19/2025

Kernighan's Lever(linusakesson.net)
112 points | 52 commentspage 2
teo_zero 12/22/2025|
Why does everybody confound "twice as hard" with "need to be twice as clever"? Why nobody contemplates twice the time, a team of twice the people, using debugging tools twice as powerful or costing twice?
commandlinefan 12/22/2025||
Far be it from me to disagree with Kernighan but... when I think of "clever" code, I think of things like Duff's device. That's clever as hell. It's also perfectly debuggable. When I deal with undebuggable code in the wild, it's usually due to people doing things like declaring global (sorry, "public static") variables that connect to live databases and start downloading definition tables into memory before the code can run.
vintagedave 12/22/2025|||
Because doubling the people doesn’t halve the difficulty of a problem.
randallsquared 12/22/2025||
I thought you were serious until very last alternative. Well done!
userbinator 12/22/2025||
(2012)

This article can be summarised in one word: learning. I've noticed over the years that there seems to be a growing divide amongst programmers, between those who believe in learning, and those who don't (and actively try to avoid it); unfortunately the latter has become a majority position, but I still try to show others this article when they don't understand code that I've written and would rather I stoop to their level.

A look around the site at what else he has accomplished, should be enough evidence that he isn't just a charlatan, unlike some others who have made a consulting career out of spouting pompous hot air about methodology.

gregw2 12/22/2025||
I applaud the author for thinking afresh on this topic.

I am also comfortable with the closing comments that you can't always dumb down your code or you stagnate and never learn new tricks/techniques. It is a good thing to keep in mind.

But I have also seen people waste a lot of their (and others') time trying to be clever in ways which as an outsider from additional context I have I can anticipate won't pan out. And I've let it slide and watched it end "not-well", leading to long unnecessary debugging cycles, missing deadlines, and creating boilerplates of YAGNI abstraction and complexity that didn't "make the easy things easy and the hard things possible" but instead made the easy things complicated.

I myself have been accused of that when trying to design optimal "scalable" architectures up front. And I myself have patched over inherited "clever" things with flaws that I handled by adding yet more incremental "cleverness" when, N years later I wish I had just cut the knot of Gordian complexity on day 1.

I think Kernighan's Law is perhaps best applied as a cautionary question to ask yourselves or others in the journey: are you getting too clever, and can you (and others around you) really debug the cleverness you are pursuing?

Complexity and cleverness may be needed, but have you considered re-approaching the problem from a standpoint that minimizes the need for cleverness?

Put another way, there is cleverness that brings "simplicity of code" that does not bring "simplicity of debugging or maintenance" by yourself or others. It's wise to be aware of that.

I view cleverness as somewhat like "innovation tokens"... you should "pick a small handful" of them strategically but not overuse them. I don't see that caution in a pure statement of "Kernighan's lever".

Also seemingly tacitly ignored in the poster's perspective is any acknowledgement that software is, or can be in a huge chunk of scenarios, a "team sport". It's all fine for you to get more clever by pushing yourself, but if you don't transfer your knowledge/cleverness to the broader development+support group, it isn't good for the organization, and perhaps not even you if you consider your code's value proposition will itself harden and stagnate and get refactored out.

(Of course, for some programmers, that's a virtue; write your code in an obscure language/style so that nobody else will take credit or touch it and mess it up. I literally had an acquaintance who, sensing in me a similar competence (or elitism?), boasted to me about his cleverness in doing this at his workplace. I was intrigued, but silently not impressed.)

dahart 12/22/2025||
It’s superficially easy to bike-shed on use of clever code and debugging, but there is an interesting and fundamentally difficult question to answer underneath about what to spend your limited time learning, and how to think about it.

Yes Kernighan was trying to pass along advice to future programmers about what he thinks is how to reduce unnecessary effort, and yes at the same time spending time debugging difficult code often/ideally increases your skill and avoiding that effort might mean you miss out on developing those skills. Both things are true, so how does one decide which way to go? Of course it depends on your goals, but it’s also worth asking what the opportunity cost is. What if instead of doing battle with complexity and debuggers, you could instead pick up different skills?

It is possible that people should deliberately ignore Kernighan’s advice, and use clever code in order to gain the skills and experience needed to see that Kernighan was right. ;) It’s also possible that spending that valuable time learning how to scale to larger systems would pay off. Or, for some people, spending less time coding and more time devoted to other pursuits.

The ‘stopped evolving’ comment seems like it might be designed to stir the pot, but the best programmers I’ve ever known tend to work hard at reducing complexity by thinking hard about dependencies and about how to write large systems. They don’t necessarily shy away from high performance tricks or hard problems. It’s possible that what a young programmer means by “clever” and what a very seasoned programmer means by “clever” aren’t the same thing at all. https://www.teamten.com/lawrence/writings/norris-numbers.htm...

zahlman 12/22/2025||
(2012)

> You effortlessly wield clever programming techniques today that would've baffled your younger self. (If not, then I'm afraid you stopped evolving as a programmer long ago.)

... Perhaps if we allow that "clever techniques" can yield simpler results than my former self did.

stodor89 12/22/2025|
My younger self effortlessly wielded clever programming techniques that continuously baffle my current self.
stodor89 12/22/2025||
[dead]
lupire 12/22/2025||
This article says nothing of substance.
Panzerschrek 12/22/2025|
If debugging is 2 times harder than writing code we have at least two choices. One suggests to write simpler code. But another one means not debugging code at all, which may be achieved by using a programming language way better than C, which allows fixing (almost) all bugs in compilation time.
uecker 12/22/2025||
There is no programming language better than C ;-) Just people not yet experienced enough to have learned this. (Just trolling you back)
Panzerschrek 12/22/2025||
50 years of widespread C usage has shown that just trying writing without errors using C doesn't work. But surprisingly some people still believe it's possible.
lelanthran 12/22/2025|||
> 50 years of widespread C usage has shown that just trying writing without errors using C doesn't work.

Millions upon millions of C code, over decades, controlled (and still control) things around you that would kill you, or similar catastrophic failure. Cars, microwaves, industrial machinery, munitions, aircraft systems ... with so few errors attributable to C that I can only think of one prominent example.

So sure, you can get bugs written in C. In practice, the development process is more important to fault-reduction than the language chosen. And yes, I speak from a place of experience, having spent considerable parts of my career in embedded systems.

uecker 12/22/2025|||
Writing without errors using other languages also doesn't work. And if you go towards formal verification (which also does not completely avoid errors), C has good tools.
Panzerschrek 12/22/2025||
By using a better language you have no errors typical for C which usually require debugging. Logical errors may still happen, but they are easy to identify without even running a debugger.
uecker 12/22/2025||
For your comments I get that you drank the Kool Aid, but I see no argument.
zahlman 12/22/2025||
It's honestly strange to me that people still believe that things like type systems and effect systems and borrow checkers can actually do that. At least, without spoiling the features that make compile-time detection preferable in the first place.