Top
Best
New

Posted by milanm081 1 day ago

Laws of Software Engineering(lawsofsoftwareengineering.com)
1115 points | 505 commentspage 5
Sergey777 1 day ago|
A lot of these “laws” seem obvious individually, but what’s interesting is how often we still ignore them in practice.

Especially things like “every system grows more complex over time” — you can see it in almost any project after a few iterations.

I think the real challenge isn’t knowing these laws, but designing systems that remain usable despite them.

0xpgm 15 hours ago||
An extension to Zawinski's Law, every web service attempts to expand until it becomes a social network.
bogomog 7 hours ago|
A modernization, really.
tfrancisl 1 day ago||
Remember, just because people repeated it so many times it made it to this list, does not mean its true. There may be some truth in most of these, but none of these are "Laws". They are aphorisms: punchy one liners with the intent to distill something so complex as human interaction and software design.
macintux 1 day ago||
Some similarly-titled (but less tidily-presented) posts that have appeared on HN in the past, none of which generated any discussion:

* https://martynassubonis.substack.com/p/5-empirical-laws-of-s...

* https://newsletter.manager.dev/p/the-unwritten-laws-of-softw..., which linked to:

* https://newsletter.manager.dev/p/the-13-software-engineering...

kwar13 8 hours ago||
Half of these are not about software engineering and just general management principles.
galaxyLogic 20 hours ago||
The Law of Leaky Abstractions. What is a "leaky" abstraction? How does it "leak"?

I wonder if it should be called "Law of Leaky Metaphors" instead. Metaphor is not the same thing as Abstraction. I can understand a "leaky metaphor" as something that does not quite make it, at least not in all aspects. But what would be a good EXAMPLE of a Leaky Abstraction?

dgb23 1 day ago||
I like this collection. It's nicely presented and at least at a glance it adds some useful context to each item.

While browsing it, I of course found one that I disagree with:

Testing Pyramid: https://lawsofsoftwareengineering.com/laws/testing-pyramid/

I think this is backwards.

Another commenter WillAdams has mentioned A Philosophy of Software Design (which should really be called A Set of Heuristics for Software Design) and one of the key concepts there are small (general) interfaces and deep implementations.

A similar heuristic also comes up in Elements of Clojure (Zachary Tellman) as well, where he talks about "principled components and adaptive systems".

The general idea: You should greatly care about the interfaces, where your stuff connects together and is used by others. The leverage of a component is inversely proportional to the size of that interface and proportional to the size of its implementation.

I think the way that connects to testing is that architecturally granular tests (down the stack) is a bit like pouring molasses into the implementation, rather than focusing on what actually matters, which is what users care about: the interface.

Now of course we as developers are the users of our own code, and we produce building blocks that we then use to compose entire programs. Having example tests for those building blocks is convenient and necessary to some degree.

However, what I want to push back on is the implied idea of having to hack apart or keep apart pieces so we can test them with small tests (per method, function etc.) instead of taking the time to figure out what the surface areas should be and then testing those.

If you need hyper granular tests while you're assembling pieces, then write them (or better: use a REPL if you can), but you don't need to keep them around once your code comes together and you start to design contracts and surface areas that can be used by you or others.

nazgul17 1 day ago|
I think the general wisdom in that scenario is to keep them around until they get in the way. Let them provide a bit of value until they start being a cost.
nopointttt 22 hours ago||
The one I keep coming back to is "code you didn't write is code you can't debug." Every fancy dep I grabbed to save an afternoon ended up costing me weeks later when something upstream broke in some way I had no mental model for. LLM generated code has the same problem now. Looks fine until you hit a case it doesn't cover and you're trying to reverse engineer what you let it write.
noduerme 1 day ago|
I'd like to propose a corollary to Gall's Law. Actually it's a self-proving tautology already contained with the term "lifecycle." Any system that lasts longer than a single lifecycle oscillates between (reducing to) simplicity and (adding) complexity.

My bet is on the long arc of the universe trending toward complexity... but in spite of all this, I don't think all this complexity arises from a simple set of rules, and I don't think Gall's law holds true. The further we look at the rule-set for the universe, the less it appears to be reducible to three or four predictable mechanics.

jt2190 6 hours ago|
I think this site doesn’t capture Gall’s Law correctly, and your observations are closer to the original.

Gall notes that the universe naturally trends toward complexity and unintended consequences and therefore complex designs should be assumed to already be full of these unintended consequence “bugs”. He proposes that systems should be designed:

- with less scope to reduce unintended consequences,

- with less rigidity to allow for workarounds when unintended consequences arise, and

- to take advantage of “momentum” to reduce the required energy to use the system correctly. In other words make the right thing the easy thing, remembering that the easiest thing to do is nothing, thus systems will halt if operators get too busy with other tasks.)

More comments...