Posted by carlos-menezes 4/12/2025
Some genius decided that, to make time input convenient, YAML would parse HH:MM:SS as SS + 60×MM + 60×60×HH. So you could enter 1:23:45 and it would give you the correct number of seconds in 1 hour, 23 minutes, and 45 seconds.
They neglected to put a maximum on the number of such sexagesimal places, so if you put, say, six numbers separated by colons like this, it would be parsed as a very large integer.
Imagine my surprise when, while working at a networking company, we had some devices which failed to configure their MAC addresses in YAML! After this YAML config file had been working for literal years! (I believe this was via netplan? It's been like a decade, I don't remember.)
Turns out, if an unquoted MAC address had even a single non-decimal hex digit, it would do what we expected (parse as a string). This is not only by FAR the more common case, but also we had an A in our vendor prefix, so we never ran into this "feature" during initial development.
Then one day we ran out of MAC addresses and got a new vendor prefix. This time it didn't have any letters in it. Hilarity ensued.
(This behavior has thankfully been removed in more recent YAML standards.)
index.html.pl is where the problem started and the reason why the officially recommended file extension for Perl files used to be (still is?) *.plx.
I don't have the Camel book at hand, but Randal Schwartz's Learning Perl 5th edition says:
"Perl doesn't require any special kind of filename or extension, and it's better not to use an extension at all. But some systems may require an extension like plx (meaning PerL eXecutable); see your system's release notes for more information."
The YAML document from hell (566 points, 2023, 353 comments) https://news.ycombinator.com/item?id=34351503
That's a Lot of YAML (429 points, 2023, 478 comments) https://news.ycombinator.com/item?id=37687060
No YAML (Same as above) (152 points, 2021, 149 comments) https://news.ycombinator.com/item?id=29019361
I never liked the provisioning overlap Ansible has with Terraform, so it makes sense to me: provisioning servers with tf, configure them with another tool, whether it’s ansible or pyinfra. Well, at least in theory
Which Ansible is absolutely atrocious at, so that makes sense. Use the best tool for the job (so Terraform, maybe Pulumi/tfcdk if you hate your future self/future teammates for infra.
Even more fun is if you then run your Kubernetes cluster on top of a VM orchestrator such as vSphere, that way you have multiple layers of stateful control planes and compute orchestrators fighting each other.
If only they had had ⊥ and ⊤ somewhere on their keys to work with Booleans directly while designing the languages. In another branch of history, perchance.[1]
[1] https://en.wikipedia.org/wiki/APL_(programming_language)#/me...
Boolean and propositional logic is not the same.
It's not that bad, because you can explicitly turn that behavior off, but ask me how I know =(
"Yeah but it's so convenient"
"Yeah but the benefit of yaml is that you don't need quotes everywhere so that it's more human readable"
DON'T
00,01,02,03,04,05,06,07,OH SHIT
Ansible has a pretty common issue with file permissions, because pretty much every numeric representation of a file mode is a valid number in YAML - and most of them are not what you want.
Sure, we can open up a whole 'nother can of worms if we should be programming infrastructure provisioning in YAML, but it's what we have. Chef with Ruby had much more severe issues once people started to abuse it.
Plus, ansible-lint flags that reliably.
See also p95 but the same couple of users always see the p99 time, due to some bug.
I've only seen it used for configuration.
Don't ask me why though, might have something to do with how it's written like a python file, no user would want to write their data in yaml format.
Robustness has a meaning and it refers to handling bad inputs gracefully. An example of a lack of robustness is allowing a malicious actor to execute arbitrary code by supplying a datum larger than some buffer limit.
Trying to make sense of invalid inputs and do something with them isn't robustness. It's just example of making an extension to a spec. The extension could be robust or not.
Postel's Law amounts to "have extensions and hacks to handle incorrectly formatted data, rather than rejecting them. So, OK, yes, that entails being robust to certain bad inputs that are outside of the spec, but which land into onto one of the extensions. It doesn't entail being robust to inputs that fall outside of the core spec and all hacks/extensions.
Cherry picking certain bad inputs and giving them a meaning isn't, by itself, bona fide robustness; robustness means handling all bad inputs without crashing or allowing security to be compromised.
In a distributed non-adversarial setting, this is exactly what you want for robustness.
The problem, as we've come to realise in the time since Postel's law was formulated, is that there is no such thing as a distributed non-adversarial setting. So I get what you're saying.
But your definition of robustness is too narrow as well. There's more to robustness than security. When Outlook strips out a certificate from an email for alleged security reasons, then that's not robustness, that's the opposite, brokenness: You had one job, to deliver an attachment from A to B, and you failed.
Robustness and security can be at odds. It's quite OK to say, "on so and so occasion I choose to make the system not robust, because the robust solution would not be sufficiently secure".
Ouch, no. Dragons be there. Famous last words.
The only area in which is it acceptable to reason this way is graphical user interfaces. (And only if you've provided an API already for reliable automation, so that nobody has to automate the application through its GUI.). Is say graphical, because, no, not in command interfaces.
Even in the area of GUIs, new heuristics about intent cause annoyances to the users. But only annoyances and nothing more.
Like for example when you update your operating system, and now the window manager thinks that whenever you move a window so that its title bar happens to touch the top of the screen, you must be indicating the intent to maximize it.
I suppose the ship has sailed now that people are deploying LLMs in this way and that and those things intuit intent. They are like Postel's Law on amphetamines. There is a big cost to it, like warming the planet, and the systems become fragile for their lack of specification.
> When Outlook strips out a certificate from an email for alleged security reasons
I would say it's being liberal in what it accepts, if it's an alternative to rejecting the e-mail for security reasons.
It has taken a datum with a security problem and "fixed" it, so that it now looks like a datum without that security problem.
(I can't find references online to this exact issue that you're referring to, so I don't have the facts. Are you talking about incoming or outgoing? Is it a situation like an expired or otherwise invalid certificate not being used when sending an outgoing mail? That would be "conservative in what you send/do".)
Or to put it otherwise, Postel was right to begin with, albeit perhaps just a little too cryptic, and has been frequently misquoted and misinterpreted ever since.
This is especially ironic given that the constructive argument against Postel’s law is generally based on the value of a strict interpretation of specification. If you’re intentionally omitting half of the law, then you have an implementation problem.
Furthermore, none of this has much to do with YAML being a shitty design.
The world doesn't need principles that are half good.
That could undoubtedly lead to misapprehension. As the GP indicated, and as the word itself means, it references systemic stability and self-preservation behaviour. Reciprocally, however, the obligation to be liberal absolutely does not mean absolving faulty inputs of their flaws. For example, it would not excuse a dud response to an SSH handshake like trying to negotiate RC4. Both Steve Crocker and Eric Allman have been at pains to unpack the understanding of robustness, forgiveness, and format canonicalisation in security context, and they're hardly wrong. It's also why I'm particularly an advocate of the "do", not the "send", formulation. This is a much more systemic and contextual verb in its consequences for implementatation.
> staying away from generating inputs for other programs that exercise dark corner cases
This is exactly the kind of focus-solely-on-the-wire misdirection that I identified above as a common misinterpretation. Conforming to the most precise and unambiguous interpretation of a protocol, if there is one, in regards to what an implementation puts on the wire, can most certainly be a part of that, but that isn't always what being conservative looks like, and processing is equally if not more important.
The introduction of Explicit Congestion Notification (ECN) aka RFC 3168 (2001) springs to mind. RFC 791 (1981) defined bits 14 & 15 of the IPv4 header as "reserved for future use" and diagrammatically gave them as zero. RFC 1349 (Type of Service, 1992, now obsoleted) named them "MBZ" (Must Be Zero) bits but gave them to be otherwise ignored. RFC 2474 (DSCP, 1998) did much the same with what it termed the "Currently Unused field". When ECN was introduced, making use of those bits as a supposedly backwards-compatible congestion signalling mechanism, we discovered a significant proportion of IP implementations aboard endpoints, routers, and middleboxes were rejecting (by discard or reset) datagrams with nonzero values in those bits. Consequently, ECN has taken two decades to fully enable, and this is where both sides of the principle prove their joint and inseparable necessity; to this day many ECN-aware TCP/IP stacks are passive, stochastic, or incremental with their advertisement of ECN, and equally forgiving if the bits coming back don't conform, because an implementation that resets a connection under the circumstances where the developer comprehends the impedance mismatch would be absurd. Thus fulfilling both sides of the maxim in order to promote systemic stability and practical availability and giving ECN a path to the widespread interoperability it has today.
The exposition on page 13 of RFC 1122 (Requirements for Internet Hosts, 1989) broadly anticipated this entire scenario, even though the same section misquotes Postel (or, rather, uses the "send" restatement that I find too reductive).
The statement of the robustness principle is an integrated whole. A partial reading is, perhaps ironically, nonconformant; as with Popper's paradox of tolerance, one thing it cannot be liberal about is itself.
"Must be zero" means that when the datum is being constructed, they must be initialized to zero, not that when the datum is being consumed, they must also be validated to be zero.
Violating either rule will cause that implementation not to interoperate properly when it is still found deployed in a future in which the bits have now been put to use.
Rejecting must-be-initialized-to-zero fields for not being zero is not an example of a flaw caused by neglecting to be "liberal in what you accept". It's an example of failing to accept what is required: the requirement is to accept the datum regardless of what is in those reserved bits. It is arguably an instance of failing to "be conservative in what you do". If you are conservative, you stay away from reserved bits. You don't look at them, until such a time as when they have a meaning, accompanied by required (or perhaps optional) processing rules.
Now, I see the point that reading Postel's law might steer some designer away from putting in that harmful check for zero, specifically due to the "be liberal" part of it. But that's just a case of two wrongs accidentally making right. That same designer might refer to "be liberal" again in some other work, and do something stupid in that context.
The only thing that will really help is clear specs which spells out requirements like "the implementation shall not examine these bits for any purpose, including validating their value".
Surely someone at some point thought it was obvious that “No” should mean “false”, and that’s why we’re now in this mess.
If you accept crap, then eventually you will receive only crap.
... assume that the network is
filled with malevolent entities that will send in packets
designed to have the worst possible effect ...
[1] https://datatracker.ietf.org/doc/html/rfc761#section-2.10