Posted by nimbleplum40 4/3/2025
LLMs in the most general case do neither what you tell them, nor what you want them to. This, surprisingly, can be less infuriating, as now it feels like you have another actor to blame - even though an LLM is still mostly deterministic, and you can get a pretty good idea of what quality of response you can expect for a given prompt.
Programming is about iteratively expressing a path towards satisfying said goals.
What LLMs are doing now is converting "requirements" into "formalizations".
I don't think Djikstra is wrong in saying - that performing programming in plain-language is a pretty weird idea.
We want to concretize ideas in formalisms. But that's not what any human (including Djikstra) starts with... you start with some sort of goal, some sort of need and requirements.
LLMs merely reduce the time/effort required to go from goals -> formalism.
TLDR: Requirements != Implementation
Seeing as transformers are relatively simple to implement…
It stands to reason he was, in some sense, right. LLMs are damn easy to use.
Wittgenstein stated that the differences between personal and cultural language makes it impossible to agree on anything foundational to philosophy.
Godel did something similar to discrete structures by hacking self reference -- a kind of recursive self reference with no exit state.
I think pair programming with an LLM is interesting. You get to compare personal context with aggregate context. Plus, the external feedback helps me break out of excessive personal self reference.