Posted by nimbleplum40 4/3/2025
Or weak typing. How many languages thought that simplifying strings and integers and other types into "scalar", and making any operation between any operands meaningful, would simplify the language? Yet every single one ended up becoming a total mess instead.
Or constraint-based UI layout. Looks so simple, so intuitive on simple examples, yet totally failing to scale to even a dozen of basic controls. Yet the idea keeps reappearing from time to time.
Or an attempt at dependency management by making some form of symlink to another repository e.g. git modules, or CMake's FetchContent/ExternalProject? Yeah, good luck scaling that.
Maybe software engineering should have some sort of "Hall of Ideas That Definitely Don't Work", so that young people entering the field could save their time on implementing one more incarnation of an already known not good idea.
I'm deeply curious to know how you could easily and definitively work out what is and is not an idea that "Definitely Don't Work"
Mathematics and Computer Science seem to be littered with unworkable ideas that have made a comeback when someone figured out how to make them work.
What this Hall could contain, for each idea, is a list of reasons why the idea has failed in the past. That would at least give future Quixotes something to measure their efforts by.
I can get behind that :)...
Flowchart-based programming scales badly. Blender's game engine (abandoned) and Unreal Engine's "blueprints" (used only for simple cases) are examples.
It doesn’t really get complicated, but you can very quickly end up with drawings with very high square footage.
As a tool for planning, it’s not ideal, because “big-picture” is hard to see. As a user following a DRAKON chart though, it’s very, very simple and usable.
Link for the uninitiated: https://en.m.wikipedia.org/wiki/DRAKON
Of course, it's best that such learning happens before one has mandate to derail the whole project.
FWIW, neural networks would be in that pool until relatively recently.
The Hall would then end up containing a spectrum ranging from useless ideas to hard problems. Distinguishing between the two based on documented challenges would likely be possible in many cases.
A native submodule approach would fail at link time or runtime due to attempt to mix incompatible files in the same build run. Or, in some build systems, simply due to duplicate symbols.
That "just in a recursive way" addition hides a lot of important design decisions that separate having dependency manager vs. not having any.
Yet JavaScript and Python are the most widely used programming languages [1]. Which suggests your analysis is mistaken here.
[1] https://www.statista.com/statistics/793628/worldwide-develop...
Similarly, there's great demand for a typed layer on top of Javascript:
- Macromedia: (2000) ActionScript
- Google: (2006) GWT [Compiling Java to JS], and (2011) Dart
- Microsoft: (2012) Typescript
(Both JavaScript and Python have dynamic typing; Python’s type declarations are a form of optional static type checking.)
Do not confuse these concepts.
1. <https://www.destroyallsoftware.com/talks/wat>
2. <https://eqeq.js.org/>
All this went out of fashion, leaving some good stuff that was built at that time (remaining 95% was crap).
Today's "vibe coding" ends when Chat GPT and alikes want to call on some object a method that does not exist (but existed in 1000s of other objects LLM was trained with, so should work here). Again, we will be left with the good parts, the rest will be forgotten and we will move to next big thing.
The question is "natural" to whom, the humans or the computers?
AI does not make human language natural to computers. Left to their own devices, AIs would invent languages that are natural with respect to their deep learning architectures, which is their environment.
There is always going to be an impedance mismatch across species (humans and AIs) and we can't hide it by forcing the AIs to default to human language.
If you still don’t want to do programming, then you need some way to instruct or direct the intelligence that _will_ do the programming.
And any sufficiently advanced method of instruction will look less like natural language, and more like an education.
Compare that to:
Found in about 9 seconds.
If I create a website with Node.js, I’m not manually managing memory, parsing HTTP requests byte-by-byte, or even attempting to fully grasp the event loop’s nuances. I’m orchestrating layers of code written by others, trusting that these black boxes will behave as advertised according to my best, but deeply incomplete, understanding of them.
I'm not sure what this means for LLMs programming, but I already feel separated from the case Dijkstra lays out.
Difficult to sort this out with what follows.
Consider group theory. A group G is a set S with an operator * that supports an identity, closure, and an inverse. With that abstraction comes a hefty amount of power. In some sense, a group is akin to a trait on some type, much like how a class in Java can implement or extend Collection. (Consider how a ring ‘extends’ a group.)
I’d posit frameworks and libraries are no different in terms of formal symbolism from the math structure laid out above. Maybe the interfaces are fuzzy and the documentation is shoddy, but there’s still a contract we use to reason about the tool at hand.
> I’m not manually managing memory, parsing HTTP requests byte-by-byte
If I don’t reprove Peano’s work, then I’m not really doing math?
https://githubnext.com/projects/speclang/
Funny coincidence!
I leave it here for the nice contrast it creates in light of the submission we're discussing.
The whole thing seems a step (or several steps) backwards also in terms of UX. I mean surely there was a reason why ls was named ls, and so forth?
A bonus point is that he had also something to say about a real or alleged degeneration of natural languages themselves.
I think people who think about this like us need to start building resilience for the very real possibility that in a couple of years we'll be the ones dealing with these awful LLM-generated code bases, fixing bad logic and bugs.