Posted by nimbleplum40 1 day ago
If you still don’t want to do programming, then you need some way to instruct or direct the intelligence that _will_ do the programming.
And any sufficiently advanced method of instruction will look less like natural language, and more like an education.
Or weak typing. How many languages thought that simplifying strings and integers and other types into "scalar", and making any operation between any operands meaningful, would simplify the language? Yet every single one ended up becoming a total mess instead.
Or constraint-based UI layout. Looks so simple, so intuitive on simple examples, yet totally failing to scale to even a dozen of basic controls. Yet the idea keeps reappearing from time to time.
Or an attempt at dependency management by making some form of symlink to another repository e.g. git modules, or CMake's FetchContent/ExternalProject? Yeah, good luck scaling that.
Maybe software engineering should have some sort of "Hall of Ideas That Definitely Don't Work", so that young people entering the field could save their time on implementing one more incarnation of an already known not good idea.
I'm deeply curious to know how you could easily and definitively work out what is and is not an idea that "Definitely Don't Work"
Mathematics and Computer Science seem to be littered with unworkable ideas that have made a comeback when someone figured out how to make them work.
What this Hall could contain, for each idea, is a list of reasons why the idea has failed in the past. That would at least give future Quixotes something to measure their efforts by.
I can get behind that :)...
Flowchart-based programming scales badly. Blender's game engine (abandoned) and Unreal Engine's "blueprints" (used only for simple cases) are examples.
It doesn’t really get complicated, but you can very quickly end up with drawings with very high square footage.
As a tool for planning, it’s not ideal, because “big-picture” is hard to see. As a user following a DRAKON chart though, it’s very, very simple and usable.
Link for the uninitiated: https://en.m.wikipedia.org/wiki/DRAKON
FWIW, neural networks would be in that pool until relatively recently.
The Hall would then end up containing a spectrum ranging from useless ideas to hard problems. Distinguishing between the two based on documented challenges would likely be possible in many cases.
Of course, it's best that such learning happens before one has mandate to derail the whole project.
A native submodule approach would fail at link time or runtime due to attempt to mix incompatible files in the same build run. Or, in some build systems, simply due to duplicate symbols.
That "just in a recursive way" addition hides a lot of important design decisions that separate having dependency manager vs. not having any.
Yet JavaScript and Python are the most widely used programming languages [1]. Which suggests your analysis is mistaken here.
[1] https://www.statista.com/statistics/793628/worldwide-develop...
(Both JavaScript and Python have dynamic typing; Python’s type declarations are a form of optional static type checking.)
Do not confuse these concepts.
1. <https://www.destroyallsoftware.com/talks/wat>
2. <https://eqeq.js.org/>
Similarly, there's great demand for a typed layer on top of Javascript:
- Macromedia: (2000) ActionScript
- Google: (2006) GWT [Compiling Java to JS], and (2011) Dart
- Microsoft: (2012) Typescript
All this went out of fashion, leaving some good stuff that was built at that time (remaining 95% was crap).
Today's "vibe coding" ends when Chat GPT and alikes want to call on some object a method that does not exist (but existed in 1000s of other objects LLM was trained with, so should work here). Again, we will be left with the good parts, the rest will be forgotten and we will move to next big thing.
If I create a website with Node.js, I’m not manually managing memory, parsing HTTP requests byte-by-byte, or even attempting to fully grasp the event loop’s nuances. I’m orchestrating layers of code written by others, trusting that these black boxes will behave as advertised according to my best, but deeply incomplete, understanding of them.
I'm not sure what this means for LLMs programming, but I already feel separated from the case Dijkstra lays out.
Difficult to sort this out with what follows.
Consider group theory. A group G is a set S with an operator * that supports an identity, closure, and an inverse. With that abstraction comes a hefty amount of power. In some sense, a group is akin to a trait on some type, much like how a class in Java can implement or extend Collection. (Consider how a ring ‘extends’ a group.)
I’d posit frameworks and libraries are no different in terms of formal symbolism from the math structure laid out above. Maybe the interfaces are fuzzy and the documentation is shoddy, but there’s still a contract we use to reason about the tool at hand.
> I’m not manually managing memory, parsing HTTP requests byte-by-byte
If I don’t reprove Peano’s work, then I’m not really doing math?
Compare that to:
13) Humans writing code is an inherently flawed concept. Doesn't matter what form the code takes. Machine code, assembly language, C, Perl, or a ChatGPT prompt. It's all flawed in the same way. We have not yet invented a technology or mechanism which avoids it. And high level abstraction doesn't really help. It hides problems only to create new ones, and other problems simply never go away.
21) Loosely coupled interfaces made our lives easier because it forced us to compartmentalize our efforts into something manageable. But it's hard to prove that this is a better outcome overall, as it forces us to solve problems in ways that still lead to worse outcomes than if we had used a simpler [formal] logic.
34) We will probably end up pushing our technical abilities to the limit in order to design a superior system, only to find out in the end that simpler formal logic is what we needed all along.
55) We're becoming stupider and worse at using the tools we already have. We're already shit at using language just for communicating with each other. Assuming we could make better programs with it is nonsensical.
For a long time now I've been upset at computer science's lack of innovation in the methods we use to solve problems. Programming is stupidly flawed. I've never been good at math, so I never really thought about it before, but math is really the answer to what I wish programming was: a formal system for solving a problem, and a formal system for proving that the solution is correct. That's what we're missing from software. That's where we should be headed.
https://githubnext.com/projects/speclang/
Funny coincidence!
I leave it here for the nice contrast it creates in light of the submission we're discussing.
Found in about 9 seconds.
The whole thing seems a step (or several steps) backwards also in terms of UX. I mean surely there was a reason why ls was named ls, and so forth?
A bonus point is that he had also something to say about a real or alleged degeneration of natural languages themselves.