Posted by teleforce 2 days ago
At the extremes, we call these paradigms. Functional languages, Object Oriented languages, Array languages, etc. But every language exists to make some subset of "shapes" of solutions easier to read and write. Elixir encourages you to describe your programs as a distributed system of programs each running a series of transformations to data, thanks to its pipeline system, and to push your branching to function-call level with its pattern-matching multiple dispatch.
Java encourages you to write your application as a collection of things, that know stuff and do stuff, including telling other things to do stuff based on things they know.
C and Go encourage you to write your programs as an ordered series of atomic tasks for the computer to complete.
SQL has you describe what you want your output to look like.
Etc, etc. There are inherent trade offs between languages, because what you make inelegant to express or even inexpressible carries value, too.
Many DSLs can be bolted onto an existing language with support for compiler extensions. This approach offers more flexibility, but often leads to fragmentation and poor interoperability in the language ecosystem.
There is third approach, established by a group in Minnesota [1], which is to design languages and tools which are modular and extensible from the get-go, so that extensions are more interoperable. They do research on how to make this work using attribute grammars.
If the host language has a sufficiently expressive type system, you can often get away with writing a fluent API [2] or type safe embedded DSL. But designing languages and type systems with good support for meta-programming is also an active area of research. [3, 4]
If none of these options work, the last resort is to start from tabula rasa and write your own parser, compiler, and developer tools. This offers the most flexibility, but requires an enormous amount of engineering, and generally is not recommended in 2026.
[2]: https://arxiv.org/pdf/2211.01473
You mean like this?
https://github.com/williamcotton/webpipe
https://github.com/williamcotton/webpipe-lsp
https://github.com/williamcotton/webpipe-js
It's less effort if you, well, you know where I'm going with this...
MLIR [1] has entered the chat :P
I know I know MLIR is an IR and not a programming language, but MLIR does give me the feeling of "an IR to rule them all" (as long as you're ok with SSA representation), and the IR itself is quite high-level to the point it almost feels like an actual programming language, e.g. you can write a MLIR program that compiles to C using the EmitC dialect that feels a lot like writing C.
There’s a time in my life where I designed languages and wrote compilers. One type of language I’ve always thought about that could be made more approachable to non technical users is an outline-liked language with English like syntaxes and being a DSL, the shape of the outline would be very much fixed and on a guardrail, and can’t express arbitrary instructions like normal programming languages, but an escape hatch (to more expressive language) for advanced users can be provided. An area where this DSL can be used would be common portal admin app generation and workflow automation.
That said, with the advent of AI assistants, I’m not sure if there is still room for my DSL idea.
The structure of a language matters to the ease and feel of its use, despite even being logically identical. One parallel would be the syntactic benefits of something like Hintikka’s independence-friendly logic vs first order logic, even if they are equivalent.
The sentiment shared is that we should sacrifice benefits to the next generation to make our own lives easier. This is a common sentiment, but a sad one. The goal should be a natural language based programming language, that everyone can use, along side a technical programming language that makes unambiguous the interface between the language and the machine.
Everyone seems to endorse their happy medium, and those languages are also perfectly fine.
Why yes, it can and has been done: https://www.dreamsongs.com/Files/ECOOP.pdf
I follow new language developments with keen interest, but few of them will ever reach the level of maturity to be considered serious candidates for adoption. It's also risky to adopt a language that you cannot easily hire developers for, for example.
Libraries are great, but there is only so much they can address, and that depends on the language, too, as the article correctly points out. And there are two kinds of libries: tool libraries and frameworks. Someone once said it nicely: "Frameworks are like Hollywood - 'You don't call us, we call you!'". Frameworks often require you to submit to their control flow rather the other way round; that's why I prefer tool libraries.
I’ve been experimenting with a small defeasible-logic core (argumentation semantics + priorities + conflict detection) that stays stable, while most of the real meaning lives in extension points. https://github.com/canto-lang/canto-lang