Posted by birdculture 1 day ago
In Clojure, there's no appreciable compilation time. During my work week I barely, if ever, restart the application I'm working on: I'm working inside of it via a REPL connection.
It's an entirely different ball game and if you just compare language features you are missing out on an incredible interactive coding experience.
Your editor is also directly connected to your running application and can compile individual functions or pieces of code to update the application without a full edit/compile/run/test loop. It happens way faster and more interactively while you code. You can directly inspect the running program and its data within that same framework and with the exact same tools and functions you use to code.
To answer another way, you don’t code your application in a REPL, like by line , but with all the tools you’d expect (editor or ide, git, etc) PLUS that live connection to your software.
You kinda have to keep track of the state, but the latter can be easily inspected so there's no need to keep it your head.
Then you see the result in another pane of your editor.
It's pretty revelatory. You're molding your program while it's running.
I'm curious about that, can you elaborate? I'm a beginner in Clojure and I only know a few concepts about Rust, but it seems to me that they solve (at least, currently) very different problems...
I only saw Rust being used in places where C or C++ would be used in the past (Linux kernel, CLI apps, desktop apps), while I only saw Clojure being used as a modern and functional Java and JS replacement.
Not to mention how different they are as languages (static vs dynamic typing, native vs jvm, borrow checker vs garbage collection)
Rust is interesting because it solves the problem of shared mutable state, while allowing sharing, and allowing mutability, just not at the same time. State can be mutated until it is shared. While it is shared, it cannot be mutated. This is the goal of the ownership system in Rust.
I would argue that avoiding _unrestricted_ shared mutable state is a mean for Rust, not a goal. The main goal would be to provide a way to make safe, fast and non garbage collected programs, which doesn't seem at all what clojure is aiming for.
Best part of Rust is catering to a crowd that not even at point gun will consider touching anything that might resemble having any kind of automatic resource management.
Hence why I consider articles like "How We Saved 70% of CPU and 60% of Memory in Refinery’s Go Code, No Rust Required" [0] interesting to read and make awareness, even if Go isn't one of the languages I happen to be entusiastic about.
[0] - https://www.honeycomb.io/blog/how-we-saved-70-cpu-60-memory-...
The problem is that shared mutable state is incredibly hard to get correct, especially when concurrency enters the picture.
Elixir and Clojure sidestep the problem by making copying data so cheap that instead of sharing references to mutable data, you make all data immutable and copy it whenever you want to change it in some way. (Which is a classic technique: don’t like the problem? Solve a different problem that sidesteps the original problem in a creative way)
So you have a lot of functions of roughly the shape `f -> f` (return a new thing) instead of `f -> ()` (mutate a thing in place).
This possible at all by the clever use of some novel data immutable structures like HAMTs that are able to approximate the performance of traditional mutable data structures like hashmaps or arrays while presenting an immutable API.
As it turns out, this is a much easier programming model for most of us to get correct in practice (especially in the presence of concurrency) than sharing mutable state in an imperative programming language like C or Java or Python.
The tradeoff is that the immutable functional data structures actually do have a some performance overhead. In most domains it's not large enough to matter, but in some domains it is, and in those domains you really do need mutable state to eek out that last bit of performance.
Which is where Rust comes in.
In Rust's model, data can either mutable or shared, but not both at the same time. The Rust compiler computes the lifetimes of data throughout your program to ensure that this invariant is upheld, so mutable data is not shared, and shared data is not mutated.
The downside of this is that you have to internalize these rules and program to them. They can be tough to learn and tough to program with even after you have learned them, though it does get much easier with experience, I will say.
The upside of Rust's model is that you can have your cake and eat it too: you can keep the high performance ceiling of a true mutable data model while maintaining memory and resource safety.
In short, you can think of Clojure/Elixir as sidestepping the shared mutability problem at the cost of some runtime performance (though again in practice it is smaller than you would think), and Rust as tackling the shared mutability problem head on by transferring the cost of solving the problem to a more complicated compiler and a harder-to-learn programming model.
These are just my opinions having used both Rust and the immutable data functional programming stuff in anger. I'm not trying to present one as being better than the other, the key point is that they're both better than what came before.
While for many applications Clojure's performance is good enough it's not anywhere near what you can achieve with Rust. I once did a small game in Clojure trying to be very clever to eke out every last bit of performance and still didn't hit an acceptable frame rate. Made a very naive reimplementation in Rust that involved copying the entire state every frame and it run buttery smooth.
If there is a task for wish persistent data structures are the most performant solution it should be easy enough to implement and use them in rust too. Probably someone already did that.
Clojure is my default programming language but if I want performance (or static types) I reach for Rust.
Clang and GCC have a pretty solid suite of static checks that they can enforce if you enable them. They catch most of the common footguns.
It’d be of great help if you could share an example of this along with an explanation why it’s impossible in a different language say one of Java/C++/Go
You should absolutely try out F#. :) It's a great language.
Also, if you're looking for the magic of Lisp/Scheme, I think you might really enjoy Elixir/Erlang. Elixir has macros, has purely immutable data (there is _no_ way whatsoever to get mutable data unlike F#, Racket, Clojure, etc.), live code updates, and the BEAM VM, its process framework, and OTP are quite magical.
When I first learned Erlang, I felt I had come home. I mainly used Elixir though due to the available packages and jobs.
Joe was indeed a great guy. I was lucky enough to spend some private time with him when he was visiting Chicago to give a talk, a true renaissance man with wide-ranging interests.
Particularly as a noob, babashka is SUCH a good way to learn the language. I’ve written myself all sorts of fun utilities with babashka and have learned so much clojure along the way. Even got my coworkers to use my bb scripts!
If you're going to learn a niche Lisp, you might as well learn Common Lisp or Scheme which have well-specified standards, have stood the test of time and will still be around for decades to come.
And it’s not tied to the jvm per say, look at clojurescript (and derivatives) or the upcoming jank.
It’s far from dead. As much as I like CL, the ecosystem is a bit of a desert compared to the jvm.
Anyway, I came here to say Clojure also targets JavaScript and could target more like ClojureCLR https://clojure.org/about/clojureclr
Here, have another karma point!
I also forgot the very solid ClojureDart.
How do you debug ClojureScript? Can you modify the source-code while in the debugger? That is a huge time-saver, you debug and see a typo and fix it right away. My preference are influenced by my background in Smalltalk's "live" environemnt: you see the variables, the stack, and can change anything without having to stop the debugging session and then have to go back to the "editor" and then locate the place you (now know) you want to modify, and then start again.
Since clojure is so REPL heavy I haven’t felt like I’m missing a debugger too much. But the smalltalk live environment sounds ridiculously cool. I do end up using REPLs to remote programs (even prod lol) pretty often, which is pretty crazy for me coming from a node background
All the best finding a Clojure job though.
Im guessing they pay all that much, while simultaneously cursing themselves for not using Python instead, and swearing to never use Clojure again.
I know that as I have seen people do and say similar things about Perl and Erlang in the last decade.
Having said that, I don't think I'd pick Clojure for unpaid (hobby) projects. The JVM is such a hog and I don't like anything related to the Java culture...
It sees plenty use as Clojure/Clojurescript and Babashka. (and other niche variants). Jank is shaping up to be real nice too.
I think Rich even alludes to this fact in one of his talks where it would be disallowed to run Ruby/Python/Rust whatever but it's Java then it's a know entity.
There's another, orthogonal, aspect: I half-jokingly suggested Clojure as a substitute for one of the DSLs we use at work, and my idea was shot down by a teammate based on the reasoning that Clojure is hard and most of the Java devs (not my team, of course) in the shop are not up to learning it or coding in it. So that's another thing to consider: when you leave the company, who else will be up to maintaining what you produce?
Both of these considerations weigh strongly against attempts to "sneak Clojure in through the back door".
The amount of R&D that has gone into making it execute with good performance, and its overall stability...
Yeah, it's got the curse of being boring.
I do think it is perhaps unfortunate that Clojure is tied so heavily to the JVM, because I actually don't think it gains much from that ecosystem... but it's a product of the time it was first written.
Actually hell. I'm between jobs, I like Lisp, and I miss the JVM. I've never worked in Clojure, but does anybody want to hire me to work in it? :-)
When I was doing more Clojure, I loved that it was on the JVM because it meant I got to use every Java library under the sun. There are tons of battle tested Java libraries that didn't have to be rewritten in Clojure, and getting to use them for approximately zero financial and runtime cost was a HUGE benefit of Clojure compared to other niche FP languages.
Lol, only dying/dead in the febrile imagination of some HN commenters. The JVM has had some of the most explosive feature activity in the last several years. Java had several million greenfield projects in 2024-25 - among the top 6 greenfield programming languages according to the Github Octoverse.
In my opinion Lisp is too flexible. I think the ideal use of Lisp is one or a few talented developers exploring the problem space and creating the MVP. Then a follow on team to reimplement it in a mainstream language that’s more maintainable by “mere mortals”.
Ime it’s similar to the fact that projects implemented with statically typed languages are easier to maintain than dynamically typed languages. Lisp is so flexible even lexical scoping (of variables for example) is a choice. Not what I’d care to have juniors or run of the mill seniors responsible for!
When doing joint debugging with teammates, I've seen so many of them randomly add or remove & and * from C++ statements trying to get their code to compile, without bothering to reason through the issue. I suspect this stochastic approach to code development is pretty common. That is not going to unlock the benefits of metaprogramming either, where you have to deliberately build up the language you want to have.
Metaprogramming is extremely powerful but hard to use, especially for novices. I also think there is a general lack of education about programming languages and compilers at play here. So much of Lisps power comes from knowing that.
Pretty accurate foresight in 1980, in the "Mysteries and other Matters" section McCarthy predicting declarative textual description replacing lisp as a higher-level programming language, basically describing todays LLMs and agentic coding.
It seems like a stretch to say that's what McCarthy was thinking about regarding declarative facts and goals driving a program.
To me, that sounds more like Prolog than agentic coding.
I understand what you're trying to say, but I don't think LLMs were created as some replacement for Lisp. I don't think they've replaced any programming language, but they do help quite a bit with autogeneration of Python & Javascript in particular.
The math behind transformers is deterministic, so LLMs could be treated as compilers (putting aside intentionally adding temperature and non-determinism due to current internal GPU scheduling). In the future I imagine we could be able to declare a dependency on a model, hash its weights in a lockfile and the prompt/spec itself will be the code, which corresponds to that insight.
What I've understood from discussions on HN is that LLMs are non-deterministic. Am I right? So the same prompt when executed again could produce a different answer, a different program every time.
That would mean the prompt is not a great 'highleve lanaguage", it would get compiled into a different Lisp-program depending on the time of the day?
Lisp was ideal for reasoning systems, its homoiconic and meta-programmable nature is perfect for manipulating symbolic structures and logic. But when AI shifted toward numerical learning with neural networks, tensors, and GPU computation, Lisp’s strengths mattered less, and Python became the new glue for C/CUDA libraries like NumPy, PyTorch and TensorFlow.
Still, nothing prevents Lisp from coming back. It would actually fit modern deep learning well if a "LispTorch" with a CUDA FFI existed. We would have macros for dynamic graph generation, functional composition of layers, symbolic inspection, interactive REPL exploration, automatic model rewriting etc.
We almost had it once: Yann LeCun’s SN (the first CNN) was built on a C core with a Lisp interpreter on top to define, develop and inspect the network. It eventually evolved into Lush, essentially "Lisp for neural networks", which in turn inspired Torch and later PyTorch.
https://x.com/ylecun/status/1944504502260003296?lang=en
So Lisp didn't die in AI, it's just waiting for the right people to realize its potential for modern neural networks and bring it back. Jank in particular will probably be a good contender for a LispTorch.
I don’t think "doesn’t work for teams of 5+" is a fair generalization. There are production Clojure and Emacs (Lisp) codebases with far more contributors than that.
Language adoption is driven less by inherent team-size limits and more by social and practical factors. Some students probably don't like Lisp because most people naturally think in imperative/procedural terms. SCIP was doing a great job teaching functional and symbolic approaches, I wish they hadn't shifted their courses to Python, since that increases the gravitational pull toward cognitive standardization.
So modern AI is all mostly C or even Fortran, often driven from something more pedestrian, like Python.
Lisp languages are great for these manipulations, since the AST being manipulated is the same data structure (a list) as everything else. In other words, genetic programming can lean into Lisp's "code is data" paradigm.
As others mentioned, today everything is based on neural networks, so people aren't learning these other techniques.
In fact, the first edition of AIMA even had a NN and Perceptron implementation in Common Lisp. (https://github.com/aimacode/aima-lisp/blob/master/learning/a...)
Also this video was interesting, Oral History of John McCarthy: https://www.youtube.com/watch?v=KuU82i3hi8c&t=1564s
But with Lisps isn't it the case that statement 1 can be immutable while the next statement 2 is mutable? In other words it mixes mutable and immutable calls into the same program-file without really isolating them from each other in any way. And that means if one part of the program is "impure", so is all of it. You can't isolate those parts except by following a convention, but conventions are not something that it enforces. And you can't know which parts of your program are pure except by reading the code-statements in detail. Am I right?
Ultimately I think it might just be fads. Object oriented programming came along at the same time as the web, when the demand for programmers grew dramatically. That may have crystallized OO and imperative languages as the "default" style. Would be interesting to see the alternate universe where JavaScript actually was a Lisp.