Top
Best
New

Posted by ravenical 1/1/2026

Rust--: Rust without the borrow checker(github.com)
139 points | 260 comments
dataflow 1/1/2026|
This can't possibly be guaranteed to work just by disabling the checker, can it? If Rust optimizes based on borrow-checker assumptions (which I understand it can and does) then wouldn't violating them be UB, unless you also mess with the compiler to disable those optimizations?
masklinn 1/1/2026||
> This can't possibly be guaranteed to work just by disabling the checker, can it?

It works in the sense that the borrow checker stops bothering you and the compiler will compile your code. It will even work fine as long as you don't write code which invokes UB (which does include code which would not pass the borrow checker, as the borrow checker necessarily rejects valid programs in order to forbid all invalid programs).

dataflow 1/1/2026||
> It will even work fine as long as you don't write code which invokes UB (which does include code which would not pass the borrow checker, as the borrow checker necessarily rejects valid programs in order to forbid all invalid programs).

To be clear, by "this" I meant "[allowing] code that would normally violate Rust's borrowing rules to compile and run successfully," which both of us seem to believe to be UB.

masklinn 1/1/2026||
Not quite, there is code which fails borrow checking but is safe and sound.

That is part of why a number of people have been waiting for Polonius and / or the tree borrows model, most classic are relatively trivial cases of "check then update" which fail to borrow check but are obviously non-problematic e.g.

    pub fn get_or_insert (
        map: &'_ mut HashMap<u32, String>,
    ) -> &'_ String
    {
        if let Some(v) = map.get(&22) {
            return v;
        }
        map.insert(22, String::from("hi"));
        &map[&22]
    }
Though ultimately even if either or both efforts bear fruits they will still reject programs which are well formed: that is the halting problem, a compiler can either reject all invalid programs or accept all valid programs, but it can not do both, and the former is generally considered more valuable, so in order to reject all invalid programs compilers will necessarily reject some valid programs.
dataflow 1/2/2026|||
I don't feel you're quite following what I'm saying unfortunately. In your specific example: couldn't the optimizer just optimize out all the mutations you've written, under the assumption that such a program would not have passed borrow-checking and thus isn't a case it needs to handle? Wouldn't this make it so that if you disabled borrow checking, you would get incorrect codegen vs. what you intended? This seems like an entirely legal and sane optimization; I'm not sure how you're assuming something like this outside the realm of possibility.

You seem to be operating on some (generous) underlying assumptions about what a borrow-checker-violating Rust program really means and what optimizations the compiler has the liberty to make even when borrow-checker assumptions are violated. But are you sure that assumption is well-founded? What is it formally based on?

masklinn 1/2/2026||
> I don't feel you're quite following what I'm saying unfortunately.

Disagreeing with your plainly incorrect assertion is not "not following" what you're saying.

> In your specific example: couldn't the optimizer just optimize out all the mutations you've written, under the assumption that such a program would not have passed borrow-checking and thus isn't a case it needs to handle?

No. The borrow checker ensures specific rules are followed, the borrow checker is not the rules themselves, and the optimisations are based on the underlying rules not on the borrow checker.

The program above abides by the underlying rules, it's literally the sort of examples used by people working on the next generation borrow checker, but the current borrow checker is not able to understand that.

> This seems like an entirely legal and sane optimization

It's neither of those things.

> You seem to be operating on some (generous) underlying assumptions about what a borrow-checker-violating Rust program really means and what optimizations the compiler has the liberty to make even when borrow-checker assumptions are violated. But are you sure that assumption is well-founded? What is it formally based on?

They are not assumptions, or generous. They are an understanding of the gap between the capabilities of the NLL borrow checker and "Behavior considered undefined".

They are what anyone working on the borrow checker sees as limitations of the borrow checker (some fixable, others intrinsic).

dataflow 1/2/2026||
> Disagreeing with your plainly incorrect assertion is not "not following" what you're saying.

I'm sorry, that particular comment wasn't disagreeing so much as missing my point entirely. It gave an example that would've still suffered from the same problem I was talking about if the optimizer relied on the same borrowing assumptions (hence my subsequent comment clarifying this) and it also diverted the discussion toward explaining the basics of incompleteness and the halting problem to me, neither of which indicated a following of my point at all, and both of which indicated a misunderstanding of where my (mis)understanding was.

But your new comment tracks it now (thanks).

>> This seems like an entirely legal and sane optimization

> It's neither of those things.

Thanks for clarifying.

> They are what anyone working on the borrow checker sees as limitations of the borrow checker (some fixable, others intrinsic).

I understand this, but it (again) doesn't contradict my point. Just because something is a known limitation of an earlier stage like the borrow checker, that doesn't mean that relaxing it would never require a change to later stages of the compiler, which never had to consider other possibilities before. Just like how limitations of the type checker don't imply that relaxing them would automatically cause the backend to generate correct code. It depends how the compiler is written and what the underlying rules and assumptions are for the later stages, and what's actually tested in practice, hence this entire question.

Heck, weren't there literal bugs discovered in LLVM during Rust development (was it noalias?) simply because those patterns weren't seen or tested much prior to Rust, despite being intended to work? It feels quite... optimistic to just change the constraints in one stage and assume they will work correctly for the later stages of a compiler with zero additional work. It makes sense if there's already active work to ensure this in the later stages, and I don't know if that's the case here or not, but if there isn't, then it feels risky.

> the optimisations are based on the underlying rules not on the borrow checker

Is there a link I can follow to these underlying rules so I can see what they are?

jojomodding 1/2/2026||
https://plf.inf.ethz.ch/research/pldi25-tree-borrows.html

(Note that I am an author of that paper, and also that this is just a proposal of the rules and not yet adopted as normative.)

What you seem to be forgetting in this discussion is that unsafe code exists. The example above does not pass the borrow checker, but with a small amount of unsafe code (casting a reference to a pointer and back to erase the lifetime constraints) you can make it compile. But of course with unsafe code it is possible to write programs that have undefined behavior. The question is whether this specific program has undefined behavior, and the answer is no.

Since it does not have undefined behavior, the rest of the compiler already has to preserve its semantics. So one could also tweak the borrow checker to accept this program.

TL;DR unsafe code exists and so you can't just say all programs not passing the borrow checker are UB.

dataflow 1/2/2026||
Thanks for the link. Like you said, that's not normative, so it doesn't really dictate anything about what the compiler would currently do if you violated borrow checking, right?

> What you seem to be forgetting in this discussion is that unsafe code exists. (...) unsafe code exists and so you can't just say all programs not passing the borrow checker are UB.

Unsafe code does not turn off the borrow-checker though? So I don't see how its existence implies the opposite of what I wrote.

Moreover, my entire concern here is about violating assumptions in earlier stages of the compiler that later stages don't already see violated (and thus might be unprepared for). Unsafe is already supported in the language, so it doesn't fall in that category to begin with.

TruePath 1/3/2026||||
That gets a bit tricky in terms of what you mean by valid programs. I presume what you mean is that you can't write a compiler that accepts every function which always returns the borrowed reference and reject every piece of code which fails to do so.

Though it's technically a bit different than the halting problem as this issue remains even if you assume that the function terminates -- you only want to show that the reference is returned assuming the code terminates if it isn't returned because it enters an infinite loop that's not a leak.

hyghjiyhu 1/2/2026|||
Imo if you run into the halting problem it's because you are trying to do too much. In particular I think what you actually want is to check soundness based on the "shape" of the code rather than the reason about which variables can have which values and what that means for soundness.

  flag=false;
  if flag {
    use_after_free();
  }
can be rejected with a clear conscience.
Sytten 1/1/2026|||
Correct, I was reading a very interesting blog post [1] on how the rust compiler will change the LLVM annotaions like sending noalias for mutable pointer. This changes a lot the generated machine code. Disabling the borrow checker won't enable those LLVM flags.

[1] https://lukefleed.xyz/posts/who-owns-the-memory-pt1/

CGamesPlay 1/1/2026|||
Yes. An analog would be uninitialized memory. The compiler is free to make optimizations that assume that uninitialized memory holds every value and no value simultaneously (because it is undefined behavior to ever read it).

In the following example, z is dereferenced one time and assigned to both x and y, but if z and x are aliased, then this is an invalid optimization.

    fn increment_by(x: &mut i32, y: &mut i32, z: &i32) {
        *x = *z;
        *y = *z;
    }
https://rust.godbolt.org/z/Mc6fvTzPG
OptionOfT 1/2/2026||
> Yes. An analog would be uninitialized memory. The compiler is free to make optimizations that assume that uninitialized memory holds every value and no value simultaneously (because it is undefined behavior to ever read it).

Even casting a MaybeUninit<i32>::uninit() to i32 is UB, even though every bit pattern in that memory space is a valid i32.

What's interesting is your code example is solved in Rust. By preventing a reference and a mutable reference all of the sudden the code becomes easier to reason about. No need for special attributes: https://www.lysator.liu.se/c/restrict.html#comparison-with-n...

MangoToupe 1/1/2026|||
> If Rust optimizes based on borrow-checker assumptions

This is a binary assumption that you can understand to evaluate to "true" in the absence of a borrow checker. If it is "false" it halts the compiler

djdjfljfgddfg 1/2/2026|||
I guess this is like putting an unsafe { } around all your code...
Narishma 1/2/2026|||
No, unsafe doesn't disable the borrow checker.
djdjfljfgddfg 1/2/2026||
lol, that'll teach me :D
aw1621107 1/2/2026|||
Not really. Unsafe blocks don't change the semantics of Rust code or disable Rust's normal checks, so if you have something that doesn't compile due to a borrow checker error adding an unsafe block around that code will do precisely nothing to get around that error.
tialaramex 1/1/2026||
If you write correct Rust code it'll work, the borrowck is just that, a check, if the teacher doesn't check your homework where you wrote that 10 + 5 = 15 it's still correct. If you write incorrect code where you break Rust's borrowing rules it'll have unbounded Undefined Behaviour, unlike the actual Rust where that'd be an error this thing will just give you broken garbage, exactly like a C++ compiler.

Evidently millions of people want broken garbage, Herb Sutter even wrote a piece celebrating how many more C++ programmers and projects there were last year, churning out yet more broken garbage, it's a metaphor for 2025 I guess.

ozgrakkurt 1/1/2026|||
I have been using kde for years now without a single problem. Calling cpp garbage sounds wrong.
doodlesdev 1/1/2026|||
KDE is a great desktop environment , but it's also notorious for being a buggy and unpolished DE [1]. It's good your experience wasn't like that, but it's certainly not how the software is generally perceived.

[1]: Of course, different versions have different levels of stability. Also, some of these bugs and problems wouldn't be prevented by using an alternative language such as Rust.

danudey 1/1/2026||||
Well FWIW, the original poster's anti-C++ statements aside, removing the borrow checker does nothing except allow you to write thread-unsafe (or race condition-unsafe) code. Therefore, the only change this really makes is allowing you to write slightly more ergonomic code that could well break somewhere at some point in time unexpectedly.
tialaramex 1/2/2026||
Nope. Anything which wouldn't pass the borrowck is actually nonsense. This fantasy that magically it will just lose thread safety or have race conditions is just that, a fantasy.

The optimiser knows that Rust's mutable references have no aliases, so it needn't safeguard mutation, but without borrow checking this optimisation is incorrect and arbitrary undefined behaviour results.

LtWorf 1/1/2026|||
People who can't do something, sometimes assume nobody else possibly could.
3836293648 1/1/2026|||
People hate C because it's hard, people hate C++ because it truly is rubbish. Rubbish that deserved to be tried but that we've now learned was a mistake and should move on from.
zzrrt 1/1/2026|||
I’m sure some people could tiptoe through minefields daily for years, until they fail. Nobody is perfect at real or metaphorical minefields, and hubris is probably the only reason to scoff at people suggesting alternatives.
LtWorf 1/1/2026||
Just FYI rust projects have CVEs as well.
zzrrt 1/2/2026||
Of course. My sense is there are a lot fewer in of out-of-bounds accesses and use after frees. Maybe a world-class programmer can go several decades without writing a memory error in C/C++, but they will probably eventually falter, meanwhile the other 99.9% of programmers fail more often. Why would you decline a compiler’s help eliminating certain types of bugs almost entirely?
Spivak 1/1/2026||||
How are we still having the same trade off discussion being argued so black and white when reality has shown that both options are preferred by different groups.

Rust says that all incorrect programs (in terms of memory safety) are invalid but the trade is that some correct programs will also be marked as invalid because the compiler can't prove them correct.

C++ says that all correct programs are valid but the trade is that some incorrect programs are also valid.

You see the same trade being made with various type systems and people still debate about it but ultimately accept that they're both valid and not garbage.

Maxatar 1/1/2026||
>C++ says that all correct programs are valid but the trade is that some incorrect programs are also valid.

C++ does not say this, in fact no statically typed programming language says this, they all reject programs that could in principle be correct but get rejected because of some property of the type system.

You are trying to present a false dichotomy that simply does not exist and ignoring the many nuances and trade-offs that exist among these (and other) languages.

Spivak 1/1/2026|||
I knew I should have also put the (in terms of memory safety) on the C++ paragraph but I held off because I thought it would be obvious both talking about the borrow checker and in contrast to Rust with the borrow checker.

Yes, when it comes to types C++ will reject theoretically sound programs that don't type correctly. And different type system "strengths" tune themselves to how many correct programs they're willing to reject in order to accept fewer incorrect ones.

I don't mean to make it a dichotomy at all, every "checker", linter, static analysis tool—they all seek to invalidate some correct programs which hopefully isn't too much of a burden to the programmer but in trade invalidate a much much larger set of incorrect programs. So full agreement that there's a lot of nuance as well as a lot of opinions when it goes too far or not far enough.

tialaramex 1/1/2026|||
Nope. C++ really does deliberately require that compilers will in some cases emit a program which does... something even though what you wrote isn't a C++ program.

Yes, that's very stupid, but they did it with eyes open, it's not a mistake. In the C++ ISO document the words you're looking are roughly (exact phrasing varies from one clause to another) Ill-formed No Diagnostic Required (abbreviated as IFNDR).

What this means is that these programs are Ill-formed (not C++ programs) but they compile anyway (No diagnostic is required - a diagnostic would be an error or warning).

Why do this? Well because of Rice's Theorem. They want a lot of tricky semantic requirements for their language but Rice showed (back in like 1950) that all the non-trivial semantic requirements are Undecidable. So it's impossible for the compiler to correctly diagnose these for all cases. Now, you could (and Rust does) choose to say if we're not sure we'll reject the program. But C++ chose the exact opposite path.

Maxatar 1/1/2026||
I'm not sure what your replying to, but it can't be my comment because what you're saying has absolutely nothing to do with it.

But kudos to you on writing an irrelevant wall of text.

1718627440 1/1/2026||
It does. The UB is false positives to the question "Is this a valid program".
Maxatar 1/1/2026||
No one disputes that C++ accepts some invalid programs, I never claimed otherwise. I said that C++'s type system will reject some programs that are in principle correct, as opposed to what Spivak originally claimed about C++ accepting all correct programs as valid.

The fact that some people can only think in terms of all or nothing is really saying a lot about the quality of discourse on this topic. There is a huge middle ground here and difficult trade-offs that C++ and Rust make.

1718627440 1/1/2026||
Sorry, then I misunderstood you, do you have an example, of a correct rejected C++ program?
joshuamorton 1/1/2026||
Many cases that require any kind of cast are this.
ada0000 1/1/2026||||
herb sutter and the c++ community as a whole have put a lot of energy into improving the language and reducing UB; this has been a primary focus of C++26. they are not encouraging people to “churn out more broken garbage”, they are encouraging people to write better code in the language they have spent years developing libraries and expertise in.
lordgroff 1/1/2026||
And for which there's often no serious alternative to in many domains anyway.
nemetroid 1/1/2026|||
Yes, many or even most domains where C++ sees a large market share are domains with no other serious alternative. But this is an indictment of C++ and not praise. What it tells us is that when there are other viable options, C++ is rarely chosen.

The number of such domains has gone down over time, and will probably continue to do so.

pron 1/1/2026||
The number of domains where low-level languages are required, and that includes C, C++, Rust, and Zig, has gone down over time and continues to do so. All of these languages are rarely chosen when there are viable alternatives (and I say "rarely" taking into account total number of lines of code, not necessarily number of projects). Nevertheless, there are still some very important domains where such languages are needed, and Rust's adoption rate is low enough to suggest serious problems with it, too. When language X offers significant advantages over language Y, its adoption compared to Y is usually quite fast (which is why most languages get close to their peak adoption relatively quickly, i.e. within about a decade).

If we ignore external factors like experience and ecosystem size, Rust is a better language than C++, but not better enough to justify faster adoption, which is exactly what we're seeing. It's certainly gained some sort of foothold, but as it's already quite old, it's doubtful it will ever be as popular as C++ is now, let alone in its heydey. To get there, Rust's market share will need to grow by about a factor of 10 compared to what it is now, and while that's possible, if it does that it will have been the first language to ever do so at such an advanced age.

tialaramex 1/1/2026||
> When language X offers significant advantages over language Y

So e.g. the silver bullet characteristics reported by Google among others in "More fast and fix things" ?

https://security.googleblog.com/2025/11/rust-in-android-move...

There's always resistance to change. It's a constant, and as our industry itself ages it gets a bit worse. If you use libc++ did you know your sort didn't have O(n log n) worst case performance until part way through the Biden administration? A suitable sorting algorithm was invented back in 1997, those big-O bounds were finally mandated for C++ in 2011, but it still took until a few years ago to actually implement it for Clang.

pron 1/2/2026||
Except, as you say, all those factors always exist, so we can compare things against each other. No language to date has grown its market share by a factor of ten at such an advanced age [1]. Despite all the hurdles, successful languages have succeeded faster. Of course, it's possible that Rust will somehow manage to grow a lot, yet significantly slower than all other languages, but there's no reason to expect that as the likely outcome. Yes, it certainly has significant adoption, but that adoption is significantly lower than all languages that ended up where C++ is or higher.

[1]: In a competitive field, with selection pressure, the speed at which technologies spread is related to their relative advantage, and while slow growth is possible, it's rare because competitive alternatives tend to come up.

tialaramex 1/2/2026||
This sounds like you're just repeating the same claim again. It reminds me a little bit of https://xkcd.com/1122/

We get it, if you squint hard at the numbers you can imagine you're seeing a pattern, and if you're wrong well, just squint harder and a new pattern emerges, it's fool proof.

pron 1/2/2026||
Observing a pattern with a causal explanation - in an environment with selective pressure things spread at a rate proportional to their relative competitive advantage (or relative "fitness") - is nothing at all like retroactively finding arbitrary and unexplained correlations. It's more along the lines of "no candidate has won the US presidential election with an approval of under 30% a month before the election". Of course, even that could still happen, but the causal relationship is clear enough so even though a candidate with 30% in the polls a month before the election could win, you'd hardly say that's the safer bet.
tialaramex 1/2/2026||
You're basically just re-stating my point. You mistakenly believe the pattern you've seen is predictive and so you've invented an explanation for why that pattern reflects some underlying truth, and that's what pundits do for these presidential patterns too. You can already watch Harry Enten on TV explaining that out-of-cycle races could somehow be predictive for 2026. Are they? Not really but eh, there's 24 hours per day to fill and people would like some of it not to be about Trump causing havoc for no good reason.

Notice that your pattern offers zero examples and yet has multiple entirely arbitrary requirements, much like one of those "No President has been re-elected with double digit unemployment" predictions. Why double digits? It is arbitrary, and likewise for your "about a decade" prediction, your explanation doesn't somehow justify ten years rather than five or twenty.

pron 1/2/2026||
> You mistakenly believe the pattern you've seen is predictive

Why mistakenly? I think you're confusing the possibility of breaking a causal trend with the likelihood of doing that. Something is predictive even if it doesn't have a 100% success rate. It just needs to have a higher chance than other predictions. I'm not claiming Rust has a zero chance of achieving C++'s (diminished) popularity, just that it has a less than 50% chance. Not that it can't happen, just that it's not looking like the best bet given available information.

> Notice that your pattern offers zero examples

The "pattern" includes all examples. Name one programming language in the history of software that's grown its market share by a factor of ten after the age of 10-13. Rust is now older than Java was when JDK 6 came out and almost the same age Python was when Python 3 came out (and Python is the most notable example of a late bloomer that we have). Its design began when Java was younger than Rust is now. Look at how Fortran, C, C++, and Go were doing at that age. What you need to explain isn't why it's possible for Rust to achieve the same popularity as C++, but why it is more likely than not that its trend will be different from that of any other programming language in history.

> Why double digits? It is arbitrary, and likewise for your "about a decade" prediction

The precise number is arbitrary, but the rule is that the rate of adoption of any technology (or anything in a field with selective pressure) spreads at a rate proportional to its competitive advantage. You can ignore the numbers altogether, but the general rule about the rate of adoption of a technology or any ability that offers a competitive advantage in a competitive environment remains. The rate of Rust's adoption is lower than that of Fortran, Cobol, C, C++, VB, Java, Python, Ruby, C#, PHP, and Go and is more-or-less similar to that of Ada. You don't need numbers, just comparisons. Are the causal theory and historical precedent 100% accurate for any future technology? Probably not, as we're talking statistics, but at this point, it is the bet that this is the most likely outcome that a particular technology would buck the trend that needs justification.

I certainly accept that the possibility of Rust achieving the same popularity that C++ has today exists, but I'm looking for the justification that that is the most likely outcome. Yes, some places are adopting Rust, but the number of those saying nah (among C++ shops) is higher than that of all programming languages that have ever become very popular. The point isn't that bucking a trend with a causal explanation is impossible. Of course it's possible. The question is whether it is more or less likely than not breaking the causal trend.

tialaramex 1/7/2026||
Your hypothetical "factor of ten" market share growth requirement means it's literally impossible for all the big players to achieve this since they presumably have more than 10% market share and such a "factor of ten" increase would mean they somehow had more than the entire market. When declaring success for a model because it predicted that a literally impossible thing wouldn't happen I'd suggest that model is actually worthless. We all knew that literally impossible things don't happen, confirming that doesn't validate the model.

Lets take your Fortran "example". What market share did Fortran have, according to you, in say 1959? How did you measure this? How about in 1965? Clearly you're confident, unlike Fortran's programmers, users and standards committee, that it was all over by 1966. Which is weird (after all that's when Fortran 66 comes into the picture), but I guess once I see how you calculate these outputs it'll make sense right?

pron 1/8/2026||
> means it's literally impossible for all the big players to achieve this

Only because they've achieved that 10% in their first decade or so, but what I said is the case for all languages, big and small alike (and Rust doesn't have this problem because it needs a 10x boost to approach C++'s current market share, which is already well below its peak). But the precise numbers don't matter. You can use 5x and it would still be true for most languages. The point is that languages - indeed, all technologies, especially in a competitive market - reach or approach their peak market share relatively quickly.

You make it sound like a novel or strange theory, but it's rather obvious when you look at the history. And the reason is that if a technology offers a big competitive advantage, it's adopted relatively quickly as people don't want to fall behind the competition. And while a small competitive advantage could hypothetically translate to steady, slow growth, what happens is that over that time, new alternatives show up and the language loses the novelty advantage without ever having gained a big-player advantage.

That's why, as much as I like, say, Clojure (and I like it a lot), I don't expect to see much future growth.

> Clearly you're confident, unlike Fortran's programmers

Yes, because I have the benefit of hindsight. Also, note that I'm not saying anything about decline (which happens both quickly and slowly), only that technologies in a competitive market reach or approach their peak share quickly. Fortran clearly became the dominant language for its domain in under a decade.

But anyway, if you think that steady slow growth is a likelier or more common scenario than fast growth - fine. I just think that thesis is very hard to support.

tialaramex 1/13/2026||
> The point is that languages - indeed, all technologies, especially in a competitive market - reach or approach their peak market share relatively quickly.

This predicts nothing in particular, for any outcome we can squint at this and say it was fulfilled, so in this sense it's actually worse than the XKCD cartoon.

It's not that it's a novel or strange theory, it's just wrong.

> if a technology offers a big competitive advantage, it's adopted relatively quickly as people don't want to fall behind the competition

Yeah, no. See, humans have a strong preference for the status quo so it isn't enough that some technology "offers a big competitive advantage", they'd usually just rather not actually. Lots of decision makers read Google's "Move Fast and Fix Things" and went "Yeah, that's not applicable to us [for whatever reason]" and moved on. It doesn't matter whether they were right to decide it wasn't applicable, it only matters whether their competitors reach a different conclusion and execute effectively.

pron 1/13/2026||
> It's not that it's a novel or strange theory, it's just wrong.

Okay. Can you provide an example of a language that steadily and gradually grew in popularity over a long time (well over a decade) and that this slow growth was the lion share of its market size growth? You say "it's just wrong" but I think it applies to 100% of cases, and if you want to be specific when it comes to numbers, then even languages whose market share has grown by a factor of 5 after age 10 is a small minority, and even a factor of 2 is a minority.

> Yeah, no. See, humans have a strong preference for the status quo so it isn't enough that some technology "offers a big competitive advantage", they'd usually just rather not actually.

Except, again, all languages, successful and unsuccessful alike, have approached their peak market share in their first decade or so. You can quibble over what I mean by "approach" but remember that Rust, at age 10+, needs to grow its market share by a factor of 10 to even match C++'s already-diminished market share today.

ada0000 1/1/2026|||
even when there are alternatives, sometimes it makes sense to use a library like Qt in its native language with its native documentation rather than a binding - if you can do so safely
lordgroff 1/1/2026||||
The attitude expressed here and that tends to surface in any Rust discussion is the reason I completely lost interest in the language.
maxbond 1/1/2026|||
Rust isn't a one true language, no one necessarily needs to learn it, and I'm sure your preffered language is excellent. C and C++ are critical languages with legitimate advantages and use cases. Don't learn Rust of you aren't interested.

But Rust, its community, and language flame wars are separate concerns. When I talk shop with other Rust people, we talk about our projects, not about hating C++.

Maxatar 1/1/2026||||
So don't use it. Rust is not intended to be used by everyone. If you are happy using your current set of tools and find yourself productive with them then by all means be happy with it.
tempodox 1/1/2026|||
You’re expressing the same attitude here, just in reverse. Some users not thinking highly of C++ doesn’t make Rust a worse or less interesting language.
jjgreen 1/1/2026||||
It is possible to like something without hating people who like something else, can't people just live and let live?
tialaramex 1/1/2026||
Did I write that I hated somebody? I don't think I wrote anything of the sort. I can't say my thoughts about Bjarne for example rise to hatred, nobody should have humoured him in the 1980s, but we're not talking about what happened when rich idiots humoured The Donald or something as serious as that - nobody died, we just got a lot of software written in a crap programming language, I've had worse Thursdays.

And although of course things could have been better they could also have been worse. C++ drinks too much OO kool aid, but hey it introduced lots of people to generic programming which is good.

jjgreen 1/1/2026||
Correct me if I'm wrong, but I don't think you think that C++ programmers actually want to write "broken garbage", so when you say "millions of people want broken garbage" the implication is that a) they do write broken garbage, b) they're so stupid don't even know that is what they are doing. I can't really read else than in the same vein as an apartheid-era white South-African statement starting "all blacks ...", i.e., an insult to a large class of people simply for their membership in that class. Maybe that's not your intent, but that's how it reads to me, sorry.
tialaramex 1/1/2026|||
I can't help how you feel about it, but what I see is people who supposedly "don't want" something to happen and yet take little or no concrete action to prevent it. When it comes to their memory safety problem WG21 talks about how they want to address the problem but won't take appropriate steps. Years of conference talks about safety, and C++ 26 is going to... encourage tool vendors to diagnose some common mistakes. Safe C++ was rejected, and indeed Herb had WG21 write a new "standing rule" which imagines into existence principles for the language that in effect forbid any such change.

Think Republican Senators offering thoughts and prayers after a school shooting, rather than Apartheid era white South Africans.

3836293648 1/1/2026||||
Are you seriously comparing discrimination based on factors noone can control to a group literally defined by a choice they made? And you think that's a good faith argument?
dminik 1/1/2026|||
Considering how many people will defend C++ compilers bending over backwards to exploit some accidental undefined behaviour with "but it's fast though" then yeah, that's not an inaccurate assessment.
amelius 1/1/2026||||
People don't want garbage. But in any case, they don't want straightjackets like the borrow checker.

Hence, they use GC'd languages like Go whenever they can.

eru 1/1/2026|||
Straightjackets can be very useful.

Haskell (and OCaml etc) give you both straightjackets and a garbage collector. Straightjackets and GC are very compatible.

Compared to C, which has neither straightjackets nor a GC (at least not by default).

qsera 1/1/2026|||
>Haskell (and OCaml etc) give you both straightjackets..

Haskell's thing with purity and IO does not feel like that. In fact Haskell does it right (IO type is reflected in type). And rust messed it up ("safety" does not show up in types).

You want a global mutable thing in Haskell? just use something like an `IORef` and that is it. It does not involve any complicated type magic. But mutations to it will only happen in IO, and thus will be reflected in types. That is how you do it. That is how it does not feel like a straight jacket.

Haskell as a language is tiny. But Rust is really huge, with endless behavior and expectation to keep in mind, for some some idea of safety that only matter for a small fraction of the programs.

And that I why I find that comment very funny. Always using rust is like always wearing something that constrains you greatly for some idea of "safety" even when it does not really matter. That is insane..

gpm 1/1/2026|||
> "safety" does not show up in types

It does in rust. An `unsafe fn()` is a different type than a (implicitly safe by the lack of keyword) `fn()`.

The difference is that unsafe fn's can be encapsulated in safe wrappers, where as IO functions sort of fundamentally can't be encapsulated in non-IO wrappers. This makes the IO tagged type signatures viral throughout your program (and as a result annoying), while the safety tagged type signatures are things you only have to think about if you're touching the non-encapsulated unsafe code yourself.

qsera 1/1/2026||
>The difference is that unsafe fn's can be encapsulated in safe wrappers

This is the koolaid I am not willing to drink.

If you can add safety very carefully on top of unsafe stuff (without any help from compiler), why not just use `c` and add safety by just being very careful?

> IO tagged type signatures viral throughout your program (and as a result annoying)..

Well, that is what good type systems do. Carry information about the types "virally". Anything short is a flawed system.

ninkendo 1/1/2026|||
> If you can add safety very carefully on top of unsafe stuff (without any help from compiler), why not just use `c` and add safety by just being very careful?

Y'know people complain a lot about Rust zealots and how they come into discussions and irrationally talk about how Rust's safety is our lord and savior and can eliminate all bugs or whatever...

But your take (and every one like it) is one of the weakest I've heard as a retort.

At the end of the day "adding safety very carefully atop of unsafe stuff" is the entire point of abstractions in software. We're just flipping bits at the end of the day. Abstractions must do unsafe things in order to expose safe wrappers. In fact that's literally the whole point of abstractions in the first place: They allow you to solve one problem at a time, so you can ignore details when solving higher level problems.

"Hiding a raw pointer behind safe array-like semantics" is the whole point of a vector, for instance. You literally can't implement one without being able to do unsafe pointer dereferencing somewhere. What would satisfy your requirement for not doing unsafe stuff in the implementation? Even if you built a vector into the compiler, it's still ultimately emitting "unsafe" code in order to implement the safe boundary.

If you want user-defined types that expose things with safe interfaces, they have to be implemented somehow.

As for why this is qualitatively different from "why not just use c", it's because unsafety is something you have to opt into in rust, and isn't something you can just do by accident. I've been developing in rust every day at $dayjob for ~2 years now and I've never needed to type the unsafe keyword outside of a toy project I made that FFI'd to GTK APIs. I've never "accidentally" done something unsafe (using Rust's definition of it.)

It's an enormous difference to something like C, where simply copying a string is so rife with danger you have a dozen different strcpy-like functions each of which have their own footguns and have caused countless overflow bugs: https://man.archlinux.org/man/string_copying.7.en

qsera 1/2/2026||
I think it comes down to

1. In `c` one have to remember a few, fairly intutive things, and enforce them without fail.

2. In rust, one have to learn, remember ever increasing number of things and constantly deal with non-intutive borrow-checker shenanigans that can hit your project at any point of the development forcing you to re-architecture your project, despite doing everything to ensure "safety". But the borrow-checker can't be convinced.

I have had enough of 2. I might use rust if I want to build a critical system with careless programmers, but who would do such a thing? For open source dependencies, one will have to go by community vouching or auditing themselves. Can't count something to be "Safe" just because it is in rust, right? So what is the point. I just don't see it. I mean, if you look a bit deeper, It just does not make any sense.

adastra22 1/2/2026||
Do you have any examples of that?
qsera 1/2/2026||
What is the point. If I share something, someone is going to come along and say. That is not how you are "supposed" to do it in rust.

And that is exactly my point. You need to learn a zillion rust specific patterns for doing every little thing to work around the borrow-checker and would be kind of unable to come up with your own designs with trade-offs that you choose.

And that becomes very mechanical and hence boring. I get that it would be safe.

So yes, if I am doing brain surgery, I would use tools that prevent me from making quick arbitrary movements. But for everything else a glove would do.

adastra22 1/2/2026||
To learn something is generally the point. Either me, or you. I’ve been developing in rust for half a decade now and genuinely do not know what you were talking about here. I haven’t experienced it.

So either there are pain points that I’m not familiar with (which I’m totally open to), or you might be mistaken about how rust works. Either way, one or both of us might learn something today.

qsera 1/2/2026||
All lessons are not equally valuable. Seemingly arbitrary reasoning for some borrow checker behavior is not interesting enough for me to learn.

In the past, I would come across something and would lookup and the reasoning for it often would be "What if another thread do blah blah balh", but my program is single threaded.

adastra22 1/2/2026||
Borrow checker issues do not require multiple threads or async execution to be realized. For example, a common error in C++ is to take a reference/interator into vector, then append/push onto the end of that vector, then access the original error. If that causes reallocation, the reference is no longer valid and this is UB. Rust catches this because append requires a mutable reference, and the borrow checker ensures there are no other outstanding references (read only or mutable) before taking the &mut self reference for appending.

This is generally my experience with Rust: write something the way I would in C++, get frustrated at borrow checker errors, then look into it and learn my C++ code has hidden bugs all these years, and appreciate the rust compiler’s complaints.

qsera 1/2/2026||
>If that causes reallocation, the reference is no longer valid

Doesn't the append/push function return a pointer in that case? At least in `c` there are special functions that reallocate and is not done by implicitly (but I understand someone could write a function that does it).

Thus it appears that borrow checker's behavior is guided by bad designs in other languages. When bad design is patched with more design, the latter often becomes non-intuitive and restricting. That seems to have happened with the rust's borrow checker.

adastra22 1/3/2026||
In C++? No. The vector container is auto resizing. When it hits capacity limits it doubles the size of the allocation and copies the contents to the new memory. An insertion operation will give you an iterator reference to the newly inserted value, but all existing references may or may not remain valid after the call.

This meant “guided by bad design.” The borrow checker wasn’t written to handle this one use case. It was designed to make all such errors categorically impossible.

qsera 1/3/2026||
I got how c++ vectors work. Rust `Vec`s work similarly IIRC. But the problem in C++ is that it allowed you to make a `Vec` from a raw pointer.

Anywany, IMHO rust has thrown the baby out with bath water. To make such errors (ie bad design) categorically impossible, it also made a huge class of valid programs also impossible. And in-turn to patch that, it gave yet another bunch of stuff (IIRC `Rc`, `RefCell`) that people are supposed to learn (they have horrible interfaces IMHO) by which some of the programs could be implemented.

I think someone else should give it another shot. May be they can come up with a better solution than this "borrow checker"..

gpm 1/1/2026|||
> This is the koolaid I am not willing to drink.

> If you can add safety very carefully on top of unsafe stuff (without any help from compiler), why not just use `c` and add safety by just being very careful?

There is help from the compiler - the compiler lets the safe code expose an interface that creates strict requirements about how it is being called with and interacted with. The C language isn't expressive enough to define the same safe interface and have the compiler check it.

You can absolutely write the unsafe part in C. Rust is as good at encapsulating C into a safe rust interface as it is at encapsulating unsafe-rust into a safe rust interface. Just about every non-embedded rust program depends on C code encapsulated in this manner.

> Well, that is what good type systems do. Carry information about the types "virally". Anything short is a flawed system.

Good type systems describe the interface, not every implementation detail. Virality is the consequence of implementation details showing up in the interface.

Good type systems minimize the amount of work needed to use them.

IO is arguably part of the interface, but without further description of what IO it's a pretty useless detail of the interface. Meanwhile exposing a viral detail like this as part of the type system results in lots of work. It's a tradeoff that I think is generally not worth it.

qsera 1/1/2026||
>the compiler lets the safe code expose an interface that creates strict requirements about how it is being called with and interacted with..

The compiler does not and cannot check if these strict requirements are enough for the intended "safety". Right? It is the judgement of the programmer.

And what is stopping a `c` function with such requirements to be wrapped in some code that actually checks these requirements are met? The only thing that the rust compiler enables is to include a feature to mark a specific function as unsafe.

In both cases there is zero help from the compiler to actually verify that the checks that are done on top are sufficient.

And if you want to mark a `c` function as unsafe, just follow some naming convention...

>but without further description of what IO it's a pretty useless detail of the interface..

Take a look at effect-system libraries which can actually encode "What IO" at the type level and make it available everywhere. It is a pretty basic and widely used thing.

gpm 1/1/2026||
> The compiler does not and cannot check if these strict requirements are enough for the intended "safety". Right? It is the judgement of the programmer.

Yes*. It's up to the programmer to check that the safe abstraction they create around unsafe code guarantees all the requirements the unsafe code needs are upheld. The point is that that's done once, and then all the safe code using that safe abstraction can't possibly fail to meet those requirements - or in other words any safety related bug is always in the relatively small amount of code that uses unsafe and builds those safe abstraction.

> And what is stopping a `c` function with such requirements to be wrapped in some code that [doesn't] actually checks these requirements are met?

Assuming my edit to your comment is correct - nothing. It's merely the case that any such bug would be in the small amount of clearly labelled (with the unsafe keyword) binding code instead of "anywhere".

> The only thing that the rust compiler enables is to include a feature to mark a specific function as unsafe.

No, the rust compiler has a lot more features than just a way to mark specific functions as unsafe. The borrow checker, and it's associated lifetime constraints, enforcing that variables that are moved out of (and aren't `Copy`) aren't used, is one obvious example.

Another example is marking how data can be used across threads with traits like `Send` and `Sync`. Another - when compared to C anyways - is simply having a visibility system so that you can create structs with fields that aren't directly accessible via other code (so you can control every single function that directly accesses them and maintain invariants in those functions).

> In both cases there is zero help from the compiler to actually verify that the checks that are done on top are sufficient.

Yes and no, "unsafe" in rust is synonymous with "the compiler isn't able to verify this for you". Typically rust docs do a pretty good job of enumerating exactly what the programmer must verify. There are tools that try to help the programmer do this, from simple things like being able to enable a lint that checks every time you wrote unsafe you left a comment saying why it's ok, and that you actually wrote something the compiler couldn't verify in the first place. To complex things like having a (very slow) interpreter that carefully checks that in at least one specific execution every required invariant is maintained (with the exception of some FFI stuff that it fails on as it is unable to see across language boundaries sufficiently well).

The rust ecosystem is very interested in tools that make it easier to write correct unsafe code. It's just rather fundamentally a hard problem.

* Technically there are very experimental proof systems that can check some cases these days. But I wouldn't say they are ready for prime time use yet.

eru 1/2/2026||||
> You want a global mutable thing in Haskell? just use something like an `IORef` and that is it. It does not involve any complicated type magic. But mutations to it will only happen in IO, and thus will be reflected in types. That is how you do it. That is how it does not feel like a straight jacket.

Haskell supports linear types now. They are pretty close in spirit to Rust's borrowing rules.

> Haskell as a language is tiny.

Not at all. Though much of what Haskell does can be hand-waved as sugar on top of a smaller core.

elbear 1/1/2026|||
When I started learning Haskell, it did feel like coding with a straightjacket.
qsera 1/1/2026||
I think that is because when you start learning Haskell, you are not typically told about state monads, `IORefs` and likes that enables safe mutability.

It might be because Monads could have a tad bit advanced type machinery. But IORefs are straightforward, but typically one does not come across it until a bit too late into their Haskell journey.

reactordev 1/1/2026||||
>Straitjackets can be very useful.

Only if you’re insane.

qsera 1/1/2026|||
Damn! This is the funniest HN comment that I have ever come across...
imtringued 1/1/2026||||
The meaning of straightjacket here is inherently subjective and not to be meant literally.
josephg 1/1/2026|||
How dare you. C is a fine language.

Just don't accidentally step on any of these landmines and we'll all get along great.

reactordev 1/1/2026||
Not to mention your sidearm is a Sig P365. We like to call them footguns.
127 1/1/2026||||
You call it a straightjacket, I call it a railroad track for reliably delivering software.
pron 1/1/2026|||
For all its faults, and it has many (though Rust shares most of them), few programming languages have yielded more value than C++. Maybe only C and Java. Calling C++ software "garbage" is a bonkers exaggeration and a wildly distorted view of the state of software.
ViewTrick1002 1/1/2026||
For everyone unaware, this repo is a meme:

https://www.reddit.com/r/rust/comments/1q0kvn1/corroded_upda...

As a follow on to the corroded meme crate:

https://github.com/buyukakyuz/corroded

> What Is This

> The rust compiler thinks it knows better than you. It won't let you have two pointers to the same thing. It treats you like a mass of incompetence that can't be trusted with a pointer.

> We fix that.

amluto 1/1/2026|
It does seem like satire. The very first example is:

    fn main() {
        let a = String::from("hello");
        let b = a;
        println!("{a}");  // Works! Prints: hello
    }
This is not “I have correct code but Rust can’t tell it’s correct.” This is “wow, this code is intentionally outrageously wrong, obviously dereferences a pointer that is invalid, and happens to work anyway.”
1718627440 1/1/2026||
> this code is intentionally outrageously wrong

Can you explain why? Why can't both a and b point at the same string object? Does `let b = a;` do something like a destructive move?

tredre3 1/1/2026|||
The rust way isn't intuitive if you're coming from C, but b = a does indeed transfer the ownership to b and a is now invalid/unusable. You would need to make a mutable reference if you want two variables that point to the same object.

    error[E0382]: borrow of moved value: `a`
     --> main.rs:4:16
      |
    2 |     let a = String::from("hello");
      |         - move occurs because `a` has type `String`, which does not implement the `Copy` trait
    3 |     let b = a;
      |             - value moved here
    4 |     println!("{a}");  // Works! Prints: hello
      |                ^ value borrowed here after move
      |
      = note: this error originates in the macro `$crate::format_args_nl` which comes from the expansion of the macro `println` (in Nightly builds, run with -Z macro-backtrace for more info)
    help: consider cloning the value if the performance cost is acceptable
      |
    3 |     let b = a.clone();
      |              ++++++++
aw1621107 1/1/2026||||
> Does `let b = a;` do something like a destructive move?

Yes. Semantically, Rust performs destructive moves by default, and as a result using `a` after `let b = a;` would normally result in a hard error [0].

The way destructive moves are (currently?) actually implemented, however, is as a shallow memcpy of the value in question coupled with compiler checks that the moved-from thing isn't used. As a result, if you disable the compiler check simple uses of the moved-from value immediately after the move could still work since the compiler doesn't take explicit steps to modify the moved-from value immediately after a move.

[0]: https://rust.godbolt.org/z/Wdr6G1GsK

gpm 1/1/2026|||
I'm not 100% sure the semantics here are nailed down - but I think there's no guarantee that `a` continues to exist after assignment to `b`. The value in it has been moved out of it after all... The memory which was used for the variable `a` can probably be re-used for something else, e.g. for some inlined variable used by `println!`...

In normal rust `let a = b` where the variable is of a non-Copy type (including String) is "destructive" in the sense that you can no longer use b.

The question about semantics in normal rust turns to "so if I have a raw-pointer to a hanging around and use unsafe code to copy the value out of it what do I get" and I'm not 100% sure... but I think the answer is probably it's a use after free and you get undefined behavior. The rust-- version is basically just this except you don't have to explicitly make that raw pointer to read the old memory.

corrode2711 1/1/2026||
I'm the author of this repo. I see some really angry comments, some of them even personal. Obviously I didn't think that just by tinkering with a compiler, I'd get personally attacked, but anyway, fair enough.

For those of you confused: yes, this started as a satirical project with the corroded lib. Then I thought "why not just remove the borrow checker?" without any real motivation. Then I just went ahead and did it. To my surprise, it was really simple and easy. I thought it would be heavily tangled into the rustc compiler, but once I figured out where the error emitter is, it was pretty straightforward.

I'm not sure about my long-term goals, but besides the joke, I genuinely think for debugging and prototyping purposes, I'd like the borrow checker to shut up. I'm the kind of guy that prints everything while debugging and prototyping. Maybe you're using a debugger, okay, but I don't. I don't like debuggers. It's just more convenient for me. So what constantly happens is I run into issues like: does this implement Debug? Can I print this after it moved? The borrow checker won't let me access this because of some other borrow. Stuff like that.

Another point is, as you guys are well aware, the borrow checker will reject some valid programs in order to never pass any invalid program. What if I'm sure about what I'm doing and I don't want that check to run?

In the repo there's a doubly linked list example. Without the borrow checker it's fairly simple and easy to implement. With it, you know how complicated and ugly it gets.

Anyway, have a good new year, and don't get angry over compilers, you know.

CodeMage 1/1/2026||
> Then I thought "why not just remove the borrow checker?" without any real motivation.

Reminds me of a chemistry kit I had as a kid. None of this tame, safe stuff you can buy these days. Mine was a gift from my dad and I never thought of asking him where he dug it up, but it had stuff like pure sulfuric acid in it.

One day, when I was done with all of the experiments I had planned to do, I decided to mix a few things and heat them up, just for fun, without any real motivation other than "let's see what happens".

Let's just say I was lucky we only had to replace some of the clothes my mom had left out for me to put away. ;)

> Another point is, as you guys are well aware, the borrow checker will reject some valid programs in order to never pass any invalid program. What if I'm sure about what I'm doing and I don't want that check to run?

Then you do it using the "unsafe" keyword, and you think long and hard about how to design and structure the code so that the unsafe code is small in scope, surface, and blast radius.

That's precisely what unsafe code is for: to get around the borrow checker and assert you know what you're doing. Of course, if you're wrong, that means your program will blow up, but at least you know that the culprit is hiding in one of those unsafe areas, rather than literally anywhere in the whole codebase.

Alternately, you can switch to a language with a different ethos.

The ethos of Rust is caring for memory safety so much that you willingly limit yourself in terms of what kind of code you write and you only step out of those limits reluctantly and with great care. That's something that resonates with a lot of people and Rust has been built on top of that for years.

If you suddenly take the product of those years of hard work, strip out the foundation it has been built on, and unironically offer it as a good idea, a lot of people won't like it and will tell you so. Mind, I'm not excusing the personal attacks, I'm just explaining the reaction.

corrode2711 1/1/2026||
Anything fun is dangerous. Or anything dangerous is fun. Something like that.
ethin 1/4/2026|||
I ran into the borrowc so many times when doing OS dev. Which is why I hold the position that Rust is (not) a suitable language for OS development/kernels/embedded systems. In such scenarios, multithreading/multiprocessing is something you explicitly have to do, since although most systems that aren't x86 may start all processors simultaneously, your bootstrap code (should!) explicitly park all other cores before continuing. But the problem I always ran into was that Rust wanted me to do something weird for a system that was (not) multi-threaded. As in, I didn't even support threads of any kind. Or coroutines. But I still had to wrap statics in Lazy<T> or Rc<T> or something because "uwuw this is unsafe oh the world is going to end!" And all I wanted was to tell the borrow checker to shut up for once because yes, I actually did know what I was doing, thank you very much.

I get the idiom of "trust the compiler, it knows your code better than you". But there are also instances where the compiler just needs to trust the programmer because the programmer is, in fact, smarter than the compiler in those instances.

whatshisface 1/1/2026|||
I think there is probably a way to do what you're doing with unsafe. You could write a library that copies handles and can dump potentially freed memory afterwards.
g-mork 1/1/2026|||
Some kind of cargo plugin that transforms all references in the project into pointers and casts prior to feeding to rustc would probably be the best practice and highly maintainable route I'd go. like "cargo expand" but with a fancy catchier name that encourages new users to rely on it. "cargo autofix" might work
jvanderbot 1/1/2026|||
There's definitely a way to do it without unsafe! It just isn't as simple as dropping one println out so.... Lets alter the compiler?

I gotta applaud that level of my-way-or-the-highway

zaptheimpaler 1/2/2026|||
This is incredible! I honestly think Rust is an awesome language because it has a lot of high-level niceties like HOF, powerful libraries but can simultaneously let you manage memory manually or put assembly next to your regular code - the borrow checker is an annoyance in a lot of cases like prototyping as you said. I would loove if this was an official mode that Rust supported, Rust could really work in any niche with that capability.
bkovitz 1/2/2026|||
Thanks for explaining why you made Rust--! I figured it was just whimsical play, and I find it delightful. I'm especially delighted to hear that it wasn't even hard to disable the borrow checker.

I hope you will answer the question here from @dataflow about whether, without the borrow checker, compiler optimizations will emit incorrect code. Did you have to make further modifications to disable those optimizations?

oblio 1/1/2026|||
You, sir, might one day belong to the Computing Hall of Fame, together with the creators of Brainf*k, Visual Basic, PHP, Javascript, ColdFusion, etc. :-p
npalli 1/1/2026|||
You just need to master one package managed in depth and you will get what you really want with Modern C++.
linolevan 1/1/2026||
[dead]
xlii 1/1/2026||

    > In addition to meeting the Open Source Definition, the following standards apply to new licenses:
    > (...) The license does not have terms that structurally put the licensor in a more favored position than any licensee.
    https://opensource.org/licenses/review-process

That's a funfact I learned from IP lawyer when discussing possibility of open-source but otherwise LLVM-extempt license. If there is extemption (even in LLM) such license is most likely OSI-incompatible.
QuaternionsBhop 1/1/2026||
Fighting the borrow checker is something you do when you're learning Rust. After you learn how to design things that way in the first place, it's just there to keep you honest.
tinco 1/1/2026|
To be fair, working on the Rust compiler is also something to do when you're learning Rust. I guess this person is killing two birds with one stone.
raluk 1/1/2026||
What are protental issues with compiler, by just disabling borrow checker? If I recall correctly some compiler optimisations for rust can not be done in C/C++ because of restrictions implied by borrow checker.
0xdeafbeef 1/1/2026||
Rust can set restricts to all pointers, because 1 mut xor many shared refs rule. Borrow checker empowers this. https://en.wikipedia.org/wiki/Restrict
imtringued 1/1/2026||
The crazy part about this is that (auto) vectorization in Rust looks something like this: iter.chunks(32).map(vectorized)

Where the vectorized function checks if the chunk has length 32, if yes run the algorithm, else run the algorithm.

The compiler knows that the chunk has a fixed size at compile time in the first block, which means it can now attempt to vectorize the algorithm with a SIMD size of 32. The else block handles the scalar case, where the chunk is smaller than 32.

collinvandyck76 1/4/2026||
Hah I love things like this, where the compiler leaks out.
vanviegen 1/1/2026||
Without the borrow checker, how should memory be managed? Just never deallocate?
masklinn 1/1/2026|||
The borrow checker does not deal with ownership, which is what rust’s memory management leverages. The borrow checker validates that borrows (references) are valid aka that they don’t outlive their sources and that exclusive borrows don’t overlap.

The borrow checker does not influence codegen at all.

flohofwoe 1/1/2026||||
It would be the same as in any language with manual memory management, you'd simply get a dangling pointer access. The 'move-by-default' semantics of Rust just makes this a lot trickier than in a 'copy-by-default' language though.

It's actually interesting to me that the Rust borrow checker can 'simply' be disabled (e.g. no language- or stdlib-features really depending on the borrow checker pass) - not that it's very useful in practice though.

eptcyka 1/1/2026|||
The same as C++, destructors get called when an object goes out of scope.
sakisv 1/1/2026||
As someone who's only did a couple of small toy-projects in rust I was never annoyed by the borrow checker. I find it nothing but a small mental shift and I kinda like it.

What I _do_ find annoying though and I cannot wrap my head around are lifetimes. Every time I think I understand it, I end up getting it wrong.

Maxatar 1/1/2026||
Lifetimes are the input to the borrow checker, so it doesn't make much sense to say you have never been bothered by the borrow checker but you are bothered by lifetimes.
ViewTrick1002 1/1/2026|||
Due to lifetime elision you can mostly skip lifetimes if you leave a bit of performance on the table.
bluehex 1/2/2026||
How does lifetime elision affect performance? I thought the compiler just inferred lifetimes that you would have had to manually annotate. Naively, it seems to me that the performance should be identical.
steveklabnik 1/2/2026||
Strictly speaking, elision just adds lifetimes based on common patterns, so yes, it wouldn't directly affect performance.

I believe your parent is implying that if you skip using a lifetime and do something else instead to make it easier, that may be less performant.

ViewTrick1002 1/2/2026|||
Exactly.

Cloning values, collecting iterators into Vecs and then continue the transformation rather than keeping it lazy all the way through. Skipping structs/enums with references.

adastra22 1/2/2026|||
I think his point is the lifetime you’d put there is identical to the lifetime that is inferred/elided. So there is literally no difference.
steveklabnik 1/3/2026||
I thought they meant the case where you go "ugh, I don't want to write a lifetime here" and then change your code, because you have to. If you don't have to, then yes, there's literally no difference.
sakisv 1/3/2026|||
Ah, that's a fair point. In that case then yes, I have been bothered by the borrow checker very much indeed lol.
culebron21 1/4/2026||
Somehow at some point lifetimes started clicking for me. I think the key was when I understood that in a function, not everything needs the same lifetime. e.g.

    struct MyRef<'a> { item: &'a i64 }
    struct MyType { items: Vec<i64> }
    impl MyType {
        fn get_some_ref(&self, key: &usize) -> MyRef<'a> {
            MyRef { item: &self.items[key] }
        }
    }
the function this way is incorrect. When I was a beginner, I would have thought that every & in the signature needed a lifetime. In a slightly complicated code, the borrow checker then demanded lifetimes everywhere, they'd infect everything, but the code would still never compile.

Then I understood that not everything needs to be tied to the same lifetime. In this example, MyRef should be tied to the MyStruct, but not to the `key`:

        fn get_some_ref<'a>(&'a self, key: &usize) -> MyRef<'a>
lamontcg 1/1/2026||
The more I write code in other languages where I think hard about ownership ("does this method ultimately grab the object and throw a ref onto some long-lived data structure somewhere? Then it owns it, so I better clone it") the more robust my code in other languages generally gets. Same with mutation. Generally better to make a copy of something and then mess with it and throw it away than to try to mutate-then-unmutate or something like that, even though it might in principle be nanoseconds faster. Eliminate loads of spooky-action-a-distance bugs where things are getting mutated in one spot and used in another spot, when there should have been a copy in there somewhere.
dmitrygr 1/1/2026|
> Generally better to make a copy of something

> Eliminate loads of spooky-action-a-distance bugs

This line of thinking so sickens me. Many things are not easy when done right. That is no excuse to avoid understanding how to do them right. Sure, making endless copies is easier. But this is why machines now need 16GB of ram and four cores to run the calculator.

LexiMax 1/1/2026||
> But this is why machines now need 16GB of ram and four cores to run the calculator.

This has more to do with the fact that the web stack has become the de-facto app development platform, and thus we inherit the bloat and optimization oversights of that platform.

You're not going to make a 16GB calculator just because you personally prefer copies over shared ownership in a language that gives you the tools to avoid bloat in a myriad of ways.

andrewshadura 1/1/2026||
I don’t have a slightest idea why would anyone want this. Borrow checking is one of the greatest benefits of Rust.
Someone1234 1/1/2026|
It is funny.
zemo 1/1/2026|
way too many people in this thread are taking this project seriously
More comments...