Top
Best
New

Posted by iliketrains 12/28/2025

Unity's Mono problem: C# code runs slower than it should(marekfiser.com)
279 points | 181 commentspage 2
boguscoder 12/29/2025|
Good article but seems strange that author benchmarked debug builds first, that’s a huge “no-no” in any perf tweaking and it’s clear that authors knows this well
iliketrains 12/29/2025||
From my experience, performance gains seen in Debug builds in Unity/C#/Mono nearly always translate in gains in Release mode. I know that this is not always true, but in this context that's my experience.

Setting up release benchmarks is much more complex and we develop the game in Debug mode, so it is very natural to get the first results there, and if promising, validate them in Release.

Also, since our team works in Debug mode, even gains that only speed things up in Debug mode are valuable for us, but I haven't encountered a case where I would see 20%+ perf gain in Debug mode that would not translate to Release mode.

mort96 12/29/2025||
I agreed with you initially, but is it really as big of a deal in C#? I'm used to compiled languages where "debug build" means "no compiler optimisations", aka every operation done with a variable is a memory load + store, trivial functions aren't inlined so even trivial accessors called in a loop carry function call overhead, etc etc. But this is C#, so the JIT presumably optimises just as much in a debug build as in a release build?

So in C++ terms, it's really just benchmarking "-O2" instead of "-O2 -DNDEBUG". This seems fine.

MindSpunk 12/29/2025||
CoreCLR doesn't help on console platforms because you can't ship the JIT runtime. To my knowledge CoreCLR's AOT solution won't work because of SDK and build requirements for shipping builds on consoles. I believe some consoles require that all shipped native code must have been compiled with the SDK compiler. Even if you fork the CoreCLR AOT system so you can build for the consoles (the code can't be open because of NDAs) you wouldn't be allowed to ship the binary. IL2CPP is the only path forward there. CoreCLR is only viable on PC.
CyanLite2 12/29/2025||
Simply not true, this info is outdated by a decade.

CoreCLR NativeAOT is already shipping real games on Nintendo, PS5, and Xbox.

JIT isn't allowed on iPhones either, and this is what NativeAOT solves. Also, .NET is moving WASM support to CoreCLR (rather than mono) in an upcoming version as well.

MindSpunk 12/29/2025||
Do you have examples? As far as I'm aware based on current info there's at least one current console vendor that requires all native code to be generated by their SDK.
neonsunset 12/29/2025||
Just don't ship to PlayStation and discourage others until Sony changes (is forced to) policy.
pjmlp 12/29/2025||
Capcom does, and they are quite happy with it.
pjmlp 12/29/2025||
Yes, it does, Capcom is using it for their Playstation 5 games, like Devil May Cry.

"RE:2023 C# 8.0 / .NET Support for Game Code, and the Future"

https://www.youtube.com/watch?v=tDUY90yIC7U

As always, it is a matter of having the skill to deliver, instead of GC phobia.

MindSpunk 12/29/2025||
If I'm interpreting that correctly they're using an IL2CPP compilation system that hooks into Roslyn and not using .NET Core's AOT technology. It's possible to ship C# on consoles, obviously, because Unity already does it with their own IL2CPP backend that's stuck on the old .NET versions. My point is that CoreCLR can't be used because of console certification requirements. I certainly wasn't commenting on C# as language for games. I think C# is, as of late, becoming a very powerful language for games with all the Span and similar tools to minimize GC pressure.
Rochus 12/28/2025||
That's interesting. I made measurements with Mono and CoreCLR some years ago, but only with a single thread, and I came to the conclusion that their performance was essentially the same (see https://rochus.hashnode.dev/is-the-mono-clr-really-slower-th...). Can someone explain what benchmarks were actually used? Was it just the "Simple benchmark code" in listing 1?
to11mtm 12/28/2025||
I think a lot of the devil is in the details, especially when we look at NET8/NET10 and the various other 'boosts' they have added to code.

But also, as far as this article, it's noting a noting a more specific use case that is fairly 'real world'; Reading a file (I/O), doing some form of deserialization (likely with a library unless format is proprietary) and whatever 'generating a map' means.

Again, this all feels pretty realistic for a use case so it's good food for thought.

> Can someone explain what benchmarks were actually used?

This honestly would be useful in the article itself, as well as, per above, some 'deep dives' into where the performance issues were. Was it in file I/O (possibly Interop related?) Was it due to some pattern in the serialization library? Was it the object allocation pattern (When I think of C# code friendly for Mono I think of Cysharp libraries which sometimes do curious things)? Not diving deeper into the profiling doesn't help anyone know where the focus needs to be (unless it's a more general thing, in which case I'd hope for a better deep dive on that aspect.)

Edited to add:

Reading your article again, I wonder whether your compiler is just not doing the right things to take advantage of the performance boosts available via CoreCLR?

E.x. can you do things like stackalloc temp buffers to avoid allocation, and does the stdlib do those things where it is advantageous?

Also, I know I vaguely hit on this above, but also wondering whether the IL being done is just 'not hitting the pattern'. where a lot of CoreCLR will do it's best magic if things are arranged a specific way in IL based on how Roslyn outputs, but even for the 'expected' C# case, deviations can lead to breaking the opt.

Rochus 12/29/2025|||
The goal of my compiler is not to get out maximum performance, neither of CoreCLR nor Mono. Just look at it as a random compiler which is not C#, especially not MS's C# which is highly in sync and optimized for specific features of the CoreCLR engine (which might appear in a future ECMA-335 standard). So the test essentially was to see what both, CoreCLR and Mono, do with non-optimized CIL generated by not their own compiler. This is a legal test case because ECMA-335 and its compatible CLR were exactly built for this use-case. Yes, the CIL output of my compiler could be much more improved, and I could even get more performance out of e.g. CoreCLR by using the specific knowledge of the engine (which you cannot find in the standard) which also the MS C# compiler uses. But that was not my goal. Both engine got the same CIL code and I just measured how fast it run on both engines on the same machine.
WorldMaker 12/29/2025|||
> Reading your article again, I wonder whether your compiler is just not doing the right things to take advantage of the performance boosts available via CoreCLR?

> E.x. can you do things like stackalloc temp buffers to avoid allocation, and does the stdlib do those things where it is advantageous?

The C# standard lib (often called the base class library or BCL) has seen a ton of Span<T>/Memory<T>/stackalloc internal usage adoption in .NET 6+, with each release adding more of them. Things like File IO and serialization/deserialization particularly see a lot of notable performance improvements just from upgrading each .NET version. .NET10 is faster than .NET9 with a lot of the same code, and so forth.

Mono still benefits from some of these BCL improvements (as more of the BCL is shared than not these days, and Blazor WASM for the moment is still more Mono than CoreCLR so some investment has continued), but not all of them and not always in the same ways.

to11mtm 12/29/2025||
> The C# standard lib (often called the base class library or BCL) has seen a ton of Span<T>/Memory<T>/stackalloc internal usage adoption in .NET 6+, with each release adding more of them. Things like File IO and serialization/deserialization particularly see a lot of notable performance improvements just from upgrading each .NET version. .NET10 is faster than .NET9 with a lot of the same code, and so forth.

I worded my reply poorly, mostly in that I meant 'If Oberon has it's own stdlib, is it doing the modern performant practice' ?

eterm 12/28/2025|||
What's going on with the Mandelbrot result in that post?

I don't beleive such a large regression from .NET framework to CoreCLR.

to11mtm 12/28/2025|||
NGL would be nice if there was a clear link to the cases used both for OP as well as who you are replying to... Kinda get it in OP's case tho.
Rochus 12/29/2025||
I measured the raw horsepower of the JIT engine itself, not the speed of the standard library (BCL). My results show that the Mono engine is surprisingly capable when executing pure IL code, and that much of the 'slowness' people attribute to Mono actually comes from the libraries, not the runtime itself.

In contrast, the posted article uses a very specific, non-standard, and "apple-to-oranges" benchmark. It is essentially comparing a complete game engine initialization against a minimal console app (as far as I understand), which explains the massive 3x-15x differences reported. The author is actually measuring "Unity Engine Overhead + Mono vs. Raw .NET", not actually "Mono vs. .NET" as advertized. The "15x" figure comes very likely from the specific microbenchmark (struct heavy loop) where Mono's optimizer fails, extrapolated to imply the whole runtime is that much slower.

eterm 12/29/2025||
Can we reproduce your results for Mandelbrot?
Rochus 12/29/2025||
You can find all necessary information/data in the article (see references). Finding the same hardware that I used might be an issue though. Concerning Mandelbrot, I wouldn't spend too much time, because the runtime was so short for some targets that it likely has a big error margin compared to the other results. For my purpose this is not critical because or the geometric mean over all factors.
to11mtm 12/29/2025||
I think we are trying to find something like 'can we pull this branch/commit/etc and build it to reproduce'.
Rochus 12/29/2025|||
The Mono and .Net 4 times were too short; the true time is unknown. I only left the Mandelbrot result because I got a decently looking figure for CoreCLR, but the actual factor to Mono is unreliable. If the Mono result was 1, the factor would still be seven. I have no idea why it is that much faster.
LeFantome 12/29/2025||
I think the “some years ago” is pretty relevant.

.NET has heavily invested in performance. If I understand your article correctly, you tested .NET 5 which will be much slower at this point than .NET 10 is.

I also think it matters what you mean by “Mono”. Mono, the original stand-alone project has not seen meaningful updates in many years. Mono is also one of the two runtimes in the currently shipping .NET though and I suspect this runtime has received a lot of love that may not have flowed back to the original Mono project.

calebh 12/28/2025||
Will the move to CoreCLR give any speed ups in practice if the release build is complied with IL2CPP anyway? On all the games that I've worked on, IL2CPP is one of the first things that we've enabled, and the performance difference between the editor and release version is very noticeable.
Rohansi 12/28/2025|
Editor is slower than Mono release builds. You'll need to compare Mono release vs. IL2CPP release to see the actual difference.
calebh 12/29/2025||
I guess it would be good to also see a comparison between IL2CPP and Core CLR by the post author!
pwdisswordfishy 12/29/2025||
The author (probably unknowingly) glosses over a lot in these sentences of the "How did we get here" section:

> Unity uses the Mono framework to run C# programs and back in 2006 it was one of the only viable multi-platform implementations of .NET. Mono is also open-source, allowing Unity to do some tweaks to better suit game development. [...] An interesting twist happened nearly 10 years later.

Not mentioned is that Mono itself of course improved a lot over the years, and even prior to MS's about-face on open source, it was well known that Unity was hindered by sticking with an old and out-of-date Mono, and they were very successful at deflecting the blame for this by throwing the Mono folks under the bus. Any time complaints about game developers' inability to use newer C# features came up, Mono/Xamarin would invariably receive the ire* because Unity couldn't come to an agreement with them about their license and consulting fees. (Mono was open source under LGPL instead of MIT licensed at the time, and Unity had initially bought a commercial license that allowed them exemptions from even the soft copyleft provisions in the LGPL, but the exemption was limited and not, for example, for any and all future versions, too, indefinitely.) Reportedly, they were trying to charge too much (whatever that means) for Unity's attempts to upgrade to the modern versions.

It's now 10+ years later, and now not only Mono (after being relicensed under MIT) but .NET CoreCLR are both available for Unity at no cost, but despite this it still took Unity years before they'd upgraded their C# language support and to a slightly more modern runtime.

Something else to note: Although, LGPL isn't inherently incompatible with commercial use or even use in closed source software, one sticking point was that some of the platforms Unity wished to be able to deploy have developer/publisher restrictions that are incompatible with the soft copyleft terms in the LGPL that require that users (or in this case game developers) be allowed to relink against other versions of the covered software (including, for example, newer releases). Perversely, it's because Unity sought and obtained exemptions to the LGPL that both end users and game developers were hamstrung and kept from being able to upgrade Mono themselves! (It wouldn't have helped on, say, locked down platforms like Nintendo's for example, but certainly would have been viable on platforms without the first-party restrictions, like PC gaming or Android.)

By now, Unity has gone on to pull a lot of other shenanigans with their own pricing that seems to have sufficiently pissed off the game development community, but it should still not be forgotten when they were willing to pass the blame to an open source project over the development/support that the company felt it was entitled to for a price lower than they were told it would cost, and that they themselves were slow to make progress on even when the price of the exemption literally became $0.

* you can find threads with these sorts of comments from during this period right here on HN, too, but it was everywhere

littlecranky67 12/29/2025|
> it was well known that Unity was hindered by sticking with an old and out-of-date Mono, and they were very successful at deflecting the blame

So much this. According to a 2023 blog article from Unity [0], Unity uses Boehm GC. But Mono itself introduced another, generational GC called SGen [1] more than 10 years ago that became the default at some point. It is just Unity stuck on old mono versions, missing out on all the changes and improvements that went into Mono after their fork, essentially.

[0]: https://unity.com/blog/engine-platform/porting-unity-to-core... [1]: https://www.mono-project.com/docs/advanced/garbage-collector...

rincebrain 12/29/2025||
A sibling comment [1] remarks that they play games with raw pointers that are incompatible with the newer GC, so it's not "just" an older runtime that's biting them in the ass.

[1] - https://news.ycombinator.com/item?id=46415568

enbugger 12/29/2025||
Not to mention the Hot Reload which comes out of the box.
LarsDu88 12/29/2025||
Ah I wonder if this could've saved me countless hours of optimizing my VR game Rogue Stargun for the Quest 2, particularly the final battle in the game
KronisLV 12/29/2025||
It's cool to see detailed traces and flame graphs be used more often! A lot of different problems could be detected if they were available for pretty much any language, with enough details and tooling to be useful. Heh, I remember also using VisualVM for finding issues with a web app HTTP thread pool and later with the SQL queries being executed (and also the DB pooling solution).
viktorcode 12/29/2025||
I wonder why the author doesn't use IL2CPP and sticks to Mono. IL2CPP does produce much faster code, making Mono builds obsolete. This should be the very first step you do if you care at all about performance in Unity.
NooneAtAll3 12/29/2025|
my main problem with Unity games is never the speed - it's the outrageous RAM usage
More comments...