Top
Best
New

Posted by jnord 1 day ago

Ubuntu now requires more RAM than Windows 11(www.howtogeek.com)
141 points | 187 comments
senfiaj 1 day ago|
From my understanding this is an official statement, not a benchmark result.

> The change isn't about the core operating system becoming resource-hungry. Instead, it reflects the way people use computers today—multiple browser tabs, web apps, and multitasking workflows, all of which demand additional memory.

So it is more about the 3rd party software instead of OS or desktop environment. Actually, nowadays it's recommended to have 8+ GB of RAM, regardless of OS.

I just checked the memory usage on Ubuntu 24.04 LTS after closing all the browser tabs. It's about 2GB of 16GB total RAM. 26.04 LTS might have higher RAM usage but it seems unlikely that it will get anywhere close to 6GB.

HauntingPin 1 day ago||
Also, the Windows 11 requirements are ludicrous.

https://www.microsoft.com/en-us/windows/windows-11-specifica...

4GB of RAM? What? I guess if your minimum is "able to start Windows and eventually reach the desktop", sure? I wouldn't even use Windows 11 with 8GB even though it would theoretically be okay.

mpyne 1 day ago|||
> 4GB of RAM? What? I guess if your minimum is "able to start Windows and eventually reach the desktop", sure? I wouldn't even use Windows 11 with 8GB even though it would theoretically be okay.

Not okay as soon as you throw on the first security tool, lol.

I work in an enterprise environment with Win 11 where 16 GB is capped out instantly as soon as you open the first browser tab thanks to the background security scans and patch updates. This is even with compressed memory paging being turned on.

ralferoo 12 hours ago||
I was about to rush to the defence of Windows 11, thinking it couldn't possibly be that bad, and just checked mine. I booted a couple of hours ago and have done nothing apart from running Chrome and putty, and whatever runs on startup.

Apparently 13.6GB is in use (out of 64GB), and of that 4.7GB is Chrome. Yeah, I'm glad I'm not running this on an 8GB machine!

winrid 1 day ago||||
Win11 IOT runs great on 4gb if that matters :) I have a few machines in the field running it and my java app, still over a gig free usually.
lousken 21 hours ago|||
Yea, Windows requirements are a meme. Maybe it could barely work with IoT LTSC for non interactive tasks, but definitely not with regular versions. Even windows 10 would hold just barely. Same with HDD space.

Current minimum specs should be more like

2 cores, 2ghz min with SSE4.2

128GB SSD

8GB RAM

mpol 1 day ago|||
It's not just the applications, the installer doesn't even start up with 1GiB of memory. With 2GiB of memory it does start up. You could (well, I would :) ) blame it on the Gnome desktop, but it is very different from what I would have expected.

I just tested this with 25.10 desktop, default gnome. With 24.04 LTS it doesn't even start up with 2GiB.

senfiaj 1 day ago||
So, you mean when RAM is 2 GiB with 25.10 the installer started up but didn't with 24.04? What about being able to install and then boot the installed Ubuntu?
panarky 1 day ago|||
If you run Windows 11 with Microsoft Teams and Microsoft Outlook on a 4GB machine you're gonna have a bad day.
lousken 21 hours ago||
If you run it on 8GB machine you have a bad day as well, no more space for Chrome.
CoolGuySteve 1 day ago|||
No because as far as we know 26.04 won't enable zswap or zram whereas Windows and MacOS both have memory compression technology of some sort. So Ubuntu will use significantly more memory for most tasks when facing memory pressure.

Apparently it's still in discussion but it's April now so seems unlikely.

Kind of weird how controversial it is considering DOS had QEMM386 way back in 1987.

functional_dev 23 hours ago|||
interesting, I never realized macOS compresses memory in place before swapping.. Linux just dumps straight to disk, no wonder it freezes

Found good visual explainer on this - https://vectree.io/c/linux-virtual-memory-swap-oom-killer-vs...

cogman10 1 day ago||||
Zswap is a no brainer. I have to wonder why the hesitancy.
bzzzt 1 day ago|||
QEMM386 for DOS did not have a memory compression feature. Only one of the later versions for Windows 3.1 did.
roryirvine 1 day ago||
CPUs really weren't up to the job in the pre-Pentium/PowerPC world. Back then, zip files used to take an appreciable number of seconds to decompress, and there was a market for JPEG viewers written in hand-optimised assembly.

That's why SoftRAM gained infamy - they discovered during testing that swapping was so much faster than compression that the released version simply doubled the Windows swap file size and didn't actually compress RAM at all, despite their claims (and they ended up being sued into oblivion as a result...)

Over on the Mac, RAMDoubler really did do compression but it a) ran like treacle on the 030, b) needed to do a bunch of kernel hacks, so had compatibility issues with the sort of "clever" software that actually required most RAM, and c) PowerMac users tended to have enough RAM anyway.

Disk compression programs were a bit more successful - DiskDoubler, Stacker, DoubleSpace et al. ISTR that Microsoft managed to infringe on Stacker's patents (or maybe even the copyright?) in MS DOS 6.2, and had to hastily release DOS 6.22 with a re-written version free of charge as a result. These were a bit more successful because they coincided with a general reduction in HDD latency that was going on at roughly the same time.

Lerc 1 day ago|||
I know 2GB isn't very heavy in OS terms these days, but it's still enough to hold nearly 350 uncompressed 1080p 24-bit images.

There's rather a lot of information in a single uncompressed 1080p image. I can't help but wonder what it all gets used to for.

array_key_first 1 day ago|||
A lot of it is optimizing applications for higher-memory devices. RAM is completely worthless if it's not used, so ideally you should be running your software with close to maximum RAM usage for your device. Of course, the software developer doesn't necessarily know what device you will be using, or how much other software will be running, so they aim for averages.

For example, Java applications will claim much more memory than they need for the heap. Most of that memory will be unused, but it's necessary to have a faster running application. If you've ever run a Java app at consistently 90% heap usage, you know it grinds to an absolute halt with constant collection.

The same is true for caching techniques. Reading from storage is slow, so it often makes sense to put stuff in RAM even if you're not using it very often.

senfiaj 1 day ago||||
I also believe that this memory usage might be decreased significantly, but I don't know how much (and how much is worth it). Some RAM usage might be useful, such as caching or for things related with graphics. Some is a cumulative bloat in applications caused by not caring much or duplication of used libraries.

But I remember in 2016 Fedora Gnome consumed about 1.6GB of RAM on my PC with 2GB of RAM a decade ago. Considering that after a decade the standard Ubuntu Gnome consumes only 400MB more RAM and also that my new laptop has 16GB of RAM (the system might use more RAM when more RAM is installed), I think the increase is not that bad for a decade. I thought it would be much worse.

jonhohle 1 day ago|||
Buy why that much? The first computer I bought had 192MB of RAM and I ran a 1600x1200 desktop with 24-bit color. When Windows 2000 came out, all of the transparency effects ran great. Office worked fine, Visual Studio, 1024x768 gaming (I know that’s quite a step down from 1080p).

What has changed? Why do I need 10x the RAM to open a handful of terminals and a text editor?

Someone 1 day ago|||
> and I ran a 1600x1200 desktop with 24-bit color

> What has changed? Why do I need 10x the RAM to open a handful of terminals and a text editor?

It’s not a factor of ten, but a 4K monitor has about four times as many pixels. Cached font bitmaps scale with that, photos take more memory, etc.

> When Windows 2000 came out

In those times, when part of a window became uncovered, the OS would ask the application to redraw that part. Nowadays, the OS knows what’s there because it keeps the pixels around, so it can bitblit the pixels in.

Again, not a factor of ten, but it contributes.

The number of background processes likely also increased, and chances are you used to run fewer applications at the same time. Your handful of terminals may be a bit fuller now than it was back then.

Neither of those really explain why you need gigabytes of RAM nowadays, though, but they didn’t explain why Windows 2000 needed whatever it needed at its time, either.

The main real reason is “because we can afford to”.

senfiaj 1 day ago||||
Partly because we have more layers of abstraction. Just an extreme example, when you open a tiny < 1KB HTML file on any modern browser the tab memory consumption will still be on the order of tens, if not hundreds of megabytes. This is because the browser has to load / initialize all its huge runtime environment (JS / DOM / CSS, graphics, etc) even though that tiny HTML file might use a tiny fraction of the browser features.

Partly because increased RAM usage can sometimes improve execution speed / smoothness or security (caching, browser tab isolation).

Partly because developers have less pressure to optimize software performance, so they optimize other things, such as development time.

Here is an article about bloat: https://waspdev.com/articles/2025-11-04/some-software-bloat-...

tosti 1 day ago||||
2 Programmers sat at a table. One was a youngster and the other an older guy with a large beard. The old guy was asked: "You. Yeah you. Why the heck did you need 64K of RAM?". The old man replied, "To land on the moon!". Then the youngster was asked: "And you, why oh why did you need 4Gig?". The youngster replied: "To run MS-Word!"
winrid 1 day ago|||
Higher res icons probably add a couple hundred megs alone
Lerc 17 hours ago||
Well if you have a 512x512 icon uncompressed it is an even megabyte, so that makes the calculations fairly easy.

But raw imagery is one of the few cases where you can legitimately require large amounts of RAM because of the squaring nature of area. You only need that raw state in a limited number of situations where you are manipulating the data though. If you are dealing with images without descending to pixels then there's pretty much no reason to keep it all floating around in that form, You generally don't have more than a hundred icons onscreen, and once you start fetching data from the slowest RAM in your machine you get pretty decent speed gains from using decompression than trying to move the uncompressed form around.

KronisLV 1 day ago|||
I remember running Xubuntu (XFCE) and Lubuntu (LXDE, before LXQt) on a laptop with 4 GB of RAM and it was a pretty pleasant experience! My guess is that the desktop environment is the culprit for most modern distros!
abenga 1 day ago||
Gnome 50 and its auxilliary services on my machine uses maybe 400MB.

The culprit is browsers, mostly.

adgjlsfhk1 1 day ago|||
well to start, you likely have 2 screen size buffers for current and next frame. The primary code portion is drivers since the modern expectation is that you can plug in pretty much anything and have it work automatically.
Lerc 1 day ago||
How often do you plug in a new device without a flurry of disk activity occurring?
rdsubhas 1 day ago||
That's subjective and I would be more comfortable if that's called as Recommended memory, not Minimum memory.

Minimum memory as in this change sets a completely different expectation.

goalieca 1 day ago||
I hear a lot from linux users that found gtk 2 era on x11 as pretty close to perfect. I know i had run ubuntu and after boot it used far less than 1GB. The desktop experience was perhaps even slightly more polished than what we have today. Not much has fundamentally changed except the bloat and a regression on UX where they started chasing fads.

I suppose the most major change on RAM usage is electron and the bloated world of text editors and other simple apps written in electron.

john01dav 1 day ago||
Just stick XFCE on a modern minimal-ish (meaning not Ubuntu, mainly) distribution and you'll have this with modern compatibility. Debian and Fedora are both good options. If you want something more minimal as your XFCE basd, there are other options too.
mrob 1 day ago|||
XFCE is saddled with its GTK requirement, and GTK gets worse with every version. Even though XFCE is still on GTK3, that's a big downgrade from GTK2 because it forces you to run Wayland if you don't want your GUI frame rate arbitrary capped at 60 fps.

For people wanting the old-fashioned fast and simple GUI experience, I recommend LXQt.

jstanley 1 day ago||
What use is there in display frame rates above 60 fps?
mrob 1 day ago|||
It makes it easier to treat the computer as part of your own body, allowing operation without conscious thought, as you would a pencil or similar hand tool.
tuetuopay 1 day ago||||
Outside of gaming, not much. However, now that I'm used to a 144Hz main monitor, there is no world where I would get back. You just feel the difference.

So basically, no use when you've not tasted 120+Hz displays. And don't because once you do, you won't go back.

bogwog 1 day ago||
I have a 165hz display that I use at 60hz. Running it at max speed while all I'm doing is writing code or browsing the web feels like a waste of electricity, and might even be bad for the display's longevity.

But for gaming, it really is hard to go back to 60.

tuetuopay 1 day ago||
Mine supports variable refresh rate, which means for most desktops tasks (I.e when nothing is moving), it runs at 48Hz.

Incredibly, Linux has better support than windows for it on the desktop: DWM runs full blast, while sway supports VRR on the desktop. Windows will only enable it for games (and games that support it). Disclaimer: Wayland compositor required.

It’s not enabled by default on e.g. sway because on some GPU and monitor combos, it can make the display flicker. But if you can, give it a try!

jasomill 1 day ago|||
Windows 11 idles at around 60 Hz in 120 Hz modes on my VRR ("G-SYNC Compatible") display when the "Dynamic refresh rate" option is enabled, and supports VRR for applications other than games (e.g., fullscreen 24 FPS video playback runs at 48 Hz* via VRR rather than mode switching, even with "Dynamic refresh rate" disabled).

* The minimum variable refresh rate my display (LG C4) supports is 40 Hz.

tuetuopay 1 day ago||
Ah, then Windows 11 has at least one improvement compared to Windows 10 as far as user experience goes. Good to know, as I never made the jump
bogwog 1 day ago|||
I use KDE + Nvidia, and last I looked into it, it only worked if you had one monitor enabled. That's fine for gaming, not for working.

But it has been a while since I've tried it, maybe I should look into it again

tuetuopay 1 day ago||
AFAIK it’s very compositor-dependent. Works fine with disparate monitors on Sway + NVIDIA (1x144 VRR, 2x60 no VRR + 1x120 VRR when the TV is on).

The state of Wayland compositors move fast, so support may be here. Last thing I’m waiting for is sway 1.12 that will bring HDR.

TacticalCoder 1 day ago||||
> What use is there in display frame rates above 60 fps?

On a CRT monitor the difference between running at 60 Hz and even a just slightly better 72 Hz was night and day. Unbearable flickening vs a much better experience. I remember having some little utility for Windows that'd allow the display rate to be 75 (not 72 but 75). Under Linux I was writing modelines myself (these were the days!) to have the refresh rate and screen size (in pixels) I liked: I was running "weird" resolutions like 832x604 @ 75 Hz instead of 800x600 @ 60 Hz, just to gain a little bit more screen real estate and better refresh rate.

Now since monitors started using flat panels: I sure as heck have no idea if 60 fps vs 120 fps or whatever change anything for a "desktop" usage. I don't think the problem of the image fading too quickly at 60 Hz that CRT had is still present. But I'm not sure about it.

jasomill 1 day ago||
120 FPS vs 60 FPS is definitely noticeable for desktop use. Scrolling and dragging are night and day, but even simple mouse cursor movement is noticeably smoother.
Tade0 1 day ago|||
I, for one, lose track of the mouse way less often at 165Hz.
jstanley 1 day ago|||
I lose track of the mouse less often at 1024x768!
M95D 1 day ago|||
You need a bigger cursor.
Imustaskforhelp 1 day ago|||
MXLinux is really great for something like xfce and I really loved the snapshotting feature of it too. Highly recommended.
imcritic 1 day ago||
You spelled Debian wrong.
okeuro49 1 day ago|||
I used gtk2, it was ok, but I preferred Ubuntu's Unity interface when it came out.

Gnome 3 seems similar to Unity nowadays, and it is pretty good.

I find it much easier to use than Windows or Mac, which is credit to the engineers who work on it.

synergy20 1 day ago|||
it's always the browser, each tab is at least 100MB, electronjs is also a browser. the gtk or whatever is nothing before the browser
shevy-java 1 day ago|||
The whole linux stack got bigger though - just look at what you need now to compile stuff, cmake, meson/ninja, mesa, llvm and so forth. gtk2 was great; GTK is now a GNOMEy-toolkit only, controlled by one main corporation. Systemd increased the bloat factor too - and also gathers age data of users now (https://github.com/systemd/systemd/pull/40954).

I guess one of the few smaller things would be wayland, but this has so few features that you have to wonder why it is even used.

curt15 1 day ago|||
>The whole linux stack got bigger though - just look at what you need now to compile stuff, cmake, meson/ninja, mesa, llvm and so forth

Those are all development tools. Has the runtime overhead grown proportionally, and what accounts for the extra weight?

array_key_first 1 day ago||
Runtime-wise we use more garbage collected languages now. Java and such are great and can be very high performance, the real cost though is memory. GC languages need much more memory for book keeping, but they also need much more memory to be performant. Realistically, a Java app needs 10x the amount of memory as a similar C++ application to get good performance. That's because GC languages only perform well when most of their heap is unused.

As a side-note, that's how GC languages can perform so well in benchmarks. If you run benchmarks that generate huge amounts of garbage or consistently run the heap at 90%+ usage, that's when you'll see that orders of magnitude slowdown.

Oh also containers, lots more containerized applications on modern Linux desktops.

jasomill 1 day ago||
Programs that manually allocate and deallocate memory to store "huge amounts of garbage" can easily incur more memory management overhead than programs using garbage collection to do the same.

If a Java application requires an order of magnitude more memory than a similar C++ application, it's probably only superficially similar, and not only "because GC".

array_key_first 1 day ago||
Well no because in a manual memory management language if you allocate 1 object, and then destroy it, and then reallocate another object then you've used 1 object amount of memory.

In Java, that's two objects, and one will be collected later.

What this means is that a C++ application running at 90% memory usage is going to use about the same amount of work per allocation/destruction as it would at 10% usage. The same IS NOT true for GC languages.

At 90% usage, each allocation and deallocation will be much more work, and will trigger collections and compactions.

It is absolutely true that GC languages can perform allocations cheaper than manual languages. But, this is only true at low amounts of heap usage. The closer you get to 100% heap usage, the less true this becomes. At 90% heap usage, you're gonna be looking at an order of magnitude of slowdown. And that's when you get those crazy statistics like 50% of program runtime being in GC collections.

So, GC languages really run best with more memory. Which is why both C# and Java pre-allocate much more memory than they need.

And, keep in mind I'm only referring to the allocation itself, not the allocation strategy. GC languages also have very poor allocation strategies, particularly Java where everything is boxed and separate.

goalieca 1 day ago||||
I’ve been using cmake since early 2000s when i was hacking on the vtk/itk toolkit. Compiling a c++ program hasn’t gotten any better/worse. FWIW, I always used the curses interface for it.
ScislaC 1 day ago|||
Is the option of legal compliance a bad thing? They have corporate customers.

If there's no opt-out, that's a different story.

GrayShade 1 day ago||
It's plain FUD. systemd always had fields for the full name, email address and location. They were optional, just like the date of birth. Bad systemd!
anthk 1 day ago||
Is not FUD; the full name, email and the rest were not META/corporations mandated, which are lobbying for it so they can earn money with users' preferences. Get your spyware to somewhere else.

If META's business model is not lucrative, is not my problem.

gruez 1 day ago||
>which are lobbying for it so they can earn money with users' preferences

Given it's a field where you can put absolutely anything in (and probably randomize, if you want), how is this different than the situation today, where random sites ask you for your birthday (also unverified)? Moreover Meta already has your birthday. It's already mandated for account creation, so claims of "so they can earn money with users' preferences" don't make any sense.

anthk 1 day ago||
Keep gaslighting:

https://www.theregister.com/2026/03/24/foss_age_verification...

Good luck when most libre users toss RH/Debian because of this and embrace GNU.

gruez 1 day ago||
>Keep gaslighting:

This is against HN guidelines: " Please respond to the strongest plausible interpretation of what someone says, not a weaker one that's easier to criticize. Assume good faith."

>The contents of the field will be protected from modification except by users with root privileges.

So... most users?

superkuh 1 day ago|||
Yep. I still develop Gtk2 applications today. It's a very snappy and low resource usage toolkit aimed entirely at desktop computers. None of that "mobile" convergence. I suppose you could put Gtk2 applications into containers of some sort but since Gtk2 has (luckily) been left alone by GNOME for decades it's a stable target (like NES or N64 is a stable target) and there's no need for it.

Most of the bloat these days is from containers and Canonical's approach to Ubuntu since ~2014 has been very heavy on using upstream containers so they don't have to actually support their software ecosystem themselves. This has lead to severe bloat and bad graphical theming and file system access.

WD-42 1 day ago||
Can you point us to some of these gtk2 applications that you’ve been writing recently?
superkuh 1 day ago||
Sure, one is connmapperl. It is a server/client application where the server is a GUI map of the world that shows all the various clients collected IP established connections via geoip lookup (local). It stores everything in sqlite db and has a bunch of config/filtering options; http://superkuh.com/connmapperl.html Technically a fork of X11 connmap I made because I coulnd't get it to run on my old X11, but with many, many more features (like offline whois from raw RIR dumps, the db, the hilbert mapping, the replays of connection history, etc).

Another one is memgaze, a program to vizualize linux process virtual memory spaces as RGB images and explore them using various binary visualization and sonification tools. Ie, you can just click a hilbert map of all processes then in the new window click around inside the image of that particular process' virtual ram and then listen to it interpreted as an 8bit wav, or find an extract images, for example. Or search for strings, run digraph analysis, etc. http://superkuh.com/memgaze-page.html

Or feeed.pl, my very quick and low resource usage feed reader for 1000+ feeds written in Perl/Gtk2 that is text only (no html, no images, etc). It is really handy for loading .opml files and finding and fixing broken feeds using the heuteristics I hard coded in to find feed urls. http://superkuh.com/blog/2025-09-13-2.html

These are a few I made 2025-26 that other people might care to use. But I have a lot more that just scratch my own particular itches. Like a Perl/Gtk2 version of MS Paint that interprets arbitrary loaded and painted images as sound, or the things that I use to monitor my ISP uptime/speed, etc.

WD-42 1 day ago||
These look really cool, thanks for sharing.
IshKebab 1 day ago||
That's rose-tinted. I remember specifically switching to KDE because GTK apps of the day segfaulted all the time. Unfortunately KDE then screwed things up massively with Plasma (remember the universally loathed kidney bean?) and it's really only recovered recently.

And to say the desktop experience was more polished than what we have now is laughable. I remember that you couldn't have more than one application playing sound at the same time. At one point you had to manually configure Xfree86 to be aware that your mouse had a middle button. And good luck getting anything vaguely awkward like WiFi or suspend-to-ram working.

The Linux desktop is in a vastly better position now, even taking the Wayland mess into account.

intothemild 1 day ago||
Two things

First, it sounds like this 6gb requirement is more like a suggestion/recommendation than a requirement. I also am curious if it actually actively uses all 6gb. From my own usage of Linux over the years the OS itself isn't using that much ram, but the application is, which is almost always the browser.

Secondly. I haven't used Ubuntu desktop in years. So I have no real idea if this is something specific to them, but I do use Fedora, so I would imagine that the memory footprint cannot be too different. Whilst I could easily get away with <8gb ram, you really kind of don't want too if you're going to be doing anything heavier than web browsing or editing documents. Dev work? Or CAD, Design etc etc. But this isn't unique to Linux.

heelix 1 day ago||
Ubuntu just raised the minimum RAM requirement from 4gb to 6. While it might have been possible to run anything with a GUI on 4, I can't imagine that is a good experience.

When they turned Centos into streams, I cut my workstation over to Ubuntu. It has been a reasonable replacement. Only real issues were when dual booting Win10 horked my grub and snap being unable to sort itself on occasion. When they release 26 as an LTS, I'm planning to update. You are spot on - the desktop itself is reasonably lean. 100+ tabs in Firefox... less so. Mind you, the amount of RAM in the workstations I'm using could buy a used car these days.

hhh 1 day ago|||
I don't really get it. I have ran fleets of thousands of devices running Chrome in a container on Ubuntu server, and it's a nice experience. It took a lot to make it nice, but once it was there it was rock solid. This was with 1GB ram on a Pi 3. When we swapped to Pi4, we just had thousands on gigabytes of ram and thousands of cpu cores unused.
bee_rider 1 day ago|||
Does Firefox really not unload the tabs in that case?
foepys 1 day ago||
It does. You can also do it by hand via the right-click on tab menu
sunshine-o 1 day ago||
I happened to install Fedora Silverblue on a computer a few days ago and looked quickly at the memory usage after boot: it was about 6gb ! I usually run Alpine or FreeBSD, so I thought: great that thing consumes 10x the RAM.

I believe Fedora and Ubuntu use about the same set of technologies: systemd, wayland, Gnome, etc. so it is about the same.

Apart from working out of the box I do not really know what those distros have and I don't. I just have to admit managing network interfaces is really easy in Gnome.

With the skyrocketing price of RAM this might finally be the year of the Linux desktop. But it is not gonna be Gnome I guess.

whatevaa 1 day ago||
Win11 barely works with 4GB. Like, you can have a browser with youtube on and that's it, 90%+ memory usage. I know because that is one of my media PC (instead of smart tv).

Can't move to Linux because it's Intel Atom and Intel P-state driver for that is borked, never fixed.

oreally 1 day ago||
Today's browsers tend to be huge memory hogs too. Software's attitude of "there's always more memory" is coming back to bite them as prices of ram increase.
senfiaj 1 day ago||
IMHO, browsers might prioritize execution speed somewhat more than memory. There is the Pareto tradeoff principle where it's unlikely to optimize all the parameters - if you optimize one, you are likely to sacrifice others. Also more memory consumption (unlike CPU) doesn't decrease power efficiency that much, so, more memory might even help with that by reducing CPU usage with caching.
oreally 1 day ago||
TBH your comments come off as either very misleading or just uneducated on the nature of performance. Troubling indeed.
senfiaj 1 day ago||
Can you enlighten me why it's misleading or uneducated?
oreally 1 day ago||
[flagged]
senfiaj 1 day ago|||
You just read my comment very literally or carelessly. I mean in cases where it increases performance. It's not always that more memory = less CPU usage, but in many situations there is some tendency. For example, on older Windows, such as 98 or XP, applications draw directly on screen and had to redraw the parts of the exposed UI when windows were dragged (BTW, this is why many people, including myself, remember that famous artifact effect when applications were unresponsive on older Windows versions). When memory became cheaper, Vista switched the rendering model to compositing where applications render into private off-screen buffer. That is why moving windows became smoother, even though memory use went up. There is some memory / performance tradeoff, not always though.
dijksterhuis 1 day ago||
> on older Windows, such as 98 or XP, applications had to redraw the parts of the exposed UI when windows were dragged (BTW, this is why many people, including me, remember that famous cascading effect when applications were unresponsive on older Windows versions)

i remember this and had no idea that's why it would be doing that. thanks, i learned something today.

Tade0 1 day ago|||
You won't have cache misses if the reason why the application is using a lot of memory is that garbage collection is run less frequently than it could.

That is the case with every mainstream JS engine out there and is one of the many tradeoffs of this kind.

bityard 1 day ago||
Since the dawn of time, Microsoft has published the minimum system requirements needed to run Windows, not what you need to actually do something useful with it.
reilly3000 1 day ago||
> Linux's advantage is slowly shrinking

This is garbage writing. Linux’s advantages are numerous and growing. Ubuntu ≠ Linux. WRT RAM requirements, Win 11’s 4GB requirement isn’t viable for daily use and won’t represent any practical machine configuration that has the requisite TPM 2 module. On the other side, the Linux ecosystem offers a wide variety of minimal distributions that can run on ancient hardware.

Maybe I’m just grouchy today but I would flag this content if sloppy MS PR was a valid reason.

leni536 1 day ago||
FWIW I find even KDE plasma on wayland perfectly viable on a 4 GiB budget notebook. Windows runs horribly on the same hardware.
osigurdson 1 day ago|||
Agree. I'm able to do development, run multiple containerized services (including Postgres, NATS, etc), have 10 browser tabs open, all on an 8 GiB laptop running Arch. I have a desktop with 64GiB as well but realized there is no point using it most of the time.
jasomill 1 day ago|||
Yes, and to the the extent that you can do the same thing with Windows, it tends to be unsupported (field stripping desktop Windows to its core) or not viable ([legally] using embedded or server versions of Windows as desktop OSes).
listless 1 day ago||
I agree. And even on Ubuntu, the performance vs same specs on Windows is ridiculously better.

Apps are still a huge gap on Linux, but as an OS, I choose it every time over Windows and MacOS.

duckmysick 1 day ago||
For comparison, here are the official hardware recommendations for Debian: https://www.debian.org/releases/stable/amd64/ch03s04.en.html

"With Desktop" has 1GB minimum and 2GB recommended - along with Pentium 4, 1GHz cpu.

NekkoDroid 1 day ago|
> "With Desktop" has 1GB minimum and 2GB recommended - along with Pentium 4, 1GHz cpu.

This seems like a recommendation to just really get to the desktop itself + maybe some light usage. Anything more than that and the "recommendation" is fairly useless with the memory hog of apps that are commonly used.

chulcoop 7 hours ago||
I have the state of the art Windows 11 Canary Version 29560.1000 (Beta) version running on a PC with 4GB of RAM which is not even officially supported. Ironically Microsoft themselves overrode their own requirements for me as I was one of the volunteers from the Windows 10 Dev Channel days as they wanted those who helped them create Windows 11 see the final result as a "thank you" even though the hardware was not officially good enough.

I don't know how they have done it but with the latest updates to Windows 11 (In the Canary Channel, Optional 29000 series) it is VERY fast in Chrome, even on a PC with just 4GB Ram.

So all those mocking and laughing say you might just get a window up haven't a clue what they are talking about. It WAS terrible but now it is much much better. I don't know how they have done it but they have. It might be due to Microsoft's aim to rewrite the kernel in RUST and other things.

laweijfmvo 1 day ago||

  > Linux's advantage is slowly shrinking
Ubuntu is not Linux. Also, would love to see Windows running on 4GB.
wrxd 1 day ago|
The framing of the article is very odd.

It says that Ubuntu increase the requirements not because of the OS itself but to have a better user experience when people have many browser tabs opened. Then it compares to Windows which has lower nominal requirements but higher requirements in practice to get a passable user experience.

More comments...