Top
Best
New

Posted by joexbayer 5 days ago

Building my childhood dream PC(fabiensanglard.net)
210 points | 82 comments
scns 1 day ago|
Even though it's off-topic, my favourite case in a build by the same guy. Nowadays it would work with just a Ryzen 9700X:

https://fabiensanglard.net/the_beautiful_machine/index.html

[Edit] Maybe not completly off-topic since it would be my dream PC.

wdfx 1 day ago||
I bought this case a couple of years ago after this article was linked here.

I love it. It's beautifully engineered. Top quality. It sits at the corner of my desk proudly silent.

I'm likely about to upgrade the pc within but the case will remain a strong feature of my desk.

rkomorn 1 day ago||
Do you use it as a gaming PC (or for other high GPU load activities)? And if so, what's your take on noise under load?

Edit: I guess this is a senseless question if the case really only uses passive cooling. I was assuming there would still be fans somewhere.

I despise my current PC's fan noise and I'm always on the lookout for a quieter solution.

wdfx 1 day ago|||
It's a dev workstation for me.

Currently inside is an i7-9600 which I limit to 3.6ghz and a cheap 1050ti.

The CPU is technically over the TDP limit of the case but with the frequency limit in place I never exceed about 70degC and due to my workloads I'm rarely maxing the CPU anyway.

There is zero noise under any load. There is no moving parts inside the case at all, no spinning HDD, no PSU fan, no CPU fan, no GPU fan.

scns 1 day ago||||
> I guess this is a senseless question if the case really only uses passive cooling.

Are there senseless questions?

It can be used for gaming if your demands are met by a Nvidia 1650.

MonsterLabo built passive cases that could cool hotter components, seems defunct though, sadly.

danparsonson 1 day ago|||
Did you have no success upgrading your fans (Noctua etc)? Still too loud? How about water cooling?
rkomorn 1 day ago|||
It's an HP OEM (because I moved countries during the pandemic and getting parts where I settled was ridiculously more expensive).

The CPU is AIO (and the radiator fans are loud). The GPU has very loud fans too, but is not AIO.

It's four years old at this point and I might just build something else rather than try to retrofit this one to sanity (which I doubt is possible without dumping the GPU anyway).

vel0city 1 day ago||
I bought my current gaming desktop off a friend as he didn't need it anymore when I was looking for an upgrade. It had an AIO cooler. The pump made so much noise and it seemed like I had to fiddle with fan profiles forever to get it to have sane cooling. I swapped it for a $30 CoolerMaster Hyper 212 and a Noctua case fan. It cools well enough for the CPU to stay above stock speeds pretty much all the time and is much quieter than the AIO cooler was. I'm not suggesting this CPU cooler is the best one out there, but just pointing out its not like one needs to spend $100+ on a cooler to get pretty good performance.

The GPU still gets kind of loud during intense graphics gaming sessions but when I'm not gaming the GPU fans often aren't even spinning.

rkomorn 1 day ago||
Honestly at this point it's not so much about money as it is about whether or not this particular case/setup/components combo is salvageable with minimal effort.

The CPU fan is rarely an issue (it mostly just goes bananas when IntelliJ gets its business on with gradle on a new project XD).

The GPU is the main culprit and I'm not sure there's any solution there that doesn't involve just replacing it.

vel0city 1 day ago||
Depending on the fans it may be possible to re-oil the bearings.
rkomorn 1 day ago||
Interesting idea. I feel like the fan noise from my GPU is just air moving, but maybe not.
diggan 1 day ago|||
Just last week I moved from using a Noctua NH-U12S to cool my 5950X, to a ARCTIC Liquid Freezer III Pro 360 AIO liquid cooler (first time using liquid cooling), and while I expected the difference to be big, I didn't realize how big.

Now my CPU idles at ~35 usually, which is just 5 degrees above the ambient temperature (because of summer...), and hardly ever goes above 70 even under load, and still super quiet. Realize now I should have done the upgrade years ago.

Now if I could only get water cooling for the radiator/GPU I'm using. Unfortunately no water blocks available for it (yet) but can't wait to change that too, should have a huge impact as well.

keeganpoppen 22 hours ago||
oh man i saw this case years ago and have tried to find it multiple times to no avail; amazing!
vunderba 1 day ago||
I love the attention to detail in this post. I've thought about picking up one of those Vortex86 based ITX boards like the ITX-Llama [1] since you get the joy of running on real hardware but don't have to worry about tracking down a soundblaster card, network cards, etc. Assuming that they ever come back in stock that is.

[1] https://retrodreams.ca/products/itx-llama-mainboard

asukachikaru 1 day ago||
Past threads

https://news.ycombinator.com/item?id=44021824 May, 2025 (86 comments)

https://news.ycombinator.com/item?id=44023088 May, 2025 (0 comment)

https://news.ycombinator.com/item?id=44026363 May, 2025 (1 comment)

jll29 1 day ago||
This post is a meticulous documentation, it's attention to detail is remarkable, and it all looks so clean. How much did your project cost in total, though, out of curiosity (tools and failed attempts included)?

Reading through the post, sadly nothing worked the first time round (bravo to the poster for his perseverance), and while things got slightly better, IT "stuff" is still surprisingly fiddly and fragile.

The quality of the build and the technical detail of the handbooks are areas where things got remarkably worse - how could we let that happen? How can children learn how stuff works without schematics of the devices they own and love?

intrasight 1 day ago||
Honest question: Will building a high-end PC still be a thing in 10 years? I've built all of mine in the last 20 years. Just finished my first AMD build. But I don't think it'll be possible or allowed after a few more CPU iterations. Sure, you'll be able to do builds with the CPU tech available up to when it stops, but I seriously doubt that the cutting-edge chip tech ten years hence will be available to hobbyists. Tell me why I'm wrong.
zamadatix 1 day ago||
I think this hinges on what one considers "cutting edge CPU tech". Is it "newer and better CPU tech than before" or "the highest end CPU tech of the particular day".

If the latter ("the highest end CPU tech of the particular day"), I think it's going to keep getting harder and harder, with more top end options like the M4 Max being "prebuilt only", but I don't think it'll go to 0 options in as short as 10 years from now.

If the former ("newer and better CPU tech than before") I think it'll last even longer than the above, if not indefinitely, just because technology will likely continue to grow consistently enough that even serving a small niche better than before will always eventually be a reasonable target market despite what is considered mainstream.

dingnuts 1 day ago|||
no, you tell us why you think the next ten years are going to be different than the last thirty
endgame 1 day ago|||
One possible reason: to achieve the performance improvements, we are seeing more integrated and soldered-together stuff, limiting later upgrades. The Framework Desktop from the modular, user-upgradeable laptop company, has soldered-on memory because they had to "choose two" between memory bus performance, system stability, and user-replaceable memory modules.

If the product succeeds and the market starts saying that this is acceptable for desktops, I could see more and more systems going that way to get either maximum performance (in workstations) or space/power optimisation (e.g. N100-based systems). Then other manufacturers not optimising for either of these things might start shipping soldered-together systems just to get the BoM costs down.

scns 1 day ago|||
> The Framework Desktop from the modular, user-upgradeable laptop company, has soldered-on memory because they had to "choose two" between memory bus performance, system stability, and user-replaceable memory modules.

No need to pick on Framework here, AMD could not make the chip work with replaceable memory. How many GPUs with user replaceable (slotted) memory are there? Zero snark intended

Aurornis 1 day ago|||
That’s a laptop. It’s soldered for space constraints.

There are high speed memory module form factors. It just adds thickness, cost, expense, and they’re not widely available yet.

Most use cases need the high speed RAM attached to the GPU, though. Desktop CPUs are still on 2-channel memory and it’s fine. Server configs go to 12-channel or more, but desktop hasn’t even begun to crack the higher bandwidth because it’s not all that useful compared to spending the money on a GPU that will blow the CPU away anyway.

bestouff 1 day ago|||
I'm pretty sure the "Framework Desktop" is a desktop, not a laptop.
pja 1 day ago|||
The Framework Desktop is not a laptop. The clue is in the name...

https://frame.work/gb/en/desktop

whatever1 1 day ago||||
The only market for desktops is gaming. Hence nvidia will just slap a cpu on their board and use the unified memory model to sell you an all in one solution. Essentially a desktop console.

Maybe some modularization will survive for slow storage. But other than that demand for modular desktops is dead.

Cases will probably survive since gamers love flashy rigs.

barrkel 1 day ago|||
There are a handful of professional uses for a workstation that are hard to beat with a laptop.

If you're compiling code, you generally want as much concurrency as you can get, as well as great single core speed when the task parallelism runs out. There aren't really any laptops with high core counts, and even when you have something with horsepower, you run into thermal limits. You can try and make do with remoting into a machine with more cores, but then you're not really using your laptop, it might as well be a Chromebook.

intrasight 1 day ago|||
> There are a handful of professional uses for a workstation

I've historically built my own workstations. My premise is that my most recent build may be my last or second to last. In ten years, I will still have a workstation - but not one that I build from parts.

barrkel 23 hours ago||
I also built my own, since the late 90s. But I'm not building my newest: a 96 core threadripper with 768GB ram. I went with a specialist to ensure it works compatibly. I expect it to last me a good few years, and I don't really anticipate replacing it with anything too similar.
whatever1 1 day ago|||
All of these can be done much better on the cloud (I can spawn as big of a machine as my pocket can afford). And with today’s tooling (vs code & jetbrains remote development) you don’t even notice that you develop on a remote machine and not your local.

So the desktop developer market is for those who are not willing to use cloud. And this is a very small minority.

(FYI I am not endorsing cloud over local development, I just state where the market is)

tekne 1 day ago|||
Much of my PhD thesis was/is done traveling in places with poor, poor Internet. Currently on my laptop in rural Calabria, where I pull a blazing fast 60 kbps, sometimes. Would be very irritating waiting for the compiler/theorem prover to go brr, remotely… I can hardly edit a Google doc out here!

This doesn’t contradict your minority point, but it really does make me appreciate local-first.

simgt 1 day ago||
CS thesis that requires traveling, tell us more! What's the topic? :)
jll29 1 day ago||
Perhaps the Italian girlfriend was not where the mainframe on which the theorem prover ran? ;)

If I had been in Italy, perhaps my Ph.D. would never have been finished...

LtdJorge 1 day ago||||
Yes, until the day you get an attack by a North American Fiber-Seeking Backhoe, losing your gigabit+ connection and your entire set of tools with it.
whatever1 1 day ago||
I mean there are also prepers with power generators, solar panels and dry food and water tanks waiting for the apocalypse to happen. Again this is a very small minority.
barrkel 23 hours ago||||
Indeed I use such a machine in my day job. 64 slow Epyc cores, presumably power efficient. But even on that machine, builds are slower than they could be, and distributed builds are the way.
hulitu 1 day ago|||
> of these can be done much better on the cloud. If you forget about the latency, yes. Suddenly your "cc a.c -o a.o" becomes " issue the command, wait for the server to start it, ping pong between your teminal and server for messages, final file available on the cloud"
intrasight 1 day ago||||
>The only market for desktops is gaming.

I disagree. My premise isn't that desktops are going away. It's that DIY custom-build desktops are destined for the trash heap of history since you'll no longer be able to buy CPUs and memory. We will be buying desktops like the HP Z2 Mini Workstation - or the 10 years from now equivalent.

>Cases will probably survive since gamers love flashy rigs

But only as a retro theme thing? Would enthusiasts just put a Z2 Mini, for example, inside the case, wire up the lights, and call it a day?

zokier 1 day ago|||
There is still lot of productivity stuff that benefits from power of desktops. Engineering (Ansys etc), local AI development, 3D modeling, working with large C++/Rust codebases, scientific computing, etc etc. And related to gaming there is of course the huge game developer market too. There is a reason why nvidia and amd still make workstation class GPUs for big bucks.
KeplerBoy 1 day ago||
But all of that hinges on fast off-chip memory. If manufacturers agree that this memory and the SoC need to be soldered, there's not much left to swap out except PCIe boards.
blackoil 1 day ago|||
If the processor comes with builtin GPU, NPU and RAM will you be really building the system
hombre_fatal 1 day ago|||
Sure. Building a PC already is barely building anything. You buy a handful of components and click them into each other.
lizknope 1 day ago|||
While that is mostly true there is a large variety of motherboards. It took me a while to find something with the right SATA and PCIE slots that I wanted. But after that it is just using a screwdriver and some cable ties.
navigate8310 1 day ago|||
A lot of flexibility still exists
danparsonson 1 day ago||||
RAM? Are we expecting on-chip RAM any time soon?
theodric 1 day ago||
Apple's done it since 2020. Intel was planning to, but walked it back. It dramatically increases performance, and allows vendors to sell you RAM at 8x the market price, and requires you to replace your entire computer to upgrade it, thereby inducing you to overspend on RAM from the outset so that you don't have to spend even more to replace the entire system later.

There's literally no reason for shareholders not to demand this from every computer manufacturer. Pay up, piggie.

intrasight 1 day ago|||
Exactly. Better performance and higher profits. Seems inevitable to me.
hulitu 1 day ago|||
> allows vendors to sell you RAM at 8x the market price, and requires you to replace your entire computer to upgrade it,

Good luck then. Some of us build our own computers to be upgreadable.

theodric 7 hours ago||
Same, but about ten years after the last modular RAM computers go on the market, you and I are both going to have a Problem.

Incidentally, this is why I bought a hot air rework station and a heater. Won't help me if they blow an efuse on the package to permanently set the RAM spec, or use part pairing For Your Security And Protection, but maybe they won't all be that evil.

charcircuit 1 day ago|||
Yes, as that's already the case with phones. There is more to a phone than the SOC.
fourthark 1 day ago||
Who builds phones?
charcircuit 1 day ago||
It's mostly factory workers. But hobbyists could do so too if they wanted. Most people want to just buy something that works out of the box so it's not a popular option.
pjc50 1 day ago||
You're going to have to unpack "allowed". Are you saying that the Apple model will win so heavily that separate parts will not be available? What change are you expecting?

NVIDIA not selling cutting edge other than in bulk is a phenomenon of the AI bubble, which will eventually deflate. (I'm not saying it will go away, just that the massive training investments are unsustainable without eventually revenue catching up)

nsavage 1 day ago||
This is a pretty common behaviour. My dad has been buying both his dream Amigas and his dream car, a Triumph TR6. I bought my dream childhood console, a Gameboy Advance SP (I only had a regular Gameboy Advance).
ferguess_k 1 day ago||
I also bought a few consoles (GB, NES, N64, PS2) that I was never allowed to own/play, except for NES which I didn't own but did play due to its popularity. My parents were pretty strict with my studies and piano practices so I didn't even have much time with TV, and games were considered as not only wasteful, but also evil.

The thing is, I never played those consoles after purchasing them. I don't have any nostalgic feelings towards except for NES. I actually felt sorry for myself because I discovered my inner kid died a long time ago when I tried to wake him up.

I'll probably give them to a friend's kid if he so wish, or donate to some local museums.

raffael_de 1 day ago||
And the common realization then is: what did I find so interesting and special about this (as a child)?
erinnh 1 day ago||
Cannot confirm.

I often look fondly at the hardware I have.

I recently build one pc for each PC generation of the 90s. (486,Pentium 1-2,Athlon)

Still love them even after having built them.

Finding back into DOS is quite interesting, since its so different to PCs today.

sandermvanvliet 1 day ago||
Looking at those pictures made me realise I could _smell_ it…

Got kicked right in the nostalgia I guess

layer8 1 day ago||
This is nice, but without a CRT monitor (he's using an IPS) it's not quite the real thing regarding the actual on-screen experience.
mrob 1 day ago||
It's much less important for VGA games than for console games. Most used 320x200 resolution, which was line-doubled to 320x400 then displayed on a monitor capable of at least 640x480, so you had distinct and moderately sharp pixels. The monitor was natively progressive scan, so you didn't get the exaggerated spacing between scan lines that you got on consoles using non-standard field timing to force 240p on a 480i TV. And the refresh rate at this resolution was 70Hz, but very few games ran at 70fps, so you lost most of the benefit of the low persistence of CRTs.
theandrewbailey 1 day ago||
Without a CRT, the soul of retro computing doesn't glow as warm.
intrasight 1 day ago||
Typing this on a very old Model M keyboard :)
voidUpdate 1 day ago|
I always really liked the handle on my old upgraded lenovo E73. It made it much easier to transport when going to university and back for the holidays, and I'm sad that most cases don't have one. Even a hinged one that sits flat to the top of the case when folded down would be awesome
More comments...