Top
Best
New

Posted by Bogdanp 2 days ago

Behind the scenes of Bun Install(bun.com)
422 points | 152 commentspage 2
atonse 1 day ago|
I absolutely loved reading this. It's such an excellent example of a situation where Computer Science principles are very important in day to day software development.

So many of these concepts (Big O, temporal and spatial locality, algorithmic complexity, lower level user space/kernel space concepts, filesystems, copy on write), are ALL the kinds of things you cover in a good CS program. And in this and similar lower level packages, you use all of them to great effect.

epolanski 1 day ago|
This is about software engineering not computer science.

CS is the study of computations and their theory (programming languages, algorithms, cryptography, machine learning, etc).

SE is the application of engineering principles to building scalable and reliable software.

atonse 17 hours ago||
Without getting bogged down in rigid definitions of phrases, do we both agree that this is about the application of deeper technical concepts and algorithms (usually taught as part of a computer science curriculum) towards real world problems than the normal “build this login form” or “write these 5 queries to generate this report that shows up in an html table” that 75% of devs do daily?
RestartKernel 1 day ago||
This is very nicely written, but I don't quite get how Linux's hardlinks are equivalent to MacOS's clonefile. If I understand correctly, wouldn't the former unexpectedly update files across all your projects if you modify just one "copy"?
1vuio0pswjnm7 1 day ago||
https://github.com/oven-sh/bun/releases/expanded_assets/bun-...

progress: dynamically-linked musl binaries (tnx)

next: statically-linked musl binaries

wink 2 days ago||
> Node.js uses libuv, a C library that abstracts platform differences and manages async I/O through a thread pool.

> Bun does it differently. Bun is written in Zig, a programming language that compiles to native code with direct system call access:

Guess what, C/C++ also compiles to native code.

I mean, I get what they're saying and it's good, and nodejs could have probably done that as well, but didn't.

But don't phrase it like it's inherently not capable. No one forced npm to be using this abstraction, and npm probably should have been a nodejs addon in C/C++ in the first place.

(If anything of this sounds like a defense of npm or node, it is not.)

k__ 2 days ago||
To me, the reasoning seems to be:

Npm, pnpm, and yarn are written in JS, so they have to use Node.js facilities, which are based on libuv, which isn't optimal in this case.

Bun is written in Zig, so it doesn't need libuv, and can so it's own thing.

Obviously, someone could write a Node.js package manager in C/C++ as a native module to do the same, but that's not what npm, pnpm, and yarn did.

lkbm 2 days ago||
Isn't the issue not that libuv is C, but that the thing calling it (Node.js) is Javascript, so you have to switch modes each time you have libuv make a system call?
azangru 1 day ago||
I am probably being stupid; but aren't install commands run relatively rarely by developers (less than once a day perhaps)? Is it such an important issue how long it takes for `x install` to finish?

Or is the concern about the time spent in CI/CD?

tuetuopay 10 hours ago|
CICD is a major usage. But dependencies version bumps are also a big part of it. In the python ecosystem I’ve had poetry take minutes to resolve the ansible dependencies after bumping the version. And then you see uv take milliseconds to do a full install from scratch.
valtism 1 day ago||
I had no idea Lydia was working for Bun now. Her technical writing is absolutely top notch
markasoftware 1 day ago||
I'm pretty confused about why it's beneficial to wait to read the whole compressed file before decompressing. Surely the benefit of beginning decompression before the download is complete outweigh having to copy the memory around a few extra times as the vector is resized?
Jarred 1 day ago|
Streaming prevents many optimizations because the code can’t assume it’s done when run once, so it has to suspend / resume, clone extra data for longer, and handle boundary cases more carefully.

It’s usually only worth it after ~tens of megabytes, but vast majority of npm packages are much smaller than that. So if you can skip it, it’s better.

yencabulator 1 day ago||
Streaming compression with a large buffer size handles everything in a single batch for small files.
tracker1 2 days ago||
I'm somewhat curious how Deno stands up with this... also, not sure what packages are being installed. I'd probably start a vite template project for react+ts+mui as a baseline, since that's a relatively typical application combo for tooling. Maybe hono+zod+openapi as well.
tracker1 2 days ago||
For my own curiousity on a React app on my work desktop.

    - Clean `bun install`, 48s - converted package-lock.json
    - With bun.lock, no node_modules, 19s
    - Clean with `deno install --allow-scripts`, 1m20s
    - with deno.lock, no node_modules, 20s
    - Clean `npm i`, 26s
    - `npm ci` (package-lock.json), no node_modules, 1m,2s (wild)
So, looks like if Deno added a package-lock.json conversion similar to bun the installs would be very similar all around. I have no control over the security software used on this machine, was just convenience as I was in front of it.

Hopefully someone can put eyes on this issue: https://github.com/denoland/deno/issues/25815

steve_adams_86 2 days ago||
I think Deno isn't included in the benchmark because it's a harder comparison to make than it might seem.

Deno's dependency architecture isn't built around npm; that compatibility layer is a retrofit on top of the core (which is evident in the source code, if you ever want to see). Deno's core architecture around dependency management uses a different, URL-based paradigm. It's not as fast, but... It's different. It also allows for improved security and cool features like the ability to easily host your own secure registry. You don't have to use npm or jsr. It's very cool, but different from what is being benchmarked here.

tracker1 2 days ago||
All the same, you can run deno install in a directory with a package.json file an it will resolve and install to node_modules. The process is also written in compiled code, like bun... so I was just curious.

edit: replied to my own post... looks like `deno install --allow-scripts` is about 1s slower than bun once deno.lock exists.

rtpg 1 day ago||
bun installs are fast, but I think they might be so fast and concurrent they cause npm to really get confused sometimes.

I end up hitting 500s from npm from time to time installing by bun and I just don't know why.

Really wish the norm was that companies hosted their own registries for their own usage, so I could justify the expense and effort instead of dealing with registries being half busted kinda randomly.

randomsofr 2 days ago|
wow, crazy to see yarn being so slow, when it used to beat npm by a lot, at a company i was we went from npm, to yarn, to pnpm, back to npm. Nowadays i try to use Bun as much as possible, but Vercel still does not uses natively for Next.
chrisweekly 2 days ago|
why leave pnpm?
More comments...