Top
Best
New

Posted by onlyspaceghost 14 hours ago

The three pillars of JavaScript bloat(43081j.com)
391 points | 228 commentspage 3
skydhash 13 hours ago|
Fantastic write up!

And we're seeing rust happily going down the same path, especially with the micro packages.

chrismorgan 2 hours ago||
Rust is not going down the same path, and it’s ludicrous to suggest it is. Almost none of the first and third pillars are even possible in Rust, and to the extent they are, they’re not a problem in practice. As for the second, “atomic architecture”, it’s not taken anywhere near the extreme it frequently is with npm. There are not many micro-packages that get used, and where they are, they mostly make more sense than they did in npm, and they don’t have anywhere near the cost they do in npm, and can have some concrete advantages.
vsgherzi 8 hours ago|||
Yeah I’m in the same boat here I really don’t like the dependency sprawl of rust. I understand there’s tradeoffs but I really wanna make sure we don’t end up like npm
CoderLuii 11 hours ago|||
the docker side of this is painful too. every extra dependency in any language means a bigger image, more layers to cache, more things that can break during a multi-arch build. ive been building images that are 4GB because of all the node and python tooling bundled in. micro packages make it worse because each one adds metadata overhead on top of the actual code.
cute_boi 12 hours ago||
Rust is different as there is no runtime.
wiseowise 7 hours ago|||
Yes, instead we pay with requiring supercomputers and 10 hour compile times to process billion of those “atomic architecture”.
b00ty4breakfast 9 hours ago||||
I'm not very familiar with rust but I'm pretty sure it has a runtime. Even C has a runtime.

Unless you're talking about an "environment" eg Node or the like

embedding-shape 6 hours ago||
Indeed Rust has a runtime, I'm not sure why the whole "Rust has no runtime" comes from, I keep seeing it repeated from time to time, but can't find the origin of this, I don't think it's ever been true?
onlyspaceghost 11 hours ago||||
but it still increases compile time, attack surface area, bandwidth use, etc.
vsgherzi 8 hours ago|||
I’m assuming you’re referring to an async runtime like tokio. In my option the dependency problem exists with or without tokio. Tokio is probably one of the best dependencies
IAmLiterallyAB 10 hours ago||
For the old version support. Why not do some compile time #ifdef SUPPORT_ES3? That way library writers can support it and if the user doesn't need it they can disable it at compile time and all the legacy code will be removed
sgbeal 4 hours ago||
> Why not do some compile time #ifdef SUPPORT_ES3?

Rather unfortunately, JS has no native precompiler. For the SQLite project we wrote our own preprocessor to deal with precisely that type of thing (not _specifically_ that thing, but filtering code based on, e.g., whether it's vanilla, ESM, or "bunlder-friendly" (which can't use dynamically-generated strings because of castrated tooling)).

Griffinsauce 8 hours ago|||
Two problems: - people would need to know how to effectively include dependencies in a way that allows them to be tree shaken, that's a fragile setup - polyfills often have quirks and extra behaviours (eg. the extra functions on early promise libraries come to mind ) that they start relying on, making the switch to build-in not so easy

Also, how is this going to look over time with multiple ES versions?

sgbeal 4 hours ago||
> people would need to know how to effectively include dependencies in a way that allows them to be tree shaken

Is the need for tree-shaking not 100% a side-effect of dependency-mania? Does it not completely disappear once one has ones dependencies reduced to their absolute minimum?

Maybe i'm misunderstanding what tree-shaking is really for.

ascorbic 8 hours ago||
It'll still install the dependencies, which is what this is about
casey2 9 hours ago||
There is a clear and widespread cultural problem with javascript. Sites should think seriously hard about server side rendering, both for user privacy (can't port the site to i2p if you drop 5MB every time they load a page) and freedom. Even this antibloat site smacks you with ~100KB and links to one that smacks you with ~200KB. At this rate if you follow 20 links you'll hit a site with 104 GB of JS.
g947o 3 hours ago||
> Sites should think seriously hard about server side rendering

You think the average site owner plus wix/squarespace is going to spend a lot of money beefing up their CPU and RAM to marginally "improve user experience" when they could and have been offloading rendering client side all these years?

sgbeal 4 hours ago||
> Sites should think seriously hard about server side rendering...

The rise of AI crawlers makes that ever less appetizing. Moving the workloads to the client is, among other things, a form of DoS mitigation.

sheept 13 hours ago||
I wonder this means there could be a faster npm install tool that pulls from a registry of small utility packages that can be replaced with modern JS features, to skip installing them.
seniorsassycat 12 hours ago|
Not sure about faster, but you could do something with overrides, especially pnpm overrides since they can be configured with plugins. Build a list of packages that can be replaced with modern stubs.

It couldn't inine them, but it could replace ponyfils with wrappers for native impls, and drop the fallback. It could provide simple modern implementations of is-string, and dedupe multiple major versions, tho that begs the question what breaking change lead to a new mv and why?

turtleyacht 13 hours ago||
It would be interesting to extend this project where opt-in folks submit a "telemetry of diffs," to track how certain dependencies needed to be extended, adapted, or patched; those special cases would be incorporated as future features and new regression tests.

Someday, packages may just be "utility-shaped holes" in which are filled in and published on the fly. Package adoption could come from 80/20 agents [1] exploring these edges (security notwithstanding).

However, as long as new packages inherit dependencies according to a human author's whims, that "voting" cycle has not yet been replaced.

[1] https://news.ycombinator.com/item?id=47472694

stephenr 11 hours ago||
The primary cause of JS bloat is assuming you need JS or that customers want whatever you're using it to provide.

For $client we've taken a very minimal approach to JavaScript, particularly on customer facing pages. An upcoming feature finally replaces the last jquery (+ plugin) dependent component on the sales page, with a custom implementation.

That change shaved off ~100K (jquery plus a plugin removed) and for most projects now that probably seems like nothing.

The sales page after the change is now just 160K of JS.

The combination of not relying on JS for everything and preferring use-case-specific implementations where we do, means we aren't loading 5 libraries and using 1% of each.

I'm aware that telling most js community "developers" to "write your own code" is tantamount to telling fish to "just breathe air".

CoderLuii 11 hours ago|
160K total is impressive. most landing pages i see are shipping 2-3MB of js before the first paint. the "write your own code" approach gets laughed at but when you actually do it the result is faster, easier to debug, and you dont wake up one morning to find out one of your 200 dependencies got compromised.
stephenr 5 hours ago||
Wait till I tell those people we keep all our dependencies (js and backend) in our own git repo.

Updating dependencies is a task a person does, followed by committing the changes to the repo.

I am aware a lot of these ideas are heretical to a lot of software developers these days.

grishka 9 hours ago||
Yes, of course the tiny packages cause some of the bloat. As mainly a Java developer being pretty paranoid about my dependency tree (I'm responsible for every byte of code I ship to my users, whether I wrote it or not), I'm always blown away by JS dependency trees. Why would you reach for a library for this three-line function? Just write it yourself, ffs.

But the real cause of JS bloat is the so-called "front-end frameworks". Especially React.

First of all, why would you want to abstract away the only platform your app runs on? What for? That just changes the shape of your code but it ends up doing the same thing as if you were calling browser APIs directly, just less efficiently.

Second of all, what's this deal with mutating some model object, discarding the exact change that was made, and then making the "framework" diff the old object with the new one, call your code to render the "virtual DOM", then diff that, and only then update the real DOM tree? This is such an utterly bonkers idea to me. Like, you could just modify your real DOM straight from your networking code, you know?

Seriously, I don't understand modern web development. Neither does this guy who spent an hour and some to try to figure out React from the first principles using much the same approach I myself apply to new technologies: https://www.youtube.com/watch?v=XAGCULPO_DE

padjo 7 hours ago||
> you could just modify your real DOM straight from your networking code

You can also use your underparts as a hat. It doesn't mean its a good idea.

grishka 4 hours ago||
You imply that you somehow get a visibly different end result if you touch DOM directly. Except to me, using React instead of a simple assignment to e.g. update the text on a button feels like taking several long flights that complete a lap around the world just to get from LA to SF, instead of the 1-hour direct flight.
padjo 2 hours ago|||
It's a case of Chesterton's fence. Having built complex apps pre-react, I wouldn't be in a hurry to go back to that approach because I have first hand experience of running into the problems it solves.
skydhash 3 hours ago|||
React is a paradigm change (from imperative to functional) that makes sense in a large UI project. React itself is fairly small in terms of deps.

The main issue is the tooling. JSX is nice enough (not required though) to want a transpiler that will also bundle you app. It’s from that point things get crazy. They want the transpiler to also be a bundler so that it manages their css as well. They also want it to do minification and dead code elimination. They want it to support npm dependencies,etc…

This is how you get weird ecosystems.

ascorbic 8 hours ago|||
That's like asking "why would you use Swing when you can use Graphics2D". Sometimes you want something higher level. The DOM is great and very powerful, but when you're building a highly interactive web app you don't want to be manually mutating the DOM every time state changes.

I am a core maintainer of Astro, which is largely based around the idea that you don't need to always reach for something like React and can mostly use the web platform. However even I will use something like React (or Solid or Svelte or Vue etc) if I need interactivity that goes beyond attaching some event listeners. I don't agree with all of its design decisions, but I can still see its value.

srdjanr 7 hours ago|||
Regarding tiny packages, I don't think they affect the size of shipped bundle at all. They only bloat your local dev environment.
wiseowise 7 hours ago|||
> Second of all, what's this deal with mutating some model object, discarding the exact change that was made, and then making the "framework" diff the old object with the new one, call your code to render the "virtual DOM", then diff that, and only then update the real DOM tree? This is such an utterly bonkers idea to me. Like, you could just modify your real DOM straight from your networking code, you know?

https://youtu.be/Q9MtlmmN4Q0?t=519&is=Wt3IzexiOX4vMPZf

Also, why do you use SQL and databases? Couldn’t you just modify files on the filesystem?

grishka 4 hours ago|||
Yes, I don't understand "declarative" approach at all, it seems too wasteful and roundabout to me. You want to change something? You go and change it. That simple. I hate it when callbacks are abstracted away from me. Abstractions over callbacks always feel like they're getting in the way, not helping me.

> Also, why do you use SQL and databases? Couldn’t you just modify files on the filesystem?

Anyone can read a MySQL data file. IIRC the format is pretty straightforward. The whole point of doing it through the real MySQL server is to make use of indexes, the query optimizer, and proper handling of concurrency, at least. Sure you can reimplement those things, but at this point congrats, you've just reimplemented the very database system you were trying to avoid, just worse.

thomasikzelf 6 hours ago|||
The declarative vs imperative example is strange here. Why is the imperative example so convoluted? This is what one could write in js:

  badge.textContent = count > 99? '99+' : count
  badge.classList.toggle('show', count > 0)
  paper.classList.toggle('show', count > 0)
  fire.classList.toggle('show', count > 99)
The declarative example also misses the 99+ case. I don't think this example describes the difference between imperative and declarative well.
panstromek 8 hours ago||
Yea, honestly you probably just don't understand. FE frameworks solve a specific problem and they don't make sense unless you understand that problem. That TSoding video is a prime example of that - it chooses a trivial instance of that problem and then acts like the whole problem space is trivial.

To be fair, React is especially wasteful way to solve that problem. If you want to look at the state od the art, something like Solid makes a lot more sense.

It's much easier to appreciate that problem if you actually try to build complex interactive UI with vanilla JS (or something like jQuery). Once you have complex state dependency graph and DOM state to preserve between rerenders, it becomes pretty clear.

grishka 4 hours ago||
One of my projects does have a complex UI and is built with zero runtime dependencies on the front end. It doesn't require JS at all for most of its functionality.

I just render as much as possible on the server and return commands like "hide the element with that ID" or "insert this HTML after element with that ID" in response to some ajax requests. Outside of some very specific interactive components, I avoid client-side rendering.

panstromek 2 hours ago|||
That's good and arguably the right default for most websites.
skydhash 3 hours ago|||
I agree with you. It’s baffling to see websites (not web apps) refusing to show anything if you disable JS. And a lot of such web apps don’t need to be SPA (GitHub,…)

SPA was mean for UI that relies on the client state mostly, not on the server data (figma and other kind of online editors).

sylware 3 hours ago||
It is not javascript itself (until the interpreter is written in plain and simple C or similar), it is the abomination of the web engine, one of the 2.5 from the whatng cartel.
wonnage 8 hours ago|
An underappreciated source of bloat is module duplication stemming from code splitting. SPAs have a bad rep because you don't expect to download an entire app just to load one page on the web. You can solve this by code splitting. But if you just naively split your app by route, you'll end up with duplicate copies of every shared module.

Bundlers handle this by automatically creating bundles for shared modules. But if you optimize to avoid all shared modules, you end up with hundreds of tiny files. So most bundlers enforce a minimum size limit. That's probably fine for a small app. But one or more of these things happens:

1. Over time everybody at the company tends to join one giant SPA because it's the easiest way to add a new page. 2. Code splitting works so well you decide to go ham and code split all of the things - modals, below-the-fold content, tracking scripts, etc.

Now you'll run into situations where 20 different unrelated bundles happen to share a single module, but that module is too small for the bundler to split out, and so you end up downloading it N times.

More comments...