Posted by 0xblinq 1 day ago
I write pretty lean HTML/vanilla JS apps on the front end & C#/SQL on the backend; and have had great customer success on mobiles with a focus on a lot of the metrics the author hammers home.
As shown in the article, you can build ONCE an app that loads in milliseconds by just providing an url to any potential customer. It works on mobile and on desktop, on any operating system.
The native alternative requires:
- Multiple development for any platform you target (to be widely used you need *at least* ios, android, macOS and windows.) - Customers are required to download and install something before using your platform, creating additional friction.
And all of this to obtain at most 20-30ms better loading times?
There are plenty of cases where native makes sense and is necessary, but most apps have very little to gain at the cost of a massive increase in development resources.
Web deployment is easier, faster and cheaper.
Setting a header only works if you know exactly when you are going to update the file. Except from highly dynamic or sensitive things this is never correct.
You can add ?v=2 to each and every instance of an url on your website. Then you update all pages which is preposterous and exactly what we didn't want. As a bonus ?v=1 is not erased which might also be just what you didn't want.
I never want to reload something until I do.
There are also other solutions if you need to preserve the url that are cleaner than appending a query string, like etags
These are expensive hacks to work around a lack of basic functionality.
It reminds me of one kid taking something from another and refusing to give it back because they are larger.
People make websites, they think they control what goes on on the page. This isn't unreasonable to think. In fact, everything should be made to preserve that idea.
A situation where they just can't change the page shouldn't exist. Abstracting it away or otherwise working around it doesn't make it any less wrong.
Some browsers have a magic key combo to force reload. I suppose the solution is to put up a modal and ask the user to "reinstall" the web page.
I have a lot of static pages with minimal html/css that thanks to lazy loading and caching consume very little bandwidth. The technology is truly wonderful, clicking around feels like a desktop application.
Vite gives you that behaviour out of the box.
Can someone explain why ? What precisely would make React sooo slow and big compared to other abstractions ?
But this mostly applies to subsequent re-renders, while things mentioned in the article are more about initial render, and I'm not exactly sure why does React suffer there. I believe React can't skip VDOM on the server, while Vue or Solid use compiled templates that allow them to skip that and render directly to string, so maybe it's partially that?
The other thing is that React is too big in terms of kBs of JavaScript you have to download and then parse (and often, thanks to great React ecosystem, you use many other libraries). But that's just another trade-off: it's the price you pay for great backwards compatibility (e.g. you can still use React Class components, you don't have to use hooks, etc.).
That being said React is slow. That is why you need useTransition, which is essentially manual scheduling (letting React know some state update isn't very important so it can prioritise other things) which you don't need to do in other frameworks.
useOptimistic does not improve performance, but perceived performance. It lets you show a placeholder of a value while waiting for the real computation to happen. Which is good, you want to improve perceived performance and make interactions feel instant. But it technically does not improve React's performance.
so React has a performance issue in most places it is used. And probably in every project that lives long enough.
As to why it is slow, my knowledge is super up-to-date (haven't kept up that well with recent updates), but in general the idea is:
- The React runtime itself is 40 kB so before doing anything (before rendering in CSR or before hydrating in SSR) you need to download the runtime first.
- Most frameworks have moved on to use signals to manage state updates. When state change, observers of that state will be notified and the least amount of code will be run before updating the DOM surgically. React instead re-executes the code of entire component trees, compares the result with the current DOM and then applies changes. This is a lot more work and a lot slower. Over time techniques have been developed in React to mitigate this (Memoization, React Compiler, etc.), but it still does a lot more work than it needs to, and these techniques are often not needed in other frameworks because they do a lot less work by default.
The js-framework-benchmark [1] publishes benchmarks testing hundreds of frameworks for every Chrome release if you're interested in that.
[1]: https://krausest.github.io/js-framework-benchmark/2025/table...
You're not answering my question, just adding some more feelings.
> The React runtime itself is 40 kB
React is < 10 kb compressed https://bundlephobia.com/package/react@19.2.0 (add react-dom to it). That's not really significative according to the author's figures, the header speaks about up "to 176.3 kB compressed".
> Most frameworks have moved on to use signals to manage state updates. When state change
This is not kilobytes or initial render times, but performance in rendering in a highly interactive application. They would not impact rendering a blog post, but rendering a complex app's UI. The original blog post does not measure this, it's out of scope.
Well you seemed surprised by this fact, even though it's a given for most people working in front-end frameworks.
> React is < 10 kb compressed https://bundlephobia.com/package/react@19.2.0 (add react-dom to it).
I don't know how bundlephobia calculates package size, and let me know if you're able to reproduce them in a real app. The simplest Vite + React app with only a single "Hello, World" div and no dependencies (other than react and react-dom), no hooks used, ships 60+ kB of JS to the browser (when built for production, minified and gzipped).
Now the blog post is not just using React but Next.js which will ship even more JS because it will include a router and other things that are not a part of React itself (which is just the component framework). There are leaner and more performant React Meta-Frameworks than Next.js (Remix, TanStack Start).
> This is not kilobytes or initial render times, but performance in rendering in a highly interactive application
True, but it's another area where React is a (relative) catastrophe.
The large bundle size on the other hand will definitely impact initial render times (in client-side rendering) and time-to-interactive (in SSR), because it's so much more JS that has to be parsed and executed for the runtime before even executing your app's code.
EDIT: It also does not have to be a highly interactive application at all for this to apply. If you only change a single value, that is read in a component deep within a component tree you will definitely feel the difference, because that entire component tree is going to execute again (even though the resulting diff will show that only that deeply nested div needs to be updated, React has no way of knowing that beforehand, whereas signal-based framework do)
And finally I want to say I'm not a React hater. It's totally possible to get fast enough performance out of React. There are just more footguns to be aware of.
First, there's the separation between the generic cross-platform `react` package, and the platform-specific reconcilers like `react-dom` and `react-native. All the actual "React" logic is built into the reconciler packages (ie, each contains a complete copy of the actual `react-reconciler` package + all the platform-specific handling). So, bundle size has to measure both `react` and `react-dom` together.
Then, the contents of `react-dom` have changed over time. In React 18 they shifted the main entry point to be `react-dom/client`, which then ends up importing the right dev/prod artifacts (with `react-dom` still supported but deprecated):
- https://app.unpkg.com/react-dom@18.3.1/files/cjs
Then, in React 19, they restructured it further so that `react-dom` really only has a few utils, and all the logic is truly in the `react-dom/client` entry point:
- https://app.unpkg.com/react-dom@19.2.0/files/cjs/react-dom.d...
- https://app.unpkg.com/react-dom@19.2.0/files/cjs/react-dom-c...
So yes, the full prod bundle size is something like 60K min+gz, but it takes some work to see that. I don't think Bundlephobia handles it right at all - it's just automatically reading the main entry points for each package (and thus doesn't import `react-dom/client`. You can specify that with BundleJS though:
- https://bundlejs.com/?q=react%2Creact-dom%2Fclient&treeshake...
> Bundle size is 193 kB -> 60.2 kB (gzip)
Now, let's talk about the comments, particularly the top comment. I have to say I find the kneejerk backlash against "AI style" incredibly counter-productive. These comments are creating noise on HN that greatly degrades the reading experience, and, in my humble opinion, these comments are in direct violation of all of the "In Comments" guidelines for HN: https://news.ycombinator.com/newsguidelines.html#comments
Happy to change my mind on this if anyone can explain to me why these comments are useful or informative at all.
As ever on mobile it's latency, not bandwidth, that's the issue. You can very happily transfer a lot of data, but if that network is in your interactive hot path then you will always have a significant delay.
You should optimise to use the available bandwidth to solve the latency issues, after FCP. Preload as much data as possible such that navigations are instant.
No, it is excuse not to invest money in places where users won't pay.
For questions about mobile - yeah we get requests for showing it on mobile but app in app store is hard requirement, because of discoverability. People know how to install app from app store and then they have icon. Making PWA icon is still too much work for normal people.
I would need "add to home screen" button in my website that I could have user making icon with single click, then I could go with PWA.
I'm not overly familiar with it, but we use it at work. I've no idea if I should expect it to be quicker or slower than something like Next.