Top
Best
New

Posted by 0xblinq 10/28/2025

I built the same app 10 times: Evaluating frameworks for mobile performance(www.lorenstew.art)
237 points | 161 commentspage 2
gethly 10/28/2025|
I prefer to use whatever I'm more comfortable with than something that is measurably the fastest horse in the stable. Trading dev time, skill and comfort for few kb of memory and few ms of speed seems pointless to me.

By the way, my "horse" of choice is Quasar(based on Vue) and has been for years now.

grebc 10/28/2025||
Thanks for posting, a lot of effort went into that and I think the quality shines through in the write up.

I write pretty lean HTML/vanilla JS apps on the front end & C#/SQL on the backend; and have had great customer success on mobiles with a focus on a lot of the metrics the author hammers home.

econ 10/28/2025||
I believe the biggest performance hit lives in je inability to force reload a cached file with js (or even html(!)).

Setting a header only works if you know exactly when you are going to update the file. Except from highly dynamic or sensitive things this is never correct.

You can add ?v=2 to each and every instance of an url on your website. Then you update all pages which is preposterous and exactly what we didn't want. As a bonus ?v=1 is not erased which might also be just what you didn't want.

I never want to reload something until I do.

mpeg 10/28/2025||
This is a solved problem. All modern javascript bundlers append a hash to the filename, so even if cached indefinitely the js that hits the browser will update when it has changed as the url will change.

There are also other solutions if you need to preserve the url that are cleaner than appending a query string, like etags

econ 10/29/2025||
I want clean urls, I don't want to update every page or dynamically stich them together for each page view, I don't want a server cache for dynamically stitched together static content.

These are expensive hacks to work around a lack of basic functionality.

It reminds me of one kid taking something from another and refusing to give it back because they are larger.

People make websites, they think they control what goes on on the page. This isn't unreasonable to think. In fact, everything should be made to preserve that idea.

A situation where they just can't change the page shouldn't exist. Abstracting it away or otherwise working around it doesn't make it any less wrong.

Some browsers have a magic key combo to force reload. I suppose the solution is to put up a modal and ask the user to "reinstall" the web page.

I have a lot of static pages with minimal html/css that thanks to lazy loading and caching consume very little bandwidth. The technology is truly wonderful, clicking around feels like a desktop application.

mpeg 10/29/2025||
I'm not entirely sure what your point is. You don't want the complexity of a server cache but you want "clean urls" – having a hash in a filename does not necessarily make it worse, in fact it's closer to a pure url since one of the principles of hypermedia is that each url should point to a unique resource

Urls like /v2/yourfile.js are probably closer to that philosophy. Or /[hash]/yourfile.js.

econ 10/30/2025||
Turns out we can both be wrong (specially me) fetch() does have various cache options.
econ 10/31/2025||
To needlessly elaborate a bit more.

Any query string may prevent caching in some browsers (not sure which or if they still do)

File paths in my view are to organize files hierarchically not for hash or version numbers.

Html like <img src="logo.jpg"> looks neat and sophisticated. You can teach it in 5 seconds. If more characters are needed I expect something huge in return. For example styling it individually or as a group of things is a huge benefit. lazy loading is also HUGE.

mpeg 10/31/2025||
I've given you a list of options that you can use to cache your simple logo.jpg url: use a bundler that automatically appends a hash (it can be as a querystring if you want, you can do logo.jpg?h=xxxxx, you can use etags so that the client checks with the server if they have the latest version of a resource, you can also use the last-modified headers to instruct the client to send an if-modified-since

I'll give you an extra one I learnt about recently: you can use a custom compression dictionary, although this is only available in chrome right now, which means even when a file needs to be redownloaded the network size is tiny as it's compressed with a custom dictionary that matches a previous version of the file

iainmerrick 10/28/2025||
The standard solution is to have small top-level HTML files with short expiration (or no caching at all), then all the other assets (CSS, JS, images) have content-hashed filenames and are cached indefinitely.

Vite gives you that behaviour out of the box.

econ 10/29/2025||
I design and architect things specifically for the purpose they serve. What you describe, while popular and workable is putting the horse behind the carriage.
maelito 10/28/2025||
If I trust this article, React is a (relative) catastrophe.

Can someone explain why ? What precisely would make React sooo slow and big compared to other abstractions ?

panstromek 10/28/2025||
I'm not exactly sure why "big" but it's slow because it has worse change tracking and rendering model, which requires you to do more work to figure out what needs to be updated, unless you manually opt-out when you know. Solid, Vue and other signals based frameworks have granular change tracking, so they can skip a lot of that work.

But this mostly applies to subsequent re-renders, while things mentioned in the article are more about initial render, and I'm not exactly sure why does React suffer there. I believe React can't skip VDOM on the server, while Vue or Solid use compiled templates that allow them to skip that and render directly to string, so maybe it's partially that?

rk06 10/31/2025|||
When react came out, it was much faster than its competition. but then others learned from react and improved upon it.

however, React didn't copy from others, so it got slower than "competition"

simjnd 10/28/2025|||
It is pretty established at this point that React has (relative) terrible performance. React isn't successful because it's a superior technology, it's successful despite being an inferior technology. It's just really difficult to beat an extremely established technology and React has a huge ecosystem, so many companies depend on it that the job market for it is huge, etc.

As to why it is slow, my knowledge is super up-to-date (haven't kept up that well with recent updates), but in general the idea is:

- The React runtime itself is 40 kB so before doing anything (before rendering in CSR or before hydrating in SSR) you need to download the runtime first.

- Most frameworks have moved on to use signals to manage state updates. When state change, observers of that state will be notified and the least amount of code will be run before updating the DOM surgically. React instead re-executes the code of entire component trees, compares the result with the current DOM and then applies changes. This is a lot more work and a lot slower. Over time techniques have been developed in React to mitigate this (Memoization, React Compiler, etc.), but it still does a lot more work than it needs to, and these techniques are often not needed in other frameworks because they do a lot less work by default.

The js-framework-benchmark [1] publishes benchmarks testing hundreds of frameworks for every Chrome release if you're interested in that.

[1]: https://krausest.github.io/js-framework-benchmark/2025/table...

maelito 10/28/2025||
> It is pretty established at this point that React has (relative) terrible performance. > it is slow

You're not answering my question, just adding some more feelings.

> The React runtime itself is 40 kB

React is < 10 kb compressed https://bundlephobia.com/package/react@19.2.0 (add react-dom to it). That's not really significative according to the author's figures, the header speaks about up "to 176.3 kB compressed".

> Most frameworks have moved on to use signals to manage state updates. When state change

This is not kilobytes or initial render times, but performance in rendering in a highly interactive application. They would not impact rendering a blog post, but rendering a complex app's UI. The original blog post does not measure this, it's out of scope.

simjnd 10/28/2025||
> You're not answering my question, just adding some more feelings.

Well you seemed surprised by this fact, even though it's a given for most people working in front-end frameworks.

> React is < 10 kb compressed https://bundlephobia.com/package/react@19.2.0 (add react-dom to it).

I don't know how bundlephobia calculates package size, and let me know if you're able to reproduce them in a real app. The simplest Vite + React app with only a single "Hello, World" div and no dependencies (other than react and react-dom), no hooks used, ships 60+ kB of JS to the browser (when built for production, minified and gzipped).

Now the blog post is not just using React but Next.js which will ship even more JS because it will include a router and other things that are not a part of React itself (which is just the component framework). There are leaner and more performant React Meta-Frameworks than Next.js (Remix, TanStack Start).

> This is not kilobytes or initial render times, but performance in rendering in a highly interactive application

True, but it's another area where React is a (relative) catastrophe.

The large bundle size on the other hand will definitely impact initial render times (in client-side rendering) and time-to-interactive (in SSR), because it's so much more JS that has to be parsed and executed for the runtime before even executing your app's code.

EDIT: It also does not have to be a highly interactive application at all for this to apply. If you only change a single value, that is read in a component deep within a component tree you will definitely feel the difference, because that entire component tree is going to execute again (even though the resulting diff will show that only that deeply nested div needs to be updated, React has no way of knowing that beforehand, whereas signal-based framework do)

And finally I want to say I'm not a React hater. It's totally possible to get fast enough performance out of React. There are just more footguns to be aware of.

acemarke 10/28/2025||
React's bundling system and published packages has gotten noticeably more complicated over time.

First, there's the separation between the generic cross-platform `react` package, and the platform-specific reconcilers like `react-dom` and `react-native. All the actual "React" logic is built into the reconciler packages (ie, each contains a complete copy of the actual `react-reconciler` package + all the platform-specific handling). So, bundle size has to measure both `react` and `react-dom` together.

Then, the contents of `react-dom` have changed over time. In React 18 they shifted the main entry point to be `react-dom/client`, which then ends up importing the right dev/prod artifacts (with `react-dom` still supported but deprecated):

- https://app.unpkg.com/react-dom@18.3.1/files/cjs

Then, in React 19, they restructured it further so that `react-dom` really only has a few utils, and all the logic is truly in the `react-dom/client` entry point:

- https://app.unpkg.com/react-dom@19.2.0/files/cjs/react-dom.d...

- https://app.unpkg.com/react-dom@19.2.0/files/cjs/react-dom-c...

So yes, the full prod bundle size is something like 60K min+gz, but it takes some work to see that. I don't think Bundlephobia handles it right at all - it's just automatically reading the main entry points for each package (and thus doesn't import `react-dom/client`. You can specify that with BundleJS though:

- https://bundlejs.com/?q=react%2Creact-dom%2Fclient&treeshake...

> Bundle size is 193 kB -> 60.2 kB (gzip)

maelito 10/30/2025||
Thanks for the detailed explanations !
simjnd 10/31/2025||
You're welcome
jakubmazanec 10/28/2025|||
React isn't slow. It can be pretty fast (thanks to hooks like useTransition or useOptimistic, and now React Compiler, etc.) - it's just that it takes a lot of learning and work to use React correctly. Some people don't like that, and that's why other frameworks with different trade-offs exist.

The other thing is that React is too big in terms of kBs of JavaScript you have to download and then parse (and often, thanks to great React ecosystem, you use many other libraries). But that's just another trade-off: it's the price you pay for great backwards compatibility (e.g. you can still use React Class components, you don't have to use hooks, etc.).

simjnd 10/28/2025||
I want to preface this by saying I have nothing against React, I have used it professionally for a couple years and it's fine and perfectly good enough.

That being said React is slow. That is why you need useTransition, which is essentially manual scheduling (letting React know some state update isn't very important so it can prioritise other things) which you don't need to do in other frameworks.

useOptimistic does not improve performance, but perceived performance. It lets you show a placeholder of a value while waiting for the real computation to happen. Which is good, you want to improve perceived performance and make interactions feel instant. But it technically does not improve React's performance.

dominicrose 10/28/2025||
Honestly I don't know about mobile apps but React for a desktop website has no performance issue provided that the code is of sufficient quality.
brazukadev 10/29/2025||
> provided that the code is of sufficient quality.

so React has a performance issue in most places it is used. And probably in every project that lives long enough.

alex-moon 10/28/2025||
I greatly appreciated this article and have found the data very useful - I have shared this with my business partner and we will use this information down the road when we (eventually) get around to migrating our app from Angular to something else. Neither of us were surprised to see Angular at the bottom of the league tables here.

Now, let's talk about the comments, particularly the top comment. I have to say I find the kneejerk backlash against "AI style" incredibly counter-productive. These comments are creating noise on HN that greatly degrades the reading experience, and, in my humble opinion, these comments are in direct violation of all of the "In Comments" guidelines for HN: https://news.ycombinator.com/newsguidelines.html#comments

Happy to change my mind on this if anyone can explain to me why these comments are useful or informative at all.

mrasong 10/28/2025||
This post made me open up the Svelte docs again.
gloosx 10/28/2025||
> when I built the first implementations and started measuring, something became clear: the issues I was seeing with Next.js weren’t specific to Next.js. They were fundamental to React’s architecture.

So here some obscure Next.js issues magically become fundamental React architecture issues. What are these? Skill issues?

brazukadev 10/29/2025|
That is a decision that the React team decided to make and it alienated many of its users.
samwillis 10/28/2025||
This is a great comparison, but it depends so much on what sort of website or web app you are building. If you are building a content site, with the majority of visitors arriving without a hot cache bundle size is obviously massively important. But for a web app, with users regularly visiting, it's somewhat less important.

As ever on mobile it's latency, not bandwidth, that's the issue. You can very happily transfer a lot of data, but if that network is in your interactive hot path then you will always have a significant delay.

You should optimise to use the available bandwidth to solve the latency issues, after FCP. Preload as much data as possible such that navigations are instant.

willsmith72 10/28/2025||
> 40ms round-trip time

In that case how can you possibly get 35ms FCP? Am I missing something?

ozim 10/28/2025|
Let’s be honest: “desktop-only” is usually an excuse to skip performance discipline entirely

No, it is excuse not to invest money in places where users won't pay.

For questions about mobile - yeah we get requests for showing it on mobile but app in app store is hard requirement, because of discoverability. People know how to install app from app store and then they have icon. Making PWA icon is still too much work for normal people.

I would need "add to home screen" button in my website that I could have user making icon with single click, then I could go with PWA.

More comments...