Top
Best
New

Posted by 0xblinq 1 day ago

I built the same app 10 times: Evaluating frameworks for mobile performance(www.lorenstew.art)
228 points | 141 commentspage 2
grebc 1 day ago|
Thanks for posting, a lot of effort went into that and I think the quality shines through in the write up.

I write pretty lean HTML/vanilla JS apps on the front end & C#/SQL on the backend; and have had great customer success on mobiles with a focus on a lot of the metrics the author hammers home.

Akhu117 1 day ago||
I am the only one shocked that no comparison or test or thinking of native development? Web dev are this closed to other languages? I came here for this kind of comparison because of the article. headline
gbalduzzi 1 day ago||
It's not about being closed to other languages, it's about being economically pragmatic in many, many cases.

As shown in the article, you can build ONCE an app that loads in milliseconds by just providing an url to any potential customer. It works on mobile and on desktop, on any operating system.

The native alternative requires:

- Multiple development for any platform you target (to be widely used you need *at least* ios, android, macOS and windows.) - Customers are required to download and install something before using your platform, creating additional friction.

And all of this to obtain at most 20-30ms better loading times?

There are plenty of cases where native makes sense and is necessary, but most apps have very little to gain at the cost of a massive increase in development resources.

ale 1 day ago|||
Native to the web like web components or a native platform?
croes 1 day ago||
The problem of native apps isn't the language but the app stores.

Web deployment is easier, faster and cheaper.

econ 1 day ago||
I believe the biggest performance hit lives in je inability to force reload a cached file with js (or even html(!)).

Setting a header only works if you know exactly when you are going to update the file. Except from highly dynamic or sensitive things this is never correct.

You can add ?v=2 to each and every instance of an url on your website. Then you update all pages which is preposterous and exactly what we didn't want. As a bonus ?v=1 is not erased which might also be just what you didn't want.

I never want to reload something until I do.

mpeg 1 day ago||
This is a solved problem. All modern javascript bundlers append a hash to the filename, so even if cached indefinitely the js that hits the browser will update when it has changed as the url will change.

There are also other solutions if you need to preserve the url that are cleaner than appending a query string, like etags

econ 7 hours ago||
I want clean urls, I don't want to update every page or dynamically stich them together for each page view, I don't want a server cache for dynamically stitched together static content.

These are expensive hacks to work around a lack of basic functionality.

It reminds me of one kid taking something from another and refusing to give it back because they are larger.

People make websites, they think they control what goes on on the page. This isn't unreasonable to think. In fact, everything should be made to preserve that idea.

A situation where they just can't change the page shouldn't exist. Abstracting it away or otherwise working around it doesn't make it any less wrong.

Some browsers have a magic key combo to force reload. I suppose the solution is to put up a modal and ask the user to "reinstall" the web page.

I have a lot of static pages with minimal html/css that thanks to lazy loading and caching consume very little bandwidth. The technology is truly wonderful, clicking around feels like a desktop application.

iainmerrick 1 day ago||
The standard solution is to have small top-level HTML files with short expiration (or no caching at all), then all the other assets (CSS, JS, images) have content-hashed filenames and are cached indefinitely.

Vite gives you that behaviour out of the box.

econ 7 hours ago||
I design and architect things specifically for the purpose they serve. What you describe, while popular and workable is putting the horse behind the carriage.
maelito 1 day ago||
If I trust this article, React is a (relative) catastrophe.

Can someone explain why ? What precisely would make React sooo slow and big compared to other abstractions ?

panstromek 1 day ago||
I'm not exactly sure why "big" but it's slow because it has worse change tracking and rendering model, which requires you to do more work to figure out what needs to be updated, unless you manually opt-out when you know. Solid, Vue and other signals based frameworks have granular change tracking, so they can skip a lot of that work.

But this mostly applies to subsequent re-renders, while things mentioned in the article are more about initial render, and I'm not exactly sure why does React suffer there. I believe React can't skip VDOM on the server, while Vue or Solid use compiled templates that allow them to skip that and render directly to string, so maybe it's partially that?

jakubmazanec 1 day ago|||
React isn't slow. It can be pretty fast (thanks to hooks like useTransition or useOptimistic, and now React Compiler, etc.) - it's just that it takes a lot of learning and work to use React correctly. Some people don't like that, and that's why other frameworks with different trade-offs exist.

The other thing is that React is too big in terms of kBs of JavaScript you have to download and then parse (and often, thanks to great React ecosystem, you use many other libraries). But that's just another trade-off: it's the price you pay for great backwards compatibility (e.g. you can still use React Class components, you don't have to use hooks, etc.).

simjnd 1 day ago||
I want to preface this by saying I have nothing against React, I have used it professionally for a couple years and it's fine and perfectly good enough.

That being said React is slow. That is why you need useTransition, which is essentially manual scheduling (letting React know some state update isn't very important so it can prioritise other things) which you don't need to do in other frameworks.

useOptimistic does not improve performance, but perceived performance. It lets you show a placeholder of a value while waiting for the real computation to happen. Which is good, you want to improve perceived performance and make interactions feel instant. But it technically does not improve React's performance.

dominicrose 1 day ago|||
Honestly I don't know about mobile apps but React for a desktop website has no performance issue provided that the code is of sufficient quality.
brazukadev 13 hours ago||
> provided that the code is of sufficient quality.

so React has a performance issue in most places it is used. And probably in every project that lives long enough.

simjnd 1 day ago||
It is pretty established at this point that React has (relative) terrible performance. React isn't successful because it's a superior technology, it's successful despite being an inferior technology. It's just really difficult to beat an extremely established technology and React has a huge ecosystem, so many companies depend on it that the job market for it is huge, etc.

As to why it is slow, my knowledge is super up-to-date (haven't kept up that well with recent updates), but in general the idea is:

- The React runtime itself is 40 kB so before doing anything (before rendering in CSR or before hydrating in SSR) you need to download the runtime first.

- Most frameworks have moved on to use signals to manage state updates. When state change, observers of that state will be notified and the least amount of code will be run before updating the DOM surgically. React instead re-executes the code of entire component trees, compares the result with the current DOM and then applies changes. This is a lot more work and a lot slower. Over time techniques have been developed in React to mitigate this (Memoization, React Compiler, etc.), but it still does a lot more work than it needs to, and these techniques are often not needed in other frameworks because they do a lot less work by default.

The js-framework-benchmark [1] publishes benchmarks testing hundreds of frameworks for every Chrome release if you're interested in that.

[1]: https://krausest.github.io/js-framework-benchmark/2025/table...

maelito 1 day ago||
> It is pretty established at this point that React has (relative) terrible performance. > it is slow

You're not answering my question, just adding some more feelings.

> The React runtime itself is 40 kB

React is < 10 kb compressed https://bundlephobia.com/package/react@19.2.0 (add react-dom to it). That's not really significative according to the author's figures, the header speaks about up "to 176.3 kB compressed".

> Most frameworks have moved on to use signals to manage state updates. When state change

This is not kilobytes or initial render times, but performance in rendering in a highly interactive application. They would not impact rendering a blog post, but rendering a complex app's UI. The original blog post does not measure this, it's out of scope.

simjnd 1 day ago||
> You're not answering my question, just adding some more feelings.

Well you seemed surprised by this fact, even though it's a given for most people working in front-end frameworks.

> React is < 10 kb compressed https://bundlephobia.com/package/react@19.2.0 (add react-dom to it).

I don't know how bundlephobia calculates package size, and let me know if you're able to reproduce them in a real app. The simplest Vite + React app with only a single "Hello, World" div and no dependencies (other than react and react-dom), no hooks used, ships 60+ kB of JS to the browser (when built for production, minified and gzipped).

Now the blog post is not just using React but Next.js which will ship even more JS because it will include a router and other things that are not a part of React itself (which is just the component framework). There are leaner and more performant React Meta-Frameworks than Next.js (Remix, TanStack Start).

> This is not kilobytes or initial render times, but performance in rendering in a highly interactive application

True, but it's another area where React is a (relative) catastrophe.

The large bundle size on the other hand will definitely impact initial render times (in client-side rendering) and time-to-interactive (in SSR), because it's so much more JS that has to be parsed and executed for the runtime before even executing your app's code.

EDIT: It also does not have to be a highly interactive application at all for this to apply. If you only change a single value, that is read in a component deep within a component tree you will definitely feel the difference, because that entire component tree is going to execute again (even though the resulting diff will show that only that deeply nested div needs to be updated, React has no way of knowing that beforehand, whereas signal-based framework do)

And finally I want to say I'm not a React hater. It's totally possible to get fast enough performance out of React. There are just more footguns to be aware of.

acemarke 1 day ago||
React's bundling system and published packages has gotten noticeably more complicated over time.

First, there's the separation between the generic cross-platform `react` package, and the platform-specific reconcilers like `react-dom` and `react-native. All the actual "React" logic is built into the reconciler packages (ie, each contains a complete copy of the actual `react-reconciler` package + all the platform-specific handling). So, bundle size has to measure both `react` and `react-dom` together.

Then, the contents of `react-dom` have changed over time. In React 18 they shifted the main entry point to be `react-dom/client`, which then ends up importing the right dev/prod artifacts (with `react-dom` still supported but deprecated):

- https://app.unpkg.com/react-dom@18.3.1/files/cjs

Then, in React 19, they restructured it further so that `react-dom` really only has a few utils, and all the logic is truly in the `react-dom/client` entry point:

- https://app.unpkg.com/react-dom@19.2.0/files/cjs/react-dom.d...

- https://app.unpkg.com/react-dom@19.2.0/files/cjs/react-dom-c...

So yes, the full prod bundle size is something like 60K min+gz, but it takes some work to see that. I don't think Bundlephobia handles it right at all - it's just automatically reading the main entry points for each package (and thus doesn't import `react-dom/client`. You can specify that with BundleJS though:

- https://bundlejs.com/?q=react%2Creact-dom%2Fclient&treeshake...

> Bundle size is 193 kB -> 60.2 kB (gzip)

alex-moon 1 day ago||
I greatly appreciated this article and have found the data very useful - I have shared this with my business partner and we will use this information down the road when we (eventually) get around to migrating our app from Angular to something else. Neither of us were surprised to see Angular at the bottom of the league tables here.

Now, let's talk about the comments, particularly the top comment. I have to say I find the kneejerk backlash against "AI style" incredibly counter-productive. These comments are creating noise on HN that greatly degrades the reading experience, and, in my humble opinion, these comments are in direct violation of all of the "In Comments" guidelines for HN: https://news.ycombinator.com/newsguidelines.html#comments

Happy to change my mind on this if anyone can explain to me why these comments are useful or informative at all.

samwillis 1 day ago||
This is a great comparison, but it depends so much on what sort of website or web app you are building. If you are building a content site, with the majority of visitors arriving without a hot cache bundle size is obviously massively important. But for a web app, with users regularly visiting, it's somewhat less important.

As ever on mobile it's latency, not bandwidth, that's the issue. You can very happily transfer a lot of data, but if that network is in your interactive hot path then you will always have a significant delay.

You should optimise to use the available bandwidth to solve the latency issues, after FCP. Preload as much data as possible such that navigations are instant.

mrasong 1 day ago||
This post made me open up the Svelte docs again.
ozim 1 day ago||
Let’s be honest: “desktop-only” is usually an excuse to skip performance discipline entirely

No, it is excuse not to invest money in places where users won't pay.

For questions about mobile - yeah we get requests for showing it on mobile but app in app store is hard requirement, because of discoverability. People know how to install app from app store and then they have icon. Making PWA icon is still too much work for normal people.

I would need "add to home screen" button in my website that I could have user making icon with single click, then I could go with PWA.

ochre-ogre 19 hours ago||
Any reason the url is '10-kanban-boards'? I noticed this when I copied the url to share.
hi_hi 1 day ago|
I'd be interested in seeing React Native in this comparison.

I'm not overly familiar with it, but we use it at work. I've no idea if I should expect it to be quicker or slower than something like Next.

koolala 1 day ago|
What do you hope to see from the result of that comparison?
hi_hi 1 day ago||
To gauge where RN sits on the spectrum of fast to slow.
More comments...