Posted by Sateeshm 4 days ago
But WebGL comes with drawbacks:
- You need JS code running before anything shows up.
- Shaders can’t directly manipulate the DOM render. To make refraction work, you’d have to re-render everything into a canvas—which isn’t really “the web” anymore.
With the SVG/CSS approach, you can pre-render the displacement map (at build time or on the backend) and get the refraction visible on the very first frame. Plus, it integrates cleanly with existing, traditional UIs.
That said, this approach could definitely be improved. Ideally we’d have shader-like features in the SVG Filter spec (there was a proposal, but it seems abandoned). There are some matrix operations available in SVG Filters, but they’re limited—and for my first blog post I wanted to focus more on pedagogy, art, and technique than heavy optimization.
I planned to fix the performance issues before posting here (since I knew HN would be quick to point that out), but somebody posted it first. You’re absolutely right — it’s pretty slow right now and needs optimization.
And it’s not just the refraction/displacement map: plenty of other parts, like visualisations, aren’t optimized yet either.
It ran perfectly smoothly with no perf hit in 2020 mba m1. there are no issues with this.
Man the ptsd that AIs have given us from sentences like this.
(Safari stills seems to be a bit slow to render SVGs)
Anyway, I did not expect this blog post to be on HN, so still things to improve on it.
But I don’t think css can leverage the gpu in most (any?) cases. Apple has almost certainly baked something into the silicon to help handle the ui.
Chrome‑only demo
The interactive demo at the end currently works in Chrome only (due to SVG filters as backdrop‑filter).
You can still read the article and interact with the inline simulations in other browsers.
Dishonor on your WHOLE FAMILY! dishonor on you, dishonor on your cow...Besides that, very impressed by the article presentation.
PS Neat website and explanations, but talking about the liquid glass as a design principle in general, I would rather ui elements in a random website not use that much of gpu for not great reasons but maybe that's my problem of not thinking different.
These are in the specification here: https://drafts.fxtf.org/filter-effects-1/#typedef-filter-url
And used by backdrop-filter here: https://drafts.fxtf.org/filter-effects-2/#BackdropFilterProp...
It works on Chromium-based browsers but it does not look great, probably needs some filtering.
The stuttering has already been pointed out here so I won’t pile on.
This is "just" a glass shader.
Here is an implementation: https://codepen.io/lenymo/pen/pJzWVy
One thing I'd say is to apply some anti-aliasing (MSAA, SMAA?)—even on a 4K display with a pixel density of 64.3 px/cm, the jaggies are visible, especially because of the extreme contrast of the caustics behind the dark background.
Regardless, this is a great writeup for changes I wish to never see in ordinary UI.
That said, I've seen many attempt to recreate the effect on web but you've outdone them all. The variety and mathematical modeling of edge shapes elevates this implementation above the rest.
If you decide to continue with this, I would love to see:
1. chromatic aberration along displaced areas
2. higher resolution in the refraction
Many people discussing performance issues but this runs like butter on my M3 Pro.