Top
Best
New

Posted by bentocorp 10/23/2024

How JPEG XL compares to other image codecs(cloudinary.com)
156 points | 71 commentspage 2
Theodores 10/27/2024|
Few realise that JPEG was designed for flickery low resolution analogue CRTs and slow CPUs. Nowadays we have digital high resolution screens and fast CPUs.

JPEG XL has been coming soon for as long as I can remember.

Nowadays I like to serve webP from a CDN with the filenames being .jpg even though it is a .webp. This means listening to the request headers and serving what is supported.

If someone right clicks and downloads then they get a genuine JPEG, not a webp. Same if they do a 'wget'. This is because the headers are different, with webp not supported.

Browsers do not care about filename extensions. Hence you can serve a webP as 'example.jpg'.

The benefits of doing this, and having the CDN send the request headers to the origin server (for it to encode webp on the fly) is that the rest of the team, including those that upload images can work entirely in the JPG space they know.

The mozjpeg encoder is really neat but all browsers support webp these days. Mozjpeg is JPEG for digital hi-res screens and fast CPUs. Brilliant, but too late.

What I am interested in now is larger colour spaces than sRGB. This requires a whole toolchain that supports P3 (or whatever) too.

I tried AVIF but in practical testing, webP was simply better.

JPEGLI from Google is what I want for the toolchain with the CDN supplying webp. Nobody cares about your images in the vast majority of applications and it is just best to go for highly compressed webp with artefacts just about to cut in. You also have retina nowadays, so using all those pixels, typically 1.5x, is better than 1x. More pixels is a better trade off.

lifthrasiir 10/27/2024||
> Few realise that JPEG was designed for flickery low resolution analogue CRTs and slow CPUs.

To my knowledge, JPEG was initially considerably slow to decode with contemporary CPUs. For example a Usenet post in 1992 [1] stated that decoding "big" images like 800x600 takes 1--2 seconds, which was considered fast at that time. JPEG itself was eventually used as a basis of MPEG-1 video codec, so it couldn't be too slow but I'm not aware of other concrete performance concerns.

[1] https://groups.google.com/g/comp.compression/c/oabAk3UibhU/m...

SG- 10/27/2024|||
JPEG XL has been here a while but Google has decided to avoid it, even Apple has adopted it. at the same file size webp has worst color banding and way more trouble at keeping small details accurate and compresses too much.
mort96 10/27/2024|||
> JPEG XL has been coming soon for as long as I can remember.

You must not have a very long memory. The Joint Photographic Experts Group (JPEG) published a call for proposals for what would become JPEG-XL in 2018. That's not so long ago.

Are you sure you're not thinking about JPEG-2000?

Theodores 10/29/2024|||
Yes, I stand corrected!
jampekka 10/27/2024|||
JPEG-2000 may have poisoned the well. I still, for some irrational reason, get a flashback of it whenever JPEG XL is mentioned.

Maybe JPEG earned that with pushing a deliberately patent infested codec with horrible proprietary implementations? And at the time when the god damn GIF patents were causing grief, and h.264 trolls were on the rise.

Mr_Minderbinder 10/28/2024|||
I do not consider JPEG 2000 to be a failure even if it did not displace the old JPEG in all applications. JPEG 2000 “succeeded” but not on the Web. Most people have probably seen more JPEG 2000s than all other image formats combined. Whenever you go watch a movie in theatres you see 24 of them every second; roughly 173,000 for a two hour feature. There were a few open implementations like OpenJPEG, which is the one I use.
mort96 10/27/2024|||
Could it be argued that a format being patent encumbered was less in focus back then, that "codecs are covered by patents and users of a codec must pay royalties to the patent holders" was just the more or less unquestioned "way things were" back then by groups like ISO? The original JPEG was also patent encumbered after all.

This is not a rhetorical question, I was way too young back then to have been aware of the conversations which were going on regarding JPEG-2000. I see that PNG was developed specifically to be unencumbered, which suggests that at least some groups were focused on it, to your point. But I can't tell if that was a mainstream concern at the time like it is now.

troupo 10/27/2024||
> Nowadays we have digital high resolution screens and fast CPUs.

The problem is that even fast CPUs may have their limits.

A Macbook with M1 Pro spends up to a second decoding and rendering a HEIC image where a comparable JPG and PNG images are rendered instantly. And this is on an 11-year-old codec

greenavocado 10/27/2024||
I investigated using JPEG XL for high speed applications but encoding time was much slower than JPEG with libturbojpeg even if you reduce encoder complexity to a minimum
SG- 10/27/2024||
it's much much faster than the other alternatives tho. and staying with JPEG isn't really ideal with it's low quality and limited options going forward.
doublepg23 10/27/2024||
doesn't mozjpeg get parity with WebP? https://siipo.la/blog/is-webp-really-better-than-jpeg
lonjil 10/27/2024||
Not for a long time. The WebP encoder has improved a lot since MozJPEG was released. But these days we have Jpegli [1] which beats WebP at higher quality levels.

[1] https://github.com/google/jpegli/

dvhh 10/27/2024||
I have a similar experience where jpeg encoding and webp encoding result in far less computing resources use that jpeg XL or AV1, and was curious at what other people used (as I might be using the wrong library).
JyrkiAlakuijala 10/27/2024||
MozJPEG encoding times are not that great but decoding is still fast.

I believe with Jpegli you can have faster encoding than with MozJPEG.

hulitu 10/29/2024||
> The JPEG XL reference encoder (cjpegxl) produces, by default, a well-compressed image that is indistinguishable from (or, in some cases, identical to) the original. In contrast, other image formats typically have an encoder with which you can select a quality setting, where quality is not really defined perceptually.

We are the best, so fuck the rest. /s

Using vague language to claim superiority is not a sign of inteligence.