Posted by bentocorp 4 days ago
I believe with Jpegli you can have faster encoding than with MozJPEG.
JPEG XL has been coming soon for as long as I can remember.
Nowadays I like to serve webP from a CDN with the filenames being .jpg even though it is a .webp. This means listening to the request headers and serving what is supported.
If someone right clicks and downloads then they get a genuine JPEG, not a webp. Same if they do a 'wget'. This is because the headers are different, with webp not supported.
Browsers do not care about filename extensions. Hence you can serve a webP as 'example.jpg'.
The benefits of doing this, and having the CDN send the request headers to the origin server (for it to encode webp on the fly) is that the rest of the team, including those that upload images can work entirely in the JPG space they know.
The mozjpeg encoder is really neat but all browsers support webp these days. Mozjpeg is JPEG for digital hi-res screens and fast CPUs. Brilliant, but too late.
What I am interested in now is larger colour spaces than sRGB. This requires a whole toolchain that supports P3 (or whatever) too.
I tried AVIF but in practical testing, webP was simply better.
JPEGLI from Google is what I want for the toolchain with the CDN supplying webp. Nobody cares about your images in the vast majority of applications and it is just best to go for highly compressed webp with artefacts just about to cut in. You also have retina nowadays, so using all those pixels, typically 1.5x, is better than 1x. More pixels is a better trade off.
To my knowledge, JPEG was initially considerably slow to decode with contemporary CPUs. For example a Usenet post in 1992 [1] stated that decoding "big" images like 800x600 takes 1--2 seconds, which was considered fast at that time. JPEG itself was eventually used as a basis of MPEG-1 video codec, so it couldn't be too slow but I'm not aware of other concrete performance concerns.
[1] https://groups.google.com/g/comp.compression/c/oabAk3UibhU/m...
You must not have a very long memory. The Joint Photographic Experts Group (JPEG) published a call for proposals for what would become JPEG-XL in 2018. That's not so long ago.
Are you sure you're not thinking about JPEG-2000?
Maybe JPEG earned that with pushing a deliberately patent infested codec with horrible proprietary implementations? And at the time when the god damn GIF patents were causing grief, and h.264 trolls were on the rise.
This is not a rhetorical question, I was way too young back then to have been aware of the conversations which were going on regarding JPEG-2000. I see that PNG was developed specifically to be unencumbered, which suggests that at least some groups were focused on it, to your point. But I can't tell if that was a mainstream concern at the time like it is now.
The problem is that even fast CPUs may have their limits.
A Macbook with M1 Pro spends up to a second decoding and rendering a HEIC image where a comparable JPG and PNG images are rendered instantly. And this is on an 11-year-old codec