Top
Best
New

Posted by bluedel 6 days ago

A new PNG spec(www.programmax.net)
667 points | 595 commentspage 2
LeoPanthera 6 days ago|
> I know you all immediately wondered, better compression?. We're already working on that.

This worries me. Because presumably, changing the compression algorithm will break backwards compatibility, which means we'll start to see "png" files that aren't actually png files.

It'll be like USB-C but for images.

lifthrasiir 6 days ago||
Better compression can also mean a new set of filter methods or a new interlacing algorithm. But yeah, any of them would cause an instant incompatibility. As noted in the relevant issue [1], we will need a new media type at the very least.

[1] https://github.com/w3c/png/issues/39#issuecomment-2674690324

Arnt 6 days ago|||
We would need a new media type. But the actual new features don't need one, because the news don't break compatibility.

https://svgees.us/blog/img/revoy-cICP-bt.2020.png uses the new colour space. If your software and monitor can handle it, you see better colour than I, otherwise, you see what I see.

snvzz 6 days ago|||
I am hopeful whatever better compression doesn't end up multiplying memory requirements, or increase burden on cpu, especially on decompression.

Now, PNG datatype for AmigaOS will need upgrading.

Arnt 6 days ago|||
I don't see why? If your video output is plain old RGB (like the Amiga hardware), then an unmodified decoder will handle new files without a problem. You only need a new decoder if your video output can handle more vivid colours than RGB can express.
Findecanor 5 days ago||
An image decoded in the wrong colour space for the output will look wrong. It is not using extra bits to express the increased dynamic range: the existing numeric range is stretched and warped.
Arnt 5 days ago||
Yes. But how bad? AIUI the way it's done is more or less the best that can be done with old video hardware, like mine and like the Amiga.

It could be horrible in principle, but actually isn't.

clownpenis_fart 6 days ago|||
[dead]
Lerc 6 days ago|||
It has fields to say what compression is used. Adding another compression form should be handled by existing software as recognizing it as a valid PNG that they can't decompress.

The PNG format is specifically designed to allow software to read the parts they can understand and to leave the parts they cannot. Having an extensible format and electing never to extend it seems pointless.

koito17 6 days ago|||
> Having an extensible format and electing never to extend it seems pointless.

This proves OP analogy regarding USB-C. Having PNG as some generic container for lossless bitmap compression means fragmentation in libraries, hardware support, etc. The reason being that if the container starts to support too many formats, implementations will start restricting to only the subsets the implementers care about.

For instance, almost nobody fully implements MPEG-4 Part 3; the standard includes dozens of distinct codecs. Most software only targets a few profiles of AAC (specifically, the LC and HE profiles), and MPEG-1 Layer 3 audio. Next to no software bothers with e.g. ALS, TwinVQ, or anything else in the specification. Even libavcodec, if I recall correctly, does not implement encoders for MPEG-4 Part 3 formats like TwinVQ. GP's fear is exactly this -- that PNG ends up as a standard too large to fully implement and people have to manually check which subsets are implemented (or used at all).

cm2187 6 days ago|||
But where the analogy with USB-C is very good is that just like USB-C, there is no way for a user to tell from the look of the port or the file extension what the capabilities are. Which even for a fairly tech savvy user like me is frustrating. I have a bunch of cables, some purchased years ago, how do I know what is fit for what?

And now think of the younger generation that has grown up with smartphones and have been trained to not even know what a file is. I remember this story about senior high school students failing their school tests during covid because the school software didn't support heif files and they were changing the file extension to jpg to attempt to convert them.

I have no trust the software ecosystem will adapt. For instance the standard libraries of the .net framework are fossilised in the world of multimedia as of 2008-ish. Don't believe heif is even supported to this day. So that's a whole bunch of code which, unless the developers create workarounds, will never support a newer png format.

skissane 5 days ago||
> there is no way for a user to tell from the look of the port or the file extension what the capabilities are

But that's typical for file extensions. Consider EXE – it is probably an executable, but an executable for what? Most commonly Windows – but which Windows version will this EXE run on? Maybe this EXE only works on Windows 11, and you are still running Windows 10. Or maybe you are running x86-64 Windows, but this EXE is actually for ARM or MIPS or Alpha. Or maybe it is for some other platform which uses that extension for executable files – such as DOS, OS/2, 16-bit Windows, Windows CE, OpenVMS, TOPS-10, TOPS-20, RSX-11...

.html, .js, .css – suggest to use a web browser, but don't tell you whether they'll work with any particular one. Maybe they use the latest features but you use an old web browser which doesn't support them. Maybe they require deprecated proprietary extensions and so only work on some really old browser. Maybe this HTML page only works on Internet Explorer. Maybe instead of UTF-8 it is in some obscure legacy character set which your browser doesn't support.

.zip – supports extensible compression and encryption methods, your unzip utility might not support the methods used to compress/encrypt this particular zip file. This is actually normal for very old ZIP files (from the 1980s) – early versions of PKZIP used various deprecated compression mechanisms, which few contemporary unzip utilities support. The format was extended to 64-bit without changing the extension, there's still a lot of 32-bit only implementations out there. ZIP also supports platform-specific file attributes–e.g. PKZIP for z/OS creates ZIP files which contain metadata about mainframe data storage formats, unzip on another platform is going to have no idea what it means, but the metadata is actually essential to interpreting the data correctly (e.g. if RECFM=V you need to parse the RDWs, if RECFM=F there won't be any)

.xml - okay, it is XML – but that tells you nothing about the actual schema. Maybe you were expecting this xml file to contain historical stock prices, but instead it is DocBook XML containing product documentation, and your market data viewer app chokes on it. Or maybe it really is historical stock prices, but you are using an old version of the app which doesn't support the new schema, so you can't view it. Or maybe someone generated it on a mainframe, but due to a misconfiguration the file came out in EBCDIC instead of ASCII, and your app doesn't know how to read EBCDIC, yet the mainframe version of the same app reads it fine...

.doc - people assume it is legacy (pre-XML) Microsoft Word: every version of which changed the file format, old versions can't read files created with newer versions correctly or at all, conversely recent versions have dropped support for files created in older versions, e.g. current Office versions can't read DOC files created with Word for DOS any more... but back in the 1980s a lot of people used that extension for plain text files which contained documentation. And it was also used by incompatible proprietary word processors (e.g. IBM DisplayWrite) and also desktop publishing packages (e.g. FrameMaker, Interleaf)

.xmi – I've seen this extension used for both XML Model Interchange (XML-based standard for exchanging UML diagrams) and XMIT (IBM mainframe file archive format). Because extensions aren't guaranteed to be unique, many incompatible file formats share the same extension

.com - is it an MS-DOS program, or is it DCL (Digital Command Language)?

.pic - probably some obscure image format, but there are dozens of possibilities

.img – could be either a disk image or a visual image, either way dozens of incompatible formats which use that extension

.db – nowadays most likely SQLite, but a number of completely incompatible database engines have also used this extension. And even if it is SQLite, maybe your version of SQLite is too old to read this file because it uses some features only found in newer versions. And even if SQLite can read it, maybe it has the wrong schema for your app, or maybe a newer version of the same schema which your old version that app doesn't support, or an old version of the schema which the current version of the app has dropped support for...

Calzifer 5 days ago|||
Just last week I had again some PDFs Okular could not open because of some more uncommon form features.
cmiller1 5 days ago|||
> Consider EXE – it is probably an executable, but an executable for what? Most commonly Windows

Has anyone ever used .exe for anything other than Windows?

skissane 5 days ago|||
Prior to Windows 95, the vast majority of PC games were MS-DOS exe files – so anyone who played any of those games (whether back in their heyday, or more recently through DOSBox) has run an MS-DOS exe. Most people who ever used Lotus 1-2-3 or WordPerfect were running an MS-DOS exe. Both products were eventually ported to Windows, but were far less popular under Windows than under DOS.

Under Windows 95/98/Me, most command line tools were MS-DOS executables. Their support for 32-bit Windows console apps was very poor, to the extent that the input and output of such apps was proxied through a 16-bit MS-DOS executable, conagent.exe

First time in my life I ever used GNU Emacs, it was an OS/2 exe. That's also true for bash, ls, cat, gcc, man, less, etc... EMX was my gateway drug to Slackware

cesarb 5 days ago||||
> Has anyone ever used .exe for anything other than Windows?

Did you know that Microsoft Windows originally ran on top of the much older MS-DOS, which used EXE files as one of its two executable formats? Most Windows users had lots and lots of EXE files which were not Windows executables, but instead DOS executables. And then came Windows 95, which introduced 32-bit Windows executables, but kept the same file extension as 16-bit Windows executables and 16-bit DOS executables.

asgerhb 5 days ago|||
Way back when, my prof was using his Linux machine to demonstrate how to use GCC. He called the end result .exe but that might have been for the benefit of the Windows users in the room. (Though Linux users being considerate to Windows users, or vice versa, is admittedly a rarity)
bayindirh 6 days ago||||
JPEG is no different. Only the decoder is specified. As long as the decoder decodes what you give it to the image you wanted to see, you can implement anything. This is how imgoptim/squash/aerate/dietJPG works. By (ab)using this flexibility.

Same is also true for the most advanced codecs. MPEG-* family and MP3 comes to my mind.

Nothing stops PNG from defining a "set of decoders", and let implementers loose on that spec to develop encoders which generate valid files. Then developers can go to town with their creativity.

cm2187 6 days ago||
Video files aren't a good analogy. Before God placed VLC and ffmpeg on earth, you had to install a galaxy of codecs on your computer to get a chance to read a video file and you could never tell exactly what codec was stored in a container, nor if you had the right codec version. Unfortunately there is no vlc and ffmpeg for images (I mean there is, the likes of imagemagick, but the vast majority of software doesn't use them).
bayindirh 5 days ago||
I lived through that era (first K-Lite Codec Pack, then CCCP came along), but still it holds.

Proprietary or open, any visual codec is a battleground. Even in commercial settings, I vaguely remember people saying they prefer the end result of one encoder over another, for the same video/image format, not unlike how photographers judge cameras by their colors.

So maybe, this flexibility to PNG will enable or encourage people to write better or at least unorthodox encoders which can be decoded by standard compliant ones.

fc417fc802 6 days ago||||
I honestly don't see an issue with the mpeg-4 example.

Regarding the potential for fragmentation of the png ecosystem the alternative is a new file format which has all the same support issues. Every time you author something you make a choice between legacy support and using new features.

From a developer perspective, adding support for a new compression type is likely to be much easier than implementing logic for an entirely new format. It's also less surface area for bugs. In terms of libraries, support added to a dependency propagates to all consumers with zero additional effort. Meanwhile adding a new library for a new format is linear effort with respect to the number of programs.

7bit 6 days ago|||
I never once in 25 years encountered an issue with an mp4 Container that could Not be solved by installing either the divx or xvid codec. And I extensively used mp4's metatdat for music, even with esoteric Tags.

Not Sure what youre talking abouz.

Arnt 6 days ago||
He's saying that in 25 years, you used only the LC and HE profiles, and didn't encounter TwinVQ even once. I looked at my thousand-odd MPEG-4 files. They're overwhelmingly AAC LC, a little bit of AAC LC SBR, no TwinVQ at all.

If you want to check yours: mediainfo **/*.mp4 | grep -A 2 '^Audio' | grep Format | sort | uniq -c

https://en.wikipedia.org/wiki/TwinVQ#TwinVQ_in_MPEG-4 tells the story of TwinVQ in MPEG-4.

mort96 6 days ago||||
> Adding another compression form should be handled by existing software as recognizing it as a valid PNG that they can't decompress.

Yeah, we know. That's terrible.

pvorb 6 days ago||||
Extending the format just because you can – and breaking backwards compatibility along the way – is even more pointless.

If you've created an extensible file format, but you never need to extend it, you've done everything right, I'd say.

jajko 6 days ago||
What about an extensible format that would have as part of header an algorithm (in some recognized DSL) of how to decompress it (or any other step required for image manipulation)? I know its not so much about PNG but some future format.

That's what I would call really extensible, but then there may be no limits and hacking/viruses could have easily a field day.

lelanthran 6 days ago||
> What about an extensible format that would have as part of header an algorithm (in some recognized DSL) of how to decompress it (or any other step required for image manipulation)?

Will sooner or later be used to implement RCEs. Even if you could do a restriction as is done for eBPF, that code still has to execute.

Best would be not to extend it.

shiomiru 6 days ago||||
The difference between valid PNG you can't decompress and invalid PNG is fairly irrelevant when your aim is to get an image onto the screen.

And considering we already have plenty of more advanced competing lossless formats, I really don't see why "feed a BMP to deflate" needs a new, incompatible spin in 2025.

Arnt 6 days ago|||
It's a new and compatible spin. https://svgees.us/blog/img/revoy-cICP-bt.2020.png uses the important new feature and your old software can display it.

More generally, PNG has a simple feature to specify what's needed. A file consists of a number of chunks, and one bit in the chunk specifies whether that chunk is required for display. All of the extensions I've seen in the past decades set that bit to "optional".

For example, this update includes a chunk containing EXIF data. As you'd expect, the exif chunk sets that bit to "optional".

fc417fc802 6 days ago|||
> plenty of more advanced competing lossless formats

Other than JXL which still has somewhat spotty support in older software? TIFF comes to mind but AFAIK its size tends to be worse than PNG. Edit: Oh right OpenEXR as well. How widespread is support for that in common end user image viewer software though?

chithanh 6 days ago||||
> Adding another compression form should be handled by existing software

In an ideal world, yes. In practice however, if some field doesn't change often, then software will start to assume that it never changes, and break when it does.

TLS has learned this the hard way when they discovered that huge numbers of existing web servers have TLS version intolerance. So now TLS 1.2 is forever enshrined in the ClientHello.

HelloNurse 6 days ago||||
Extensibility of PNG has been amply used, as intended, for proprietary chunks that hold application specific data (e.g. PICO-8 games) without bothering other software.
Lerc 5 days ago||
Doesn't pico-8 store the data in the least significant bits of colour? Maybe it got updated to use chunks.
HelloNurse 5 days ago||
Yes, my mistake. I assumed common sense.

https://pico-8.fandom.com/wiki/P8PNGFileFormat

Actual cases of proprietary chunks include iDOT from Apple (apparently a performance optimization for plain images)

https://www.hackerfactor.com/blog/index.php?/archives/895-Co...

and the Macromedia Fireworks save files

https://stackoverflow.com/questions/4242402/the-fireworks-pn...

dooglius 6 days ago|||
> Having an extensible format and electing never to extend it seems pointless.

So then it was pointless for PNG to be extensible? Not sure what your argument is.

jillesvangurp 6 days ago|||
Old PNGs will work just fine. And forward compatibility is much less important.

The main use case for PNG is web browsers and all of them seem to be on board. Using old web browsers is a bad idea. You do get these relics showing up using some old version of internet explorer. But some images not rendering is the least of their problems. The main challenge is actually going to be updating graphics tools to export the new files. And teaching people that sRGB maybe isn't good enough any more. That's going to be hard since most people have no clue about color spaces.

Anyway, that gives everybody plenty of time to upgrade. By the time this stuff is widely used, it will be widely supported. So, you kind of get forward compatibility that way. Your browser already supports the new format. Your image editor probably doesn't.

hnlmorg 6 days ago|||
Browsers aren't the only software that work with PNGs. Far from it in fact.
whywhywhywhy 5 days ago||||
> The main use case for PNG is web browsers

It's not, most images you encounter on the web need better compression.

The main PNG use case is to store lossless images locally as master copies that are then compressed or in workflows where you intend to edit and change them where compressed formats would degrade the more they were edited.

AlienRobot 5 days ago||||
>The main use case for PNG is web browsers

This is news to me. I'm pretty sure the main use case for PNG is lossless transparent graphics.

asadotzler 5 days ago||
Depends on whose use cases you're considering.

There are about 3.6 billion people surfing the web and experiencing PNGs. That use case, consuming PNGs, seems to dwarf the perhaps 100 million (somewhat wild guess) graphic designers, web developers, and photo editing professionals who manipulate images for publishing (in any medium) or archiving.

If, on the other hand, you're considering the use cases envisioned by PNG's creators, or the use cases that interest the people processing or publishing images, yes, these people are focused on format itself and its capabilities.

I suspect this particular use of "use case" isn't terribly clear. Also these two considerations are not incompatible.

skywal_l 6 days ago|||
Can't you improve a compression algorithm and still produce a still valid decompression input? PNG is based on zip, there's certainly ways to improve zip without breaking backwards compatibility.

That being said, they also can do dumb things however, right at the end of the sentence you quote they say:

> we want to make sure we do it right.

So there's hope.

masklinn 6 days ago|||
> Can't you improve a compression algorithm and still produce a still valid decompression input? PNG is based on zip, there's certainly ways to improve zip without breaking backwards compatibility.

That's just changing an implementation detail of the encoder, and you don't need spec changes for that e.g. there are PNG compressors which support zopfli for extra gains on the DEFLATE (at a non-insignificant cost). This is transparent to the client as the output is still just a DEFLATE stream.

vhcr 6 days ago|||
That's what OptiPNG already does.
josefx 6 days ago||
Doesn't OptiPNG just brute force various settings and pick the best result?
colanderman 6 days ago|||
One could imagine a PNG file which contains a low-resolution version of the image with a traditional compression algorithm, and encodes additional higher-resolution detail using a new compression algorithm.
mrheosuper 6 days ago|||
Does usb-c spec break backward compatibility ?, a 2018 macbook work perfectly fine with 2025 usb c charger
danielheath 6 days ago|||
Some things don't work unless you use the right kind of USB-C cable.

EG your GPU and monitor both have a USB-C port. Plug them together with the right USB cable and you'll get images displayed. Plug them together with the wrong USB cable and you won't.

USB 3 didn't have this issue - every cable worked with every port.

mrheosuper 6 days ago||
That is not backward compatible problem. If a cable that does 100w charging when using pd2.0, but only 60w when using with pd3.1 device, then i would agree with you.
yoz-y 6 days ago||
The problem is not backward compatibility but labeling. A USB-C cable looks universal but isn’t. Some of them just charge, some do data, some do PD, some give you access to high speed. But there is no way to know.

I believe the problem here is that you will have PNG images that “look” like you can open them but can’t.

voidUpdate 6 days ago|||
That's not just an issue with usb-c. normal usb a and b cables can have data or no data depending on how stingy the company wants to be, and you can't know until you test it
Xss3 6 days ago||
You can get pretty good guesses just by feel and length. Tiny with a super thin cable? Probably charge only.
mystifyingpoi 6 days ago||||
Cable labeling could fix 99% of the issues with USB-C compat. The solution should never be blaming consumer for buying the wrong cable. Crappy two-wire charge-only cables are perfectly fine for something like a night desk lamp. Keep the poor cables, they are okay, just tell me if that's the case.
lelanthran 6 days ago|||
> Cable labeling could fix 99% of the issues with USB-C compat.

Labelling is a poor band-aid on the root problem - consumer cables which look identical and fit identically should work wherever they fit.

There should never have been a power-only spec for USB-C socket dimensions.

If a cable supports both power and data, it must fit in all sockets. If a cable supports only power it must not fit into a power and data socket. If a cable supports only data, it should not fit into a power and data socket.

It is possible to have designed the sockets under these constraints, with the caveat that they only go in one way. I feel that that would have been a better trade-off. Making them reversible means that you cannot have a design which enforces cable type.

Xss3 6 days ago|||
So since my vape (example, i dont vape) has a power and data slot for charging and firmware updates, i should be limited to only using dual purpose cables day to day rather than a power only cable?
lelanthran 6 days ago||
> So since my vape (example, i dont vape) has a power and data slot for charging and firmware updates, i should be limited to only using dual purpose cables day to day rather than a power only cable?

Well, yes.

Why can't you use a power+data cable for the vape (or whichever appliance takes both)? What's the deal-breaker here?

The alternative is labeling, or plugging cables in to see if they do what you want them to do.

Both are a poor user interface.

Xss3 5 days ago||
Is the same true for my laptop? Or soldering plate? Both take over 150w of power. Buying a power and data cable is expensive compared to just power, and the length of cable is severely limited...or the data speed impaired significantly. How slow does the data have to be for it to be non compliant?
mystifyingpoi 6 days ago|||
> If a cable supports only power it must not fit into a power and data socket

That's even more confusing than the current state of affairs. If my phone has power and data socket, then I cannot use power only cable to only charge it? Presumably with the charger that has power only socket. So I need a cable with two different ends anyway. Just go micro-USB at this point :)

Funnily enough, there is a 100% overkill way to solve such issues. Just use super expensive certified TB cables. Well... plus a A-to-C adapter for noncompliant devices, I guess.

ay 6 days ago||||
Same thing with PNG. Just call the format with new additions it PNGX, so the user can clearly see that the reason their software can’t display the image is not a file corruption.

This is just pretending that if you have a cat and a dog in two bags and you call it “a bag”, it’s one and the same thing…

kevin_thibedeau 5 days ago|||
Two wire cables are not in the specification, just like A-to-A cables aren't. The whole charging above 100mA with resistor hacks wasn't in the standard either until they had to grandfather it in. The implementers forum isn't responsible for non-members breaking their spec.
mrheosuper 6 days ago||||
the parent said "changing the compression algorithm will break backwards compatibility", which i assume is something works now won't work in the future. The usb-c spec is intentionally trying to avoid that.
danielheath 6 days ago||
Today, I can save a PNG file off a random website and then open it.

If PNG gets extended, it's entirely plausible that someone will view a PNG in their browser, save it, and then not be able to open the file they just saved.

There are those who claim "backwards compatibility" doesn't cover "how you use it" - but roughly none of the people who now have to deal with broken software care about such semantic arguments. It used to work, and now it doesn't.

fc417fc802 6 days ago|||
The alternative is the website operator who wants to save on bandwidth instead adopts JXL or WEBP or what have you and ... the end user with old software still can't open it.

It's a dichotomy. Either the provider accommodates users with older software or not. The file extension or internal headers don't change that reality.

Another example, new versions of PDF can adopt all the bells and whistles in the world but I will still be saving anything intended to be long lived as 1/a which means I don't get to use any of those features.

mrheosuper 6 days ago||||
which is what usb-c spec has been avoiding so far. Even in USB4 spec, there are a lot of mentioning the new spec should be compatible with TB3 devices.

USB-C spec is anything but breaking backward compatible.

johnisgood 6 days ago|||
This is what I fear, too.

Do they mention which C libraries use this spec?

globular-toast 6 days ago|||
Some aren't even USB. Thunderbolt and DisplayPort both use USB-C too.
Xss3 6 days ago||
Thunderbolt meets usbc specs (and exceeds them afaik), so it is still usb...
mystifyingpoi 6 days ago||||
Yeah, I also don't think they've broken backwards compat ever. Super high end charger from 2024 can charge old equipment from 2014 just fine with regular 5V.

What was broken was the promise of a "single cable to rule them all", partly due to manufacturers ignoring the requirements of USB-C (missing resistors or PD chips to negotiate voltages, requiring workarounds with A-to-C adapters), and a myriad of optional stuff, that might be supported or not, without a clear way to indicate it.

techpression 6 days ago||||
I don’t know if it’s the spec or just a plethora of vendors that ignores it, but I have many things with a USB-C port that requires USB-A as source. USB-C to A to C works, yay dongles, but not just C to C. So maybe it’s not really breaking backwards compatibility, just a weird mix of a port and the communication being separate standards.
fragmede 6 days ago|||
it's vendors just changing the physical port but not updating the electronics. specifically, a 5.1kΩ pull-up resistors on the CC1 and/or CC pins is needed on the host (was usb-a) side in order for the c to c cable to work.
mrheosuper 6 days ago|||
because those usb-c ports do not follow the spec. If they had followed the spec from 1st day there would be no problem even now.
zirgs 6 days ago||||
Yeah - it's a mess. Some devices only charge with a charger that supports PD. Some other devices need a charger WITHOUT PD support.
mrheosuper 5 days ago||
If those devices follow the spec, they dont need charger without PD support.

You don't follow spec, you're on your own.

zirgs 20 hours ago||
Yes, but, unfortunately - devices like that exist.
ProgramMax 5 days ago|||
Worry not! (Well, worry a little.)

The first bit of our research is "What can we already make use of which requires no spec update? There are plenty of PNG optimizers. How much of that should go into the typical PNG libraries?"

Same with parallel encoding & decoding. An older image viewer will be able to decode it on one thread without ever knowing parallel decoding was an option.

Here's the worry-a-little part: Everybody immediately jumps to file size as to what image compression is better or worse. That isn't the best take, but it is what it is. So there is pressure to adopt newer technologies.

We often do have a way to maintain some degree of backwards compatibility even when we do this. For example, we can store a downsampled image for old viewers. Then extra, new chunks will know "Mix that with this full scale data, using a different compression".

As you can imagine, this mixing complicates things. It might not be the best option. Sooooo we're researching it :)

HexDecOctBin 5 days ago|||
Downsampling will make PNG not be a lossless format. Just leave it alone, and work on a separate PNG2 or PNGX or whatever.
ori_b 5 days ago|||
My strong vote is to just not touch it. Stability is a feature.
altairprime 6 days ago|||
They could, for example, use lossy compression for the compatibility layer and then fill it in the rest of the way to lossless using incompatible new compression objects. Legacy uses will see some fidelity degradation, but they are already being stuck with sRGB downmixes, so that’s fine — and those who are bothered by it can just emit a lossless-pixels (but lossy-color and lossy-range) compatibility layer and reserve the compression benefits for the color and dynamic range.

I’m not saying this is what will happen — but if I was able to construct a plausible approach to compression in ten minutes, then perhaps it’s a bit early to predict the doom of compatibility.

ajnin 6 days ago|||
What backward compatibility are we talking about here? Backwards compatibility of images will be fine, backwards compatibility of decoders might be impacted, but the article says the major image viewers (browsers) and image editors already support the 3rd version. Better compression is only planned for the 5th version of the spec.

Also if you forbid evolving existing formats, the only alternative to improve is to introduce a new format, and I argue that it would be causing even more fragmentation and be more difficult to adopt to. Look at all the drama surrounding JPEG XL.

bawolff 6 days ago|||
I don't think that will super be an issue. How often has "progressive jpeg" ever caused problems? That's the same thing.
bmacho 6 days ago||
+1 why not name it png4 or something. It's better if compatibility is obvious upfront
josephg 6 days ago||
I think if they did that, nobody would use it. And anyway, from the article:

> Many of the programs you use already support the new PNG spec: Chrome, Safari, Firefox, iOS/macOS, Photoshop, DaVinci Resolve, Avid Media Composer...

It might be too late to rename png to .png4 or something. It sounds like we're using the new png standard already in a lot of our software.

adgjlsfhk1 6 days ago||
I'm very curious to see how this will end up stacking up vs lossless jpegxl
Simran-B 6 days ago||
I doubt it can get anywhere near. What is even the point of a new PNG version if there's something as advanced as JXL that is also royalty-free?
layer8 6 days ago||
Browser support for JPEG XL is poor (basically only Safari I think), while the new PNG spec is already supported by all mainstream browsers.
encom 6 days ago||
It's poor, only because Google is using their stranglehold on browsers, to push their own WebP trash. That company can't get broken up soon enough.
layer8 6 days ago||
Firefox also doesn’t support JPEG XL out of the box, and Chrome does support the new PNG, so ¯\_(ツ)_/¯.
trallnag 5 days ago|||
How about renaming JPEG XL to PNG or just merging the complete spec into PNG 3.0?
account42 5 days ago|||
Firefox is there to prevent/delay the forced breakup of Googles monopoly no to provide any real competition, thanks for showing another example of that.
LoganDark 6 days ago||
For starters, you're actually able to use PNG.
iliketrains 6 days ago||
Official support for animations, yes! This feels so nostalgic to me, I have written an L-system generator with support for exporting animated PNGs 11 years ago! They were working only in Firefox, and Chrome used to have an extension for them. Too bad I had to take the website down.

Back then, there were no libraries in C# for it, but it's actually quite easy to make APNG from PNGs directly by writing chunks with correct headers, no encoders needed (assuming PNGs are already encoded as input).

https://github.com/NightElfik/Malsys/blob/master/src/Malsys....

https://marekfiser.com/projects/malsys-mareks-lsystems/

chithanh 6 days ago|
> Official support for animations, yes!

While I welcome that there is now PNG with animations, I am less impressed about how Mozilla chose to push for it.

Using PNG's magic numbers and pretend to existing software that it is just normal PNG? That is the same mindset that lead to HTML becoming tag soup. After all, HTML with a <blink> tag is still HTML, no?

I think they could have achieved animated PNG standardization much faster with a more humble and careful approach.

369548684892826 5 days ago||
A fun fact about PNG, the correct pronunciation is defined in the specification

> PNG is pronounced “ping”

See the end of Section 1 [0]

0: https://www.w3.org/TR/REC-png.pdf

gred 5 days ago||
That makes two image format names which I will refuse to pronounce correctly (the other being GIF [1]).

[1] https://edition.cnn.com/2013/05/22/tech/web/pronounce-gif

ziml77 5 days ago|||
The only logic I ever hear for using a hard G is because that's how Graphics is said. Yet I never hear people saying jay-feg.
gred 5 days ago|||
Also "gift".
account42 5 days ago|||
As with all other pronunciations the real reason is because it sounds better (more correct) to most people.
cmiller1 5 days ago|||
How do you pronounce PNG?
gred 5 days ago|||
Pee En Gee
kristopolous 5 days ago||||
I used to call them Nogs claiming the P was silent.

People believed me. Still funny.

illiac786 5 days ago||||
P&G, stands for Pee & Gloat.
gred 5 days ago||
Portable & Graphical
NoMoreNicksLeft 5 days ago|||
"Pong". Hate me, I don't care.
dspillett 5 days ago|||
Because the creator of gifs telling the world how he pronounced it made such a huge difference :)

Not sure I'll bother to reprogram myself from “png”, “pung”, or “pee-enn-gee”.

naikrovek 5 days ago|||
When someone makes a baby, you call that person by their real name with the correct pronunciation, don’t you?

So why can’t you do that with GIF or PNG? People that create things get to name them.

AllegedAlec 5 days ago|||
> People that create things get to name them.

And if they pick something dumb enough other people get to ignore them.

pixl97 5 days ago||||
Depends...

You'll commonly call someone by their pronounced name out of respect, forced or given.

In a situation where someone does something really stupid or annoying and the forced respect isn't there, most people don't.

dspillett 5 days ago||||
There is a huge difference between inanimate objects/classes and babies. Don't personify inanimate objects, they hate that!

On inanimate objects: Aluminium was first ratified by the IUPAC as aluminium⁰, with the agreement of its discoverer Sir Humphrey Davy¹, yet one huge nation calls it something else…

On people: nicknames are a thing, are you saying those are universally wrong? But yes, when a person tells me that they'd prefer their name pronounced a different way, or that they'd prefer a different name entirely, or that they don't like the nickname other use for them, you can bet your arse that I'll make the effort to use their preferred name+pronunciation in future.

------

[0] Though it should be noted that aluminum was, a few years after, officially accepted as an alternate form.

[1] He initially called it aluminum in the first paper.

naikrovek 4 days ago||
Let people name their offspring, both biological and technical.
eviks 5 days ago||||
First, it's not a baby, that's a ridiculous comparison.

But also, no, not universally even for babies, especially when the name is something ridiculous like X Æ A-Xii where even parents disagree on pronunciation, or when the person himself uses a "non-specced" variant

freeopinion 5 days ago||||
A parent may name their baby Elizabeth. Then even the parent might call them Liz or Beth or Betsy or Bit or Bee.
airstrike 5 days ago||||
Because PNGs won't answer back when I call them by some "correct" name.
account42 5 days ago|||
You are aware that kids generally don't get to pick the nicknames they end up being called and their parents definitely don't either.
LocalH 5 days ago|||
I've said "jif" for almost 40 years, and I'm not stopping anytime soon.

Hard-g is wrong, and those who use it are showing they have zero respect for others when they don't have to.

It's the tech equivalent to the shopping cart problem. What do you do when there is no incentive one way or the other? Do you do the right thing, or do you disrespect others?

pwdisswordfishz 5 days ago|||
Linguistic prescriptivism is wrong, and people who promote it are showing they have zero respect for others when they don't have to.
LocalH 5 days ago|||
I agree that language is fluid. However, when it comes to names, I think people should have enough respect to pronounce things how the creator (or owner, depending on the situation) of the name says it should be pronounced. Too often people will mispronounce someone's name as a sign of intentional disrespect (see Kamala Harris for a fairly recent prominent example) and I cannot get behind that. You see a similar disrespect in the hard-soft discourse around the pronunciation of GIF. A lot of people use the hard g and mock the creator for thinking that soft g should ever have been right.

Naming is probably one of the few language areas that I think should be prescriptive, even while language at large is descriptive.

Analemma_ 5 days ago|||
I don’t think technical standards merit the same level of “deference to the creator” as personal names. People are wrong about standards they created all the time (ask me what I think about John Gruber’s “stewardship” of Markdown) and should be corrected, a standard is meant for all. Obviously the pronunciation of an acronym isn’t anywhere near as important as technical details, but I think the principle holds.
asadotzler 5 days ago||
People are wrong about the children they create all the time too, and should be corrected.
LocalH 5 days ago||
A child is presumably a sentient being, and at some point in their life should gain control of their name. In fact, they do, to some large degree. There are means to change one's legal name, or one can diverge from their legal name and professionally/publicly use a completely different name.

A file format is not a sentient being. The creator's intent matters much more. If GIF had sentience and could voice a desire one way or the other, the whole discussion would be moot as it would clearly be disrespectful to intentionally mispronounce the name.

mandmandam 5 days ago|||
If the creator insists on a weird pronunciation, because of an inside joke most won't ever get, then I feel no responsibility in humoring them.

The G in gif is for graphics. Not 'giraffics'. And most people in the world have no idea what Jif even is, much less a particular catchphrase from an old ad campaign that barely even connects.

ziml77 5 days ago||
And the P in JPEG is for photographic, so you better be saying jay-feg if you want to rely on that logic.
joquarky 5 days ago||
If everyone conformed, then we would have no fun lively debates on things like this. That would be a boring world.
xdennis 5 days ago|||
Linguistic prescriptivism has nothing to do with it.

English has both pronunciations for "gi" based on origin. Giraffe, giant, ginger, etc from Latin; gift, give, (and presumably others) from Germanic roots.

Using the preferred one is just a matter of politeness.

Also, it's quite ironic to prescribe "linguistic prescriptivism" as wrong.

account42 5 days ago||
Insisting on one out of multiple possible pronunciations when most people naturally pick a different one is the definition of linguistic prescriptivism. Politeness doesn't have anything to do with it, people are not required to let individuals dictate how our collective language works.
bigfishrunning 5 days ago||||
pronounce the jraphics interchange format any way you want, everyone knows what you're talking about anyway -- try not to get so worked up. It's not the shopping cart problem, because no-one is measurably harmed by not choosing the same pronunciation as you.
LocalH 5 days ago||
i'll start using hard-g gif when you start saying "jfeg" ;)
npteljes 5 days ago||||
As much as I hate jif, thinking about it, "GPU" works the same - we say gee-pee-you and not gh-pee-you. Garbage Collection is also gee-cee. So it's only logical that jif is the correct one - even if it's not the widely accepted one.

Wrt/ communication, aside from personal preference, one can either respect the creator, or the audience. If I stand in front of 10 colleagues, 10 out of them would not understand jif, or would only get it because this issue has some history now. gif on the other hand has no friction.

Ghengis Khan for example sounds very different from its original Mongolian pronunciation. And there is a myriad others as well.

LocalH 5 days ago||
The whole debate seems to be a modern phenomenon to me - from my anecdotal experience back in the day, it was never questioned by computer enthusiasts that it was pronounced "jif".
eCa 5 days ago|||
I (as a non-native English speaker) have pronounced it with a hard g since first i saw it (mid ’90s) and many years before I learned how the creator preferred it to be pronounced.

I continue to pronounce it how I prefer it, not as a slight, but most people here would be surprised by the soft g.

If I ever meet him I’ll attempt to pronounce it soft-g.

On the other hand, even though my name exists and is reasonably common in English, I’m fairly certain neither you or the GIF creator would address me the way I pronounce my name. I would understand anyway, and wouldn’t care one bit.

npteljes 5 days ago|||
I have the same experience - but with gif. Mind you, me and my circle are not native English speakers.

The debate itself is old. "Since the 90s" Wikipedia says, and keep in mind the format was is from 1987 - so I would say the debate is on from the get-go. Appropriate, too, if you think back, arguing about this kind of stuff was pretty common. Emacs vs vim, browser wars, different kinds of computers, tribalism everywhere.

https://en.wikipedia.org/wiki/Pronunciation_of_GIF

dspillett 5 days ago||
I think “since the 90s” here is “since the late 90s”. When I first was aware of gif files (in the early 90s IIRC) I only saw the name and meaning in print so went with the hard G to match the g's pronunciation in graphics, I don't think I was aware of the original intention to pronounce it jif until somewhere in the early 2000s, at which point the use of the hard g was almost ubiquitous and the soft g idea was presented as an interesting/amusing aside.
npteljes 4 days ago||
One of Wiki's sources date it back as far as 1994, and that's a news article, so the thing must have been going on for a while.

Thinking about it, I think I understand why hard G makes sense for people. With GPU, we pronounce the the individual letters, as it's clearly an abbreviation - as no sane English word starts with "gp". With GIF though, even though it's an abbreviation, it looks a lot like a normal word, "gift", and English also has "give", another one with a hard G, so it feels familiar to say. Moreover, the US, where GIF comes from, has Jif already established as a peanut butter brand, so it makes sense to not pronounce a newly invented, differently written word the same as an already established thing. Well, at least to some it makes sense!

i80and 5 days ago|||
Is this a bit?
LocalH 5 days ago||
Absolutely not. See my response to your sibling comment. Choosy nerds choose "GIF".
rhet0rica 5 days ago||
Unfortunately for your crusade, the soft-G pronunciation has essentially been typo-squatted since before typo-squatting was a thing.

https://file.org/extension/jif

https://fileinfo.com/extension/jiff

https://www.reddit.com/r/todayilearned/comments/4rirr8/til_t...

https://en.wikipedia.org/wiki/JPEG#JPEG_files

ProgramMax 5 days ago|||
Even though I know about this, I still pronounce it as letters. :)
account42 5 days ago|||
That's the correct pronunciation the same way the correct pronunciation for GIF is jiff. Human language is not something you can prescribe.
eviks 5 days ago|||
Ha, been doing it "wrong" my whole life!
yuters 5 days ago||
Pronouncing it like that would invite confusion as the word ping is often used in messaging.
nashashmi 5 days ago||
let's propose PENJ to avoid the confusion.
remram 5 days ago||
So what do we call it? PNG3? The spec is titled "Portable Network Graphics Specification (Third Edition)".

Surely they aren't releasing a new, incompatible version and expecting us to pretend it's the same format...?

> This updates the existing image/png Internet Media type

whyyyyyyy

ProgramMax 5 days ago|
New? Yes. Incompatible? No.

We went to pretty extreme lengths to make sure old software worked with the new changes. Effectively, the limit will be the software, not the image.

For example, you can imagine some old software that is unaware of color spaces and treats everything as sRGB. There is nothing we can do to make that software show a Display P3 correctly. However, we can still show the image well enough that a user understands "that is a red apple".

account42 4 days ago||
Having images show up in washed up colors without any indication is not what I'd consider "working". This mistake has been made many times, please let's not make it again.
hrydgard 6 days ago||
What about implementations? libpng seems pretty dead, 1.7 has been in development forever but 1.6 is still considered the stable version. Is there a current "canonical" png C/C++ library?
vanderZwan 6 days ago|||
I mean, if the spec has been stable for two decades then maybe there just hasn't been much to fix? Especially since PNG is a relatively simple image format.
illiac786 5 days ago||
Seems that logic does not apply to jpeg though.
ethan_smith 5 days ago|||
For modern C/C++ PNG implementations, consider lodepng (header-only), stb_image/stb_image_write (single-file), or libspng (active fork focused on performance and security) as more actively maintained alternatives to libpng.
ProgramMax 5 days ago||
libpng updates are either already landed or nearly landed.
poisonborz 6 days ago||
Not backwards compatible. We just add it to that nice cupboard "great advanced image formats we will forget about".

Society doesn't need a new image format. I'd wager to say not any new multimedia format. Big corporate entites do, and have churning them out at a steady pace.

Look at poor webp - a format pushed by the largest industry players - and the abysmal everyday use it gets, and the hate it generates.

lioeters 5 days ago||
> Not backwards compatible

They say it's technically compatible since older image decoders should recognize the PNG file is using a different compression algorithm than the default.

> Many programs already support the new PNG spec: Chrome, Safari, Firefox, iOS/macOS, Photoshop, DaVinci Resolve, Avid Media Composer...

This is intentionally ignoring the fact that there are countless PNG decoders out in the wild, many using libpng the standard decoder last updated 6 years ago; and they will not be able to read the new PNG v2 files.

They should have used a different file extension, PNG2, to distinguish this incompatible format. Otherwise, users will be confused why their newly saved PNG file cannot be read by certain existing programs.

arp242 5 days ago|||
libng seems to get regular updates? A release just a few days ago.

There's a PR for APNG: https://github.com/pnggroup/libpng/pull/706 – it seems there was some work for HDR in e.g. https://github.com/pnggroup/libpng/pull/635 as well. Related: https://github.com/pnggroup/libpng/issues/507

lioeters 5 days ago||
Oh cool! I was looking at this page, which looks official but apparently not up to date.

https://www.libpng.org/pub/png/libpng.html

Looks like this is the proper location for the project.

https://libpng.sourceforge.io/

JKCalhoun 5 days ago|||
[flagged]
colejohnson66 5 days ago|||
Those are indeed the "magic" bytes of PNG. It's a very clever choice meant to ensure the transport layer didn't mess with it.

To start, there's a byte with the upper bit set which ensures an "8-bit clean" transport. If it's stripped, it becomes a harmless tab. Then the literal "PNG" text so you can see it in a text editor. Then a CR-LF pair to check for CR-LF to LF translations. Then, a CTRL-Z to stop display on DOS-like systems. And finally, another LF to check for LF to CR-LF translations.

It's a clever "magic" that basically ensures a binary transport layer. Things that mattered back in 1996.

https://www.libpng.org/pub/png/spec/1.2/PNG-Rationale.html#R...

account42 5 days ago||
It's clever but I'm not so sure it actually mattered - other formats have done just as well with simpler magic numbers. All it does in the end is that you get something that doesn't identify as a PNG file rather than a PNG file with bad data when a non-binary transport is used - both results are bad and immediately apparent.
colejohnson66 4 days ago||
Yes and no. It wasn't just about telling you a problem occurred, but failing early and being able to say exactly why. A "something in the chain is running in 7-bit mode" is more helpful than "CRC error in IDAT". Maybe the developers were being a bit too ambitious/hopeful, but an eight byte "magic" over a "simpler" four byte one isn't really worth crying over, even with 1996 download speeds.
ape4 5 days ago||||
The 50 4E 47 spells "PNG"
michaelmior 6 days ago|||
> and the abysmal everyday use it gets

Estimates are that 95% of Internet users have a browser that supports WebP and that ~25% of the top million websites serve WebP images. I wouldn't call that abysmal.

Geezus_42 6 days ago|||
Great, so I can download it, but then I have to convert it to a different format before half my apps will be able to use it.
PaulHoule 5 days ago|||
Blame Adobe. For what they charge for Creative Suite it ought to have supported it a long time ago.

My webcrawler sucks down a lot of WebP images, at least it did before it got the smackdown from Cloudflare.

martin_a 5 days ago||
Adobe Photoshop has support for WebP (through "Save as", not "Export") but I don't think WebP is important.
whywhywhywhy 5 days ago||
But it can’t open them
martin_a 5 days ago|||
Not sure if that's version specific, but my one can (version 26.7.0) without any issues or warnings. Tried with this sample file: https://www.gstatic.com/webp/gallery/1.webp
williamscales 5 days ago||||
Looks Photoshop has since v23.2 in 2022.
asadotzler 5 days ago|||
My PS can open them. Maybe update?
lizknope 5 days ago||||
I was about to write that Slack doesn't support webp but I just tested it and it does. For years I have been typing "convert file.webp file.jpg" and then posting that in slack but it looks like they have added support.
jeroenhd 5 days ago||||
Everything I've tried supports WebPs. It took Adobe a while but even Photoshop supports the format these days.

Hell, for some software features (like stickers in some chat apps), WebP is mandatory.

HEIFF files, on the other hand...

BeFlatXIII 5 days ago||||
Or convert before you upload because the image host has delusions about fighting the Google monoculture by refusing WebP support. Even more of a head scratcher when WebM is their only video format.
wltr 5 days ago|||
Maybe the issue is with your operating system then?
jdiff 5 days ago|||
App support has very little to do with the operating system. OSes by and large will preview it just fine.
dinkblam 5 days ago|||
on the contrary. on macOS apps don't have to support image (or movie) formats. it is done by the system and transparently handled by the APIs. apps automatically gain new formats when the system adds it.
reaperducer 5 days ago||
The unfortunate side effect of this convenience is that apps automatically lose image support when macOS chases to no longer support them, too.

One example is Sony's SRF camera raw format.

Programs like Photoshop and Affinity have to bring their own decoders where previously none were required.

dspillett 5 days ago||
And having to bring in support for formats that are deprecated by the OS, if they decide to keep supporting that format as there is sufficient demand from their users, is worse than having to bring in support for all formats rather than getting support from the OS?

Having ask that in a slightly confrontational way, one of the reasons I started using VLC all those years ago, and still use it to this day, was having trouble with other media players that relied on OS support fail to work well (or at all) with some codecs, while VLC brought support for them, and their dog, built-in and reliable. Dragging your own format support libraries with you can be beneficial.

wltr 5 days ago|||
I meant Windows, as macOS and Linux are usually good with modern things. It’s trivial to add the support if you don’t have it. I have no idea about Windows, but I got this vibe of someone using Win7 in 2025 and complaining the world moved on and keeps moving on.
echelon 5 days ago|||
You can't use webp on Reddit, Instagram, and hundreds of other websites. Which is ironic because some of them serve images as webp.
socalgal2 5 days ago|||
Just tested reddit. It works fine with .webp I don't have an instagram account
echelon 5 days ago||
Try https://www.reddit.com/settings/profile

There are so many uneven areas of Reddit where WebP doesn't work. Old reddit, profile support, mod tools, etc.

kccqzy 5 days ago||
I'm convinced that this is because of the prevalent MVP culture in modern software engineering. Instead of holistically looking at a new feature request such as "support webp images" we break it down into parts (e.g. "serve webp" "accept webp upload here" "accept webp upload there") and then we call it a MVP when only the highest priority items are done.
wltr 5 days ago|||
That doesn’t mean it’s dead, it rather shows sheer incompetence of the web dev departments of these wonderful companies for whom webp or avif aren’t images, I guess.
PaulHoule 5 days ago||
Instagram's image uploading interface is klunky compared to Mastodon which is entirely unfunded.
echelon 5 days ago||
This shows the unfortunate power of distribution.

It doesn't matter if the alternative is technically superior once the majority use the mainstream thing.

whywhywhywhy 5 days ago||||
completely fails the second you want to do anything more than load it on a webpage

Photoshop still won’t open it, MacOS preview opens it but then demands to convert it to tiff when you try to edit it

asgerhb 5 days ago|||
Maybe using VLC Media Player from an early age has left me with too high expectations. But if I have a program designed to view or edit a certain class of file, and it doesn't support a certain file format, I will blame that program.
account42 5 days ago|||
GIMP and Gwenview have supported webp (the latter via platform image plugins that add support to other applications as well) since before you encountered them online. Maybe choose better tools.
AlienRobot 5 days ago||||
You can't even upload webp to instagram.
bastawhiz 5 days ago||
Which makes sense for an app made for photos: why would you capture a photograph to disk in a format made for distributing on the web?
jdiff 5 days ago|||
Indeed, why might one upload a photo to the web in a format made for distributing images on the web?
bastawhiz 5 days ago||
I could save my photos as BMPs like early digital cameras did but that doesn't make it practical or reasonable. My camera takes pictures as RAW or HEIF files. Why would I save my photos to a primarily lossy codec that's optimized and designed for distribution rather than preserving fidelity?

We used to do this with JPEG, in fact. And that's why many pictures on Facebook from pre-2018 or so all have a distinctive grainy look. It's artifacts on top of artifacts. Storage on phones isn't tight anymore, we don't need to store photos in a format meant to minimize bytes at the expense of quality.

jdiff 5 days ago||
There's more on Instagram than photos. Lotta meme pages, lot of people just uploading random screenshots and photos they downloaded that have been turned over a million times. Heck, all it takes is someone downloading their own photo from SocialMediaX to reupload on SocialMediaY, or just uploading a the WebP that they exported for their website.
Sharlin 5 days ago|||
Instagram hasn't even been primarily or even secondarily about photos for a long time. Indeed trying to "just" upload a photo is made super inconvenient these days.
sunaookami 5 days ago|||
Tangentially related but Instagram is really the worst plattform for photos. I don't understand why they crop and downsize (!) pictures. Not even Twitter does this, it's unironically a better photo plattform.
bastawhiz 5 days ago|||
Unless you're uploading memes you've downloaded from elsewhere, this strictly isn't true. I'd consider myself an Instagram power user and the only thing that I and all the people I interact with is photos and videos. None of those are webp, or would have been worthwhile to save as webp as an intermediate format.
hsbauauvhabzb 6 days ago||||
My file manager can’t handle them but my browser can.

Edit: and good luck uploading the format to the majority of webforms that aren’t faang.

debugnik 5 days ago|||
Not even Google supports webp uploads in many of their web apps, and it's their format.
chillingeffect 5 days ago||
Could it be a lack of resources? Or some missing expertise? Maybe they could find some interns who are familiar with it? Maybe the entire world is so obsessed w AI, we don't even care about image formats anymore.
pixl97 5 days ago||
Honestly this kind of stuff happens all the time in large companies.

Interns won't want to work on a dead end like this. Moreso they need to be supervised by someone that doesn't want to get removed by being the lowest X% usefulness in a company. So all these existing tools that aren't primary revenue generators just sit on coast mode.

chillingeffect 3 days ago||
Thank you for the nice response... in contrast to those I deserved downvoting. You'ee alright.
upcoming-sesame 5 days ago||||
If you are using an image optimization service like Imgix / Cloudflare Image Resizing then it doesn't really matter, image can be uploaded as any supported format and will be sent to the end user according to their "Accept" header
hsbauauvhabzb 5 days ago||
if you’d like to go and implement that in all the millions of existing web apps, go ahead?

Let’s also not forget the dependency mess that leaves in applications before we do though..

account42 5 days ago|||
Demand more from you file manager then.
hsbauauvhabzb 4 days ago||
Sure, ur then it’s my image viewer, my phones image viewer, the website I try and upload pictures to. This isn’t a problem you can solve by patching one application, and it’s not one the world as a whole cares about.

Better image formats serve entities who store images at scale, not end users.

dotancohen 6 days ago|||
5% of people can't view them, yet 25% of top websites use them?

In what other industry would it be considered acceptable to exclude 5% of visitors/users/clients?

mlok 6 days ago|||
Maybe they offer alternatives to webp for those 5% ?

See CSS image-set : https://developer.mozilla.org/en-US/docs/Web/CSS/image/image...

pchangr 6 days ago||||
I can tell you, I have personally worked with a global corporation and we estimated that for one of their websites, supporting the 3% that we exclude by using “modern standards” would be more costly than the amount of revenue they get from them. So in that case, it was a rational decision. And up to the 10% cut, management just didn’t want to do the extra investment. So if something falls below that 10% threshold, they just don’t care to get it fixed.
Aachen 5 days ago|||
> it was a rational decision. And up to the 10% cut, management just didn’t want to do the extra investment

Rational, or economical? I find it rational to help someone in need since I'd want others to do the same to me, even if it's not financially profitable for me. Imo more factors flow into what's rational, but I understand what you mean by corporate greed working this way (less than 10% of people are blind, neither male nor female, run a free operating system or can't afford a new computer, etc., so yep they're not profitable groups and for-profits don't optimise for that)

majewsky 5 days ago||
You are using the notion of rationality wrong. Rational reasoning can only help you find how to achieve goals that align with your values. It is strictly worthless in choosing your values.

If a corporation has determined that profit maximization is their core tenet, excluding the needs of a minority of users can likely be deduced in a rational manner from that tenet. That is precisely why values need to be forced onto corporate actors through regulation, e.g. in this case through mandatory accessibility guidelines like EU directive 2019/882 that enters into force this very week.

account42 5 days ago||
Rational reasoning also takes into account long-term and second and higher order effects which quarterly profit-driven reasoning often ignores. If you support 95% of users and your competitor supports 100% then that may help your competitor getting 100% of them while you get none.
dotancohen 5 days ago||||
In my experience, accessibility features are needed by about 1.5% of users (E-commerce and some internal business tools). So by your logic, the rational choice is to exclude accessibility?

Or Linux users? Or even Firefox users in our market?

pchangr 3 days ago||
By that logic, yes. Thankfully, EU regulation forces them to implement accessibility even if it makes no sense by that specific logic.

As for Linux users… I do recall they were even less than the 3%. Firefox users were more tho.

In any case, I’m almost sure most Linux users were fine. We just didn’t wanted to support old browsers.

eviks 5 days ago||||
Something is off in this calculation, how did they get to such a high cost for such a simple thing as an alternative image format when the web supports multiple???
dooglius 5 days ago|||
My guess would be that the users hitting different types of issues are mostly the same; someone who can't view an alternative image format is using an obscure old browser or obscure OS that will inevitably have a ton of other issues too, and fixing only a subset of the issues would not make much difference.
pchangr 3 days ago|||
No, this was not about the alternative image format. This was about the browsers and screen resolutions that we choose to fully support. We took the data directly from the website visitors analytics. Basically .. resolutions under 1024px and anything older than edge 11 was left out of the scope.
account42 5 days ago|||
Thanks for demonstrating why laws like ADA are needed to force companies to not be bad citizens. We desperately need similar laws to force compatibility with older hardware - one could even champion it under environmental protection.
pchangr 3 days ago||
I absolutely agree. I’m very glad we have accessibility requirements in the EU.
0points 6 days ago||||
> 5% of people can't view them, yet 25% of top websites use them?

That's not how it works.

The server declares what versions of media it has, and the client requests a supported media format. The same trick have been used for audio and video for ages too.

Example:

    <picture>
        <source srcset="a.webp" type="image/webp">
        <img src="fallback.jpg">
    </picture>
vbezhenar 5 days ago||
This problem was solved by HTTP since forever. Client sends `Accept` header with supported formats and server selects the necessary content with corresponding `Content-Type` header. You don't need any HTML tags for it.
NorwegianDude 5 days ago|||
No, cause thats just one of the features.

Images are often at different resolutions too, that way, depending on the pixel density of the device, and the physical size, the browser can select the photo that has high enough resolution, but not one that is needlessly large, while also selecting the preferred image format.

allendoerfer 5 days ago|||
What about file extensions?
georgyo 5 days ago|||
File extensions are just a hint about what the file might be and have nothing to do with what the file actually is. If the server sets the MIME type, the browser will use that as the hint.

But even beyond that, most file formats have a bit of a header at the start of the file that declares the actual format of the file. Browsers already can understand that and use the correct render for a file without an extension.

allendoerfer 5 days ago||
What if the user wants to use the file outside the browser, where they do not have access to the HTTP headers?
georgyo 4 days ago||
The same is true, if you rename a .png to .jpg and opening it with an image viewer, it will render.
jdiff 5 days ago|||
Sometimes respected, largely ignored. URLs very often don't map directly to files served.
allendoerfer 5 days ago||
Images almost always do.
jdiff 4 days ago||
I wish, would make my job a good bit easier. Sometimes they don't even respect format query parameters and just use whatever's in your Accept headers.

Will say though that it's not universal, it depends heavily on the corner of the internet you're on.

sjsdaiuasgdia 6 days ago||||
Not all businesses are attempting to reach a market of "every internet user globally".
bawolff 6 days ago||||
Can the 5% view images at all? The number of web crawlers have exploded recently.
jdiff 5 days ago||
Yes, but it's 2% that are still using browsers without full support for WebP according to caniuse, which takes its numbers from StatCounter.

https://caniuse.com/webp

Note that I'm looking at "all tracked," which excludes 2% "other" browsers in the data whose featureset is not known.

pasc1878 5 days ago|||
Any industry.

e.g. cars - not everyone is physically able to drive books - blind people can't read music - deaf people can't hear

It is a form of 80/20 or 90/10 rule the last small percentage costs as much as the majority.

danillonunes 5 days ago||
I agree with the point you're trying to make, but your examples are terrible. Music industry doesn't have too much to do to help deaf people. It's not like they're deliberately making deaf-inaccessible music instead of relying on the old good deaf-accessible music formats.

(Also, the parent comment's example is also not so good because as someone else pointed just because the top 25% websites are serving webp it does mean they're not serving alternative formats for those who does not support it, as this is quite trivial to setup)

Etheryte 6 days ago|||
I don't really think this is the case here. All major browsers already support the new spec for example. This isn't a case of oh we'll have support for it eventually, it's already there.
Hendrikto 5 days ago|||
> Momentum built, and additional parties became interested. […] we had representation from […] Adobe, Apple, BBC, Comcast / NBCUniversal, Google, MovieLabs, and […] W3C

> Many […] programs […] already support the new PNG spec: Chrome, Safari, Firefox, iOS/macOS, Photoshop, DaVinci Resolve, Avid Media Composer...

> Plus, you saw some broadcast companies in that list above. Behind the scenes, hardware and tooling are being updated to support the new PNG spec.

127 5 days ago|||
There's a big issue that all old popular image formats are 8-bits. 10-bits or even 12-bits would help a lot with storing more information and maintaining editability.
londons_explore 5 days ago|||
If adding more bits to an image format, please make it 'n bit'. Ie. the file could be 8 bit, it could be 10, it could be 12, it could be 60 bit!

Whilst we're at it, please get rid of RGB and make it N channels too.

Libraries can choose to render that into a 3 channel, 8 bit buffer for legacy applications - but the data will be there for CMYK or HDR, or depth maps, or transparency, or focus stacking, or any other future feature!

Retr0id 5 days ago|||
PNG has supported 16-bits per channel since version 1.0 in 1998 (at least)
GuB-42 5 days ago|||
There are some applications for a new image format, but I agree that what we have is generally good enough.

We need good video formats however. Video makes up most of the global internet traffic, probably accounts for a good part of global storage capacity too. Even slightly better compression will have a massive impact.

LocalH 5 days ago|||
I miss the days of old Amiga OS 3.x, where you had installable "DataTypes" that any program could make use of. If we had that, then all such programs could at least be updated to basic compatibility by simply updating the datatype.
account42 4 days ago||
All operating systems support some kind of shared libraries and plugin architecture.
ProgramMax 5 days ago|||
It is very backwards compatible. I'm not sure why you thought that.

We jumped through quite a lot of hoops to make sure old software will be able to display new images. They simply won't display them optimally. But for the most part, that would be because the old software wouldn't display images optimally anyway. So the limit was the software, not the format.

What I mean by this is old software that treats everything as sRGB wouldn't correctly show a Display P3 image anyway. But we made sure it will still display the image as correctly as it could.

account42 5 days ago||
The sample HDR images don't show correctly in image viewers even though the colors used fit into the sRGB gamut (or at least have good approximations in there). That's not really backwards compatibility.
dev_l1x_be 5 days ago|||
> Look at poor webp

What about it?

"Lossless WebP is typically 26% smaller than PNG, while lossy WebP can be 25-34% smaller than JPEG at equivalent quality levels"

This literally saves houndred of thousand of cost, bandwith, electricity every month on the internet. In fact, I strongly belive that this is one of the greatest contributions from Google to society just like ZSTD from Facebook.

https://developers.google.com/speed/webp/docs/webp_study

Timwi 5 days ago|||
I don't think the commenter you replied to disagrees with any of that. They were talking about poor rates of adoption, not its feature set.
dev_l1x_be 5 days ago||
The biggest driver of adoption are features.

"WebP is used by 16.7% of all websites. This means that while it's a popular image format, it's not yet the dominant format, with JPEG still holding the majority share at 73.0%, according to W3Techs. However, WebP offers significant advantages in terms of compression and file size, making it a preferred choice for many web developers. "

Mr_Minderbinder 5 days ago||||
Those numbers are from Google. Third parties have not found WebP to be as good as Google claims.
account42 5 days ago||||
> equivalent quality levels

Therein lies the lie.

Image and video compression comparisons are like statistics with the right corpus and evaluation criteria you can should whatever narrative you want to push.

poisonborz 5 days ago|||
Society wholeheartedly thanks Google for saving costs for Google
dev_l1x_be 4 days ago||
It saved money for our company too.

¯\_(ツ)_/¯

Retr0id 5 days ago||
Which aspects are not backwards compatible?

You'll never be able to faithfully represent an HDR image on a non-HDR system, but you'll still see an image.

account42 4 days ago||
The problem is when "HDR" images that would perfectly fit into the sRGB color space are not rendered correctly on non-HDR systems. This PNGv2 fails that which means it isn't really any more useful and one of the existing (and much better) HDR-supporting formats like JPEG-XL or the video codec based ones pushed by the big guys.
Retr0id 4 days ago||
If your image fits in sRGB colour space then why not just use sRGB?
account42 4 days ago||
Someone who is aware of this issue can do that. Someone who just uses the standard export might not notice that their files are not backwards compatible. It might also be that most of the image fits into sRGB but a few highlights do not - having the whole image washed out because of those is also not good.
Retr0id 3 days ago||
Software tends to default to sRGB export, because it's the safe default.
bartwe 5 days ago||
I'm worried that by supporting too many encodings and color spaces this will hamper adoption and unexpected unsupported files. Perhaps this is more of an encoder/decoder library issue, which hopefully will give us rec2020 rgb32/rgb10a2 encode/decode apis so we can simply use them without having to know so many details.
4ad 4 days ago||
HDR is about, well, high dynamic range images, usually expressed with at least 10 bits of precision (although it can also be float, etc), and often, but not always encoding scene-referred data instead of image-referred data (originally it was supposed to only encode scene-referred data, but then other competing formats ignored that). It has nothing to do with the gamut and with the color primaries, although in practice HDR images use a large color space.

But you can absolutely have an SDR image encoded using a large color space. So I am not sure why the author talks about color primaries when it tries to justify HDR… I still don’t know what kind of HDR images this new PNG variant can encode.

More comments...