Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

It's not just Google, Mozilla has no desire to introduce a barely supported massive C++ decoder for marginal gains either:

https://github.com/mozilla/standards-positions/pull/1064

avif is just better for typical web image quality, it produces better looking images and its artifacts aren't as annoying (smoothing instead of blocking and ringing around sharp edges).

You also get it for basically free because it's just an av1 key frame. Every browser needs an av1 decoder already unless it's willing to forego users who would like to be able to watch Netflix and YouTube.



I don't understand what you're trying to say. Mozilla said over a year ago that they would support JXL as soon as there's a fast memory safe decoder that will be supported.

Google on the other hand never expressed any desire to support JXL at all, regardless of the implementation. Only just now after the PDF Association announced that PDF would be using JXL, did they decide to support JXL on the web.

> avif is just better for typical web image quality, it produces better looking images and its artifacts aren't as annoying (smoothing instead of blocking and ringing around sharp edges).

AVIF is certainly better for the level of quality that Google wants you to use, but in reality, images on the web are much higher quality than that.

And JXL is pretty good if you want smoothing, in fact libjxl's defaults have gotten so overly smooth recently that it's considered a problem which they're in the process of fixing.


> I don't understand what you're trying to say. Mozilla said over a year ago that they would support JXL as soon as there's a fast memory safe decoder that will be supported.

Did they actually say that? All the statements i've seen them have been much more guarded and vauge. More of a, maybe we will think about it if that happens.


> If they successfully contribute an implementation that satisfies these properties and meets our normal production requirements, we would ship it.

That's what they said a year ago. And a couple of Mozilla devs have been in regular contact with the JXL devs ever since then, helping with the integration. The patches to use jxl-rs with Firefox already exist, and will be merged as soon as a couple of prerequisite issues in Gecko are fixed.


Their standards position is still neutral; what switched a year ago was that they said they would be open to shipping an implementation that met their requirements. The tracking bug hasn't been updated[2] The patches you mention are still part of the intent to prototype (behind a flag), similar to the earlier implementation that was removed in Chrome.

They're looking at the same signals as Chrome of a format that's actually getting use, has a memory safe implementation, and that will stick around for decades to justify adding it to the web platform, all of which seem more and more positive since 2022.

[1] https://mozilla.github.io/standards-positions/#jpegxl

[2] https://bugzilla.mozilla.org/show_bug.cgi?id=1539075


I disagree about the image quality at typical sizes - I find JPEG-XL is generally similar or better than AVIF at any reasonable compression ratios for web images. See this for example: https://tonisagrista.com/blog/2023/jpegxl-vs-avif/

AVIF only comes out as superior at extreme compression ratios at much lower bit rates than are typically used for web images, and the images generally look like smothered messes at those extreme ratios.


Not everything in the world is passive end-of-the-line presentation. JPEG-XL is the only one that tries to be a general-purpose image format.


If that's the case, let it be a feature of image editing packages that can output formats that are for the web. It's a web standard we're talking about here, not a general-purpose image format, so asking browsers to carry that big code load seems unreasonable when existing formats do most of what we need and want for the web.


People generally expect browsers to display general-purpose image formats. It's why they support formats like classical JPEG, instead of just GIF and PNG.

Turns out people really like being able to just drag-and-drop an image from their camera into a website - being forced to re-encode first it isn't exactly popular.


> Turns out people really like being able to just drag-and-drop an image from their camera into a website - being forced to re-encode first it isn't exactly popular.

That’s a function of the website, not the browser.


> That’s a function of the website, not the browser.

That's hand-waving away quite a lot. The task changes from serving a copy of a file on disk, as every other image format in common use, to needing a transcoding pipeline more akin to sites like YouTube. Technically possible, but lots of extra complexity in return for what gain?


Even though AVIF decoding support is fairly widespread by now, it is still not ubiquitous like JPEG/PNG/GIF. So typically services will store or generate the same image in multiple formats including AVIF for bandwidth optimization and JPEG for universal client support. Browser headers help to determine compatibility, but it's still fairly complicated to implement, and users also end up having to deal with different platforms supporting different formats when they are served WebP or AVIF and want to reupload an image somewhere else that does not like those formats. As far as I can tell, JXL solves that issue for most websites since it is backwards-compatible and can be decoded into JPEG when a client does not support JXL. I would happily give up a few percent in compression efficiency to get back to a single all-purpose lossy image format.


Even Google photo does not support avif.

It's almost as if Google had an interest in increased storage and bandwidth. Of course they don't but as paying Driver used I'm overcharged for the same thing.


> Even Google photo does not support avif

I have no previous first-hand knowledge of this, but I vaguely remember discussions of avif in google photos from reddit a while back so FWIW I just tried uploading some avif photos and it handled them just fine.

Listed as avif in file info, downloads as the original file, though inspecting the network in the web frontend, it serves versions of it as jpg and webp, so there's obviously still transcoding going on.

I'm not sure when they added support, the consumer documentation seem to be more landing site than docs unless I'm completely missing the right page, but the API docs list avif support[1], and according to the way back machine, "AVIF" was added to that page some time between August and November 2023.

[1] https://developers.google.com/photos/library/guides/upload-m...


You are correct it is possible to upload avif files into Google Photo. But you lose the view and of course the thumbnail. Defeating the whole purpose of putting them into Photo.

Given it's an app, they didn't even need Google chrome to add support. Avif is supported on Android natively.


> You are correct it is possible to upload avif files into Google Photo. But you lose the view and of course the thumbnail.

I'm not sure what you mean. They appear to act like any other photo in the interface. You can view them and they're visible in the thumbnail view, but maybe I'm misinterpreting what you mean?


Or perhaps I don't see what you see.

I take a photo, the format is jpeg. It backs up to Google photo, the Google photo app on Android renders the photo just fine.

I then convert that photo (via a local converter) to AVIF, Google backs it up, I can see the file in Google Photo on Android but it doesn't render the image. That being full size or thumbnail, all I get is a grayed square. So I concluded the app doesn't support avif rasterizing.

I then gave up on the automation that converted all my jpeg into avif, which in turn would have saved hundred of gigabytes given I have 10y worth of photos.

The experiment was done about 3 months ago, as of 2025 Google Photo on Android, latest version, would not render my AVIF photos.


Some years ago, the Google Photos team asked the Chrome team to support JXL, so that they could use it for Photos. The request was ignored, of course.


They could have added support themselves to the app as it doesn't use the WebView


Google Photos isn't just the app


See cousin comment, it accepts AVIF files. At least they would render on the app. Which would be enough for many. As it stands it accepts this format and renders nothing at all.


The killer feature of JXL is that most websites already have a whole bunch of images in JPEG format, and converting those to JXL shrinks them by about 30% without introducing any new artifacts.


> Mozilla has no desire to introduce a barely supported massive C++ decoder for marginal gains

On a slightly related note, I wanted to have a HDR background image in Windows 11. Should be a breeze in 2025 right?

Well, Windows 11 only supports JPEG XR[1] for HDR background images. And my commonly used tools did either not support JPEG XR (Gimp fex) or they did not work correctly (ImageMagick).

So I had a look at the JPEG XR reference implementation, which was hosted on Codeplex but has been mirrored on GitHub[2]. And boy, I sure hope that isn't the code that lives in Windows 11...

Ok most of the gunk is in the encoder/decoder wrapper code, but still, for something that's supposedly still in active use by Microsoft... Though not even hosting their own copy of the reference implementation is telling enough I suppose.

[1]: https://en.wikipedia.org/wiki/JPEG_XR

[2]: https://github.com/4creators/jxrlib


Another JPEG XR user is Zeiss. It saves both grayscale and color microscope images with JPEG XR compression in a container format. Zeiss also released a C++ library (libczi) using the reference JPEG XR implementation to read/write these images. Somehow Zeiss is moving away from JPEG XR - its newer version of microscope control software saves with zstd compression by default.


"Marginal Gains"

Generation Loss – JPEG, WebP, JPEG XL, AVIF : https://www.youtube.com/watch?v=w7UDJUCMTng


Marginal gains over AVIF.

(Also I am highly skeptical of the importance of these generation loss tests.)


Very nice in video workflows, where it's common to write out image sequences to disk.


Social media exists


>avif is just better for typical web image quality,

What does "typical web image quality" even mean? I see lots of benchmarks with very low BPPs, like 0.5 or even lower, and that's where video-based image codecs shine.

However, I just visited CNN.com and these are the BPPs of the first 10 images my browser loaded: 1.40, 2.29, 1.88, 18.03 (PNG "CNN headlines" logo), 1.19, 2.01, 2.21, 2.32, 1.14, 2.45.

I believe people are underestimating the BPP values that are actually used on the web. I'm not saying that low-BPP images don't exist, but clearly it isn't hard to find examples of higher-quality images in the wild.


Can AVIF display 10 bit HDR with larger color gamut that any modern phone nowadays is capable of capturing?


> Can AVIF display 10 bit HDR with larger color gamut that any modern phone nowadays is capable of capturing?

Sure, 12-bit too, with HDR transfer functions (PQ and HLG), wide-gamut primaries (BT.2020, P3, etc.), and high-dynamic-range metadata (ITU/CTA mastering metadata, content light level metadata).

JPEG XL matches or exceeds these capabilities on paper, but not in practice. The reality is that the world is going to support the JPEG XL capabilities that Apple supports, and probably not much more.


if you actually read your parent comment: "typical web image quality"


Typical web image quality is like it is partly because of lack of support. It’s literally more difficult to show a static HDR photo than a whole video!


PNG supports HDR with up to 16 bits per channel, see https://www.w3.org/TR/png-3/ and the cICP, mDCV and cLLI chunks.


With incredibly bad compression ratios.


HDR should not be "typical web" anything. It's insane that websites are allowed to override my system brightness setting through HDR media. There's so much stuff out there that literally hurts my eyes if I've set my brightness such that pure white (SDR FFFFFF) is a comfortable light level.

I want JXL in web browsers, but without HDR support.


There's nothing stopping browsers from tone mapping[1] those HDR images using your tone mapping preference.

[1]: https://en.wikipedia.org/wiki/Tone_mapping


What does that achieve? Isn't it simpler to just not support HDR than to support HDR but tone map away the HDR effect?

Anyway, which web browsers have a setting to tone map HDR images such that they look like SDR images? (And why should "don't physically hurt my eyes" be an opt-in setting anyway instead of just the default?)


> What does that achieve?

Because then a user who wants to see the HDR image in all its full glory can do so. If the base image is not HDR, then there is nothing they can do about it.

> And why should "don't physically hurt my eyes" be an opt-in setting anyway instead of just the default?

While I very much support more HDR in the online world, I fully agree with you here.

However, I suspect the reason will boil down to what it usually does: almost no users change the default settings ever. And so, any default which goes the other way will invariably lead to a ton of support cases of "why doesn't this work".

However, web browsers are dark-mode aware, they could be HDR aware and do what you prefer based on that.


What user wants the web to look like this? https://floss.social/@mort/115147174361502259


That video is clearly not encoded correctly. If it were the levels would match the background, given there is no actual HDR content visible in that video frame.

Anyway, even if the video was of a lovely nature scene in proper HDR, you might still find it jarring compared to the surrounding non-HDR desktop elements. I might too, depending on the specifics.

However, like I said, it's up to the browser to handle this.

One suggestion I saw mentioned by some browser devs was to make the default to tone map HDR if the page is not viewed in fullscreen mode, and switch to full HDR range if it is fullscreen.

Even if that doesn't become the default, it could be a behavior the browser could let the user select.


> That video is clearly not encoded correctly.

Actually I forgot about auto-HDR conversion of SDR videos which some operating systems do. So it might not be the video itself, but rather the OS and video driver ruining things in this case.


Ideally, browsers should just not support HDR.


Well I strongly disagree on that point.

Just because we're in the infancy of wide HDR adoption and thus experience some niggling issues while software folks work out the kinks isn't a good reason to just wholesale forego the feature in such a crucial piece of infrastructure.

Sure, if you don't want HDR in the browser I do think there should be a browser option to let you achieve that. I don't want to force it on everyone out there.

Keep in mind the screenshot you showed is how things looked on my Windows until I changed the auto-HDR option. It wasn't the browser that did it, it was completely innocent.

It was just so long ago I completely forgot I had changed that OS configuration.


If you want to avoid eye pain then you want caps on how much brightness can be in what percent of the image, not to throw the baby out with the bathwater and disable it entirely.

And if you're speaking from iphone experience, my understanding is the main problem there isn't extra bright things in the image, it's the renderer ignoring your brightness settings when HDR shows up, which is obviously stupid and not a problem with HDR in general.


If the brightness cap of the HDR image is full SDR brightness, what value remains in HDR? As far as I can see, it's all bath water, no baby


> If the brightness cap of the HDR image is full SDR brightness, what value remains in HDR?

If you set #ffffff to be a comfortable max, then that would be the brightness cap for HDR flares that fill the entire screen.

But filling the entire screen like that rarely happens. Smaller flares would have a higher cap.

For example, let's say an HDR scene has an average brightness that's 55% of #ffffff, but a tenth of the screen is up at 200% of #ffffff. That should give you a visually impressive boosted range without blinding you.


Oh.

I don't want the ability for 10% of the screen to be so bright it hurts my eyes. That's the exact thing I want to avoid. I don't understand why you think your suggestion would help. I want SDR FFFFFF to be the brightest any part of my screen goes to, because that's what I've configured to be at a comfortable value using my OS brightness controls.


I strongly doubt that the brightness to hurt your eyes is the same for 10% of the screen and 100% of the screen.

I am not suggesting eye hurting. The opposite really, I'm suggesting a curve that stays similarly comfortable at all sizes.


I don't want any one part of my screen to be a stupidly bright point light. It's not just the total amount of photons that matters.


It is not just the total amount.

But it's not the brightest spot either.

It's in between.


I just don't want your "in between" "only hurt my eyes a little" solution. I don't see how that's so hard to understand. I set my brightness so that SDR FFFFFF is a comfortable max brightness. I don't understand why web content should be allowed to go brighter than that.


I'm suggesting something that WON'T hurt your eyes. I don't see how that's so hard to understand.

You set a comfortable max brightness for the entire screen.

Comfortable max brightness for small parts of the screen is a different brightness. Comfortable. NO eye hurting.


It's still uncomfortable to have 10% of the screen get ridiculously bright.


Yes, it's uncomfortable to have it get "ridiculously" bright.

But there's a level that is comfortable that is higher than what you set for FFFFFF.

And the comfortable level for 1% of the screen is even higher.

HDR could take advantage of that to make more realistic scenes without making you uncomfortable. If it was coded right to respect your limits. Which is probably isn't right now. But it could be.


I severely doubt that I could ever be comfortable with 10% of my screen getting much brighter than the value I set as max brightness.

But say you're right. Now you've achieved images looking completely out of place. You've achieved making the surrounding GUI look grey instead of white. And the screen looks broken when it suddenly dims after switching tabs away from one with an HDR video. What's the point? Even ignoring the painful aspects (which is a big thing to ignore, since my laptop currently physically hurts me at night with no setting to make it not hurt me, which I don't appreciate), you're just making the experience of browsing the web worse. Why?


In general, people report that HDR content looks more realistic and pretty. That's the point, if it can be done without hurting you.


Do they? Do people report that an HDR image on a web page that takes up roughly 10% of the screen looks more realistic? Do they report that an HDR YouTube video, which mostly consists of a screen recording with the recorded SDR FFF being mapped to the brightness of the sun, looks pretty? Do people like when their light-mode GUI suddenly turns grey as a part of it becomes 10x the brightness of what used to be white? (see e.g https://floss.social/@mort/115147174361502259)

Because that's what HDR web content is.

HDR movies playing on a livingroom TV? Sure, nothing against that. I mean it's stupid that it tries to achieve some kind of absolute brightness, but in principle, some form of "brighter than SDR FFF" could make sense there. But for web content, surrounded by an SDR GUI?


> when their light-mode GUI suddenly turns grey as a part of it becomes 10x the brightness of what used to be white

I don't know why you're asking me about examples that violate the rules I proposed. No I don't want that.

And obviously boosting the brightness of a screen capture is bad. It would look bad in SDR too. I don't know why you're even bringing it up. I am aware that HDR can be done wrong...

But for HDR videos where the HDR actually makes sense, yeah it's fine for highlights in the video to be a little brighter than the GUI around them, or for tiny little blips to be significantly brighter. Not enough to make it look gray like the misbehavior you linked.


it actually is somewhat an HDR problem because the HDR standards made some dumb choices. SDR standardizes relative brightness, but HDR uses absolute brightness even though that's an obviously dumb idea and in practice no one with a brain actually implements it.


In a modern image chain, capture is more often than not HDR.

These images are then graded for HDR or SDR. I.e., sacrifices are made on the image data such that it is suitable for a display standard.

If you have an HDR image, it's relatively easy to tone-map that into SDR space, see e.g. BT.2408 for an approach in Video.

The underlying problem here is that the Web isn't ready for HDR at all, and I'm almost 100% confident browsers don't do the right things yet. HDR displays have enormous variance. From "Slightly above SDR" to experimental displays at Dolby Labs. So to display an image correctly, you need to render it properly to the displays capabilities. Likewise if you want to display a HDR image on an SDR monitor. I.e., tone mapping is a required part of the solution.

A correctly graded HDR image taken of the real world will have like 95% of the pixel values falling within your typical SDR (Rec.709/sRGB) range. You only use the "physically hurt my eyes" values sparingly, and you will take the room conditions into consideration when designing the peak value. As an example: cinemas using DCI-P3 peaks at 48 nits because the cinema is completely dark. 48 nits is more than enough for a pure white in that environment. But take that image and put it on a display sitting inside during the day, and it's not nearly enough for a white. Add HDR peaks into this, and it's easy to see that in a cinema, you probably shouldn't peak at 1000 nits (which is about 4.x stops of light above the DCI-P3 peak). In short: your rendering to the displays capabilities require that you probe the light conditions in the room.

It's also why you shouldn't be able to manipulate brightness on an HDR display. We need that to be part of the image rendering chain such that the right decisions can be made.



How about websites just straight up aren't allowed to physically hurt me, by default?


Web sites aren’t made for just you. If images from your screen are causing you issues, that is a you / your device problem, not a web site problem.


I agree, it's not a web site problem. It's a web standards problem that it's possible for web sites to do that.


Note the spec does recommend providing a user option: https://drafts.csswg.org/css-color-hdr-1/#a11y


You asked “which web browsers have a setting to tone map HDR images such that they look like SDR images?”; I answered. Were you not actually looking for a solution?


I was looking for a setting, not a hack.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: