The past, present and future of image formats
zokier 2021-08-19 10:29:55 +0000 UTC [ - ]
Notably Canon and Sony support HEIC natively in their cameras. It makes probably lot of sense to them, as they need h265 encoders anyways for video capture, so they get HEIC pretty much for free.
Also Android supports HEIC these days.
Together with Apple also supporting HEIC, it pretty much guarantees that HEIC is here to stay.
Meanwhile I haven't heard any camera manufacturer to support either AVIF or JPEG-XL. Even Android doesn't support AVIF yet. So outside web bubble, it's AVIF that has little to no support.
Youden 2021-08-19 11:01:45 +0000 UTC [ - ]
The situation isn't quite so simple. Android devices are only required to support HEIC if they also support HEVC decoding [0]. Interestingly, this seems to be less strict than the Android 10 CDD [1].
[0]: https://source.android.com/compatibility/11/android-11-cdd#5...
[1]: https://source.android.com/compatibility/10/android-10-cdd#5...
ksec 2021-08-19 15:37:20 +0000 UTC [ - ]
Yes. A lot of the conversation talk about image and video as if Web is the only platform. Just like how AV1 is suppose to be everything because of Youtube while completely neglecting the rest of the video industry from Broadcasting to Movies etc.
skyfaller 2021-08-19 12:52:02 +0000 UTC [ - ]
instead of requiring storing many copies of the same image for responsive design, that would be a godsend and would immediately convert me to JPEG XL for graphics on websites. I don’t want the complication of generating, storing, and serving the right image to the right client, there should be one image and clients should only download the amount they want.
Santosh83 2021-08-19 14:13:10 +0000 UTC [ - ]
edflsafoiewq 2021-08-19 13:21:42 +0000 UTC [ - ]
oenetan 2021-08-18 12:19:26 +0000 UTC [ - ]
For more info on jpegxl, go to https://jpegxl.info/
ksec 2021-08-19 17:01:17 +0000 UTC [ - ]
Correct me if I am wrong on the calculation.
The down-scaled PNG is showing 5 images at 6106 x 1200. At 100KB per one image would be at bpp of ~0.55. So despite the favourable results, I need to state again JPEG XL isn't yet optimised for low bpp yet ( bpp below 1 ).
Reposting My previous comment ( https://news.ycombinator.com/item?id=27577701 ) on JPEG XL from a few months ago.
Comment from Facebook Infra [1],
>Erik Andre from the Images Infra team at Facebook here. I'd like to share a bit of our view on JPEG XL in the context of new image formats (e.g AVIF, JPEG XL, WEBP2, ...) and how browser adoption will let us move forward with our plans to test and hopefully roll out JPEG XL.*
>After spending the last 5 months investigating and evaluating JPEG XL from both a performance and quality point of view, it's our opinion that JPEG XL has the most potential of the new generation of image formats that are trying to succeed JPEG.
>This opinion is based on the following findings:
>JPEG XL encoding at speed/effort 6 is as fast as JPEG encoding (using MozJpeg with Trellis encoding enabled). This means that it's practical to encode JPEG XL images on the fly and serve to client. This can be compared with the encoding speed of AVIF which necessitates offline encoding which offers much less flexibility when it comes to delivering dynamically sized and compressed content. Depending on the settings used, JPEG XL can also be very fast to decode. Our mobile benchmarks show that we can reach parity with JPEG when using multiple threads to decode. This matches and in many cases surpasses the decoding performance of other new image formats. The JPEG XL image format supports progressive decoding, offering similar gains in perceived image load performance we are already benefitting from with JPEG. This is a feature lacking in the other new image formats which are all derived from Video codecs where such features makes little sense to support.
>Having browser support from all the major browsers is going to make our lives a lot easier in upcoming WWW experiments and ensure that we can deliver a consistent experience across platforms and browsers.
Blink Tracking Bug [2] currently behind a flag ,Firefox on both [1] and [3], currently in about:preferences#experimental on Firefox Nightly. If I remember correctly it is supported on Edge behind a parameter as well. I thought it was all very quiet after the standard was published, turns out both Chrome and Firefox intent to support it.
AFAIK, neither Webkit nor Safari has any plan or intention to support JPEGXL. I think ( someone correct me if I am wrong ) Safari uses macOS image decoding library. So supporting JPEGXL may come from an OS update and not browser?
Finally, an open standard, Royalty Free, open-source reference implementation, and it is nearly better than all other alternative. As an image format on the web it is quite possibly close to perfect [4]. It is exciting and I hope JPEG XL will succeed.
[1] https://bugzilla.mozilla.org/show_bug.cgi?id=1539075
[2] https://bugs.chromium.org/p/chromium/issues/detail?id=117805...
[3] https://bugzilla.mozilla.org/show_bug.cgi?id=1707590
[4] I remember the conversion from a little more than 6 months ago current encoder is not optimised for image quality below bpp 1.0, those are going to be the focus once all the initial reference encoder and decoder standards and issues are done. So in case anyone wondering it doesn't look as good as other competitors ( but still a lot better than JPEG ), those improvements are coming later.
imaginariet 2021-08-19 10:10:11 +0000 UTC [ - ]
> It would be great if web browsers would accept in an <img> tag all the video codecs they can play in a <video> tag, the only difference being that in an <img> tag, the video is autoplayed, muted, and looped. That way, new and masterful video codecs like VP9 and AV1 would automatically work for animations, and we can finally get rid of the ancient GIF format.
Of course, the typical rebuttal on HN will be "I hate autoplay, nothing should autoplay without an explicit click", which is why we'll be stuck with GIF for another 10 years.
gardaani 2021-08-19 10:48:56 +0000 UTC [ - ]
[1] https://developer.apple.com/documentation/webkit/delivering_...
[2] https://bugs.chromium.org/p/chromium/issues/detail?id=791658...
acdha 2021-08-19 12:27:40 +0000 UTC [ - ]
pwdisswordfish8 2021-08-19 13:56:34 +0000 UTC [ - ]
acdha 2021-08-19 15:16:14 +0000 UTC [ - ]
Zababa 2021-08-19 11:41:06 +0000 UTC [ - ]
Why? If you believe nothing should autoplay GIFs are also part of the problem.
rhdunn 2021-08-19 15:08:38 +0000 UTC [ - ]
chrismorgan 2021-08-19 12:30:01 +0000 UTC [ - ]
idoubtit 2021-08-19 14:22:20 +0000 UTC [ - ]
chrismorgan 2021-08-19 15:16:58 +0000 UTC [ - ]
magicalhippo 2021-08-19 10:30:29 +0000 UTC [ - ]
I'm generally against autoplaying, but I do accept that in some cases it is a nice thing.
So this proposal seems reasonable, if the browsers in addition get a separate option to disable this behavior.
It's important though that javascripts can't override this behavior to prevent abuse (say by unmuting), so browsers would have to take some precaution against that.
ThePadawan 2021-08-19 13:43:59 +0000 UTC [ - ]
[0] https://developer.mozilla.org/en-US/docs/Web/Media/Autoplay_...
vbezhenar 2021-08-19 11:50:10 +0000 UTC [ - ]
unwind 2021-08-19 11:58:33 +0000 UTC [ - ]
At least I would be fine with that, and I can be quite annoyed with auto-playing video.
Santosh83 2021-08-19 12:37:57 +0000 UTC [ - ]
Edge mysteriously does not yet support AVIF despite building on the same Chromium base that does support it, which means the AVIF code is deliberately compiled out by MS for some reason.
Safari is a different kettle of fish. It supports neither format yet, but there seem to be vague indications on the issue tracker that one day they may land AVIF support. For that matter even Webp support is confined to only Safari on Big Sur and later, according to caniuse. It truly is shaping up to take over IE soon, as the browser for which one has to go to great lengths to code fallbacks.
IvanK_net 2021-08-19 12:11:50 +0000 UTC [ - ]
Here is the GIF version of their final image (249 kB) https://i.imgur.com/8yt7NS0.gif
kettleballroll 2021-08-19 12:00:19 +0000 UTC [ - ]
ggm 2021-08-19 11:12:31 +0000 UTC [ - ]
londons_explore 2021-08-19 10:03:08 +0000 UTC [ - ]
Of that small proportion, the cpu time of rendering the images, relayout of the page for changed image dimensions, and re-rendering the image and all overlapping text and layers each time a bit more of it partially loads, is more substantial than the network delay of loading the data.
The TL;DR: is that optimising images isn't the place to start to make most websites load faster, especially on mobile.
Obviously if CPU speeds increase faster than network speeds, image size may matter again in the future. But today the size is mostly irrelevant as long as you aren't serving your photographs as multi-megapixel images or gifs.
cryptonym 2021-08-19 11:37:44 +0000 UTC [ - ]
What if your business relies heavily on images and can save 30% of your bills on image storage and bandwidth? On your camera, maybe you will take more pictures without switching SD-Card...
imaginariet 2021-08-19 10:14:06 +0000 UTC [ - ]
And way more images are viewed on Instagram than on mobile websites.
nyx-aiur 2021-08-19 10:53:26 +0000 UTC [ - ]
q-rews 2021-08-19 13:00:10 +0000 UTC [ - ]
That clearly isn’t the case in any of the tests that I’ve seen so far. Without proof of these statements, this article is garbage and should be ignored.
pornel 2021-08-19 14:55:31 +0000 UTC [ - ]
Proper comparisons need to control for visual quality, but this very hard to do properly, because human range of "looks the same to me" is bigger than compression efficiency difference between WebP and JPEG. So you need either synthetic "objective" metrics that are not exactly human judgement (problematic, because e.g. PSNR likes blur more than people do), or test with lots of images and lots of people.
Anyway, WebP has also clear limitations compared to JPEG. For example 4:2:0 chroma subsampling of VP8 means it automatically loses whenever an image needs sharp colors, because it just can't do it, but JPEG can. VP8 also uses "studio range", so it has less than 8 bits of color precision (224 intensity levels instead of 255).
dusted 2021-08-19 10:32:40 +0000 UTC [ - ]
well, sure, right now, but not in the future.. Point: in 1996 my harddrive had a formatted capacity just shy of 1 GiB of space Now my copy of the 1979 movie "Alien" takes up 16 GiB, or 16 times the size of my 1996 drive, which is even more extreme.
So sure, clever (lossy) image compression has its place in history, but it is going to fade, just as with audio, a lossless compression format will be the future.
kettleballroll 2021-08-19 11:59:43 +0000 UTC [ - ]
At least ony bubble, the standard for audio is still lossless: I'm fairly sure 99.9% of all audio i consume is using something lossy:streamed over Spotify, in movies like Netflix, clips like YouTube, video Chats... Is that just my impression and technology moved to FLAC without me noticing?
yitchelle 2021-08-19 10:37:15 +0000 UTC [ - ]
I think that we have reached a point where it is fine to consumed media as it is now, personal view.
mchusma 2021-08-19 13:50:25 +0000 UTC [ - ]
But there had been good research on using neural nets for compression and decompression, and I expect the next codec to have massive improvements over these formats.
dusted 2021-08-19 10:42:42 +0000 UTC [ - ]
dangravell 2021-08-19 10:51:50 +0000 UTC [ - ]
hulitu 2021-08-19 15:27:46 +0000 UTC [ - ]
hulitu 2021-08-19 15:25:00 +0000 UTC [ - ]
imaginariet 2021-08-19 12:35:14 +0000 UTC [ - ]
Not even they can afford lossless capture. Because while storage capacities do increase, so does the used video size, bit depth and framerate, and number of takes.
jl6 2021-08-19 10:42:26 +0000 UTC [ - ]
So I believe we must still strive for a degree of storage efficiency.
blitzar 2021-08-19 12:02:23 +0000 UTC [ - ]
edflsafoiewq 2021-08-19 12:33:32 +0000 UTC [ - ]
To put it differently, you'll never optimize your way ahead of ten million people watching cat videos.
nyx-aiur 2021-08-19 10:49:19 +0000 UTC [ - ]
maskros 2021-08-19 11:14:19 +0000 UTC [ - ]
Not even remotely the reason. Lack of high quality decoders and absolutely terrible performance are what kept it back.
The format is ridiculously complicated (design by committee meant it had to include every single bad idea everyone could ever come up with) and there were no good open source implementations. Even the best commercial implementations have decoding performance that is orders of magnitude slower than JPEG. I don't think it's possible to decode JPEG2000 with good performance even if you tried.
JPEG2000 is plain and simple a terrible format. Adobe shoe-horned JPEG200 into PDF, and tried to make it common, so all PDF viewers have to support it. Nobody with half a brain likes to use JPEG2000 images in PDF though, since all files with embedded JPEG2000 images are ten times slower to open and use ten times as much memory as the equivalent PDF with JPEG images would.
acdha 2021-08-19 12:17:12 +0000 UTC [ - ]
I think the quote is also more correct than you give it credit for because IP concerns kept it out of browsers. I don’t think you can get traction for a new image format without browser support — _maybe_ that could be a robust WASM polyfill now but I really think anyone making a new format needs to focus how to get over that hump.
Performance is complicated: it’s good enough to be used in digital cinema systems but it’s definitely trading CPU for size, just as we see in the modern video codecs which are also orders of magnitude slower than JPEG. The one area where it can be faster is progressive decoding when you don’t want the entire image resolution - I work with systems which serve lots of thumbnails & similar lower-resolution derivatives and this is the main selling point for the format. I’m still not convinced it’s worth the hassle but I do look for it in other formats.
ZeroGravitas 2021-08-19 12:22:45 +0000 UTC [ - ]
This ironically demonstrates the absurd nature of patents in networked contexts, because not having patents made those things more valuable (to the world at large) since most of the value comes from the adopters and the ecosystem, not the specific patented spec.
JP2 would have had some niches that it could have grown from, scanned books for example.
pornel 2021-08-19 14:33:45 +0000 UTC [ - ]
It uses wavelets which are overfitting the PSNR metric. In purely numeric benchmarks (without humans judging the images) it makes JPEG2000 seem way better than it really is. Wavelets may look impressive when you compress an image to blurry death, but that's a niche use-case. It falls apart in the more relevant high and nearly-lossless quality ranges. It handles sharp edges very poorly, and struggles to preserve textures of things like skin or wood — either everything looks like plastic, or you get poor compression.
DCT-based codecs can generate realistically-looking textures with fewer bits. For web uses even de-blocking filters of WebP and AVIF are not that useful, because people want to use qualities at which blocking isn't a problem in the first place.
veltas 2021-08-19 12:12:49 +0000 UTC [ - ]
Isn't that also a symptom of lack of adoption?