NHacker Next
login
▲Why JPEGs still rule the web (2024)spectrum.ieee.org
197 points by purpleko 1 days ago | 355 comments
Loading comments...
carra 7 hours ago [-]
We keep hearing different variants of "webp should take over because now it has good browser support". But that is not nearly enough for an image format to reach widespread adoption.

People want to be able to open the images anywhere (what about watching photos in a smartTV? or an old tablet? what about digital picture frames?). They want to edit the images in any program (what about this FOSS editor? what about the many people stuck in pre-subscription Photoshop versions?).

They also want to ensure far future access to their precious photos, when formats like JPEG2000 or even WebP might be long gone. I mean, webp was made by Google and we know how many of their heavily promoted creations are dead already...

account42 2 hours ago [-]
It also doesn't help that most people's experience with webp is force-recompressed versions of images that were originally jpeg. With relatively low quality settings.
schmidtleonard 1 hours ago [-]
I'm not sure people consciously make that association, but you know which association they absolutely do make? The one where google images started using webp at the same time as they were locking it down. At the time, ecosystem support was practically nonexistent, so it functioned as "soft DRM" and among people who know the term "webp" at all that's by far the #1 association.
jagermo 6 hours ago [-]
i don't like working with webp. especially since chrome pushes it on everyting, but googles other tools like slides do not support it. Super annoying, especially since google developed it.
Velocifyer 2 hours ago [-]
JPEG XL is better than webp
jokoon 5 hours ago [-]
This reminds me of rust: even if rust will be widespread in 10 or 20 years, it will not really displace C++.

Although I need a engineering explanation as to why COBOL is still alive after all those years, because any tech cannot live forever.

fsloth 4 hours ago [-]
Can’t live forever?

Latin is still going strong as well as water pipes (oldest being several millenia old).

Hard to predict which innovations remain resilient. The longer they stick around the more ”Lindy-proof” they are.

jt2190 3 hours ago [-]
The explanation for COBOL is not an engineering one, but an economics one: “It’s cheaper to train a programmer to use COBOL than it is to rewrite the codebase in <language>.” (Perhaps LLMs might change the economics here.)
0points 5 hours ago [-]
> COBOL

Was popular in the 60s in fintech, so banks, ATM:s and related systems went digital using it.

Those systems are still running.

graealex 1 hours ago [-]
Although you can do a bit of Ship-of-Theseus philosophy on those COBOL systems. After every software component has been rewritten multiple times and every hardware has died and subsequently got replaced, all that's left is the decision to stick with COBOL, not the fact that it's a legacy system built in the 60s.
arp242 2 hours ago [-]
Last I checked you couldn't even upload .webp images to GitHub or Telegram. Well, for GitHub you can cheat by renaming it to .png and GitHub's content detection will make it work regardless, but meh.
whyever 6 hours ago [-]
> I mean, webp was made by Google and we know how many of their heavily promoted creations are dead already...

I don't understand this argument. WebP is an algorithm, not a service. You cannot kill it once it's published.

jacobgkau 6 hours ago [-]
JPEG XL is similarly an algorithm that's been published, but Google removed it from their browser and Mozilla followed suit, which effectively killed its usefulness as a web-friendly (and, more generally, usable-anywhere) format.
Vinnl 5 hours ago [-]
Some nuance there — it's not dead yet: https://github.com/mozilla/standards-positions/pull/1064
GCUMstlyHarmls 1 hours ago [-]
> the reference decoder, which weighs in at more than 100,000 lines of multithreaded C++.

Wow! I have never written a compression codec implementation, but that's kind of staggering.

eviks 4 hours ago [-]
Thanks, do you know the status of said safe decoder?
carra 6 hours ago [-]
Fair enough. What I meant by this is that, in the end, most software that decides to add webp support is doing it because of the huge push by Google to do so. But if they suddenly change that push to something else then webp might find itself growing more irrelevant.
nottorp 4 hours ago [-]
I didn't know webp was pushed by Google. They should publicize that fact more so people know to avoid the format entirely.

What Google pushes is in their self interest and has nothing to do with the good of the unwashed masses.

graealex 1 hours ago [-]
WebP is basically a single i-frame from the WebM video codec, which literally was developed by Google to avoid paying license cost for H.264. For which they had great incentive.

WebP is to WebM what HEIC is to HEVC.

You can argue that using free codecs is a collateral benefit here, even though Google did it for selfish reasons. It is not detrimental to the public or the internet.

TacticalCoder 2 hours ago [-]
[dead]
prmoustache 5 hours ago [-]
I am pretty sure webp is supported everywhere nowadays. I think it is just inertia.
chownie 5 hours ago [-]
It isn't universal, my phone gallery doesn't support webp at all by default and the windows gallery only supports non-animating webp from what I can tell?
xeromal 2 hours ago [-]
When I download a photo to send to my family webp always causes some kind of problem so I end up screenshotting it
schmidtleonard 1 hours ago [-]
Always always always -- and it's often multiple problems, where the filesystem preview generators don't support it or don't support it over a network or the social media used by the other person doesn't support it (often egregiously so, where an unrecognized drop bubbles up to browser scope and blasts the page you were on) or there's a weird problem with a site/app that is supposed to support it, such as it turns into a black box.

Support for webp is still so rough that I have to wonder what one's ecosystem must look like for it to be seamless. Maybe if you are a googler and your phone/computer/browser use entirely google software and ditto for your friends and your friends friends and your spouses? Maybe?

graealex 41 minutes ago [-]
To my knowledge, not even every Google product supports it, but I have not verified support myself.

I blame Google for pushing it, but I also blame every third-party product for not supporting it, when it is mostly free to do so (I'm sure all of them internally use libraries to decode images instead of rolling their own code).

7bit 4 hours ago [-]
My smartphone camera does not output webp and so does my professional Nikon.

As long as these two major sources of pictures stay on JPEG, I will too. Simply because that's all for subjective and completely debatable reasons.

prmoustache 4 hours ago [-]
To me what cameras support as an output is irrelevant. On a pro camera you generally use the raw files as a default format. What is important is the formats you can export to / manipulate afterwards for publication/exchange.
Blahagun 5 hours ago [-]
Of course there are tons of better formats than JPEG but it needs to be understood that the most important feature of JPEG is to be exchangeable. It doesn't matter what your shiny new web browser supports, JPEG is considered supported everywhere and all editorials actually refuse to accept anything other than JPEG. You can't just break everyone's workflow because Google decided to force WebP (for purely selfish reasons, of course). The web browser is actually one of the least important platforms for JPEG. To day JPEG still executes its mission perfectly and with huge bandwidth increases it doesn't even matter how large the file is.
dingdingdang 2 hours ago [-]
This article* makes the case for mozjpeg cleanly beating webp when we are above 500x500px image sizes. So. There's a lot more performance/compression to be gained within the jpeg container format than people generally argue for.

* https://siipo.la/blog/is-webp-really-better-than-jpeg

edit: https://opensource.googleblog.com/2024/04/introducing-jpegli... is likely the real GOAT when it comes to modern jpeg encoders in that it effectively breaks the 8bit color space "ceiling" within the format!

sumtechguy 2 hours ago [-]
The same thing sort of happened in the compression realm. ZIP is the pretty much the defacto winner for that type of file movement. Oh people use others in a lot of cases. But even when pkzip 2.04 came out there were better ones. Yet here we are still 30 years on using zip. Heck the zip alg is even used in many picture formats.
wongarsu 57 minutes ago [-]
It helps that in most cases when people make a zip the compression is a secondary feature. What people want 90% of the time is a container format to put multiple files or a whole folder structure into one file. .zip or .tar.gz do this just fine, even if their compression factor and speed aren't that great by modern standards.

Same with jpeg: Most people want to just encode images, reducing file size by 10% is a negligible win for most people

hulitu 4 hours ago [-]
> Of course there are tons of better formats than JPEG

and webp is not one of them.

imageformatssux 23 hours ago [-]
How in the world do people store images / photos nowadays?

Just as there is a clear winner for video - av1 - there seems to be nothing in the way of "this is clearly the future, at least for the next few years" when it comes to encoding images.

JPEG is... old, and it shows. The filesizes are a bit bloated, which isn't really a huge problem with modern storage, but the quality isn't great.

JPEG-XL seemed like the next logical step until Google took their toys and killed it despite already having the support in Chrome, which pretty much makes it dead in the water (don't you just love monopolies making decisions for you?)

HEIC is good, as long as you pinky promise to never ever leave Apple's ecosystem, ie HEIC sucks.

AVIF seems computationally expensive and the support is pretty spotty - 8bit yuv420 might work, but 10b or yuv444 often doesn't. Windows 10 also chokes pretty hard on it.

Alternatives like WebP might be good for browsers but are nigh-unworkable on desktops, support is very spotty.

PNG is cheap and support is ubiquitous but filesizes become sky-high very quick.

So what's left? I have a whole bunch of .HEIC photos and I'd really like if Windows Explorer didn't freeze for literal minutes when I open a folder with them. Is jpeg still the only good option? Or is encoding everything in jpeg-xl or avif + praying things get better in the future a reasonable bet?

OneDeuxTriSeiGo 18 hours ago [-]
> JPEG-XL seemed like the next logical step until Google took their toys and killed it despite already having the support in Chrome, which pretty much makes it dead in the water (don't you just love monopolies making decisions for you?)

It's worth noting that Firefox is willing to adopt JPEG-XL[1] as soon as the rust implementation[2] is mature And that rust impl is a direct port from the reference C++ implementation[3]. Mac OS and Safari already support JPEG-XL [4]. And recently Windows picked up JPEG-XL support. The only blockers at this point are Firefox, Chromium, and Android. If/when Firefox adopts JPEG-XL, we'll probably see google follow suit if only out of pressure from downstream Chromium platforms wanting to adopt it to maintain parity.

So really if you want to see JPEG-XL get adopted, go throw some engineering hours at the rust implementation [2] to help get it up to feature parity with the reference impl.

-----

1. https://github.com/mozilla/standards-positions/pull/1064

2. https://github.com/libjxl/jxl-rs

3. https://github.com/libjxl/libjxl

4. https://www.theregister.com/2023/06/07/apple_safari_jpeg_xl/

5. https://www.windowslatest.com/2025/03/05/turn-on-jpeg-xl-jxl...

Liquix 17 hours ago [-]
g**gle is hellbent on killing JPEG-XL support in favor of WebP. assuming they'll capitulate to downstream pressure is a stretch. this article [0] sums it up nicely:

What this [removal of support for JPEG-XL in Chromium] really translates to is, “We’ve created WebP, a competing standard, and want to kill anything that might genuinely compete with it”. This would also partly explain why they adopted AVIF but not JPEG XL. AVIF wasn’t superior in every way and, as such, didn’t threaten to dethrone WebP.

[0] https://vale.rocks/posts/jpeg-xl-and-googles-war-against-it

OneDeuxTriSeiGo 16 hours ago [-]
I'm not assuming they capitulate under just pressure. Rather I'm assuming they'll capitulate if a majority of or even all of the big third party chromium browsers push for adding it to mainline chromium.

This is less just blind pressure but rather the risk that google becomes seen as an untrustworthy custodian of chromium and that downstreams start supporting an alternate upstream outside of google's control.

Jxl is certainly a hill that google seems intent to stand on but I doubt it's one they'd choose to die on. Doubly so given the ammo it'd give in the ongoing chrome anti-trust lawsuits.

frollogaston 16 hours ago [-]
How is Google so intent on webp winning? They don't even support it in their own products besides Chrome.
Spooky23 14 hours ago [-]
Chrome is like a different company. They do weird shit.
ethbr1 12 hours ago [-]
You either die an hero or live long enough to become the MS Office team.
melagonster 7 hours ago [-]
But MS Office team is more easier to understand: they want money.
tonyhart7 2 hours ago [-]
they don't in google chrome team???
arp242 2 hours ago [-]
The risks and downsides of exposing an image decoder to the entire web are very real, especially a relatively new/untested one written in a language like C++. There's been vulnerabilities in pretty much every other image decoder and I fully expect jpeg-xl to be no different. You can't just brush that aside. Hell, article doesn't even acknowledge it. Google has no real stake in webp vs. jpeg-xl either. You may disagree with the decision, this this kind of stuff doesn't make much sense.
throw0101c 14 hours ago [-]
> HEIC is good, as long as you pinky promise to never ever leave Apple's ecosystem, ie HEIC sucks.

HEIC was developed by the MPEG folks and is an ISO standard, ISO/IEC 23008-12:2022:

* https://www.iso.org/standard/83650.html

* https://en.wikipedia.org/wiki/High_Efficiency_Image_File_For...

An HEIC image is generally a still frame from ITU-T H.265† (HEVC):

* https://www.geeky-gadgets.com/heif-avc-h-264-h-265-heic-and-...

OS support includes Windows 10 v1083, Android 10+, Ubuntu 20.04, Debian 10, Fedora 36. Lots of cameras and smartphones support it as well.

There's nothing Apple-specific about it. Apple went through the process of licensing H.265, so they got HEIC 'for free' and use it as the default image format because over JPEG it supports: HDR, >8-bit colour, etc.

†Like WebP was similar to an image/frame from a VP8 video.

socalgal2 12 hours ago [-]
> HEIC was developed by the MPEG folks

And the MPEG folks were so cool with video, all that licensing BS. Sounds great. No thanks!

culturestate 7 hours ago [-]
Confusingly, there are two different MPEGs in this context.

MPEG the standards group is organized by ISO and IEC, along with JPEG.

The one you’re thinking of - MPEG LA, the licensing company - is a patent pool (which has since been subsumed by a different one[1]) that’s unaffiliated with MPEG the standards group.

1. https://en.wikipedia.org/wiki/Via-LA

nottorp 4 hours ago [-]
So what good is it to have a separate entity doing the standard when the standard is unaffordable outside the top 500?
ralfd 8 hours ago [-]
Arent all MPEG patents expired?
culturestate 7 hours ago [-]
No, they’re in a patent pool. There’s what looks like a relatively up-to-date list at https://en.wikipedia.org/wiki/Via-LA#H.265/HEVC_licensors
TeMPOraL 6 hours ago [-]
5203 active patents for HEVC, 1480 for H.264. That's just plain insane! I get that video formats are complex, but complex enough to consist of 5000+ distinct, nontrivial inventions?
j16sdiz 3 hours ago [-]
Many of those are just marginally related, and might not apply to the actual standard.
throw0101c 4 hours ago [-]
> And the MPEG folks were so cool with video, all that licensing BS. Sounds great. No thanks!

Not wrong, but this is a different topic/objection than the GP's 'being locked into Apple's ecosystem'.

And as the Wikipedia article for HEIC shows, there's plenty of support for the format, even in open source OSes.

* https://en.wikipedia.org/wiki/High_Efficiency_Image_File_For...

leguminous 3 hours ago [-]
As far as I know, that's only support for the container format. You can't actually decode HEIC without also installing libde265, which you are supposed to have a license for. I'm not even sure how you'd go about getting an individual license.
throw0101d 2 hours ago [-]
> You can't actually decode HEIC without also installing libde265, which you are supposed to have a license for. I'm not even sure how you'd go about getting an individual license.

Debian doesn't have a problem with it:

* https://packages.debian.org/search?keywords=libde265

* https://packages.debian.org/search?keywords=libheif

jiggawatts 6 hours ago [-]
> OS support includes Windows 10 v1083

Ba-ha-ha... ha-ha... no.

Support is virtually non-existent. Every year or so, I try to use my Windows PC to convert a RAW photo taken with a high-end Nikon mirrorless camera to a proper HDR photo (in any format) and send it to my friends and family that use iDevices.

This has been literally impossible for the last decade, and will remain impossible until the heat death of the universe.

Read-only support is totally broken in a wide range of apps, including Microsoft-only apps. There are many Windows imaging APIs, and I would be very surprised if more than one gained HEIC support. Which is probably broken.

Microsoft will never support an Apple format, and vice versa.

Every single new photo or video format in the last 25 years has been pushed by one megacorp, and adoption outside of their own ecosystem is close to zero.

JPEG-XL is the only non-megacorp format that is any good any got and got multi-vendor traction, which then turned into "sliding backwards on oiled ice". (Google removed support from Chromium, which is the end of that sad story.)

throw0101c 4 hours ago [-]
>> OS support includes Windows 10 v1083

> Ba-ha-ha... ha-ha... no. […]

Feel free to hit "Edit" on the Wikipedia page and correct it then:

* https://en.wikipedia.org/wiki/High_Efficiency_Image_File_For...

> Microsoft will never support an Apple format, and vice versa.

Once again, it's not an Apple format: it was developed by MPEG and is published by ISO/IEC, just like H.264 and H.265.

Or do you think H.264 and H.265 are an "Apple format" as well?

jiggawatts 3 hours ago [-]
> It's not an Apple format

Create a HDR HEIC file on anything other than an Apple Device.

Upload it to an Apple Device.

Now use it any way: Forward it, attach it to a message, etc...

This won't work.

It won't ever work because the "standard" is not what Apple implements. They implement a few very specific subsets that their specific apps produce, and nothing else.

Nobody else implements these specific Apple versions of HEIC. Nobody.

For example, Adobe Lightroom can only produce a HEIC file on an Apple device.

My Nikon camera can produce a HDR HEIC file in-body, but it is useless on an Apple device because it's too dark and if forwarded in an iMessage... too bright!

It's a shit-show, comparable to "IPv6 support" which isn't.

graealex 37 minutes ago [-]
That's not an argument. HEIC is to HEVC what WebP is to WebM. The lack of support in other products is due to developers not picking up the pace and sticking with "GIF, JPEG and PNG is good enough".
glitchc 20 hours ago [-]
> Just as there is a clear winner for video - av1 - there seems to be nothing in the way of "this is clearly the future, at least for the next few years" when it comes to encoding images.

Say what? A random scan across the internet will reveal more videos in MP4 and H.264 format than av1. Perhaps streaming services have switched, but that is not what regular consumers usually use to make and store movies.

CharlesW 20 hours ago [-]
New compressed media formats always travel a decade-long path from either (1) obscurity → contender → universal support or (2) obscurity → contender → forgotten curiosity. AV1 is on one path, WebP is on another.
Andrex 19 hours ago [-]
As someone who doesn't follow this stuff, which is on which path?
CharlesW 19 hours ago [-]
WebP remains fairly esoteric after 15 years, has always been a solution in search of a problem, and isn’t even universally supported in products by its owner.

AV1 was created and is backed by many companies via a non-profit industry consortium, solves real problems, and its momentum continues to grow. https://bitmovin.com/blog/av1-playback-support/

graealex 2 hours ago [-]
Funnily enough, JPEG2000-support was eventually removed everywhere. I assume the only reason this didn't happen with WebP as well is Google pushing and keeping it in Chrome.
wongarsu 54 minutes ago [-]
Also the Google's lighthouse benchmark pushing webp recommendations, and people listening to it because of SEO concerns
graealex 35 minutes ago [-]
True. I mentioned JPEG2000 because it had a similar fate, in particular no real reason to use it in the first place.
modeless 19 hours ago [-]
AV1 is on the path to universal support and WebP is on the path to obscurity.
tedunangst 18 hours ago [-]
Apple CPUs have AV1 support in hardware.
zimpenfish 9 hours ago [-]
Only support for decoding and from A17 Pro and M3 onwards, I believe? Going to be a few years before that's commonly available (he says from the work M1 Pro.)

[edit: clarify that it's decoding only]

consp 16 hours ago [-]
So does every modern GPU. This is nothing special.
airstrike 12 hours ago [-]
I think you're arguing the same point—that there's plenty of support and it's arguably growing.
Izkata 20 hours ago [-]
Yeah, I think I only just found out about av1 a few weeks ago with a video file that wouldn't play. Thought it was corrupted at first it's been so long since I saw something like that.
ksec 15 hours ago [-]
And H.264 is about to be patent free this year in many places.
userbinator 11 hours ago [-]
I suspect there are even more H.265 than av1.
jandrese 19 hours ago [-]
From what I've seen WebP is probably the strongest contender for a JPEG replacement. It's pretty common in the indie game scene for example to re-encode a JPEG game to WebP for better image quality and often a significant (25% or more) savings on installer size. Support is coming, albeit somewhat slowly. It was pretty bad in Ubuntu 22, but several apps have added support in Ubuntu 24. Windows 11 supports WebP in Photos and Paint for another example.
0x20cowboy 19 hours ago [-]
I hate webp. Not for any legitimate technical reason like, but I often just want to download an image from the web for an image board or drop it in a diagram or ppt or for a joke and nothing works with that format. Nothing. Google image search is useless because of it.

Cmd+shift+4 is now the only way to grab an image out of a browser. Which is annoying.

It has made my life needlessly more complicated. I wish it would go away.

Maybe if browsers auto converted when you dragged ann image out of the browser window I wouldn’t care, but when I see webp… I hate.

cosmic_cheese 18 hours ago [-]
Webp images are right up there with the fake transparent PNGs you come across in Google Images.
frollogaston 16 hours ago [-]
Even if webp got better support later, I want it deprecated just as revenge for previously wasting my time.
socalgal2 12 hours ago [-]
That's true of any new format. Until everything supports it it's not so great. iPhone saves .HEIC which I have to convert to something else to be useful. It's not everywhere (not sure it ever will be).

Windows didn't use to show .jpgs in the window explorer. I know becase I wrote a tool to generate thumbnail HTML pages to include on archive CDs of photos.

To solve this problem, some format has to "win" and get adopted everywhere. That format could be webp, but it will take 3-10 years before everything supports it. It's not just the OS showing it in it's file viewer. It's it's preview app supporting it. It's every web site that lets you upload an image (gmail/gmaps/gchat/facebook/discord/messenger/slack/your bank/apartment-rental-companies, etc..etc..etc..) I just takes forever to get everyone to upgrade.

bapak 11 hours ago [-]
When does a format stop being new? WebP was introduced fifteen years ago.
graealex 32 minutes ago [-]
When it's widely adopted.

WebP gets pushed into your series of tubes without your consent, and the browser that you're most likely to use to view them just happens to be made by the same company that invented the codec. It's DivX and Real Media all over again.

nyanpasu64 18 hours ago [-]
My working model is that WebP images are generally a lossy copy of a PNG or a generation-loss transcoding of a JPG image. I know that lossless WebP technically exists but nobody uses it when they're trying to save bandwidth at the cost of the user.
jandrese 17 hours ago [-]
Worst case you can open it up in Paint and save as JPEG.

Also, I just checked and Powerpoint has no problem dropping in a webp image. Gimp opens it just fine. You are right that web forums are often well behind the times on this.

harry8 13 hours ago [-]
Total agreement from me, I use this:

bin/webp2png:

    #!/bin/bash
    dwebp "$1" -o  "${1%%.webp}".png
nntwozz 18 hours ago [-]
I use ThumbsUp a free utility from https://www.devontechnologies.com/apps/freeware to convert webp/heic or whatever inconvenient format.

Just drop the offending image onto the icon in the dock.

nottorp 4 hours ago [-]
Pretty sure I've managed to configure my Firefoxes to act as webp does not exist...
graealex 30 minutes ago [-]
It's a constant battle though to keep those browser extensions updated, especially since Google decided that extensions cut into their profits and they essentially made them useless.
voidUpdate 7 hours ago [-]
If you want to do that so badly and hate webp so much, why not screenshot it? Then you don't have to care what format it's in on the browser
encom 17 hours ago [-]
Often (in my experience) WebP is served as a bait-and-switch even if the link ends with .jpg. So I use Curl to fetch the file, and since Curl doesn't send "Accept: image/webp" unless you tell it to, the server just gives you what you ask for.

I once edited Firefox config to make it pretend to not support WebP, and the only site that broke was YouTube.

MiddleEndian 15 hours ago [-]
lol I installed the firefox extension "Don't Accept image/webp" but I assume a lot of sites just ignore it
adzm 16 hours ago [-]
Photoshop has native WebP support now too!
wang_li 18 hours ago [-]
On the classic macs they’re was a program called DropDisk. You could drag a disk image to it and it auto mounted it. That suggests a tool for you. Make a desktop app that you can drag and drop images on that converts them to jpeg and saves them in a folder.
msephton 16 hours ago [-]
You can create this using Automator in a minute.
16 hours ago [-]
jamiek88 14 hours ago [-]
thumbsup app does exactly this.
Salgat 18 hours ago [-]
Exactly, part of being a "superior format" is adoption. Until then, it's just another in a sea of potential.
graealex 1 hours ago [-]
> It's pretty common in the indie game scene

That's such a weak argument. If I was an indie game developer, I would use whatever obscure format would offer me the most benefit, since I control the pipeline from the beginning (raw TIFF/TGA/PNG/... files) to the end (the game that needs to have a decoder and will uncompress it into GPU memory). 20 minutes extra build-time on the dev machine is irrelevant when I can save hundreds of MBs.

However, that is not the benchmark for a format widely used on the internet. Encoding times multiply, as does the need to search for specialized software, and literally everyone else needs to support the format to be able to view those files.

Sammi 19 hours ago [-]
Also webp support in browsers is looking pretty good these days: https://caniuse.com/webp

The last major browser to add support was Safari 16 and that was released on September 12, 2022. I see pretty much no one on browsers older than Safari 16.4 in metrics on websites I run.

userbinator 11 hours ago [-]
People want to use images outside of browsers too.
prmoustache 5 hours ago [-]
What apps are you using in 2025 that handle images but doesn't support webp?

I can't think of any on my Fedora desktop for instance.

wongarsu 52 minutes ago [-]
Lots of websites that expect me to upload images only accept jpeg and png.

Another one I recently interacted with are video backgrounds for zoom. Those apparently can only be jpeg, not even png

jcynix 2 hours ago [-]
Luminar Neo, for example, doesn't handle webp. And there's more than just Fedora, IIRC.
out_of_protocol 5 hours ago [-]
.. or you can go directly to avif - https://caniuse.com/avif (93%) instead of webp (95% support).
turnsout 16 hours ago [-]
Yeah, after seeing the logs I made the switch to webp earlier this year. As much as I hate to admit it (not a fan of Google), it’s a pretty big bandwidth savings for the same (or better) quality.
hombre_fatal 16 hours ago [-]
I switched to webp on my forum for avatars and other user image uploads.

With one format you get decent filesize, transparency, and animation which makes things much simpler than doing things like conditionally producing gifs vs jpegs.

hengheng 20 hours ago [-]
Normal people use jpeg. It's good enough, much like mpeg-2 was good enough for DVDs. Compatibility always beats storage efficiency.

Photography nerds will religiously store raw images that they then never touch. They're like compliance records.

munchler 20 hours ago [-]
I think most photography nerds who want to save edited images to a lossless format will use TIFF, which is very different from the proprietary "raw" files that come out straight out of the camera.
IAmBroom 19 hours ago [-]
You'd be wrong in my experience.

No photog nerd wants EVEN MORE POSTPROCESSING.

munchler 17 hours ago [-]
I don't understand. You've got to save the edited result in a file somehow. What format do you use?
msephton 15 hours ago [-]
The file as it comes out of the camera, so-called raw, is a family of formats. Usually such files are kept untouched and any edits are saved in a lightweight settings file (in the format of your editing app) alongside the original.
ksec 15 hours ago [-]
And a low of RAW format are adopting or considering adopting JPEG Lossless as codec.
nottorp 4 hours ago [-]
Is that like how javascript was named so as to imply a connection with java, in spite of there being none?

JPEG is the ur-example of lossy compression. JPEG Lossless can't have any connection with that.

1 hours ago [-]
inferiorhuman 11 hours ago [-]
Most raw files are TIFF with proprietary tags.
Gigachad 17 hours ago [-]
Normal people just use whatever the default on their phone is. Which for iPhone is HEIC, not sure about Android, AVIF?
kllrnohj 14 hours ago [-]
> not sure about Android, AVIF?

JPEG, or fancier jpeg: https://developer.android.com/media/platform/hdr-image-forma...

martin_a 23 hours ago [-]
> How in the world do people store images / photos nowadays?

Well, as JPEGs? Why not? Quality is just fine if you don't touch the quality slider in Photoshop or other software.

For "more" there's still lossless camera RAW formats and large image formats like PSD and whatnot.

JPEG is just fine.

afiori 20 hours ago [-]
I wonder how much of JPEG good quality is that we are quite accustomed to its artefacts.
mrob 4 hours ago [-]
JPEG artifacts are less disturbing because they're so obviously artificial. WEBP and similar artifacts look more natural, which makes them harder to ignore.
BugsJustFindMe 20 hours ago [-]
At high quality, the artifacts are not visible unless you take a magnifying glass to the pixels, which is a practice anathema to enjoying the photo.
afiori 18 hours ago [-]
I am referring to highly compressed images or low resolution ones, at high bitrates mostly all formats look the same.

what i mean is that jpeg squarish artifacts look ok while av1 angular artifacts look distorted

Arainach 19 hours ago [-]
I've never seen JPEG artifacts on images modified/saved 5 or fewer times. Viewing on a monitor including at 100%, printing photos, whatever - in practice the artifacts don't matter.
somat 4 hours ago [-]
jpeg artifacts mainly show up on drawings. where they seriously degrade masking operations. which is a hobby of mine. so I always appreciate it when a drawing is a png. rather than a bunch of jpeg goop.
a-french-anon 6 hours ago [-]
For non-photographic images, I'm horribly sensible to the ringing artifacts. Thankfully, there's waifu2x (in denoise mode only) to remove those when textures don't confuse it too much and I use MozJPEG to encode, which really improves the result.
whaleofatw2022 20 hours ago [-]
There's something to be said about this. A high quality JPEG after cleanup can sometimes be larger than an ARW (sony RAW) on export and it makes no sense to me.
zuminator 9 hours ago [-]
People often forget that PNG images can be compressed in a lossy manner to keep the filesize down, not quite as well as jpegs but still quite substantially.

https://pngmini.com/lossypng.html

https://pngquant.org/

https://css-ig.net/pinga

asielen 12 hours ago [-]
Tiff if you want to archive them and they started as raw or tiff, jpeg for everything else. If the file is already jpeg, there is no point in covering it to a new better quality format, the quality won't get better than it already is.

It may be obsolete, but it is ubiquitous. I care less about cutting edge tech than I do about the probability of being able to open it in 20+ years. Storage is cheap.

Presentation is a different matter and often should be a different format than whatever your store the original files as.

acomjean 11 hours ago [-]
And jpg isn’t that bad when encoded at high quality, and not saved repeatedly.

I took a tiff and saved it high quality jpg. Loaded both into photoshop and “diffed” them (basically subtracted both layers). After some level adjustment you could see some difference but it was quite small.

thisislife2 18 hours ago [-]
> How in the world do people store images / photos nowadays?

I had some high resolution graphic works in TIFF (BMP + LZW). To save space, I archived them using JPEG-2000 (lossless mode), using the J2k Photoshop plug-in ( https://www.fnord.com/ ). Saved tons of GBs. It has wide multi-platform support and is a recognized archival format, so its longevity is guaranteed for some time on our digital platforms. Recently explored using HEIF or even JPEG-XL for these but these formats still don't handle CMYK colour modes well.

djeastm 16 hours ago [-]
>are nigh-unworkable on desktops, support is very spotty

I use .webp often and I don't understand this. At least on Windows 10 I can go to a .webp and see a preview and double-click and it opens in my image editor. Is it not like this elsewhere?

hadlock 16 hours ago [-]
Try uploading one to any web service. Like imgur.
pjmlp 8 hours ago [-]
JPEG is old, and it works.

Images are sorted in folders, per year and some group description based on why they were taken, vacations, event, whatever.

Enable indexing on the folders, and usually there are no freezes to complain about.

twotwotwo 15 hours ago [-]
I recognize it as beating a dead horse now, but JPEG XL did what was needed to be actually adopted. AVIF has not been widely adopted given the difficulty of a leap to a new format in general and the computational cost of encoding AVIF specifically.

One of JPEG XL's best ideas was incorporating Brunsli, lossless recompression for existing JPEGs (like Dropbox's Lepton which I think might've been talked about earlier). It's not as much of a space win as a whole new format, but it's computationally cheap and much easier to just roll out today. There was even an idea of supporting it as a Content-Encoding, so a right-click and save would get you an OG .jpg avoiding the whole "what the heck is a WebP?" problem. (You might still be able to do something like this in a ServiceWorker, but capped at wasm speeds of course.) Combine it with improved JPEG encoders like mozjpeg and you're not in a terrible place. There's also work that could potentially be done with deblocking/debanding/deringing in decoders to stretch the old approach even further.

And JXL's other modes also had their advantages. VarDCT was still faster than libaom AVIF, and was reasonable in its own way (AVIFs look smoother, JXL tended more to preserve traces of low-contrast detail). There was a progressive mode, which made less sense in AVIF because it was a format for video keyframes first. The lossless mode was the evolution of FUIF and put up good numbers.

At this point I have no particular predictions. JPEG never stopped being usable despite a series of more technically sophisticated successors. (MP3 too, though its successors seemed to get better adoption.) Perhaps it means things continue not to change for a while, or at least that I needn't rush to move to $other_format or get left behind. Doesn't mean I don't complain about the situation in comments on the Internet, though.

todotask2 12 hours ago [-]
At least there's JPEG-XL support in recent Windows 11 updates.

I've found that sometimes WebP with lossless compression (-lossless) results in smaller file sizes for graphics than JPEG-XL and sometimes it's the other way around.

theandrewbailey 15 hours ago [-]
> JPEG-XL seemed like the next logical step until Google took their toys and killed it despite already having the support in Chrome, which pretty much makes it dead in the water (don't you just love monopolies making decisions for you?)

I've done several tests where I lowered the quality settings (and thus, the resulting file size) of JPEG-XL and AVIF encoders over a variety of images. In almost every image, JPEG-XL subjective quality fell faster than AVIF, which seemed mostly OK for web use at similar file sizes. Due to that last fact, I concede that Chrome's choice to drop JPEG-XL support is correct. If things change (JPEG-XL becomes more efficient at low file sizes, gains Chrome support), I have lossless PNG originals to re-encode from.

12 hours ago [-]
SAI_Peregrinus 22 hours ago [-]
I store Raw + PSD with edits/history + whatever edited output format(s) I used.
PaulHoule 20 hours ago [-]
I've done a few shootouts at various times in the last 10 years. I finally decided WebP was good for the web maybe two years ago, that is, I have 'set it or forget it' settings and get a good quality/size result consistently. (JPEG has the problem that you really need to turn the knob yourself since a quality level good for one image may not be good for another one)

I don't like AVIF, at least not for photos I want to share. I think AVIF is great for "a huge splash image for a web page that nobody is going to look at closely" but if you want something that looks like a pro photo I don't think it's better than WebP. People point out this example as "AVIF is great"

https://jakearchibald.com/2020/avif-has-landed/demos/compare...

but I think it badly mangles the reflection on the left wing of the car and... it's those reflections that make sports cars look sexy. (I'll grant that the 'acceptable' JPEG has obvious artifacts whereas the 'acceptable' AVIF replaced a sexy reflection with a plausible but slighly dull replacement)

nvch 15 hours ago [-]
RAW? Storage is becoming cheeper, why discard the originals?

When looking for a format to display HQ photos on my website I settled with a combination of AVIF + JPG. Most photos are AVIF, but if AVIF is too magical comparatively to JPG (like 3x-10x smaller) I use a larger JPG instead. "Magic" means that fine details are discarded.

WebP discards gradients (like sunset, night sky or water) even at the highest quality, so I consider it useless for photography.

mrheosuper 12 hours ago [-]
not every storage is created equal. 1TB hdd is dirt cheap, 1TB of cloud storage is expensive af
zeroq 19 hours ago [-]
For video it's not as easy as it takes way more compute and requires hardware support.

You can take any random device and it will be able to decode h264 at 4k. h265 not so much.

As for av1 - my Ryzent 5500GT released in 2024 does not support it.

7speter 19 hours ago [-]
I think the only cpus with av1 support right now, whether encode, decode or both, are tile era Meteor/Lunar/Arrowlake cpus from Intel.
Jap2-0 17 hours ago [-]
Actually, it goes back to Tiger lake (2020; 2021 for desktop). [0]

Addendum: AMD since RDNA2 (2020-2021-ish) [1], NVIDIA since 30 series (2020) [2], Apple since M3? (2023).

Note: GP's processor released in 2024 but is based on an architecture from 2020.

[0] https://en.wikipedia.org/wiki/Intel_Quick_Sync_Video#Hardwar...

[1] https://en.wikipedia.org/wiki/Video_Core_Next#Feature_set

[2] https://developer.nvidia.com/video-encode-and-decode-gpu-sup...

sgerenser 16 hours ago [-]
Apple M3 and newer CPUs and A17Pro and newer mobile CPUs also have hardware AV1 decode.
esperent 6 hours ago [-]
> How in the world do people store images / photos nowadays?

PNG where quality matters, JPG where size matters.

alistairSH 14 hours ago [-]
HEIC for photos taken by my iPhone. Apple stuff seems to do a mostly ok job auto-converting to JPG when needed (I assume, since I haven’t manually converted one in ages).

And JPG for photos taken on a “real” camera (including scanned negatives). Sometimes RAW, but they’re pretty large so not often.

rr808 13 hours ago [-]
I found that if you plug a iphone into a windows PC and copy the photos off it will convert to jpg. However it makes copying very slow, and the quality is worse, so I'd advise to turn off the setting on the phone (I think its compatibility mode or similar)
MallocVoidstar 22 hours ago [-]
AV1 is not the clear winner for video. Currently-existing encoders are worse than x265 for high-bitrate encodes.
CharlesW 20 hours ago [-]
AV1's advantage narrows to ~5% over H.265 at very high data rates, in the same way that MP3 at 320 kbps is competitive with AAC at 320 kbps. But AV1 is never worse than H.265 from a VMAF/PSNR perspective at any bitrate, and of course H.265 is heavily patent encumbered in comparison. https://chipsandcheese.com/p/codecs-for-the-4k-era-hevc-av1-...
ksec 15 hours ago [-]
>AV1's advantage narrows to ~5% over H.265 at very high data rates.... But AV1 is never worse than H.265 from a VMAF/PSNR perspective at any bitrate,

There is a whole discussions that modern codec, or especially AV1 simply doesn't care about PSY image quality. And hence how most torrents are still using x265 because AV1 simply doesn't match the quality offered by other encoder/ x265. Nor does the AOM camp cares about it, since their primarily usage is YouTube.

>in the same way that MP3 at 320 kbps is competitive with AAC at 320 kbps.

It is not. And never will be. MP3 has inherent disadvantage that needs substantial higher bitrate for quite a lot of samples, even at 320kbps. We have been through this war for 10 years at Hydropgenaudio with Data to back this up, I dont know why in the past 2-3 years the topic has pop up once again.

MP3 is not better than AAC-LC in any shape or form even at 25% higher bitrate. Just use AAC-LC, or specifically Apple's Quick Time AAC-LC Encoder.

CharlesW 14 hours ago [-]
> There is a whole discussions that modern codec, or especially AV1 simply doesn't care about PSY image quality.

In early AV1 encoders, psychovisual tuning was minimal and so AV1 encodes often looked soft or "plastic-y". Today's AV1 encoders are really good at this when told to prioritize psy quality (SVT-AV1 with `--tune 3`, libaom with `--tune=psy`). I'd guess that there's still lots of headroom for improvements to AV1 encoding.

> And hence how most torrents are still using x265 because…

Today most torrents still use H.264, I assume because of its ubiquitous support and modest decode requirements. Over time, I'd expect H.265 (and then AV1) to become the dominant compressed format for video sharing. It seems like that community is pretty slow to adopt advancements — most lossy-compressed music <finger quotes>sharing</finger quotes> is still MP3, even though AAC is a far better (as you note!) and ubiquitous choice.

My point about MP3 vs. AAC was simply: As you reduce the amount of compression, the perceived quality advantages of better compressed media formats is reduced. My personal music library is AAC (not MP3), encoded from CD rips using afconvert.

shiroiuma 14 hours ago [-]
>Today most torrents still use H.264

That's not what I'm seeing for anything recent. x265 seems to be the dominant codec now. There's still a lot of support for h.264, but it's fading.

MallocVoidstar 19 hours ago [-]
I don't care about VMAF or PSNR, I care about looking with my eyes. With x265 on veryslow and AV1 on preset 0/1, and the source being a UHD BD I was downscaling to 1080p, AV1 looked worse even while using a higher bitrate than x265. Current AV1 encoders have issues with small details and have issues with dark scenes. People are trying to fix them (see svt-av1-psy, being merged into SVT-AV1 itself) but the problems aren't fixed yet.
ksec 14 hours ago [-]
>see svt-av1-psy, being merged into SVT-AV1 itself

Part of it being merged for now.

It is unfortunate this narrative hasn't caught on. Actual quality over VMAF and PSNR. And we haven't had further quality improvement since x265.

I do get frustrated every time the topic of codec comes up on HN. But then the other day I only came to realise I did spend ~20 years on Doom9 and Hydrogenaudio I guess I accumulated more knowledge than most.

eviks 9 hours ago [-]
Well, did your "eyes" care more about fidelity or appeal?

https://cloudinary.com/blog/what_to_focus_on_in_image_compre...

spookie 15 hours ago [-]
Yup, have had the same experience.
zmj 18 hours ago [-]
I recently reencoded my photography archive to webp. It's a static site hosted from S3. I was pretty happy with the size reduction.
esafak 10 hours ago [-]
Assuming you are asking about archiving: Use the original format it came in. If you're going to transcode it should be to something lossless like J2K or PNG.
tristor 23 hours ago [-]
For my extensive collection of photography, I export to JPEG-XL and then convert to JPEG for use online. Most online services, like Flickr, Instagram, et al don't support JPEG-XL, but there's almost no quality loss converting from JPEG-XL to JPEG vs exporting to JPEG directly from your digital asset management system, and storing locally in JPEG-XL works very well. Almost all desktop tools I use support JPEG-XL natively already, conversely almost nothing support WEBP.
Zardoz84 22 hours ago [-]
There is NO quality loss when converting from JPEG XL to JPEG and vice versa. It was done by design. Not an accident.
eviks 9 hours ago [-]
You're confusing jpg>jxl>jpg, which can be done losslessly via a special mode, and jxl > jpg, which can't (even ignoring all the extra features of jxl that jpg doesn't support)
adgjlsfhk1 20 hours ago [-]
this isn't true. there's no loss from jpeg to jpeg-xl (if you use the right mode), but the reverse is not true
Zardoz84 20 hours ago [-]
I sorry to say that you are wrong about this.

> Key features of the JPEG XL codec are: > lossless JPEG transcoding,

> Moreover, JPEG XL includes several features that help transition from the legacy JPEG coding format. Existing JPEG files can be losslessly transcoded to JPEG XL files, significantly reducing their size (Fig. 1). These can be reconstructed to the exact same JPEG file, ensuring backward compatibility with legacy applications. Both transcoding and reconstruction are computationally efficient. Migrating to JPEG XL reduces storage costs because servers can store a single JPEG XL file to serve both JPEG and JPEG XL clients. This provides a smooth transition path from legacy JPEG platforms to the modern JPEG XL.

https://ds.jpeg.org/whitepapers/jpeg-xl-whitepaper.pdf

If you need more profs, you could transcode a JPEG to JPEG XL and convert against to JPEG. The result image would be BINARY IDENTICAL to the original image.

However, perhaps are you talking about an image on JPEG XL, using features only in JPEG XL (24 bit, HDR, etc...) that obviously couldn't be converted in a lossless way to a JPEG.

CorrectHorseBat 19 hours ago [-]
Yes, JPG to JPEG XL and back is lossless, but the reverse is nowhere mentioned.

Trying around with some jpg and jxl files I cannot convert jxl losslessly to jpg files even if they are only 8bit. The jxl files transcoded from jpg files show "JPEG bitstream reconstruction data available" with jxlinfo, so I think some extra metadata is stored when going from jpg to jxl to make the lossless transcoding possible. I can imagine not supporting the reverse (which is pretty useless anyway) allowed for more optimizations.

spider-mario 19 hours ago [-]
> However, perhaps are you talking about an image on JPEG XL, using features only in JPEG XL (24 bit, HDR, etc...) that obviously couldn't be converted in a lossless way to a JPEG.

A lot of those features (non-8×8 DCTs, Gaborish and EPF filters, XYB) are enabled by default when you compress a non-JPEG image to a lossy JXL. At the moment, you really do need to compress to JPEG first and then transcode that to JXL if you want the JXL→JPEG direction to be lossless.

gabrielhidasy 20 hours ago [-]
> However, perhaps are you talking about an image on JPEG XL, using features only in JPEG XL (24 bit, HDR, etc...) that obviously couldn't be converted in a lossless way to a JPEG.

So he was not wrong about this. You have perfect JPEG -> JPEG XL conversion, but not the other way around.

adgjlsfhk1 18 hours ago [-]
default jpeg-xl uses a different color space (XYZ), bigger transform (up to 256x256), rectangular transforms, etc. if you go from jpg to jxl, you can go back (but your jxl file will be less efficient), but if you compress directly to jxl, you can't losslessly go to jpg
tristor 21 hours ago [-]
That's good to know. I'm not an image format expert, but I couldn't see any loss that was visually discernible at any rate.
bilekas 19 hours ago [-]
You’re right, for a lot of scenario which is exactly what a standard is there to do, encapsulating the broad strokes.

> Alternatives like WebP might be good for browsers but are nigh-unworkable on desktops, support is very spotty.

Right again, and WebP is the enrichment that goes with the backend when dealing with web. I wouldn’t knock it for not being local compatible, it was designed for the web first and foremost, I think it’s in the name.

troupo 4 hours ago [-]
> HEIC is good,

It's not. Support is still surprisingly patchy, and it takes a second or so to decode and show the image even on a modern M* Mac. Compared to instant PNG.

> I have a whole bunch of .HEIC photos and I'd really like if Windows Explorer didn't freeze for literal minutes when I open a folder with them.

Indeed.

redeeman 19 hours ago [-]
just use jpegxl. works great on linux. Pressure software you use to use the proper formats
gsich 17 hours ago [-]
>How in the world do people store images / photos nowadays?

With codecs built for that purpose I hope. Intra-frame misconceptions "formats" should stay that way. A curiosity.

dangus 20 hours ago [-]
> HEIC is good, as long as you pinky promise to never ever leave Apple's ecosystem, ie HEIC sucks.

Not really true in my experience, I have no problems using it in Windows 11, Linux, or with my non-Apple non-Google cloud photos app.

The iPhone using it in an incredibly widespread way has made it a defacto standard.

If you're having trouble in Windows, I wonder if you're running Windows 11 or 10? Because 11 seems a lot better at supporting "modern things" considering that Microsoft has been neglecting Windows 10 for 3 years and is deprecating it this year.

munchler 20 hours ago [-]
My problem with HEIC is that if you convert it to another format, it looks different from the original, for reasons that I don't understand. I switched my iPhone back to JPEG to avoid that.
jorl17 20 hours ago [-]
Perhaps due to HDR handling?
spookie 15 hours ago [-]
Only Gwenview on the Linux side is able to render them properly somehow
heraldgeezer 19 hours ago [-]
>Just as there is a clear winner for video - av1

What?? Maybe I'm too much in aarrrgh circles but it's all H.264 / 265...

NoMoreNicksLeft 22 hours ago [-]
>JPEG-XL seemed like the next logical step until Google took their toys and killed it despite already having the support in Chrome,

Why would I ever care about Chrome? I can't use adblockers on Chrome, which makes the internet even less usable than it currently is. I only start up chrome to bypass cross-origin restrictions when I need to monkey-patch javascript to download things websites try to keep me from downloading (or, like when I need to scrape from a Google website... javascript scraper bots seem to evade their bot detection perfectly, just finished downloading a few hundred gigabytes of magazines off of Google Books).

Seriously, fuck Chrome. We're less than 2 years away from things being as bad as they were in the IE6 years.

codazoda 22 hours ago [-]
I think we're there, not 2 years away.

I have software that won't work quite right in Safari or Firefox through a VPN every single day. Maybe it's the VPN and maybe it's the browser but it doesn't matter. We're at IE levels it's just ever so slightly more subtle this time. I'm still using alternatives but it's a battle.

NoMoreNicksLeft 21 hours ago [-]
VPN's layer 2... I suppose it could be resizing packets in such as way as to make it glitch out, but that just seems improbable.

Some of the warez sites I download torrents from have captchas and other javascripticles that only work on Chrome, but I've yet to see it with mainstream sites.

Fight the good fight.

frollogaston 16 hours ago [-]
The VPN could be on an IP address with a bad reputation that's getting soft-blocked by some stuff
crazygringo 17 hours ago [-]
> I can't use adblockers on Chrome

Why does this myth persist?

uBlock Origin Lite works perfectly fine on Chrome, with the new Manifest v3. Blocks basically all the ads uBlock Origin did previously, including YouTube. But it uses less resources so pages load even faster.

There's an argument that adblocking could theoretically become less effective in the future but we haven't seen any evidence of that yet.

So you can very much use adblockers on Chrome.

12 hours ago [-]
frollogaston 16 hours ago [-]
If uBO Lite is really better, why does uBO exist?
crazygringo 16 hours ago [-]
Because uBO Lite uses a newer Chrome function call (declarativeNetRequest) that didn't exist previously (original uBO was based on webRequest).

webRequest is slower because it has to evaluate JavaScript for each request (as well as the overhead of interprocess communication), instead of the blocking being done by compiled C++ code in the same process like declarativeNetRequest does.

uBO also has a bunch of extra features like zapping that the creator explicitly chose not to include in uBO Lite, in the interests of making the Lite version as fast and resource-light as possible. For zapping, there are other extensions you can install instead if you need that.

They're two different products with two different philosophies based on two different underlying architectures. The older architecture has now gone away in Chrome, but the new one supports uBlock Origin Lite great.

pikelet 8 hours ago [-]
I think you're overstating it a bit. There were definitely features that couldn't be implemented due to MV3 limitations rather than because the developer chose to leave them out.

https://github.com/uBlockOrigin/uBOL-home/wiki/Frequently-as...

frollogaston 15 hours ago [-]
Thanks, I'll try it out then.
bob1029 23 hours ago [-]
Beyond the compression (which is amazing), JPEG is also extremely fast when implemented well. I'm not aware of any other image format that can encode at 60fps+ @ 1080p on a single CPU core. Only mild SIMD usage is required to achieve this. With dedicated hardware, the encode/decode cost quickly goes to zero.

I struggle to understand the justification for other lossy image formats as our networks continue to get faster. From a computation standpoint, it is really hard to beat JPEG. I don't know if extra steps like intra-block spatial prediction are really worth it when we are now getting 100mbps to our smartphones on a typical day.

MrDOS 23 hours ago [-]
https://news.ycombinator.com/item?id=44298656

You might be getting 100 Mbps to your smartphone; many people – yes, even within the United States – struggle to attain a quarter of that.

bob1029 23 hours ago [-]
What is the likelihood of experiencing precisely marginal network conditions wherein webp improves the user experience so dramatically over jpeg that the user is able to notice?

If jpeg is loading like ass, webp probably isn't going to arrive much faster.

MrDOS 23 hours ago [-]
I'm sorry, I misunderstood your doubt of the usefulness of other lossy formats as criticism of using lossy formats in general in the face of higher bandwidth. Reading too fast, never mind me... :)
GuB-42 15 hours ago [-]
If you have slow internet on your smartphone, chances are that you also have a slow smartphone, and therefore decoding performance matter, it may also save you a bit of battery life for the same reason, which may be important in place with little internet coverage.

You have to find a balance, and unless (still) pictures are at the center of what you are doing, it is typically only a fraction of the bandwidth (and a fraction of the processing power too).

We are not talking about 100 Mbps, we downloaded JPEGs from dialup connections you know. You don't even need to go into the Mbps unless you are streaming MJPEG (and why would you do that?).

tossaway0 10 hours ago [-]
25Mbps is extremely fast in relation to the benefits when browsing the web of better image compression options than JPEG.
Lammy 18 hours ago [-]
> I struggle to understand the justification for other lossy image formats as our networks continue to get faster.

Because Google's PageSpeed and Lighthouse both tell people to use WebP, and a large percentage of devs will do anything Google say in the hopes of winning better SERP placement:

- https://web.dev/articles/serve-images-webp

- https://developer.chrome.com/docs/lighthouse/performance/use...

illiac786 19 hours ago [-]
That’s why I am confident LLMs won’t change as much as some may think: after 20+ years of search engines, some still can’t be bothered to do a simple search. (Either that or you’re trolling, I can’t decide I have to say.) Hence, we can wait another 20 years and some will still not use LLMs for everyday questions.

To answer your (false?) question, there’s a long list of benefits, but I’d say HDR and storage efficiency are the two big ones I can think of. The storage efficiency especially is massive, especially with large images.

redeeman 19 hours ago [-]
transparancy? hdr? proper support for lossless? theres many things lacking in jpeg
wizardforhire 23 hours ago [-]
Exactly! It’s like asking why we still use wheels when hovercrafts exist.

If humans are still around in a thousand years they’ll be using jpegs and they’ll still be using them a thousand years after that. When things work they have pernicious tendency to stick around.

dsr_ 18 hours ago [-]
Wheels continue to support a load without power.

Wheels are vastly superior to hover technologies in the crucial areas of steering and controlled braking. (For uncontrolled braking, you just cut the power to your hover fans and lift the skirts...)

It turns out to be remarkably difficult to get a hovercraft to go up an incline...

Wheels are both suspension and traction in one system.

There's no particular physical advantage to JPEG over the others mentioned; it's just currently ubiquitous.

pbhjpbhj 23 hours ago [-]
Can JPEG do 3D somehow (I'm thinking VR/AR)? DVDs lasted well, until the medium itself moved to cheap NAND flash and then various SSD technologies.

When/if simple screens get usurped then we'll likely move on from JPEG.

I'm sure you were being a little flippant but your last sentence shows good insight. Someone said "we just need it to work" to me the other day and the "if it works there will be little impetus to improve it"-flag went off in my brain.

adgjlsfhk1 20 hours ago [-]
depends what you mean by 3d. jpeg-xl does let you add arbitrary channels, so you could add a depth channel, but it's not going to do a good job for full 3d (e.g. light field/point cloud).

one place I think jxl will really shine is PBR and game textures. for cases like that, it's very common to have color+transparency+bump map+normal map, and potentially even more. bundling all of those into a single file allows for away better compression

wizardforhire 22 hours ago [-]
Thanks, thats a great insight!

Idk about 3d, but I’ll assume someone probably will tape something out necessity if they haven't already.

…and yes, very flippant! But not without good reason. If we are to extrapolate; the popularity of jpeg, love it or hate it, will invariably necessitate it’s continued compatibility contributing to my pervious statement. That compatibility will invariably lead to plausible hypothetical circumstances where future developers out of laziness, ignorance, or just plain conformity to norms will lead to its choice and use perpetuating the cycle. The tendency as such is that short of a radical mass extension level like event brought about by mass wide spread technological adoption such as what you describe is why I don’t see it going away anytime soon. Not to say it couldn’t happen, I just feel it’s highly improbable because of the contributing human factors.

That jpeg gets so many complaints is I feel for two reasons. One, its ubiquity and two, that we actually see it! Some similar situations that don’t get nearly as much attention but are far more pervasive are tcp/ip, bash, ntpd, ad nauseam. All old pervasive protocols so embedded as to be taken for granted, and also not able to be seen.

I’ll leave with this engineering truism that I feel should be more widely adhered to in software development, especially by UI designers: if it ain’t broke don’t fix it!

tehjoker 20 hours ago [-]
Jp3d can do 3d, but it is not well supported. It is an extension to the JPEG2000 specification iirc.
eviks 8 hours ago [-]
Besides the awful wheel comparison, there are dozens of formats that worked and stuck around until they got replaced, so this also tells us nothing on such a huge timescale
GuB-42 17 hours ago [-]
Because JPEG just works. It is not the best by far, but everybody supports it, and it is usually not worth saving a few bytes in exchange for worse support, added complexity and extra processing.

Lossy compressed images are usually not the most significant consumers of bandwidth and disk space. Videos are. That's why there have been a lot more focus on video formats than anything else, there is a lot to gain here, not so much with still images.

JPEG-XL is super-complicated because it supports plenty of things most people don't really need.

Webp is somewhat better supported because it is backed by Google, it is also what is essentially a single frame video, so if you did the hard work on video (where it matters), you get images almost for free, and it saves Google a tiny bit of bandwidth, and "a tiny bit" is huge at Google scale.

We are seeing the same thing with audio. MP3 (1991) is still extremely popular, the rest is mostly M4A/AAC (2001). We pretty much have had the perfect audio format now, which is Opus (2012) and yet, we don't even use it that much, because the others are good enough for what we make of them.

Gigachad 17 hours ago [-]
Opus does get use though behind the scenes. Iirc discord uses opus and YouTube uses some pretty exotic audio codecs. No one saves opus files to their computer though.
maeln 3 hours ago [-]
It also is used in video games. OGG/Libvorbis was already pretty popular (probably because it was free of patent), and opus has seen some use.
RadiozRadioz 16 hours ago [-]
Opus is too complex. IIRC there is only one full implementation.
donatj 46 minutes ago [-]
We started using AVIFs recently in picture elements with fallback to JPEG, and I'm a pretty big fan. The compression is good, the tooling is decent. They're supported by most things natively.

Browser support for AVIF is nearly good enough that you might not need the fallback in reality. The only real problem I have encountered is that animated AVIFs are super stuttery in Safari for some reason.

reddalo 23 hours ago [-]
The article only briefly mentions the real problem: outside of browsers, proper support for .webp files is very, very low. That's why JPEG is still king and probably still be for a long time.
AshleysBrain 23 hours ago [-]
WebP seems pretty widely supported to me - on Windows at least, Explorer shows thumbnails for them, Paint can open them, other editors like Paint.NET have built-in support... I haven't come across software that doesn't support WebP for a while.
frollogaston 22 hours ago [-]
Google Docs, of all things, does not support webp. Preview on Mac can open it but not edit. Those are my two most common use cases.
coryrc 20 hours ago [-]
I celebrated the anniversary of the (internal) bug asking for SVG support in Google slides. I think it's up to 15 years now?

So, uh, don't get your hopes up.

frollogaston 20 hours ago [-]
Well SVGs I understand being harder to support, those aren't really images. And various anti-injection security rules treat it as untrusted HTML code.
pimlottc 17 hours ago [-]
There is a workaround for using SVG in Google Slides by using Google Drive to convert to EMF (a format I’ve never heard of anywhere else). It’s a pain, though.

https://graphicdesign.stackexchange.com/questions/115814/how...

frollogaston 16 hours ago [-]
Huh, first I've heard of EMF too.
Koffiepoeder 8 hours ago [-]
It seems so strange to me that it is this hard to add: I wrote a userscript recently to extract titles from slides and if I recall Google slides already renders in an svg format. Wondering what's going on here.
jhoechtl 22 hours ago [-]
Right so on Linux/KDE.

Is missing WebP support a meme?

freedomben 20 hours ago [-]
Yep, on Gnome we have both eog and GIMP that support webp completely, and have for many years. I don't think I've even tried with other apps but haven't needed to. I didn't even realize this was a problem for some platforms
CM30 22 hours ago [-]
Case in point, DaVinci Resolve. Incredibly popular with people creating videos for YouTube and TikTok, still doesn't support webp in 2025.

This becomes an issue if you're creating content about trending topics, since lots of marketing sites love using webp for every image.

nemomarx 23 hours ago [-]
if I want to even download webp and look at the file I need to convert it. barely functional in basic image galleries outside of mobile?
k__ 23 hours ago [-]
Sometimes you upload a jpeg, they convert it to webp, and then don't allow uploading webps.
palmfacehn 23 hours ago [-]
I had uploaded lossless webp images, the 3rd party site cached the images from my server and re-encoded them as lossy format of a higher file size and lower fidelity.
edflsafoiewq 23 hours ago [-]
They'll do that to large PNGs too.
throawayonthe 23 hours ago [-]
[dead]
Acrobatic_Road 23 hours ago [-]
Also missing from popular browsers is support for the new JPEG XL format.
pbhjpbhj 23 hours ago [-]
https://caniuse.com/jpegxl

Looks like a mixture of runtime and compiler flags are needed except for Safari.

JacobiX 23 hours ago [-]
I loved the article, but it overlooks one important point: although the JPEG format is frozen, encoders are still evolving ! Advances such as smarter quantization, better perceptual models, and higher-precision maths enables us achieve higher compression ratios while sticking to a format that's supported everywhere :)
cogman10 23 hours ago [-]
This is true, but there are limits. It's a little bit like DEFLAT. Sure, very advanced compressors like Zopfi exist which can get better compression ratios. But then, there's also just Zstd which will get a better compression ratio and compression speed trivially.
edflsafoiewq 23 hours ago [-]
I guess you're thinking of jpegli? Do you know how big a difference this actually makes?
qingcharles 8 hours ago [-]
Jpegli is designed from the ashes of JPEG-XL (same author), both from Google. IIRC he also had a hand in the PNG format?
ksec 23 hours ago [-]
Anywhere from 5-15% if I remember correctly depending on source material. I was at one point thinking this would make JPEG-XL and AV1F moot because all of a sudden JPEG became good enough again. But the Author of JPEG-XL suggest there is still so much JPEG-XL encoder can do to further optimise bit / quality especially in the bpp below 1.0 range.
JacobiX 23 hours ago [-]
MozJPEG, Guetzli and also Jpegli
whizzter 2 hours ago [-]
While the article is mostly good one glaring technical error annoys me.

"The stronger the cosine transformation, the more compressed the final result" is simply wrong.

DCT (and the inverse DCT) are transforms that transform between "sample" and "frequency" domains, it's well defined should be perfectly reversible without compression (iirc the one in JPEG should be lossless given as many bits as the samples themselves).

The trick of DCT based compression is that humans don't notice when information disappears from higher frequencies(also in _natural images_ there is often little data in high frequencies, often lots of 0 that can be immediately cut).

So harder compression means removing more high frequency data from storage without it being too noticeable when reconstructing samples from the frequency domain at decompression.

Conversly however, if you have "sharp edges" in the sample data you need more higher frequencies to reproduce the sharp edges without "ringing" artefacts (this is why you will see noisy blocks around text in highly compressed JPEG's with text since it runs out of bandwidth to adjust).

The frequency domain values, and how compression affects removing various frequencies(black and white in the filter images) can be illustrated on the wikipedia filter comparsion example image below. (low frequencies are in the upper-left corner of the filter and spectrum images whilst higher frequencies horizontally are to the right and higher vertical frequencies are towards the bottom).

https://en.wikipedia.org/wiki/File:DCT_filter_comparison.png

https://en.wikipedia.org/wiki/Discrete_cosine_transform (Mainly "Example of IDCT" section towards the bottom but also the preceding ones).

legitster 23 hours ago [-]
This takes me back to when the goal of a webpage was to be less than 1Mb. At the time, the only reason to use PNG (such luxury) was when you needed transparency.

The variable compression of JPEG was very important. In Photoshop you could just grab an image and choose the file size you needed and the JPEG quality would degrade to match your design constraints.

modeless 19 hours ago [-]
The amazing thing about JPEG is people are still squeezing out backwards compatible compression improvements 30 years later. Mozjpeg is well known by now, but Jpegli was just published last year and does even better at high bitrates[1]. It's hard to want to adopt newer formats when they keep getting squeezed by further improvements to good old JPEG.

[1] https://opensource.googleblog.com/2024/04/introducing-jpegli...

bonoboTP 3 hours ago [-]
It would be interesting to see optimization of compression that's targeted at computer vision processing instead of human visual preferences. What I mean is that instead of making the image look as nice as possible at a certain file size, the goal would be to improve computer vision task performance scores, like running a classifier or segmentation model on that image and evaluating the correctness of the model prediction. A compression algo is better if the model accuracy is higher at a certain file size.

Edit: yes, this has been done https://www.ecva.net/papers/eccv_2020/papers_ECCV/papers/123...

Jinyoung Choi and Bohyung Han, Task-Aware Quantization Network for JPEG Image Compression. ECCV 2020

no_wizard 23 hours ago [-]
EDIT: I was wrong, and had PNG and JPEG formats backward. As others correctly pointed out PNG is lossless where as JPEG is lossy. PNG is better for marketing / UI / artistic imagery and JPEG for photographs due to the tolerance for JPEGs lossy encoding with photographs, seems to be the generally accepted opinion now.

Regardless, since the picture tag[0] was introduced I’ve used that for most image media by default with relevant fallbacks, with WebP as default. Also allows loading relevant sized images based on media query which is a nice bonus

[0]: https://developer.mozilla.org/en-US/docs/Web/HTML/Reference/...

chowells 23 hours ago [-]
They're just for different jobs.

JPEG is lossy, in ways that were initially optimized for photographs. The details it loses are often not details photographs are good at providing in the first place. As the upside for losing some data, it gets to pick the data it gets to compress, and it chooses it in such a way as to minimize the size of the compressed data.

PNG is a lossless format. It's practically mandatory when you need 100% fidelity, as with icons or other graphics that are intended to have high contrast in small areas. It's able to optimize large areas of the same color very well, but suffers when colors change rapidly. It's especially unsuitable for photographs, as sensor noise (or film grain, if the source was film) create subtle color variations that are very difficult for it to encode efficiently.

You basically never have a situation where either one is appropriate. They are for different things, and should be used as such.

MrDOS 23 hours ago [-]
PNGs for line art and text, JPEGs for photorealistic images.

> When I came up in the IE era

In the IE era I recall, the battle was between GIF and JPEG because IE supported alphatransparent PNGs very poorly :)

> if I recall correctly JPEG as a format can encode an image with a higher fidelity than PNG

The other way around: JPEGs are “lossy” – they throw away visual information to save file size. PNGs, on the other hand, are “lossless”, and decode back to exactly the same pixels that were fed into the encoder.

no_wizard 23 hours ago [-]
It’s been so darn long since I looked into these formats I got it backward. Thank you!
Tobani 23 hours ago [-]
They're different beasts.

JPEGS are great for photographs, where lossy compression is acceptable.

PNGs can have transparency and lossless compression, so they're great for things like rasterized vector graphics, UI-element parts of a web page, etc.

kevingadd 23 hours ago [-]
PNGs are also ideal for color accuracy since one of the things you lose quickly when converting to JPEG is the ability to have an exact RGB value flow from input to output, even at a high quality level. So if you want i.e. a banner graphic to seamlessly blend in with your site's background color, JPEG is worse for that.
martin_a 22 hours ago [-]
Sorry, but that sounds more like a problem with your color management systems and workflows and not so much with JPEG itself.
kevingadd 18 hours ago [-]
JPEGs are intrinsically YCbCr not RGB, so I don't know how you would avoid some precision loss when you combine that colorspace conversion with things like the quality level and (potentially) subsampling.
qingcharles 8 hours ago [-]
Are they? I thought JPG could switch? I know jpegli uses XYB colorspace in JPGs.
gen2brain 4 hours ago [-]
JPEG can be YCbCr, RGB or even CMYK (usually from Photoshop or similar software). There can also be unusual subsample ratios for YCbCr, such as 4:2:1. I created some WASM Go bindings for jpegli, and I recall that such images cannot be represented in Go; therefore, for these images, I force RGBA output.

Also, XYB is an option; I use the adaptive quantisation option from jpegli (with progressive level 2) to get smaller files. I never bothered with XYB, as it looked complicated.

roelschroeven 23 hours ago [-]
It depends on the use case.

For archival purposes, where you care about not losing details is more important than image size, PNG is better (though often TIFF is used for that use case). For images with large blocks of solid colors and sharp edges (text, line drawings), PNG is arguable better (though JPEG can be acceptable if you're careful with quality settings). If you need alpha support, go for PNG since JPEG doesn't support that.

For photograph-like images, where image size is important, JPEG is preferred over PNG.

uses 23 hours ago [-]
PNG should be used for some types of graphics, like whenever you have big areas of solid color (like logos) or any time you need translucency / transparency. Although, nowadays you can and should use SVG in most of those cases.

JPEG should be used for everything else.

adgjlsfhk1 20 hours ago [-]
if your software supports it, lossless jpeg-xl supports all of these and should give you better compression
ghssds 23 hours ago [-]
JPEG and PNG don't work the same way. JPEG is lossy - it removes information from the original image, and PNG is lossless. For distribution, like on a website, with JPEG you can compress the image much more than PNG without the users noticing IF the image is photographic. If the JPEG represents a drawing, a screenshot, or contains labels, the lossy compression will be noticeable and PNG is much more appropriate. To keep as master for further modifications and compressions, keeping an image in JPEG format is a bad idea, again because JPEG is lossy. You can't ever encode an image with higher fidelity in JPEG vs PNG but both are useful.
sapiogram 23 hours ago [-]
> (if I recall correctly JPEG as a format can encode an image with a higher fidelity than PNG, at least in some circumstances)~

PNG is a lossless format, so I don't think that's possible, unless there's some specific feature that is not available in PNG.

addaon 23 hours ago [-]
> if I recall correctly JPEG as a format can encode an image with a higher fidelity than PNG, at least in some circumstances

24-bit color PNGs are lossless, to the extent that the input image is encodable in 24 bits of RGB (which is pretty much everything that's not HDR). There's no higher fidelity available for normal input images. If file size limits would force palettized PNGs, it's quite possible for a JPEG at the same file size to have higher fidelity (since it makes a different set of trade-offs, keeping color resolution but giving up spatial resolution instead); but this isn't really a common or particularly valid comparison in the PNG era, was more of an issue when comparing to GIFs.

tl;dr: Nope, PNG is perfect. JPEG can approach perfect, but never get there. Comparison is only interesting with external (e.g. file size) constraints.

vikingerik 23 hours ago [-]
There is a lossless JPEG spec and format, though use of it never caught on much: https://en.wikipedia.org/wiki/Lossless_JPEG
adgjlsfhk1 20 hours ago [-]
hopefully lossless jpeg-xl actually gets used
RugnirViking 4 hours ago [-]
The single biggest things that could be done to remedy this in order

1) snipping/clipping/screenshot tool outputs webp

2) convert to webp on ctrl-c ctrl-v from browsers

3) whatsapp/messenger/discord support. People will say these work fine, in my experience its a gamble, which is shouldn't be. It should be seamless, literally no edge cases

wolf550e 19 hours ago [-]
The reason JPEGs still rule is because Google Chrome removed support for JPEG-XL, the actually better photo format, because the Google guys who did AVIF decided they don't want competition.
ethan_smith 8 hours ago [-]
Chrome's JPEG-XL removal was officially due to low usage metrics and prioritization concerns, not just competitive motives - Google's own engineers were divided on the decision, with many supporting JXL's technical merits.
hungryhobbit 23 hours ago [-]
I love how the article implies there's something flawed about webp at the end ... but if you click the link the only "flaw" reported is that webp isn't ubiquitous enough yet, so some sites don't support it

Perfect logic: let's not switch to webp because it's bad. Why is it bad? Not everyone has switched to it yet.

Lammy 18 hours ago [-]
Being a much more complicated format than JPEG (WebP is based on the VP8 video codec) invites a huge attack surface because it requires so much more code to support. On top of that, the fact that 99% of people (even Apple's ImageIO!) use Google's libwebp means any exploit can hit almost everyone all at once. This has actually happened:

- https://nvd.nist.gov/vuln/detail/CVE-2023-41064

- https://nvd.nist.gov/vuln/detail/CVE-2023-41061

- https://nvd.nist.gov/vuln/detail/CVE-2023-4863

- https://citizenlab.ca/2023/09/blastpass-nso-group-iphone-zer...

frollogaston 22 hours ago [-]
The lack of support makes me suspicious of it. If even Google Docs finds it too difficult to prioritize webp support, idk if there's some hidden problem with ease of implementation.
msabalau 22 hours ago [-]
As an enduser, I hate, hate, hate webp, because I cant' easily use the images in a wide range of ways.

Maybe it's vaguely more flexible and compresses well. I don't care. If someone uses it, I despise them.

userbinator 11 hours ago [-]
I have written a JPEG, GIF, and PNG decoder. All easily weekend projects. As for the other two I've attempted the same for:

JPEG2000: insanely complex and nonintuitive, especially the edge-cases and overly flexible encoding decisions

WebP: also complex, and effectively Google-proprietary

panja 10 hours ago [-]
Thoughts on jxl?
NooneAtAll3 10 hours ago [-]
have you tried QOI? ;)
fastball 23 hours ago [-]
> These days, the format is similar to MP3 or ZIP files—two legacy formats too popular and widely used to kill.

While killing MP3 might be difficult, the vast majority of people aren't handling audio files themselves these days, so probably not hard to phase out fairly rapidly.

frollogaston 22 hours ago [-]
I do handle files myself, but almost nothing keeps me on MP3. The only time I've even seen an MP3 in the past few years was when trying to get music into an old car system, which also sucked so much (e.g. shuffle feature with replacement) that I went back to just using the aux.
fastball 12 hours ago [-]
Yeah, most audiophiles/people I know who haven't moved entirely to Spotify/YouTube/Apply Music/etc are not storing files in mp3. FLAC or something else lossless is the obvious preference.
19 hours ago [-]
Krasnol 23 hours ago [-]
The Western World perspective on this platform generates quite funny statements sometimes. Outside of it, mp3 is still quite popular and normal.

Even within the Western World, there are many people who like to own their digital music.

geerlingguy 23 hours ago [-]
I still encode everything in MP3. The files work on my 20 year old SanDisk player, my original iPod, my iPhone, Mac, Chromebook, Windows laptop, MP3 CDs in our 08 minivan...

It's nice to have that consistent ubiquity, something very hard to find these days. Especially if you're entire audio library (audio books, podcasts, songs) comes from some streaming service that requires an app!

ryandrake 19 hours ago [-]
MP3 is still "good enough." I wasn't smart enough to keep FLACs around, and I'm not going to go back through my hundreds of CDs in boxes in my attic somewhere and re-encode every single one to a "modern" format for slight but noticeable quality gains. As you say, they also work everywhere, including my 16 year old car.
frollogaston 22 hours ago [-]
The poor quality is a dealbreaker for me. Yes it's fine with a high enough bitrate, but there isn't ubiquitous support for that.
voidUpdate 7 hours ago [-]
What settings are you using with MP3 for it to be low quality enough to be a dealbreaker? I save all my music in MP3 and it all sounds perfectly fine to me, and have done forever
encom 17 hours ago [-]
What decoders don't support VBR MP3 at this point? You would have to go back at least 20 years to find software that breaks on VBR. Maybe some very very terrible hardware decoder chokes on it?

Incidentally, breakage on VBR bitstreams is buggy behaviour, because some lazy developers assumed frame sizes would never change. VBR is completely within spec, and decoders do not have to explicitly support it.

Lastly a note on bitrate: 320 kbps CBR (the max allowed in spec) is often wasteful and pointless. In many cases, an encoder will pad out frames to conform to the requested bitrate. Indeed tools exist that will losslessly re-encode a CBR file to VBR by removing the padding, producing a smaller file. MP3 (as good as it is) has certain problem samples that aren't fixed by throwing more bits at them. A competent encoder with proper settings, like lame which defaults to -V4 is transparent in most samples to most people. If you disagree you should double-blind test yourself.

frollogaston 16 hours ago [-]
I'll have to check again, but my issue wasn't VBR, it was CBR above 128kbit/s I believe? 2012 Maserati GranTurismo, which has notoriously not-so-good electronics.
fastball 12 hours ago [-]
People that like to own their digital music in the Western World are generally not storing the files in mp3.

Where outside the West are you seeing many people that are specifically storing audio files in mp3 (vs just streaming/ storing in a better digital format/storing physical media)?

I live in SEA and most people here are not storing their own mp3s. Most people don't have computers at all – they have budget Android phones that don't have much built-in storage. What they do have is cheap internet, so they are either using Spotify (Free)/YouTube/etc. Many people still use CDs (mostly in cars) but those aren't mp3 either.

conradfr 23 hours ago [-]
Like when someone says spending $1000 to learn to code effectively using Claude or any other AI is nothing.
lucgommans 17 hours ago [-]
> It’s been difficult to remove [old JPEG] from its perch. [...] the formats AVIF and HEIC, each developed by standards bodies, have largely outpaced [JPEG]

I'm currently sticking to JPEG because, last time I tried, JPEG came out as the best format. Referencing my memory at https://chaos.social/@luc/113615076328300784

- JPEG has two advantages on slow connections: the dimensions are either stored up front so the layout doesn't jump, or maybe the renderer is better; and it loads a less-sharp version first and progressively gets sharper

- JPEG was way faster when compressing and decompressing

- on the particular photo I wanted to optimise in this instance, JPEG was also simply the best quality for a given filesize which really surprised me after 32 years of potential innovation

Regarding AVIF, my n=1 experience was that it "makes smooth gradients where jpeg degrades to blotchy pixels, but at decent quality levels, jpeg preserves the grain that makes the photo look real". Gradients instead of ugliness at really small sizes can be perfect for your use-case, but note that it's also ~80 times slower at compression (80s vs. <1s)

JpegXL isn't widely in browsers yet so I couldn't use it

> These days, the [JPEG] format is similar to MP3

The difference with mp3 is that Opus is either a bit better or much better, but it's always noticeably better.

You can save ~half the storage space. For speech (audio books) I use 40kbps, and for music maybe 128kbps which is probably overkill. And I delete the originals without even checking anymore if it really sounds the same, I noticed that I simply can't tell the original apart in a blind test, no matter what expensive headset setup I try

TFA attributes it to a simple "they were first" advantage, but I think this is why "Why JPEGs still rule the web": no file format is better than JPEG in the same way as Opus is better than MP3; in that you don't have to think about it anymore and it's always a win in either filesize or quality

That said, Opus is also annoyingly hard to get into people's minds, but I've done it and you also see major platforms from compress-once-serve-continuously video (e.g. Youtube) to VoIP (e.g. Whatsapp) switching over for all their audio applications

alex77456 6 hours ago [-]
I'm surprised some multi-encoder container format didn't take over by now, seeing as there is no one size fits all clear winner so far.
ksec 23 hours ago [-]
I want to add a slightly off topic point here.

This submission was originally shown as [dead]. I have no idea why, I read some of the content and seems decent enough, especially in the current state of things when JPEG-XL is blocked because of AOM / Google Chrome. I vouched for it and upvoted, then somehow it is on the front page.

I wonder if dead means somehow flagged it. If so, then why? If not, why is it dead?

carlosjobim 19 hours ago [-]
Almost every new submission to Hacker News gets [dead] and [flagged] for no reason, and somebody has to vouch for it. I don't know if it's automated system or some kind of activists who are stalking the New section.
encom 16 hours ago [-]
Flagging seems to be used like a super downvote on both submissions and comments. Admins appear okay with it.
scrapheap 7 hours ago [-]
I'm sure that I'm not the only person here being reminded by this that over 30 years ago they were being amazed by viewing JPEGs on their Amiga using HAM mode...
poisonborz 21 hours ago [-]
The actual issue is that this is not an issue for the users. Only global providers, so they were the ones pushing "obscure" solutions. Music fidelity was more of a consumer problem, so formats like FLAC found a foothold besides the go-to one.
t1234s 19 hours ago [-]
I notice webp produces images about 20% smaller than mozilla jpeg while appearing slightly sharper. I use a <picture> element to offer webp versions first but always include jpeg versions for future-proofing.
karim79 16 hours ago [-]
I've no single clue as to the future of image formats. All or them (or almost) have merits and stuff they suck at. I've been building a service which makes no judgement about it but tries to provide all the choices. JXL will be added soon:

https://kraken.io

MallocVoidstar 23 hours ago [-]
JPEG: No active patents, universal support, good enough.

HEIC: Have fun licensing this.

WebP: Slightly better than JPEG maybe, but only supports 1/4 chroma resolution in lossy mode so some JPEGs will always look better than the equivalent WebP.

AVIF: Better than JPEG probably, but encoders for AV1 are currently heavily biased towards blurring, even at very high bitrates. Non-Chrome browser support took a while.

dylan604 23 hours ago [-]
Sometimes a format is simply just good enough. The gains that WebP may or may not have definitely do not get it over the hump of being so tied to a browser. AVIF is still very new, but other than the anim stuff, is it enough of an improvement to get people to switch. And that brings me to the entire corpus of existing files. A JPEG decoder will always be necessary based on the amount of preexisting files requiring it. If you have to support JPEG for some, might as well continue making new in the same format as well.
frollogaston 19 hours ago [-]
"Do I look like ah know what a web-P is? I just want a jaypeg of a gosh darn hotdog!" is my attitude on this
llm_nerd 23 hours ago [-]
JPEG XL: Better than JPEG in every way aside from legacy support, while being royalty free and open.

You can even "losslessly" compress existing JPEGs to JPEGXL.

JPEG XL is the natural replacement for JPEG and it is perverse that Google backtracked on supporting it.

addaon 23 hours ago [-]
> JPEG XL: Better than JPEG in every way aside from legacy support, while being royalty free and open.

And decoder complexity. A software JPEG decoder is a weekend project. A hardware JPEG decoder not much more. Doing the same for arbitrary JPEG XL files is much, much more complicated. In any world where any of development cost, implementation complexity, expected code quality (especially when using first-order assumptions like constant number of defects per line of code), or decoder resources (especially for hardware implementations) are important, JPEG has serious advantages.

eviks 8 hours ago [-]
> A software JPEG decoder is a weekend project.

How many weekend project decoders are used in real apps?

addaon 2 hours ago [-]
Seems like a hard thing to know. I’ve only written and shipped one (MJPEG w/ Speex audio slipstreamed in COM blocks), but I’m only one person… so based on that sample, somewhere between zero and the number of software engineers?
llm_nerd 20 hours ago [-]
If we were talking about some abstract hypothetical format, this is entirely reasonable. Only there are a number of extremely high quality JPEG XL decoders. There is zero need for hardware assistance for decoding JPEG or JPEG XL, so that difference is a non-difference.

Every device in the world with iOS 17 or macOS 14 or better has JPEG XL support across the system.

This is a complete and utter non-issue. Google had even added JPEG XL support to Chromium, and then bizarrely removed it (not long before Apple fully supported JXL across all their platforms which invariably would have pushed it over the top), presumably to try to anoint WebP as the successor. Only WebP has so many disadvantages that all they did was entrench classic JPEG.

JPEG XL is unquestionably the best current next gen format for images.

addaon 18 hours ago [-]
> There is zero need for hardware assistance for decoding JPEG or JPEG XL, so that difference is a non-difference.

This depends on the system requirements, doesn't it? Suppose you're compositing a low-safety-impact video stream with (well, under) safety-impacting information in an avionics application, and you're currently using a direct GMSL link. There's an obvious opportunity to cost-down and weight-down the system by shifting to a lightly compressed stream over an existing shared Ethernet bus, and MJPEG is a reasonable option for this application (as is H.264, and other options -- trade study needed). When considering replacing JPEG with JPEG XL in this implementation, what's your plan for providing partitioning between the "extremely high quality" but QM software implementation and the DAL components? Are you going to dedicate a core to avoid timing interference? At that point you're spending more silicon than a dedicated JPEG decoder would take. You likely already have an FPGA in the system for doing the compositing itself, but what's the area trade-off between an existing "extremely high quality" JPEG XL hardware decoder and the JPEG one that you've been using for decades?

I don't doubt that in a world where everything is an iPhone (with a token nod to Android), "someone already wrote the code once and it's good enough" is sufficient. But there's a huge field of software engineering where complexity and quality drive decision making, and JPEG XL really is much more complex than JPEG Classic Flavor.

23 hours ago [-]
_kidlike 23 hours ago [-]
what about PNG?
kccqzy 23 hours ago [-]
Apples and oranges. PNGs aren't used for photographic images. It's good for line art, certain cartoons, pixel art and the like.
zargon 23 hours ago [-]
Entirely different category.
suspended_state 6 hours ago [-]
It's the VHS of digital image compression and storage.
greenavocado 23 hours ago [-]
DO NOT USE WEBP

JPEG-XL is the superior format. The only reason WebP exists is not because of natural selection, but because of nepotism (Google Chrome).

https://www.reddit.com/r/AV1/comments/ju18pz/generation_loss...

cogman10 23 hours ago [-]
JPEG-XL is supported by exactly 1 browser. WebP and AV1F are supported by just about every browser.

I'd love to use JPEG-XL, but I'm guessing the only way to do that is also bringing along a WASM decoder everywhere I want to use it.

wpollock 19 hours ago [-]
This situation might change if Google is forced to divest itself from Chrome. This is currently in the US courts, but it might take awhile.
22 hours ago [-]
caycep 23 hours ago [-]
due to above said nepotism...

standards require some politicking and money I suppose

Arnt 22 hours ago [-]
Standards do require that. Someone has to show up at the right meetings and conferences, and it's often necessary to contribute code too.

"Build it and they will come" doesn't work for products, and it doesn't work for standards either.

Arnt 21 hours ago [-]
... and I want to add that there's no need to assume nepotism.

If you get code merged into something like Chrome, and it's big and goes unused for a few years, at some point some security-minded person will come along and argue that your code is an unused attack surface and should be removed again.

simoncion 3 hours ago [-]
> ... and I want to add that there's no need to assume nepotism.

Sure, and while this is true:

> If you get code merged into something like Chrome, and it's big and goes unused for a few years [it's likely to get removed.]

it's also true that Google could have pushed JPEG XL instead of pushing WebP, which would have massively increased the usage stats of the JPEG XL code and saved it from removal. But (for whatever reason) Google decided to set things up to push folks to use WebP at every turn, and here we are.

ericmcer 21 hours ago [-]
Standardization is the most important feature a technology can have. Look at JS.
frollogaston 17 hours ago [-]
Standardization is a big feature of JS, but it's also a surprisingly good language for its use cases. There was even good reason to port it over to the backend (NodeJS).
josefx 20 hours ago [-]
> Look at JS.

Given how much duct tape it took at times to get various browsers to behave I would say JS is proof of the opposite. It succeeded in an environment where standards where a mere suggestion.

frollogaston 17 hours ago [-]
Don't they all run JS ES5 itself the same way? It's more that each has different feature sets (HTML5 stuff, webrtc, wasm, etc), which are callable from JS but would've been a problem regardless of the language.
17 hours ago [-]
NoMoreNicksLeft 22 hours ago [-]
There are only two browsers, Safari and Firefox.
Acrobatic_Road 22 hours ago [-]
only Nightly
whywhywhywhy 23 hours ago [-]
Upload a JPEG, gets converted to WEBP next person, downloads WEBP converts it to JPEG to actually edit it in software because even things like Photoshop/MacOS Preview can't edit one natively, saves to JPG and uploads, gets converted to WEBP

next person downloads the now JPEG's WEBP'd JPEG'd WEBP'd image

fast forward a decade the original quality versions of the images no longer exist just these degraded

blooalien 22 hours ago [-]
Even if you have image editing software that directly supports WEBP format input / output, you still have the exact same problem, because (like JPEG) it's a lossy format which will lose fidelity with each successive generation of load / save over time. If you intend to edit an image, then the original (if possible) and it's edit should both be saved in a lossless format even if the final published output is saved in a lossy format. If no lossless format of the original is available, then the highest quality version of the original should be converted to a lossless format for "archival" before editing, and that copy should always be used for editing purposes, rather than piling loss upon loss editing the lossy format and re-saving. It's kinda like copies of copies of copies of MP3 audio. Eventually it becomes a soupy mess not worth using.
whywhywhywhy 12 hours ago [-]
doubles the speed of the degradation because the image is encoded on upload regardless of size and double stacks the encoding of artefacts
vunderba 20 hours ago [-]
Well... kinda. WebP is a bit of an unusual format in that it supports both lossy and lossless forms of compression.

Most decent image editing software (Photoshop, Pixelmator, etc) will let you choose what you want.

https://www.adobe.com/creativecloud/file-types/image/raster/...

But if you're not a professional it would be easy to mix up the two and slowly end up with VHS level degradation.

xeonmc 20 hours ago [-]
relevant xkcd: https://xkcd.com/1683/
dragonwriter 23 hours ago [-]
So, what you are saying is that JPEG-XL is superior, as long as your use case is insensitive to whether the majority of web users can view your content?
alwillis 22 hours ago [-]
I get it but 2+ billion Apple device users is not nothin’.

They can render JPEG-XL; everything else will render the fall back format like JPEG or WebP.

throw_m239339 22 hours ago [-]
Teams & developers will likely only chose a single format if they can, the one that most browsers support, because doing some content negotiation is more code, more work. It doesn't take away anything from the parent point though, if JPEG-XL is more performant it could reduce bandwidth requirements.
alwillis 19 hours ago [-]
> because doing some content negotiation is more code, more work

It's actually not more work. The user's browser automatically handles the content negotiation and only downloads the image format it understands:

    <picture>
      <source srcset="photo.jxl" type="image/jxl">
      <source srcset="photo.webp" type="image/webp">
      <img src="photo.jpg" alt="Product photo" loading="lazy">
    </picture>
macOS, iPadOS and iOS get the JPEG-XL image, device that can handle WebP get that and everything else gets JPEG.

There are several image delivery services that will provide the best image format depending on the device that's connecting.

Zardoz84 22 hours ago [-]
just fuking use a poly fill to add support of JPEG XL. Or store JPEG XL and convert on the fly to JPEG to supply it to browsers that don't support JPEG XL.
MrDOS 22 hours ago [-]
No need to bring JavaScript into it.

In HTML, use `<picture>`: https://developer.mozilla.org/en-US/docs/Web/HTML/Reference/...

In CSS, use `image-set()`: https://developer.mozilla.org/en-US/docs/Web/CSS/image/image....

throw_m239339 22 hours ago [-]
> just fuking use a poly fill to add support of JPEG XL. Or store JPEG XL and convert on the fly to JPEG to supply it to browsers that don't support JPEG XL.

Doesn't a polyfill imply more Javascript running on the device?

BugsJustFindMe 22 hours ago [-]
Only until the browser gets updated and then the polyfill stops being invoked automatically. It's self-healing.
Zardoz84 20 hours ago [-]
Tt's a WASM . And really isn't very big. So if you need to store and serve many images files, potentially big images like in an preservation & diffusion software (like the stuff that I'm working), I felt that I could afford to pay the extra few KiBs on that WASM.
Spivak 23 hours ago [-]
I mean if you define superior not in terms of its technical merits but because of its blessed status by Google. It's not a terrible format but it is very much "what is the worst quality we can reasonably get away with to save on bandwidth." Such a thing does have its uses.

At some point you have to be pragmatic and meet users where they are but doesn't mean you have to like that Google did throw their weight around in a way that only they really can.

dragonwriter 23 hours ago [-]
> I mean if you define superior not in terms of its technical merits

“Technical merits” are rarely, for anything, the sole measurement of fitness for purposes.

Even for purely internal uses, internal social, cultural, and non-technical business constraints often have a real impact on what is the best choice, and when you get out into wider uses with external uses, the non-technical factors proliferate. That's just reality.

I understand the aesthetic preference to have decisions only require considering a narrow set of technical criteria which you think should be important, but you will make suboptimal decisions in the vast majority of real-world circumstances if you pretend that the actual decision before you conforms to that aesthetic ideal.

palmfacehn 23 hours ago [-]
I enjoy webp lossless mode
frollogaston 22 hours ago [-]
I'm not going to use webp or jpeg-xl.
RankingMember 20 hours ago [-]
Yeah they're both a pain in that there's friction at all to open/use them.
gjsman-1000 23 hours ago [-]
JPEG XL is superior... just like how Betamax was visually superior.
greenavocado 23 hours ago [-]
The main problem with your argument is the people want JPEG-XL. The main reason it is not in is purely due to Jim Bankoski's limited judgement, intelligence, and foresight.

https://groups.google.com/a/chromium.org/g/blink-dev/c/WjCKc...

gjsman-1000 23 hours ago [-]
Just like how some people wanted Betamax. Politics drives adoption; not technical superiority.
izacus 22 hours ago [-]
"People" here are a tiny minority of techies that have emotional connection to the library they built and a group of OSSers which will take anything that bashes Google as a gospel.

Everyone else... is fine with JPEG. And occasionally shares HEIC pictures from their iPhones.

greenavocado 23 hours ago [-]
Thankfully, unlike people and physical hardware, software and algorithms can lie in wait in perpetuity until they are resurrected when the political winds finally change course or favorable conditions for their spread emerge.
throw_m239339 22 hours ago [-]
> Thankfully, unlike people and physical hardware, software and algorithms can lie in wait in perpetuity until they are resurrected when the political winds finally change course or favorable conditions for their spread emerge.

XHTML 2 is waiting on that one... Oh well...

gsich 17 hours ago [-]
>just like how Betamax was visually superior.

Only for a very brief period. (Beta 1)

nottorp 4 hours ago [-]
> Thirty-eight years later, we’re still using the GIF—but it never rose to the same prevalence of JPEG. > The GIF was a de facto standard. The JPEG was an actual one

I know this is IEEE but seriously? JPEG won because it was a formal standard?

Didn't JPEG win because it supported more than 256 colours and, with the lossy compression, greatly reduced file size and bandwidth needs for our cat photo and porn collections?

While proposed replacements ... solve what problems?

tsoukase 14 hours ago [-]
There are only two kinds of image file formats: the ones people complain about and the ones nobody uses.
77pt77 23 hours ago [-]
How come jpeg 2000 never became popular?
izacus 22 hours ago [-]
In 99% cases of "Why wasn't this (better) format adopted?" the answer is:

* It's not actually better.

* It's patented/requires license/is owned by someone who wants a lot of royalties.

JPEG2000 is of the second variety.

SimplyUnknown 23 hours ago [-]
Multiple reasons, while technically better and more benign compression artifacts, it is computationally more expensive, limited quality improvements, encumbered by patents, poor Metadata format, poor colorspace support... In the end, the benefits aren't great enough compared to jpeg to change the default format
AshleysBrain 23 hours ago [-]
IIRC JPEG2000 was never supported by any browser other than Safari, and even Safari recently gave up and removed support (around the same time they added support for JPEG XL). As to why other browsers never supported it, I'm not sure.
Maken 23 hours ago [-]
JPEG 2000 used to be a patent minefield.
meindnoch 23 hours ago [-]
Greed. Wavelets used to be heavily patented.
llm_nerd 23 hours ago [-]
Orthogonal, but one fun thing about JPEG 2000 is that when you watch a movie at movie theatres now, odds are overwhelming that you are watching a sequence of JPEG 2000 encoded images.
77pt77 22 hours ago [-]
So like mjpeg but for jpeg 2000?

That doesn't even make much sense because you lose inter-frame compressibility

meindnoch 20 hours ago [-]
Every frame is a keyframe in digital cinema. It's literally a bunch of JPEG2000 files, wrapped in an MXF container. A typical 2hr movie is in the 300-400GB ballpark.
martin_a 22 hours ago [-]
As others have pointed out, JPEG is just fine. It's "enough" in the best sense of the word and gets its job done. It's supported on every device, in every browser, image viewer and whatnot. It. Just. Works.

Maybe there are formats that compress better or lossless, but thanks to advancements in disk space and transfer rates (I know, not everywhere but penetration and improvement will happen...) the disadvantages of JPEG can be handled and we can just enjoy a very simple file format.

In an era where enshitification lingers around every corner I'm just happy that I don't need to think about whether I have to convert _every digital picture I've ever taken_ into some next-gen format because some license runs out or whatnot. It just works. Let's enjoy that and hope it sticks around for 30 more years.

politelemon 23 hours ago [-]
The last part of the article about jpeg being outpaced seems anecdotal or unsubstantiated, it needs context.
hoseja 9 hours ago [-]
There is an photo example for quality levels throughout the article but instead of classical JPEG compression artifacts it just seems to get progressively more posterized. What's up with that.
jmyeet 23 hours ago [-]
So there are two use cases for public image dissemination:

1. Lossy: JPEG fills this role;

2. Lossless: this was GIF but is now PNG; and

3. Animated: GIF.

So for a format to replace JPEG, it must bring something to the table for lossy compression. Now that JPEG is patent-free, any new format must be similarly unencumbered. And it's a real chicken-and-egg problem in getting support on platforms such that people will start using it.

I remember a similar thing happening with Youtube adding VP9 (IIRC) support as an alternate to H264, which required an MPAA patent license. The MPAA also tried to cloud VP9 by saying it infringed on their patents anyway. No idea if that's true or not but nobody wants that uncertainty.

Anyway, without total support for VP9 (which Apple devices didn't have, for example) Youtube would need to double their storage space required for videos by having both codecs. That's really hard to justify.

Same goes for images. You then need to detect and use a supported image format... or just use JPEG.

cadamsdotcom 14 hours ago [-]
“Because it has a 30 year head start. And there are better, new, formats coming all the time; but everyone supporting them hates each other.”
shmerl 17 hours ago [-]
I'm trying to replace jpeg with avif in my use cases, but some sites still don't handle avif.

Even Github! Though the latter doesn't support IPv6 either.

23 hours ago [-]
echelon 23 hours ago [-]
Nothing supports WebP.

Most websites break with WebP. Desktop tools choke on WebP.

It sucks, because it's a good format.

JonnyReads 23 hours ago [-]
I've recently been battling with github not having support for WebP. Seems odd for a website not to support an image format when the browser does.
afavour 23 hours ago [-]
> Most websites break with WebP

That part at least isn't true.

edflsafoiewq 23 hours ago [-]
Probably they mean uploading WebPs where an image is expected often doesn't work.
echelon 22 hours ago [-]
Yes, I was referring to the backends.

A funny case in point: Sora.com produces WebP outputs, but you can't turn around and use them as inputs. (Maybe they've fixed that?)

Smaller websites almost always reject them. Even within big websites, support is fractured within the product surface area. You can't use them as Reddit profile icons, for instance.

One of the most apparent issues is that a lot of thumbnailing and CDN systems don't work natively with WebP, so you have to reject WebP outright until broader support is added.

Once the WebPs are in your systems, you have to make sure everything downstream can support them... it's infectious.

Really unfortunate that we haven't been able to move past this.

martin_a 22 hours ago [-]
You can think about WordPress what you want, but afaik it does not allow to upload WebP files to its media gallery without any additional plugins while being said to run large parts of the internet...
celsoazevedo 12 hours ago [-]
Just tested on my self-hosted WordPress blog and it works without any plugins.

WebP support was added to WordPress in 2021 with version 5.8: https://wordpress.org/documentation/wordpress-version/versio...

AVIF support in 2024, with v6.5: https://make.wordpress.org/core/2024/02/23/wordpress-6-5-add...

martin_a 6 hours ago [-]
Ok, will have to check again, thanks for the info!
20 hours ago [-]
neuroelectron 16 hours ago [-]
I think it's about time we switched from JPEGs to Rust.