Hacker Newsnew | past | comments | ask | show | jobs | submit | demetris's commentslogin

Firefox could (should?) be better in several aspects but it seems excessive to say it is pretty irrelevant.

It has 4.5% market share in Europe, 9% in Germany (statcounter numbers).

It is the browser that got the Google Labs folks to write a Rust jxl decoder for it, and now, thanks in part to that, Chrome is re-adding support for jxl.

You can be unhappy with Firefox (I often am myself), and Firefox HAS lost relevance, but can you really say it has become pretty irrelevant?


I didn’t pay close attention to the domain and I thought it was the other one:

https://moderncss.dev/

One of the best educational resources for modern CSS.

BTW, one of the reasons I love modern CSS is front-end performance. Among other things, it allows you to make smaller DOMs.

I talk about a modern CSS technique that does that here:

https://op111.net/posts/2023/08/lean-html-markup-with-modern...

It is an idea I started playing with when custom properties landed in browsers, around 2016 or 2017? Around 2021 I started using the technique in client sites too.

Now I want to write a 2026 version of the post that talks about container queries too. The technique becomes more powerful if you can rely on container queries and on the cqw unit. (You cannot always. That stuff is still new.)

For an example of the convenience cqw offers if you can rely on it, see the snippets I have in this:

https://omnicarousel.dev/docs/css-tips-know-your-width/


I don’t believe resips will be with us for long, at least not to the extent they are now. There is pressure and there are strong commercial interests against the whole thing. I think the problem will solve itself in some part.

Also, I always wonder about Common Crawl:

Is there is something wrong with it? Is it badly designed? What is it that all the trainers cannot find there so they need to crawl our sites over and over again for the exact same stuff, each on its own?


Many AI projects in academia or research get all of their web data from Common Crawl -- in addition to many not-AI usages of our dataset.

The folks who crawl more appear to mostly be folks who are doing grounding or RAG, and also AI companies who think that they can build a better foundational model by going big. We recommend that all of these folks respect robots.txt and rate limits.


Thank you!

> The folks who crawl more appear to mostly be folks who are doing grounding or RAG, and also AI companies who think that they can build a better foundational model by going big.

But how can they aspire to do any of that if they cannot build a basic bot?

My case, which I know is the same for many people:

My content is updated infrequently. Common Crawl must have all of it. I do not block Common Crawl, and I see it (the genuine one from the published ranges; not the fakes) visiting frequently. Yet the LLM bots hit the same URLs all the time, multiple times a day.

I plan to start blocking more of them, even the User and Search variants. The situation is becoming absurd.


Well, yes, it is a bit distressing that ill behaved crawlers are causing a lot of damage -- and collateral damage, too, when well-behaved bots get blocked.

I published some benchmarks recently:

https://op111.net/posts/2025/10/png-and-modern-formats-lossl...

I compare PNG and the four modern formats, AVIF, HEIF, WebP, JPEG XL, on tasks/images that PNG was designed for. (Not on photographs or lossy compression.)


It seems like the natural categories are (1) photographs of real things, (2) line art, (3) illustrator images, (4) text content (eg, from a scanned document).

Is there a reason you used only synthetic images, ie, nothing from group 1?


Hey, tasty_freeze!

The motivation behind the benchmarks was to understand what are the options today for optimizing the types of image we use PNG for, so I used the same set of images I had used previously in a comparison of PNG optimizers.

The reason the set does not have photographs: PNG is not good at photographs. It was not designed for that type of image.

Even so, the set could do with a bit more variety, so I want to add a few more images.


Would be nice to also see decompression speed and maybe a photo as a bonus round.


Yeah.

Numbers for decompression speed is one of the two things I want to add.

The other is a few more images, for more variety.


Max memory required during decompression is also important. Thanks for sharing this research.


https://op111.net - My blog

https://omnicarousel.dev - Docs and demos site for Omni Carousel, a library I wrote recently


I did that that recently for a couple of personal projects and I like it. I think I will start doing it for client sites too.

https://omnicarousel.dev

The main navigation menu is just above the site footer in the HTML document.

Question for people who know that stuff:

What is the recommended way of hiding features that require JavaScript on browsers that do not support JavaScript, e.g., on w3m?


"What is the recommended way of hiding features that require JavaScript on browsers that do not support JavaScript, e.g., on w3m?"

You can try the <noscript> tag.


> The main navigation menu is just above the site footer in the HTML document.

Just letting you know, that stuff is a bit confusing to screen reader users.

Though I really wish we standardized on putting content first, like mobile apps do. At least we woulnd't haave to explain to new screen reader users why getting to the f???ing article is so damn hard if you don't know the right incantations to do it quickly.


Thank you!

Would a Jump to navigation link next to Skip to content make this arrangement better for screen reader users?


Do you know what user agent the browsers send?

I tried with Windows 7 (Firefox 115) and it reports Windows 7.

It seems though that it cannot distinguish between Windows 10 and Windows 11, so, without looking further, I suppose the detection is based on the User-Agent string? (The OS version browsers report on Windows is frozen, so Windows 10 and Windows 11 have the same version there.)


I was working on a carousel library a few months ago. I had made a few stress-test demos so that I could catch obvious issues while I was adding things and tweaking things.

One carousel there had 16K slides.

On Windows both Chrome and Firefox managed that fine. They scrolled from start to end and back without issue and you could see, I think, all the frames in my 60Hz screen.

On GNOME and X11 (dual boot, so same hardware) Chrome was fine but there were issues with Firefox. I was curious so I logged out and logged in with Wayland. On Wayland Firefox was fine too, indistinguishable from Chrome.

I don’t understand hardware, compositors, etc., so I have no idea why that was, but it was interesting to see.


Firefox remains very conservative on enabling modern features on X11. Some distributions force them on, but otherwise it's up to the user to figure out how to do that.

It's likely that some hwaccel flag in about:config wasn't turned on by default. Similarly, if you want smooth touchpad scrolling, you need to set MOZ_USE_XINPUT2


Oh! That’s interesting. Thank you.

My main Firefox in that setup is from the Mozilla repos, rather than the ESR version that is the default in Debian stable. So, it could very well be that. I will have to check to see what the ESR Firefox from the Debian repos does.


> Firefox remains very conservative on enabling modern features on X11.

So old school throthling if you don't use the "right" version (Apple batterygate, Microsoft wordperfectgate). They could blame it on testing though (we only use Wayland and we are too lazy to test the X11 version)


The standard video element is really nice:

https://developer.mozilla.org/en-US/docs/Web/HTML/Reference/...

I have used it on a couple of client sites, and it works really well.

You can even add a thumbnail that shows before the video starts downloading/playing (the poster attribute). :-)


Author here.

I ran benchmarks comparing PNG, AVIF, HEIF, JPEG XL, and WebP for lossless compression of graphics images. Tested with 14 images.

The results are also available in a TXT file:

https://op111.net/files/2025/10/op111-20251015-png-modern-fo...

...and in a Google Sheets document:

https://docs.google.com/spreadsheets/d/1mwaHeIsDrNhE3NTKtszK...

Cheers!


The takeaway I take from that is that "AVIF sux" which is my general feeling about AVIF.

My own interest is in publishing images that I took with my DSLR and having them look like images I took from my DSLR. People like to show me

https://jakearchibald.com/2020/avif-has-landed/

to prove I'm wrong. It's true that the AVIF image is small doesn't have the obvious blocking artifacts that JPEG and WebP but if you look really close at the reflections on the upper wing of the car it looks like the AVIF just made up some probable-looking blobs of light that don't look that much like the original if you look really closely.

The thing is a video codec doesn't have to be good for images. For instance a single frame of a VHS video looks atrocious but an actual video on VHS isn't that bad.

When I tried to use AVIF to make files of the quality I wanted, I didn't see a clear benefit over WebP and to the contrary I came to the conclusion that WebP was a good drop-in replacement for JPEF for my application. If I wanted to make a big splash image for my web site that didn't have to hold up to close inspection though, AVIF's compression ratio is really high.


:-)

My interest in doing the benchmarks was the other thing:

Seeing what the options are these days for the types of image PNG was designed for.

As the results started accumulating, I wasn’t sure I should include all formats in the post and in the TXT file and the spreadsheet, because testing them at what they were not designed for did not seem fair.

Do you think I should add something stronger or more prominent to my intro to explain this?


Here's my take as a web developer from that article, who primarily cares about formats widely supported by web browsers.

For most purposes where I might use a PNG I might use a lossless WebP now because it seems like lossless WebP beats PNG pretty solidly. My take also is that WebP is a good JPEG replacement.

JPEG XL usually does better but practically that doesn't matter much because I think the only web browser that supports it is Safari

https://caniuse.com/?search=jpeg+xl

Of course there is a lot of politics around JPEG XL, specifically Google doesn't want us to have it and they're a monopolist so we can't have it. If there is any chance we're going to change that we're going to have document that JPEG XL really is better than the alternatives and your article does that.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: