Hacker Newsnew | past | comments | ask | show | jobs | submit | kergonath's commentslogin

> The actions of this administration are positively communist (in the most cynical, fear-mongered notion about communism), from enlisting tech execs in the military, to demanding complete control over all commerce (including demanding ownership stakes), to absolute chilling control over speech.

It is more fascist than communist. The communist way would have been to make the state take control of companies and then put apparatchiks at the top. This is the other way around, with companies taking over the state instead. But yeah, it’s nitpicking at this point.


Bzip2 is slow. That’s the main issue. Gzip is good enough and much faster. Also, the fact that you cannot get a valid bzip2 file by cat-ing 2 compressed files is not a deal breaker, but it is annoying.

Gzip is woefully old. Its only redeeming value is that it's already built into some old tools. Otherwise, use zstd, which is better and faster, both at compression and decompression. There's no reason to use gzip in anything new, except for backwards compatibility with something old.

> Otherwise, use zstd, which is better and faster

Yes, I do. Zstd is my preferred solution nowadays. But gzip is not going anywhere as a fallback because there is a surprisingly high number of computers without a working libzstd.


One other redeeming quality that gzip/deflate does have is that its low memory requirements (~32 KB per stream). If you're running on an embedded device, or if you're serving a ton of compressed streams at the same time, this can be a meaningful benefit.

The claim that zstd is "better and faster", without additional qualifications, is false and misleading.

Indeed for many simple test cases zstd is both better and faster.

Despite that, it is possible to find input files for which zstd is either worse or slower than gzip, for any zstd options.

I have tested zstd a lot, but eventually I have chosen to use lrzip+gzip for certain very large files (tens of GB file size) with which I am working, because zstd was always either worse or slower. (For those files, at the same compression ratio and on a 6-year old PC, lrzip+gzip has a compression speed always greater than 100 MB/s, while zstd only of 30 to 40 MB/s.)

There are also other applications, where I do use zstd, but no compressing program is better than all the others in ALL applications.


bzip2 is particularly slow because the transform it depends on (BWT2) is "intrinsically slow" - it depends on cache-unfriendly operations with long dependency chains, preventing the CPU from extracting any parallelism:

https://cbloomrants.blogspot.com/2021/03/faster-inverse-bwt....


> the fact that you cannot get a valid bzip2 file by cat-ing 2 compressed files

TIL. Now that's why gzip has a file header! But, tar.gz compresses even better, that's probably why it hasn't caught on.


tar packs multiple files into one. If you concatenate two gzipped files and unzip them, you just get a concatenated file.

Ah okay, I thought gzip would support decompressing multiple files that way.

How it works is, if you have two files foo.gz and bar.gz, and cat foo.gz bar.gz > foobar.gz, then foobar.gz is a valid gzip file and uncompresses to a single file with the contents of foo and bar.

It’s handy because it is very easy to just append stuff at the end of a compressed file without having to uncompress-append-recompress. It is a bit niche but I have a couple of use cases where it makes everything simpler.


tar supports that types of concatenation, so you can concatenate tar.gz files, and unpack them all into separate files

I know, but I've been always confused why a gzip file would have a filename field in its header if it's supposed to contain only one file. Obviously it's good to keep a backup of original filename somewhere, but it's confusing nonetheless.

the catting issue might be more an implementation of bzip program problem than algorithm (it could expect an array of compressed files). that would only be impossible if the program cannot reason about the length of data from file header, which again is technically not something about compression algo but rather file format its carried through.

that being said, speed is important for compression so for systems like webservers etc its an easy sell ofc. very strong point (and smarter implementation in programs) for gzip


Bzip2 is great for files that are compressed once, get decompressed many times, and the size is important. A good example is a software release.

So is xz, or zstd, and the files are smaller. bzip2 disappeared from software releases when xz was widely available. gzip often remains, as the most compatible option, the FAT32 of compression algorithms.

Huh? Only if it gets decompressed few times I would say, because it's so extremely slow at it

Like a software installation that you do one time. I'd not even want it for updates if I need to update large data files. The only purpose I'd see is the first-time install where users are okay waiting a while, and small code patches that are distributed to a lot of people

(Or indeed if size is important, but then again bzip2 only shines if it's text-like. I don't really find this niche knowledge for a few % optimization worth teaching tbh. Better teach general principles so people can find a fitting solution if they ever find themselves under specific constraints like OP)


> the catting issue might be more an implementation of bzip program problem than algorithm (it could expect an array of compressed files). that would only be impossible if the program cannot reason about the length of data from file header, which again is technically not something about compression algo but rather file format its carried through.

Long comment to just say: ‘I have no idea about what I’m writing about’

These compression algorithms do not have anything to do with filesystem structure. Anyway the reason you can’t cat together parts of bzip2 but you can with zstd (and gzip) is because zstd does everything in frames and everything in those frames can be decompressed separately (so you can seek and decompress parts). Bzip2 doesn’t do that.

So like, another place bzip2 sucks ass is working with large archives because you need to seek the entire archive before you can decompress it and it makes situations without parity data way more likely to cause dataloss of the whole archive. Really, don’t use it unless you have a super specific use case and know the tradeoffs, for the average person it was great when we would spend the time compressing to save the time sending over dialup.


> zstd does everything in frames and everything in those frames can be decompressed separately (so you can seek and decompress parts). Bzip2 doesn’t do that.

This isn't accurate.

1) Most zstd streams consist of a single frame. The compressor only creates multiple frames if specifically directed to do so.

2) bzip2 blocks, by contrast, are fully independent - by default, the compressor works on 900 kB blocks of input, and each one is stored with no interdependencies between blocks. (However, software support for seeking within the archive is practically nonexistent.)


So... it's actually a reasonable objection over bzip2? I mean, you explained why it does not work with bzip2.

I think their argument is sound and it makes using bzip2 less useful in certain situations. I was once saved in resolving a problem we had when I figured out that concatening gzipped files just works out of the box. If not, it would have meant a bit more code, lots of additional testing, etc.


bzip and gzip are both horrible, terribly slow. Wherever I see "gz" or "bz" I immediately rip that nonsense out for zstd. There is such a thing as a right choice, and zstd is it every time.

> Wherever I see "gz" or "bz"

That should not happen too often, considering that IIRC bzip lasted only a couple of months before being replaced by bzip2.


lz4 can still be the right choice when decompression speed matters. It's almost twice as fast at decompression with similar compression ratios to zstd's fast setting.

https://github.com/facebook/zstd?tab=readme-ov-file#benchmar...


pigz it's damn fast on compressing. Also, a Vax with NetBSD can run gzip. So here is it. Go try these new fancy formats on a Vax, I dare you.

And, yes, I prefer LZMA over the obsolete Bzip2 any day, but GZIP it's like the ZIP of free formats modulo packaging, which it's the job of TAR.


> It's not enshrined in some document they got together and wrote down like the US constitution

It’s also very brittle and one charismatic populist away from unraveling like the American government. Too much depends on gentlemen agreements and people trusting other people to do the right thing. It works in a stable environment, but shatters the moment someone with no shame and no scruples shows up.


There's really no way around the possibility that whatever you've written down in your constitution will be ignored in the heat of the moment, or become degraded over time.

But you don't need to put the military under the direct command of the civilian president like US does, if parliament can take military action against the civilian president and civilian action against the military leader then they have ways to deal with both.

American president is too powerful to deal with since he controls both the civilian and the military side.


This is the one argument left for monarchy; that the military in the UK (and technically Australia) swear loyatly to the monarch, not the Prime Minister. In the event of an obviously-lunatic elected official ordering the troops into civilian areas to "pacify" civilian populations, the monarch could (in theory) countermand that order.

Isn't that worse? You don't even get to elect the commander in chief, its just some random guy who was born into it?

The monarch being Commander in Chief is ceremonial. Everything is done on the advice of the Prime Minister and their cabinet.

The chance of the monarch overriding said request is less than 1%.

Even then, parliament is sovereign. Whilst the logistics are complicated due to how things are introduced to the house, if parliament says no to a prime ministers decision, it overrides anything the prime minister who has no absolute power like a president does.


Monarchists can't have it both ways, though. Making him a ceremonial CiC isn't going to provide you with much of a bulwark against abuse of power by parliament. Or he isn't ceremonial and he could become a threat himself.

I didn't say it was a good argument ;)

Personally I love the idea that the codes for nukes are surgically implanted in a volunteer, and in order to issue the order to fire the nukes, the CIC must personally carve the codes from that person's chest with a knife, killing them in the process. Or the variant on that idea, that the codes are implanted in their own forearm, and to order the nukes they must cut the codes from out of their own flesh.

We could do the same for all military deployment orders.


There's a mechanism by which Congress can remove the president if he gets out of control.

This just happened.

The government, unilaterally, against the country's prevalent feelings towards this illegal war of aggression, permitted USA to use British bases, and if I'm not mistaken, without as much as the parliament vote.


Most western democracies have exactly the same fault, maybe having unscrupulous, shameless legislators are the end state of the current models of democracy being practiced.

While no democratic system is completely protected from tyrants, at least the UK (and the Commonwealth nations who inherited their principles) uses the living tree doctrine in its courts, which means that the written text is not sacrosanct and the intention and usage is to be considered. That and unwritten tradition has force of law and can be challenged in court. Look at Boris Johnson's reversal of his prorogation as an example.

> It’s also very brittle and one charismatic populist away from unraveling

All sufficiently large governments (really all organizations of any kind) are necessarily like this, from the most successful attempts at open societies to the most autocratic. They all require constant vigilance both to perform their intended function and to preserve themselves into the future.


Strong disagree. It's uncontested that supreme authority lies with parliament, not with the leader of the day. PM can't do shit if parliament doesn't want him to, because they can always simply change the rules on him.

Constitution and laws are just pieces of paper. They only matter if the population acts as if they matter. Liberia has the same Constitution as the US.

But they're cycled through much more rapidly, and seem generally more vulnerable than the dictators in the U.S or otherwise. A small concession to be sure.

It seems like a fundamental failure of government that in many cases, there are no consequences for deliberately or accidentally screwing your people. You either get murdered eventually or the country is just left to fix itself later, which disproportionately affects people with little resources.


Being able to vote in a strong leader to fix things directly is a feature. Democracy is not always the answer and when it is it can be too slow when time matters.

For “fixing things” there are well-defined mechanisms such as state of emergency declaration of war.

Those give the leader extra powers, but do not give them carte blanche to do as they wish. Those extra powers are limited to handle specific problems.


I think it's quite obvious now how this leads to the erosion of rights and laws, looking at the current state of the affairs.

That's the point? Adding laws and rights is not necessarily a good thing. People tried to work towards a local maxima but it turns out that the approach is no good so it needs to be torn down and another direction of hill climbing needs to be tried. Or circumstances where a law made sense are no longer the same. Problems that the law makers did not foresee may come into the picture.

Britain's problems are due to uncharismatic Blairite socialist.

This comment may or may not be wrong but it is quintessentially low effort.

The point of HN is to discuss, not to tweet about your political enemies.


I'm parroting back the opposite of the original reply, which was upvoted

That leaves me to conclude HN is a left leaning circle jerk echo chamber, much like reddit. With any dissent to the right triggering the non-hateful liberal lefties.


A populist far-right racist would fix all the potholes and bring HS2 in under budget? Got it.

You don't understand the core issue at heart in Britain.

The real distraction is the economic argument. The truth of the matter is natives feel like a stranger in their own country. I say this as someone who is mixed race and 2nd gen before you try and label me a racist. Yawn.


You need to change your social media algorithm.

All of them? Hmmm.

I don't know much about UK politics but I definitely know enough to know that there's no such thing as a "Blairite socialist".

> You could argue the House of Lords did the same

It can still do the same thing without hereditary peers. A slow-moving, conservative (in the classical sense) upper chamber is a classic in bicameral systems, it is not specific to the House of Lords.


It’s an interesting observation, but I think you have it backwards. The examples you give are all using discrete symbols to represent something real and communicating this description to other entities. I would argue that all your examples are languages.

> The fact that models aren't continually updating seems more like a feature.

I think this is true to some extent: we like our tools to be predictable. But we’ve already made one jump by going from deterministic programs to stochastic models. I am sure the moment a self-evolutive AI shows up that clears the "useful enough" threshold we’ll make that jump as well.


Stochastic and unpredictability aren't exactly the same. I would claim current LLMs are generally predictable even if it is not as predictable as a deterministic program.

No, but my point is that to some extent we value determinism. By making the jump to stochastic models we already move away from the status quo; further jumps are entirely possible. Depending on use case we can accept more uncertainty if it comes with benefits.

I also don’t think there is a reason to believe that self-learning models must be unpredictable.


> @dang can you revert to actual title please?

This does not work. From the guidelines:

> Please don't post on HN to ask or tell us something. Send it to hn@ycombinator.com.

https://news.ycombinator.com/newsguidelines.html


We must limit the problem, then adapt and mitigate. Some damage is irreversible, it does not mean that it’s a good idea to stop trying to understand what will happen. You don’t stop weather forecasts when a hurricane touches land just because it’s going to happen anyway.

Reality is not binary. There’s a whole spectrum of situations between "everything gets back to normal and all is well" (which was never on the cards after the 1980s) and "all humans die within a century". And the nuances in between still affect billions of people.


From what I understood we are closer to the “all humans will die within a century” and if this is the case then what’s the point of doing anything? Does it matter if our effort just delays the “all humans will die” by let say 50 years? Radical change we need to do if we are serious is going back to caves. 0 consumption. I highly doubt anyone will do that.

> This ship has sailed, warming is irreversible.

Nobody who understands the subject claims that it is reversible on a human life scale. In the realistic best cases, it’d stabilise in a couple of decades and slowly decrease from there.

The real question is not whether it is reversible, but how high it will go and how we are going to deal with it.


That is completely wrong.

First, the West and particularly the US are still well ahead of China regarding both historical total emissions and per-capita annual emissions. And regardless of what China does in the future we still need to get our acts together domestically.

Also, China is aggressively pushing low-carbon energy sources on all fronts. Where they are now is not necessarily an indication of where they will be in a decade or two.

A large part of their emissions is the result of stuff they make for us. If we are serious about climate policy, we have to set up trade barriers proportional to greenhouse gases emissions to limit this effect. These policies must be informed by climate science.

Finally, regardless of what the rest of the world does, mitigation depends only on us and how well prepared we are.

Really, there is absolutely no scenario in which it is not a good idea to understand what the hell is going on with our climate.


If the United States stopped polluting 100%, global pollution would decrease by only about 10%. Probably even less, because much of that pollution would simply be exported through outsourcing. So what happens next?

>If we are serious about climate policy, we have to set up trade barriers proportional to greenhouse gases emissions to limit this effect.

Consumption economies can incentivize production economies to emit less.


Please remember that we are not talking about stopping climate-related policies. The point here is climate-related science. And even if you are an arch-conservative and you assume (despite all observations) that you can do fuck all about it, knowing how things are going to be is very useful if you intend to survive, never mind thrive.

You're arguing a hypothetical where the US stopped all emissions 100% and the rest of the world isn't doing anything.

The reality is that China is aggressively pushing solar and electric vehicles, and the West is complaining about it. Meanwhile the current US president's maxim is "drill baby drill".

I mean, if we don't need to stick to facts, let's discuss the hypothetical scenario where I am a powerful wizard, and when I say a magic word and I can halve the total amount of CO2 in the atmosphere?

OK, now where's my Nobel peace prize dammit??


So we do not need to worry. China and India will cut their emission a lot and we US just need to cut a little. Problem solved. /s

From 2022 to 2023 (lates report), China increased their emission from 11.9 Gt to 12.6 Gt. The US decreased from 4.79 to 4.68 Gt. So we (US and China) increased emission by 0.6 Gt.

So we (the world) are polluting more and more and you are telling me that we are on a great trajectory.


Bear with me, I'm adding some nice pictures to this thread.

https://www.theguardian.com/us-news/gallery/2026/mar/05/pict...


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: