Hacker Newsnew | past | comments | ask | show | jobs | submit | dragontamer's commentslogin

Well, not to be completely dismissive here... It's clearly a prototype project to try and make quadratic probing a thing.

I'm not convinced this methology is better than linear probing (which then can be optimized easily into RobinHood hashes).

The only line I see about linear hashes is:

> Linear jumps (h, h+16, h+32...) caused 42% insert failure rate due to probe sequence overlap. Quadratic jumps spread groups across the table, ensuring all slots are reachable.

Which just seems entirely erroneous to me. How can linear probing fail? Just keep jumping until you find an open spot. As long as there is at least one open spot, you'll find it in O(n) time because you're just scanning the whole table.

Linear probing has a clustering problem. But IIRC modern CPUs have these things called L1 Cache/locality, meaning scanning all those clusters is stupidly fast in practice.


Linear probing could get pretty nasty corner cases in a concurrent system. Particularly one where the table is “warmed up” at start so that 80% of the eventual size shows up in the first minute of use. If that table is big enough then pressure to increase the load factor will be high, leading to more probing.

If you have ten threads all probing at the same time then you could get priority inversion and have the first writer take the longest to insert. If they hit more than a couple collisions then writers who would collide with them end up taking their slots before they can scan them.


That's surely true of quadratic probing though?

The comments don't make sense to you because you know what you are talking about, claude does not, and this code was all written by claude.

Hmmm. That makes me sad but it does explain the uneasy feeling I got when reading the GitHub page

> Anywhere near the coast of China, a warship is within range of truck-mounted anti-ship missiles.[2] Lots of them.

Yes, which is why the DDG(X) class has loads of stealth built in, to make it harder for those missiles to lock on.

One of the most important tools for fighting missiles is... an aircraft carrier. Early warning air systems (E2 Hawkeye), interceptors (F35), mostly for blowing up scouting craft.

Missiles can only home into what they can detect and see. Blowbup their eyes (RADAR systems) and they are flying blind. It's a lot of ocean out there and the horizon is surprisingly short.

Flight is your best way to cover a lot of ocean and find an enemy, but anything flying should be taken out by an F35.

--------

I'm not so against a rail gun or any of these future weapons per se. IIRC Japan has deployed a rail gun and they are an ally, with the right R&D team / licensing we might be able to get a working design.

But you know, that depends on how well Japans Railgun works. Ditto with laser systems and whatnot: as long as we test the crap out of them it's fine to deploy.

> The sinking of the Moskva was the first demonstration of this, and Ukraine has since taken out about eight more Russian warships and many smaller craft, using various missiles and drones.

Moskva is barely comparable to a singular US Destroyer, let alone a cruiser or larger boat.

And USA deploys large teams of Destroyers to help watch each other (and protect the carrier at the core of their fleet).

I'd expect that a drone being launched at a US Carrier strike group would simply be gunned down by the machine guns of an F35, long before they get close to the fleet.

-----

The sinking of the Moskva is also a Russian error. We all know that the Moskva's RADAR system could see the drones. The sad truth is that the Moskva's sailors were themselves unready to watch a RADAR screen for hours, days, months. They likely got fatigue and sounded the alarm too late vs the aerial threat.

Or maybe command was not notified quickly enough. Who knows? Communication error? There's a whole slew of chain of command issues that could have happened.

But we all know that the Moskva has good enough RADAR to see all of those drones. Even in the storm they were in. So it's most likely some kind of human error along the way.

USA, and other NATO forces, have anti-fatigue measures (better software, better training). Furthermore, we run missions vs Houthis and gain battle experience, or also shoot down Iranian missiles on their way to Israel. These missions (exercises??) will keep our sailors in better shape than the awful training the Russians have.


Last government was breaking up Google.

This one is protecting it. How much of a reversal do you need before you can tell the difference?

Did you follow the Google Antitrust case at all?


How is that even relevant

The current US government is reversing literally everything like a small child out of spite. That doesn't mean HN mods are "grifters" who are "in bed" with them


> but I'm not even sure it's that much more friendly than previous gov

Do you take this line back, or did you simply forget you said this? Or is there some fine detail you wish to clarify here.

Because one admin literally suing for the breakup of Google, and the next admin giving the case up with no argument is about as opposite as you can get. The facts simply do not match your assertions


Why would I take it back? Did I lie in it?

Should I assume that breaking up google is bad for YC? And not breaking up google is good for YC? Why? Like having Google around would increase business opportunities for YC funded startups? And therefore HN mods are grifters in bed with trump? Sorry my friend but this is simply an insane

And even if breaking up google actually hurts YC, if one admin is reversing literally everything from the other admin out of princple and this accidentally does some good for you it doesn't mean it's friendly to you.

Sometimes I can't tell here who's more crazy US righties or US lefties...


> Why would I take it back? Did I lie in it?

Just say that the Trump admin is friendlier to Google. Can you do that for me?

Or are you unable to even do that?

> How is that even relevant

If it's not relevant, you should be able to admit it yourself without any issue. So take the statement 'This admin is friendlier to Google than Biden admin was', and make sure your statements are logically consistent with said statement.

That's all I'm asking you to do.

> Sometimes I can't tell here who's more crazy US righties or US lefties...

Don't making this a left or right thing. I'm just seeing if you can admit to the differences between 2024 and 2025 with respect to Google and the antitrust case. It's not like the fine issues of antitrust cases are an issue that the left or right really rallies behind.


"trump is accidentally friendlier to google because he likes to reverse things from biden" doesn't mean trump is friendlier to tech overall.

for example most of what Trump claims he invested in chip production was actually invested by Biden. Trump gives more promises and less actual money.

https://itif.org/publications/2020/09/28/trump-vs-biden-comp...


And with the antitrust case lifted we now see Skydance/Paramount merger. And Netflix/WB merger. And TikTok.

That's a lot of friendly to tech things Trump is doing.

The previous admin likely would have kept the companies separate / brought up antitrust issues.


except for TikTok none of them are really "tech"

I want to say Trump seems a bit friendlier to corporate. But the pro corporate guy would not take from corporate their best cheapest workforce. Maybe his friendly tax policy compensates for his strict immigration policy idk


Larry Ellison literally is stepping into Paramount and providing guarantees for $40,000,000,000+ to try and push Paramount + WB Merger. Literally Larry Ellison.

Are you unaware of any of the tech CEOs who are involved in any of the stuff I was discussing?


And we know it would not happen if it is democrat gov because we have access to alternative histories?

Because under a Democrat Government, TikTok was getting banned.

Under Trump, he kept it long enough for TikTok to be bought by Republican-allies.

This is just stuff from last year. Have you paid attention to any of the companies I've brought up at all? Are you unaware of all the massive $Million++ dinners that every tech-bron attended for Trump? Do you know the connections of JD Vance to Ellison and other tech bros?

Have you watched cryptocoins suddenly surge into mainstream with the Trump presidency as Trump literally pardoned those criminals?

What part of this administration is not pardoning cyber criminals or tech bros or otherwise sending money to them?


I can give you that trump is 100% more friendly to crypto. that's for sure

Yes but Spain, England, and France all had decade long declines that reversed. Except you know, at the end. When it didn't reverse.

We are witnessing the end of... something. Is it the end of the Roman Republic or is this the end of the Roman Empire?

Two very different situations despite being so politically fraught and full of change.


Because of conversion losses, I have to imagine this is subtly very bad.

Every form of lossy compression deleted data. Yes AV1 is more efficient but only when working off of high quality originals.

H265 already deleted a ton of data. It can never recover the quality loss. Compressing even further can only worsen the image.


While I agree with you, I find that sometimes the “experience” can improve.

The most common “artifact” of AV1 is to make things slightly more blurry for example. A common H.265 artifact is “blockiness”. I have re-encoded H.265 to AV1 and not only gotten smaller files that playback better on low-end hardware but also display less blockiness while still looking high-resolution and great colour overall.

I always encode 10 bit colour and fast-decode for re-encoding to AV1, even if coming from an 8 bit original.


But then you look at flashback scenes and wonder where the noise has gone.

A lot of movies have purposeful noise, blurriness, snow, and fake artifacts to represent flashback scenes. One level of compression often keeps them okay-ish (like you can tell side by side that it's different, but only when you know what to look for). But these are the scenes that get especially ruined by two layers of compression.


What's the optimal strategy then ? 50 GB Blu-ray remux => 3 GB AV1 ?

50GB gives assurances that the BluRays are high quality (but not always. I've seen some horrible BluRay encodings...)

As long as you are going from high quality sources, you should be fine. The issue is each transcoding step is a glorified loop-(find something we think humans can't see and delete it)

In other words: the AV1 encoder in your example works by finding 47GBs of data TO DELETE. It's simply gone, vanished. That's how lossy compression works, delete the right things and save space.

In my experience, this often deletes purposeful noise out of animation (there are often static noise / VHS like effects in animation and film to represent flashbacks, these lossy decoders think it's actually noise and just deleted it all changing the feel of some scenes).

--------

More importantly: what is your plan with the 50GB BluRays? When AV2 (or any other future codec) comes out, you'll want to work off the 50GB originals and not off the 3GB AV1 compressed copies.

IMO, just work with the 50GB originals. Back them up, play them as is.

I guess AV1 compression is useful if you have a limited bandwidth (do you stream them out of your basement, across the internet and to your phone or something? I guess AV1 is good for that) But for most people just working with the 50GB originals is the best plan


> In other words: the AV1 encoder in your example works by finding 47GBs of data TO DELETE.

With that reasoning, lossless compression of .wav to .flac destroys >50% of data.

In actuality, you can reconstruct much of the source even with lossy compression. Hell, 320kbps mp3 (and equivalent aac, opus, etc) are indistinguishable from lossless and thus aurally transparant to humans, meaning as far as concerns us, there is no data loss.

Maybe one day we'll get to the point where video compression is powerful enough that we get transparent lossy compression at the bit rates streaming services are offering us.

> In my experience, this often deletes purposeful noise out of animation

AV1 specifically analyzes the original noise, denoises the source then adds back the noise as a synthetic mask / overlay of sorts. Noise is death for compression so this allows large gains in compression ratio.


> AV1 specifically analyzes the original noise, denoises the source then adds back the noise as a synthetic mask / overlay of sorts. Noise is death for compression so this allows large gains in compression ratio.

If said noise still exists after H265.

And there's no guarantee that these noise detection algorithms are compatible with H264, H265, AV1, or future codecs H266 or AV2.


AV1 is not about throwing away more data that the human can’t see. It’s about having better tools.

1. the prediction tools of AV1 are better than those of h265. Better angular prediction, better neighboring pixels filtering, an entirely new chroma from luma prediction tool, an intra-block copying tool, more inter prediction tools, non-square coding units.

2. If the prediction is better, the residuals will be smaller.

3. Those residuals are converted to frequency domain with better tools for AV1 as well (more options than just DCT), so that you have a better grouping of coefficients close to the DC component. (Less zeros interleaving non-zero values.)

4. Those coefficients compress better, with a better entropy coding algorithm too.

You can have exactly the same video quality for h265 and AV1 yet still have a lower bitrate for the latter and with no additional decision made to “find out what humans can’t see.” The only place in the process where you decide to throw away stuff that humans can’t see is in the quantization of the frequency transformed residuals (between step 3 and 4) and the denoising before optional film grain synthesis.

To be clear: you can of course only go down or stay equal in quality when you transcode, due to rounding errors, incompatible prediction modes etc. That’s not under discussion. I’m only arguing about the claim that AV1 is better in general because you throw away more data. That’s just not true.


Thank you for the detailed answer!

Yes, in general you find the best high quality source you can get your hands on and then compress that. For us lay people, that would currently be any 4k videos with a high bitrate. In such cases, it doesn't matter much that it is already compressed with AVC or HEVC. Sure, when you compress that again at a lower bitrate, there will some loss of data or quality. But honestly, it doesn't make a discernable difference (after all, you decide what is the video quality acceptable to you by choosing how much more to compress). Ideally, if DVD and Blu-Rays lasted long, we would all just be saving our videos on it. (Assuming there will be any Blu-Ray readers, 10+ years down the lane).

Well its sure gonna get the filesize down though, great HECV -> AV1 transcoding success..

Yes. Deleting data does wonders for the filesize. The question I'm bringing up is one of quality.

If you must delete, delete starting from the 50GB+ original BluRays if at all possible, or some other very high quality source. That way the compression algorithm has the best chance of saving the important scene data.

And keep an eye on the known hard to encode scenes. A lot of the typical shots of a movie are handled well on one set of settings, but suddenly screw up on other scenes (or other animation styles. Anime vs Cartoons vs 3D vs Live Action can have subtle differences leading to quality issues).

It's not easy, and AV1 is our best bet at doing this well so far. But when the future algorithms come out, you need to start over from the best sources of you want AV2 to have a chance.

You should *Never* double compress. (Blu-ray -> H265 -> AV1). This is horrible for the quality. You'll get better results from BluRay -> AV1 by a large margin.


I continuously get Spanish advertisements despite being unable to speak Spanish. (The only foreign language media I visit on YouTube is Anime/Japanese and Germans stuff).

There are surely large numbers of dumb cases of being mispredicted out there in.


On a foreign language scale, Bluey and Peppa Pig are around B1- or A2+.

Or in other words: a typical adult needs about one year of self study (or nearly 6 months of more focused intensive study) before they can fully understand a show like Bluey or Peppa Pig.

And maybe half that for substantial understanding. (3 months intensive, 6 months typical self study to reach A2+ / watch Bluey with substantial understanding but not complete understanding).

If I were to guess at Mickey Mouse clubhouse, it's damn near A1 or A0+, it's so repetive and slow that you can learn some words from it.

Yeah, that's a lot more boring than the 'advanced' shows like Bluey or Peppa Pig.

Also note that children are not aware of tools (ie hammers or screwdrivers) yet. So simple learning exercises to know that hammer hammers nail but not screws is the kind of thing needed at pre-school level.

I'd imagine that the appropriate age for Mickey Mouse clubhouse is under 3. Bluey/Peppa Pig are closer to 6 or 7+ year old material.

Or in foreign language levels: B1-ish / 2+ on the American scale.

------

Seriously. Just switch the shows to a different language and the level gap becomes blatantly clear.

In perhaps more Techie terms: Mickey Mouse Clubhouse level of understanding is achievable with Duolingo. Peppa Pig / Bluey (and similar level shows) are so far beyond Duolingo that I bet most Duolingo users will NEVER be able to achieve Bluey-level understanding in a foreign language (and that deep textbook + 1000ish vocab study memorization needs to be done before Bluey can be understood).

------

Maybe the vocab estimate is easiest to understand. Bluey feels like a show that uses 1000 words with mastery (and maybe 2000 hard words as learning exercises in the show).

Mickey Mouse clubhouse uses maybe 250 words with mastery and maybe uses the top1000 list as learning/teaching words.

How (and why??) does Mickey Mouse clubhouse make an ENTIRE song consisting of a single word? (hotdog?) Because it's written for people where 'Hot dog' is a difficult word and needs repetition.


This hashtable implements a multiset. Not (merely) a simple set.


Oscillators are hard because our specs on oscillators are absurd.

Let's take the common watch circuit. Conceptually it's just like 5 components: crystal, a few capacitors, a NOT gate/transistor/amplifier. Introduce 180-degree phase shift at the 32.768k target frequency and bam, oscillator.

Except not really. A clock is expected to have a drift in the region of 100 ppm (or 0.001% error), or lose a second per day or so. That's the hard part, building something that accurate and consistent.

There are also startup specs, power specs (less power is harder. More power helps startup....) etc. etc.

-------

If you just need something to oscillate back and forth randomly, try making a noise generator lol, it will oscillate wildly at many frequencies, one of which might have been the frequency you wanted.

---------

The 555 timer is perhaps the easiest oscillator for a beginner if you are willing to put up with +/-10% drift. It's honestly good enough for far more applications than you might expect.

Even without the premade chip, a 555 timer is just 2x comparators (analog version of an If statement), a 33%/33%/33% voltage divider, and a capacitor. If Voltage > 66% input voltage, remove power from capacitor. If Voltage < 33% input voltage, add power to capacitor. Bam, you now have an oscillator that is accurate to the +/-10% capacitance values of your electronics kit.

Alas, modern circuits need to be faster and more accurate than the humble 555. But a beginner article about oscillators should be about the 555, rather than about opamp or transistor based oscillators.


Well yeah. No one is saying that China cannot do that. Just that the political calculus is that it's better for China to spend their resources on that, rather than building up troops and warships.

Force Chinas growth to be more expensive. It has nothing to do with not believing China can do it, it's about slowing them down in a task we believe that they can do.


> Just that the political calculus is that it's better for China to spend their resources on that, rather than building up troops and warships.

Note that this calculus only makes sense if you invade China while they are busy with the EUV machines, otherwise they catch up technologically and then build all the scary military.

Of course, the the calculus doesn't make sense at all, because the obvious order when you can't do both is you build enough military to feel safe first, then you try for the tech race.


Their plan was to buy those chips and equipment and have the troops/ships/weapons sooner.

Now China has to build EUV themselves, then mass produce chips. It slows them down regardless and costs them resources.

Cut off the market before it becomes a problem.

---------

Militarily, delaying China into 2040s after the USA has stealth destroyers of our own (beginning production in late 2020s, mass production in the 2030s) means China has to fight vs 2030s era tech instead of our 1980s era Arleigh Burke DDGs.

What, do you want to have the fight in late 2020s or would you rather have the war in late 2030s? There is a huge difference and USAs production schedule cannot change. But we can change Chinas production schedule.


> the obvious order when you can't do both is you build enough military to feel safe first, then you try for the tech race

Literally zero actual wars with a technological component have progressed like this. (The first tradeoff to be made is the one Russia is making: sacrificing consumption for military production and research. Guns and butter.)


That's not true. Mass/quantity can still resist/delay/push back until you're exhausted and done.

We're not anymore in the swords vs guns era. We're talking about hypersonic missiles vs super intelligent hypersonic missiles. Still, all it takes is 1 dumb missile to pass through the defenses and an entire city can be wiped off. At the end of the day, they don't care if a missiles didn't reach the precise target. As you can see in Ukraine, Russia is bombing all types of buildings, they don't give a damn about schools, kindergarten or so.

The tech component is not everything.


> We're not anymore in the swords vs guns era. We're talking about hypersonic missiles vs super intelligent hypersonic missiles

These are still hypotheticals. Every war since the Civil War has had a decisive technological component. If the model doesn't apply there, this time probably ain't different.


Like the Vietnam War? Or the wars in Afghanistan...?


> Like the Vietnam War?

Yes. Concern around Soviet space and missiles capabilities overtaking America’s directly lead to Kennedy changing his mind on no boots on the ground.

(The Vietnam War started with America betting on BVR, with the long-seeing but minimally-agile F-4 Phantom. Soviet MiG-21s, on the other hand, blended into civilian traffic. This lead to disaster. When the MiG-25 rolled out, we countered with the F-15 Eagle. But it came too late, which meant we couldn’t establish air superiority with long-range aircraft alone.)

Note: I’m not saying this was the decisive component. It was one among many, and not the most important. But if we had F-15s at the outset, when the Soviets had MiG-21s, there is a better chance the skirmish would have stayed in the skies and Vietnam would have stalemated like Korea.


> it's about slowing them down in a task we believe that they can do.

But it's not slowing them down. It's forcing them to accelerate development ( aka investing more into the sector ). Has china invested more or less? It's amazing how blind people are to this counterintuitive fact.


Oh, and your plan is to just give them the chips they want directly?

Of course investing into chip development is slowing China down. Its slower to build their own than for us to give them those chips.


> Oh, and your plan is to just give them the chips they want directly?

"Give them"? I love sneaky propagandists. No, make them pay for it. It's what we do to our "allies" so that they are dependent on american tech.

> Of course investing into chip development is slowing China down.

From a myopic narrow point of view. But viewed more broadly, it has accelerated china's tech development.

> Its slower to build their own than for us to give them those chips.

In the short term, but not the long term. Just like banning china from participating in the international space station forced china to accelerate their development of their space program.


> From a myopic narrow point of view. But viewed more broadly, it has accelerated china's tech development.

Yes. I'm fine with this.

Weakening China in the short term means pushing the Taiwan war timeline by years. Years that we will spend building up the DDG(X).

As I said before and I'll say again: USA is weak in 2020s but strong in the 2030s. We only need to delay China by a few years and the DDG(X) changes everything.

----------

You need to understand that I make my view based on the perceived strength of the US Navy. The US Navy is getting huge upgrades and a few years of delay makes an incredible difference.


"We"? Okay buddy.

> USA is weak in 2020s but strong in the 2030s.

The US is the largest economy with an unparalleled military at the moment. What are you talking about?

> The US Navy is getting huge upgrades and a few years of delay makes an incredible difference.

For what? The US Navy will play no role in a war between china and taiwan.

No offense, but who gives a shit about taiwan? Not americans. Only chinese people care about taiwan.


> For what? The US Navy will play no role in a war between china and taiwan.

Uhhhhh, Taiwan is an island dude. That's either Marines or Navy. I'm betting Navy will do the heavy lifting given that China is missile heavy.

Marines might be used to shore up anti-landing defenses if China decides to send boots on the ground. But ideally the US Navy prevents the landing entirely.

Said war taking place while we have 1980s-era Arleigh Burke Destroyers would be an attack while our Navy is at our weakest. Anything we can do to delay said war until after the DDG(X) upgrade is to our advantage.

> No offense, but who gives a shit about taiwan? Not americans. Only chinese people care about taiwan.

I'm American and I care? That's why I'm arguing on this point.

Current wargames suggest that USA will be willing to dedicate like 2 carrier strike groups for the defense of Taiwan. I'm not sure if it's enough (especially with the aging Arleigh Burke destroyers), but that's the level of commitment mostly assumed in this scenario if not more.

We have like 14 Carrier strike groups for a reason. We can spare two of them to this task, maybe more.


> Oh, and your plan is to just give them the chips they want directly?

Yes! Remove the impetus for them to innovate and make them reliant on our exports.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: