These are available on Weatherbell[1] (which requires a subscription) now except for the HGEFS ensemble model which I'm guessing will probably be added later. AIGFS is on tropical tidbits which should be free for some stuff[5]. I believe some of the research on this is mentioned in these two[2][3] videos from NOAA weather partners site. They also talk about some of the other advances in weather model research.
One of the big benefits of both the single run (AIGFS) and ensemble (AIGEFS) models is the speed and (less) computation time required. Weather modeling is hard and these models should be used as complementary to deterministic models as they all have their own strengths and weaknesses. They run at the same 0.25 degree resolution as the ECMWF AIFS models which were introduced earlier this year and have been successful[4].
Edit: Spring 2025 forecasting experiment results is available here[6].
Really exciting to see NOAA finally make some progress on this front, but the AIGFS suite likely won't outperform ECMWF's AIFS suite any time soon. The underlying architecture between AIFS and GraphCast/AIGFS is pretty similar (both GNNs), so there won't likely be a model-level improvement. And most of ECMWF's edge lies in its superior 4DVar data assimilation process. AIGFS is still being initialized on NOAA's hybrid 4DEnVar assimilation process as far as I understand it, which is still not as good as straight up 4DVar unfortunately.
There are multiple efforts and a good number of VC working on AI DA system. DA is fundamentally a hand-crafted optimization process just like NN. I once reimplemented an EnKF in pytorch and it works amazingly fast. But our observations are so dirty and sparse. ECMWF tuned their system so well. NOAA definitely has potential being even better, but no hope any soon future IMHO.
The browser isn't exposing it to websites. It's simply due to the fact of playing media that it's lowering the minimum timer resolution on Windows. In the past it would also do this when just scrolling among other things if I remember correctly, I'm not sure if it still does this.
Firefox uses a different method that doesn't require lowering the minimum timer resolution.
Either way the global behavior of this is no longer true on modern Windows 10/11 machines (as of Windows 10 2004) as each process must now call timeBeginPeriod if it wants increased timer resolution:
https://randomascii.wordpress.com/2020/10/04/windows-timer-r...
They've improved things in Windows 10[1] with driver support(?) apparently although I have no experience with this so I can't say how this affects things practically.
They've been saying that they improve things and WASAPI for every version since Vista but it still does not performs as well as ASIO (and ASIO is muuuch more simple for the audio application developper than this WASAPI hellhole...)
I can't speak to Google forcing DNS, I don't know if this is true. However, it will respect the private DNS option (DNS-over-TLS) which you can point to whatever you want (like NextDNS for filtering).
It's not really recent (2018) just that it's still being used after all this time. If you're using a default config on the home router you're basically already fine (save for changing the default login).
> Very concerned about the recent microtik CVE, as that is going to make for some very large botnets.
To be pedantic there is technically no recent Mikrotik CVE WRT Meris. It was patched in 2018(?) shortly after discovery.
From their response to the Meris botnet[1]:
> As far as we have seen, these attacks use the same routers that were compromised in 2018, when MikroTik RouterOS had a vulnerability, that was quickly patched.
> Unfortunately, closing the vulnerability does not immediately protect these routers. If somebody got your password in 2018, just an upgrade will not help. You must also change password, re-check your firewall if it does not allow remote access to unknown parties, and look for scripts that you did not create.
It goes into more detail to further check/harden the device in the blog post. A lot of issues stem from having Winbox or other admin access not properly firewalled off and open to the world. Blessing and a curse of the power you have with these devices I guess.
> Specifically, Maxwell and Pascal use tile-based immediate-mode rasterizers that buffer pixel output, instead of conventional full-screen immediate-mode rasterizers.
The parent article already discusses that article, saying those GPUs don't use TBR in areas where the primitive count is too high or something:
> Another class of hybrid architecture is one that is often referred to as tile-based immediate-mode rendering. As dissected in this article[1], this hybrid architecture is used since NVIDIA’s Maxwell GPUs. Does that mean that this architecture is like a TBR one, or that it shares all benefits of both worlds? Well, not really…
What the article and the video fails to show is what happens when you increase the primitive count. Guillemot’s test application doesn’t support large primitive counts, but the effect is already visible if we crank up both the primitive and attribute count. After a certain threshold it can be noted that not all primitives are rasterized within a tile before the GPU starts rasterizing the next tile, thus we’re clearly not talking about a traditional TBR architecture.
Classic TBDRs typically require multiple passes on tiles with large primitive counts as well. Each tile's buffer containing binned geometry generally has a max size, with multiple passes required if that buffer size is exceeded.
Having watched the video, I'm fairly certain what is being observed is not really tiled.
I'm not however sure what a "tile-based immediate-mode rasterizers that buffer pixel output", but I think that's enough qualifications to make it somewhat meaningless. All modern gpu's dispatch thread groups that could look like "tiles" and have plenty of buffers, likely including buffers between fragment output, and render target output/color blending, But that doesn't make it a tiled/deferred renderer.