Hacker Newsnew | past | comments | ask | show | jobs | submit | einr's commentslogin

This argument would sound nearly identical if you made it in the 70s or early 80s about mainframes and personal computers.

It's not that mainframes (or supercomputers, or servers, or the cloud) stopped existing, it's that there was a "good enough" point where the personal computer was powerful enough to do all the things that people care about. Why would this be different?*

And aren't we all paying for a bunch of silicon that sits mostly unused? I have a full modern GPU in my Apple SoC capable of throwing a ridiculous number of polygons per second at the screen and I'm using it to display two terminal emulator windows.

* (I can think of a number of reasons why it would in fact turn out different, but none of them have to do with the limits of technology -- they are all about control or economic incentives)


It’s different because of the ubiquity of the internet and the financial incentives of the companies involved.

Right now you can get 20TB hard drives for cheap and setup your own NAS, but way more people spend money every month on Dropbox/iCloud/onedrive - people value convenience and accessibility over “owning” the product.

Companies also lean into this. Just consider Photoshop. It used to be a one-time purchase, then it became a cloud subscription, now virtually every new AI feature uses paid credits. Despite having that fast SoC, Photoshop will still throw your request to their cloud and charge you for it.

The big point still remains: by the time you can run that trillion parameter model at home, it’s old news. If the personal computer of the 80s was good enough, why’s nobody still using one? AI on edge devices will exist, but will forever remain behind data center AI.


Right now you can get 20TB hard drives for cheap and setup your own NAS, but way more people spend money every month on Dropbox/iCloud/onedrive - people value convenience and accessibility over “owning” the product.

Yes, this is a convenience argument, not a technical one. It's not that your PC doesn't have or could have more than enough storage -- it likely does -- it's that there are other factors that make you use Dropbox.

So now the question becomes: do we not believe that personal devices will ever become good enough to run a "good enough" LLM (technical barrier), or do we believe that other factors will make it seem less desirable to do so (social/financial/legal barrier)?

I think there's a very decent chance that the latter will be true, but the original argument was a technical one -- that good-enough LLMs will always require so much compute that you wouldn't want to run one locally even if you could.

If the personal computer of the 80s was good enough, why’s nobody still using one?

What people want to do changes with time, and therefore your PC XT will no longer hack it in the modern workplace, but the point is that from the point that a personal computer of any kind was good enough, people kept using personal computers. The parallel argument here would be that if there is a plateau where LLM improvement slows and converges with ability to run something good enough on consumer hardware, why would people not then just keep running those good enough models on their hardware? The models would get better with time, sure, but so would the hardware running them.


The original point that I was making was never purely a technical one. Performance, economics, convenience, and business trends all play a part in what I think will happen.

Even if LLM improvement slows, it’ll probably result in the same treadmill effect we see in other software.

Consider MS Office, Adobe Creative (Cloud), or just about any pro level software. The older versions aren’t really used, for various reasons, including performance, features, compatibility, etc. Why would LLMs, which seem to be on an even faster trajectory than conventional software, be any different? Users will want to continue upgrading, and in the case of AI, that’ll mean continuing to access the latest cloud model.

No doubt that someone can run gpt-oss-120b five years from now on device, but outside of privacy, why would they when you can get a faster, smarter answer (for free, likely) from a service?


The benchmarks are not invented by the LLM, they are from an issue where Scott Shambaugh himself suggests this change as low-hanging, but low importance, perf improvement fruit:

https://github.com/matplotlib/matplotlib/issues/31130


Ah fair enough. But then it seems the bot completely ignored the discussion in question, there's a reason they spent time evaluating and discussing it instead of just making the change. Having a bot push on the issue that the humans are already well aware of is just as bad behaviour.

This is how it always is, until suddenly one day it isn't. Linux didn't play in the same league as serious and commercial UNIX systems until one fateful day it killed them all dead forever.

> until suddenly one day it isn't

Thread OP here. This is exactly my point: today isn't that day. I need it to be.

Some future server that I can migrate my community to isn't useful.


Not everything needs to have "traction", "excitement" or the biggest community. D is a useful, well designed programming language that many thousands of people in this vast world enjoy using, and if you enjoy it too, you can use it. Isn't that nice?

Oh a programming language certainly needs to have traction and community for it to succeed, or be a viable option for serious projects.

You can code your quines in whatever you'd like, but a serious project needs existence of good tooling, good libraries, proven track record & devs that speak the language.


"Good tooling, good libraries, proven track record" are all relative concepts, it's not something you have or don't have.

There are serious projects being written in D as we speak, I'm sure, and the language has a track record of having been consistently maintained and improved since 2001, and has some very good libraries and tooling (very nice standard library, three independent and supported compiler implementations!) It does not have good libraries and tooling for all things; certainly integrations with other libs and systems often lag behind more popular languages, but no programming language is suitable for everything.

What I'm saying is there's a big world out there, not all programmers are burdened with having to care about CV-maxxing, community or the preferences of other devs, some of them can just do things in the language they prefer. And therefore, not everything benefits from being written in Rust or whatever the top #1 Most Popular! Trending! Best Choice for System Programming 2026! programming language of the week happens to be.


D has three high quality compiler implementations. It has been around for ages and is very stable and has a proven track record.

Zig has one implementation and constant breaking changes.

D is the far more pragmatic and safer choice for serious projects.

Not that Zig is a bad choice but to say that a unstable lang in active development like Zig would be a better choice for "serious projects" compared to a very well established but less popular lang shows the insanity of hype driven development.


>Oh a programming language certainly needs to have traction and community for it to succeed, or be a viable option for serious projects.

You are totally right, sir.

>You can code your quines in whatever you'd like, but a serious project needs existence of good tooling, good libraries, proven track record & devs that speak the language.

Now, you HN user who calls yourself 4gotunameagain (which type of name is already a red flag for those in the know, because usually used by trolls), do us a favour by posting the link to at least one of your so-called serious projects here. Put your money where your mouth is.


Yes. Had to look it up, but apparently it was developed by TCNA (Toyota Connected North America) which does car software and such.

They probably mean that port 25 is blocked on consumer ISPs/residential IP blocks to prevent malware from running an smtpd on an infected home computer or router (which used to happen a lot), but on a higher level of course no one blocks SMTP.

telnet hasn’t shipped with macOS since 10.12 Sierra, ten years ago.

Debian also isn’t shipping telnet in the base install since Debian 11.


Thanks, sounds like a recent development. I don't use macOS, but on other peoples macOS computer it was always there, even when they are not developers. But it could very well be that these computers are ten years old.

I mean technically MS Windows 10 is ten years old, but the big upgrade wave to 10 only happened like 4 years ago, which is quite recently. Maybe that is similar to macOS users, I don't know that.


Noise depends a lot on the actual amount of light hitting the sensor per unit of time, which is not really a part of the simulation here. ISO 1600 has been quite usable in daylight for a very long time; at night it's a somewhat different story.

The amount and appearance of noise also heavily depends on whether you're looking at a RAW image before noise processing or a cooked JPEG. Noise reduction is really good these days but you might be surprised by what files from even a modern camera look like before any processing.

That said, I do think the simulation here exaggerates the effect of noise for clarity. (It also appears to be about six years old.)


The kind of noise also makes a huge difference. Chroma noise looks like ugly splotches of colour, whereas luma noise can add positively to the character of the image. Fortunately humans are less sensitive to chroma resolution so denoising can be done more aggressively in the ab channels of Lab space.

Yes, this simulation exaggerates a lot. Either that, or contains a tiny crop of a larger image.


At least then you’re being honest about you hating your intended audience, and not proudly posting the slop vomited forth from your algorithmic garbage machine as if it were something that deserved the time, thought and consideration of your equals.

Why should anyone bother to read what nobody wrote?


This seems to be what is happening bots are posting things and bots are reading it. It's a bit like our wonderful document system (www) turned into an application platform. We gained the later but lost the former.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: