Hacker Newsnew | past | comments | ask | show | jobs | submit | jacobgorm's commentslogin

Not if you’re a vaccine skeptic and personal friends with the CDC director https://www.statnews.com/2025/12/18/cdc-grant-controversial-...

Denmark has an official system for sending digital mail, which is how we receive letters from the government, bank statements, pay slips, and so on. Without this base load of paper mail, the economics of delivering paper mail stopped working.

SOTA on internal benchmark?


going to open source it soon :)


PHP and Python did. PHP so much better for web development, and especially faster due to being linked with the web server instead of in cgi-bin, and python by being a much cleaner and easier to use scripting language for batch jobs etc.

I still have nightmares about the time we were trying to write the server part of a distributed filesystem (the precursor to Lustre) in Perl.


“like IKEA” sounds misleading at best.


Misleading how? That’s precisely how SMRs differ from traditional plants - they’re manufactured in a factory instead of being constructed on-site. That’s exactly like the difference between IKEA and constructing furniture from scratch using blocks of wood.


And the developer experience is horrible when working with AMD. They don’t even accept driver crash bug reports.


People say that as if the Nvidia experience is better. Nvidia also has a horrible developer experience.


YMMV but I reported a crash in Nvidia's vulkan driver and they responded promptly and fixed it.


Huh? I've been developing against the Nvidia ecosystem for years. Just build a container and you're done. They even provide base containers.

Anything specific related to DC level computing?


What the blog post calls Vector Quantization is just numeric vector component quantization. Vector quantization typically uses K-means to quantize the entire vector, or parts of the vector in product quantization, into K-means indices.

https://en.wikipedia.org/wiki/Vector_quantization


When I studied cognitive psychology I remember one of the professors told us about how they had been playing with implementing neurals nets on their PDP11 back in the day. I remember thinking that had to have been be a total waste of time. Silly me.


The first convolutional neural network, the Neocognitron, was AFAIK implemented on a PDP-11 as well: https://www.semanticscholar.org/paper/Neocognitron%3A-A-neur...

No backpropagation back then, this only appeared around 1986 with Rumelhart, probably on VAX machines by that time.

The 11/34 was hardly a powerhouse (roughly a turbo XT) but it was sturdy, could handle sustained load and its FPU made the whole difference.


If I remember right that FORTRAN IV compiler really sucked, it used a stack machine and that floating point accelerator "sucked" by normal standards but was actually 100% effective at accelerating that stack machine. The FORTRAN 77 compiler that came latter was better.


Author here. They call it a FORTRAN IV compiler but it uses some F66 extensions, such as proper types and functions, although it lacks some of the nicer constructs of F66 like If/Then/Else, which would have been handy.

Regarding floating point, I realized the code actually works fine without an FPU, so I assume it uses soft-float. There's no switch to enable the FP11 opcodes, maybe that was in their F77 compiler.

It's indeed rough and spartan, but using a 64KB optimizing compiler requiring just 8KB of memory was a refreshing change for me.


Yes! It took 73 years , but Fortran 77 was definitely better than Fortran IV


Why 73 years?


Fortan IV -- released in 1904. Fortran 77: 1977



> it used a stack machine

Do you have some reading for this? I've used that compiler but I never read the resulting assembly language.


I always found it annoying that Rumelhart and McClelland named their books with the acronym “PDP” - Parallel Distributed Processing. Now I know that they were probably aware of the name collision…


Been on the wall since they defacto demoted him and undercut his open source approach. End of era I am afraid.


Don’t blame him at all considering he’s working under a guy that ran one of the worst data labeling company, feels like a big misallocation of his talents


I don't think it is only a matter of building a better world model. CCD sensors work very differently than human eyes and have problems such as over- and underexposure that preclude their use for safe driving in certain conditions. If you were to try driving a car with a VR headset fed from dual CCD sensors into a sunset as in https://www.bloomberg.com/features/2025-tesla-full-self-driv... you would get into trouble too.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: