Hacker Newsnew | past | comments | ask | show | jobs | submit | mioelnir's commentslogin

I've talked to a mail admin from one of the companies involved in "Email made in Germany" once about it, and his reply was basically that of course it was marketing, and of course the techs had the idea to enable TLS in the drawer for years.

But at their scaling, enabling TLS means a lot of additional compute power, and due to that marketing campaign they now finally got the budget to install the additional hardware and enable it. Before, there was no business value that justified to spend that much more to get an - to the outside observer - unchanged product.


BioNTech developed the vaccine now mass-produced by Pfizer. For that, they received $445m from the German government. So, as a tax-paying German citizen I can say, not only will I do that, but even better, I already did.

It also feels deeply wrong to pull the "but all the research efforts that did not lead anywhere" argument, when Pfizer did not do the research in the first place. They should get compensation for organizing the huge trial, of course; that expertise was why they were on-boarded in the first place. And nobody expects them to manufacture that stuff at loss or cost. But we should not accept public money buying them goose laying golden eggs either.


Nobody’s getting a goose laying golden eggs from a covid vaccine. It’s a one time, relatively low-cost vaccine that is going to go out of demand once the population is immunized.

Oh, and now there are multiple competitors in the market.


Only if the vaccine gives permanent immunity, which I haven't found any research suggesting that it will. In all likelihood, it will be necessary to give yearly booster shots, so the companies developing the vaccines will be able to sell vaccines every year

The quantity of vaccines needed will make even a $0.50 markup worth billions of dollars every year


You can't measure long-term immunity on a virus that's only been in the wild for ~11 months. That said, all the recent studies I've seen show no signs that immunity is going to drop significantly after a year.

I hate to link to a Youtube video, but this doctor walking through the research in the first part of the video is honestly better than any news article I've seen: https://www.youtube.com/watch?v=gFeJ2BqCFY0


Thank you for the link. The study definitely suggest a best-case scenario is viable. I do however think that the sample population of 183 subjects is too small to support the conclusion in the paper. The study only got a single sample from the majority of the participants andI couldn't find any indication of how many subjects from each location participated (or which locations were included outside of California), which makes me think it might not be a representative population used

I also noted that 40 of the participants were excluded because they had no PCR test done to confirm covid and no antibodies were found in the assay. This is in my opinion a major flaw, as the subjects could have been infected, but had no antibodies left, when the blood sample was taken

Finally seven of the 18 authors declare competing interests, which might have affected the research

Getting back to your post itself, I agree that you can't measure immunity for a longer period than the virus has been around, but that also means you can't say that there will only be a need for one round of vaccines, which was what I was disagreeing with

It might very well turn out, that you gain permanent immunity for a specific strain off the virus, but unfortunately that immunity also introduces selective pressure. Whether the virus is able to mutate in a way that bypass existing antibodies in a subject is obviously still an unknown, but we have seen that it's able to jump to other species like mink, which caused the emergence of the Cluster-5 variant

Since the virus is able to use other species as a reservoir and selective pressure is being introduced, I think it's reasonable to prepare for a scenario, where a vaccine won't be a permanent fix


Vaccines are not a great business in general. (Which is not the same as saying they're unprofitable.) But one reason vaccine makers are generally indemnified against lawsuits in the US, it that at least some companies would probably pull out of making vaccines if they weren't.


They are not immutable in the 'chflags uchg,schg ' sense.

Updating these files is not in any way shape or form a hassle anymore if you do not update them. etcupdate has that solved. Even mergemaster has specific options to handle unchanged /etc that only diffs in svn-id tags and similar.

But in my opinion the basic premise of the article is false. mergemaster and etcupdate with their diff and 3-way merge capabilities are there because these files are assumed to have local edits.

The startup procedure, scripts, options are expected to be customized; thus the update procedure has handling to preserve local edits.


FreeBSD has had fully integrated, working ZFS for over 10 years. You were so excited to deploy the BSD killer app, you forgot to do so for ten years.

I can only tell you, whatever scary differences you expect between Linux and FreeBSD are probably no more than between any two Linux distros with different packaging systems.

Ten years ago, fresh out of a failed stint at university, I applied for a junior position at a Linux shop. Would nowadays probably be called junior system engineer or so. The night before the interview, I read around a bit in Stevens' TCP/IP Illustrated as well as Design and Implementation of BSD (to calm my nerves). I told them honestly, I had maybe 15 lifetime minutes on a Linux shell. But I started with FreeBSD4.4 and had by then already 8'ish years of experience on general *nix administration.

I'm still there. Pivoted around and upward a couple of times internally. But I still run FreeBSD on my workstation to get things done. And we're still fundamentally a Linux shop.

But the root cause of my career is a friend at university handing me a FreeBSD 4.4 CD. It is a tremendous system if you want to learn about the services a kernel offers to its userland. If you care to make the dive, it not only tells you the what and how, but the why and all the compromises that had to be made along the way. And that understanding is universal.

FreeBSD may be well known as a solid production platform. It's true strength is as the foundation for not only a lifetime of learning, but a lifetime of understanding.


Good for you! FreeBSD taught you Unix which applies perfectly well at least at that time, to Linux as well. That was a good bet! Somewhat the same as learning Slackware at the time would teach you Unix, while chosing another distro would be more limited.


Or not. Landesregierung BaWü for example distributes the daily COVID-19 news via Threema.


I didn't say this will always be the consequence, but it will be the typical result.


There is also another german word in the realm of centralization - Gleichschaltung. It is nowadays exclusively used to describe the centralization of government functions and bodies during the beginning of the 3rd Reich. Big central government offices, their tracking capabilities and emerging network effects were one of the required precursors for what came later.

It is why many parts of the modern/post-war German bureaucracy were intentionally set up inefficient and why we still dislike any "central databases" for anything, from law enforcement to tax collection. Slow, decoupled government systems slow down unwanted usage patterns much more than valid ones, are generally harder to game (try explaining your urgent need for some data to a rural bavarian city hall employee) and easier to monitor/verify for citizens.


By dropping transmutative alchemy from the sprint, allowing the implementation of a capability system on top of a now unforgeable subset of existing resources...


Sounds like the whitepaper for this week's hottest ICO!


Thank you so much. I'm on the verge of framing this and putting it on the wall. Cubicle wall, but still.


Simulated DC failure is more often then not just traffic flow engineering. It is more about testing the DC that takes over the traffic than it is about testing service restart in the inactive DC.

There is little to test about the introduction of a hard fault, but the service resumption in the other DC is full of data to analyze. Also, in such a setup, getting the fault location running again is not on a hard clock, since it is about restoring redundancy instead of the service.


> But something about is very compatible with the way my mind works.

Yes. For me it is channels. After nearly 20 years of UNIX'ish systems, pipes are a mental abstraction that I do not have to think about any more. And channels fit right in, they feel much closer to a how a pipe is used on the cli than a pipe or socketpair ever did in code.

For example a range loop over a closed channel is, for me, piping things to xargs. It's easy to understand, reason and conceptualize because it feels familiar.


This article lost me after just the first couple of paragraphs. If Leibniz is, as quoted in the text, `recognized for first formally proposing` the binary system - then what relevance does it have that someone had the idea earlier (and thought it was useless)? He is not recognized for having had the idea first, he is recognized for seeing an application of it and formally proposing and specifying it.

Roentgen wasn't the first to observe x-rays either, he was the first to perform extensive studies and publications on them.


Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: