Hacker Newsnew | past | comments | ask | show | jobs | submit | snuxoll's commentslogin

It is used for tracking, that's the whole point of the header. "Who's sending me all of this traffic" is a useful, non-invasive thing for websites to have access to. You can use rel="noreferrer" on a link to disable the header on a specific link, as well as the `Referrer-Policy` header and `<meta name="referrer" />` to have some additional control (the 'origin-when-cross-origin' value can be useful in some cases, so destination sites can attribute what origin traffic came from, but not the specific page, while still being able to track it on your own origin - I think this is actually the default behavior in browsers these days).

It’s a bit different, but if you compare it to solutions from the likes or Oracle, SAP, etc. it’s significantly less awkward to develop for.

"Simple" VPS providers like DigitalOcean, etc. really need to get the hell onboard with network virtualization. It's 2026, I don't want to be dealing with individual hosts just being allocated a damned /64 either. Give me a /48, attach it to a virtual network, let me split it into /64's and attach VM's to it - if I want something other than SLACC addresses (or multiple per VM) then I can deal with manually assigning them.

To be fair, the "big" cloud providers can't seem to figure this shit out, either. It's mind boggling, I'm not saying I've gone through the headache of banging out all the configuration to get FRRouting and my RouterOS gear happily doing the EVPN-VXLAN dance; but I'm also not Amazon, Google, or Microsoft...


Do you think anything other than trivial internal networking is a common requirement on DO? I’m not saying it’s not, I really don’t know— I haven’t been in the production end of things for a while and when I was, everyone was basically using AWS et. al. for non-trivial applications. They make it easy enough to set up a private ipv4 subnet to connect your internal services. Does that not satisfy you use case or are you just avoiding tooling that might be obsolete sooner than ipv6?

Are you talking about the model or their service? There's plenty of options for using their models other than the official DeepSeek API.

The problem with consumption taxes, as we have seen time and time again, is they disproportionately affect those with lower incomes.

You can alleviate this by taxing less goods mainly consumed by lower income workers, and handling out social welfare.

Do people with lower incomes have the entitlement to all benefits of the state without paying a fair share of their own?

If anything, there’s plenty of literature showing that social programs and tax exemptions on the poor make underpaying them possible to begin with. Walmart couldn’t pay $12/hr. if tax exemptions and SNAP and other aid didn’t fill the gap.


We don't have go go to the extremes of employers that pay what is effectively poverty wages relative to cost of living.

The household that brings home $80K/yr would always spend a larger percentage of their income on taxable consumption, than an executive that takes home multiple million per year. Progressive income tax brackets are a better tool for making sure those who are able to pay a larger share of the common good, do so.

Unfortunately, we still have not come up with a realistic way to deal with the hoarding of wealth - both by individuals, as well as corporations like Apple with massive warchests. Even some more broadly accepted ideas like a LVT have some issues if the future really does trend towards "AI" displacing people from their jobs.

One way or another, the reality is that the tools we have right now have persisted because they do their job well when politicians act in good faith and don't implement poor fiscal policy emphasizing short-term gains that result in long-term pain. But, they're still fundamentally flawed, and something is going to have to change if we do see dramatic changes to society in the coming decade due to developing technologies.


> The household that brings home $80K/yr would always spend a larger percentage of their income on taxable consumption, than an executive that takes home multiple million per year. Progressive income tax brackets are a better tool for making sure those who are able to pay a larger share of the common good, do so.

"Progressive income tax brackets" don't actually do this. The people with so much money they can't spend it all use various tax shelters as it is. They typically manage to not even pay tax on the amounts they do spend, because they borrow money and spend it instead of recognizing it as income first. So they would be paying more under a flat consumption tax than they do under the status quo. The "progressive income tax system" doesn't actually work the way it's claimed to.

On top of that, the problem is essentially fake. People absolutely can and do spend millions of dollars a year. Cardiologists making seven figures buy huge houses with multi-car garages full of exotic makes etc. It's spending billions of dollars a year that nobody is really going to do, but that's such a tiny percentage of people that it's ridiculous to design a tax system being imposed on everybody else on the basis of that, and those are the exact people who aren't paying the high rates under the existing system anyway.

Here's a proposal: Have a flat consumption tax, and then have an income tax where the rate is 0% up to the 99.9th percentile income and only the top 0.1% even have to file a tax return. The latter is going to be avoided in the same ways it is now, but at least then you can't say the billionaires don't have a higher nominal rate, right?


The problem with a consumption tax is not the steady state, but the transition from the current state.

How do you take a retiree couple whose main income-earning days are far behind them, and ask them to pay 25% or more on their consumption?

Not an impossible problem, but it’s THE problem.


Is it though? Both social security and 401k withdrawals are taxed under the existing income tax, so they'd just be paying it as consumption tax instead.

Also, aren't people with an enormous amount of stored wealth "the rich"?


You don’t have to have an enormous amount of stored wealth to be on a livable fixed income (e.g. a municipal pension) and that income could be very lightly taxed today relative to a viable consumption tax.

Government pensions seem like the easy one. The state would be getting the revenue from when they spend the money, so they could use it to adjust the amount of the pension ("cost of living adjustment") and it would be revenue-neutral.

But also, government pensions tend to be, shall we say, unreasonably generous, because they live in that sour spot between "the legislature doesn't have to pay for this in the current year's budget" and "the union negotiates reasonable-seeming rules it knows it can game against public officials who are in their pocket or DGAF" e.g. pension is based on compensation in the last year before retirement and overtime is "awarded" based on seniority, so that people put in 80 hours of overtime every week in their last year. And then we're back to, aren't those the people we want to be taxing anyway?


Are state government pensions worse? Folks live and work for a state that includes a pension, i.e. Illinois, then retire and move out of the state, no longer contributing to that state's economy, just drawing on it. Thoughts?

> If anything, there’s plenty of literature showing that social programs and tax exemptions on the poor make underpaying them possible to begin with.

That literature is playing fast & loose with terminology to justify a preexisting conclusion.

Anyhow, we know what life was like before Great Society programs, and it wasn't higher wages for the poor, we've just forgotten because it's been so successful. That memory hole oddly works in favor of both those who promote expanding welfare and those who oppose it.

> Walmart couldn’t pay $12/hr. if tax exemptions and SNAP and other aid didn’t fill the gap.

From a basic macro economic standpoint, most welfare programs push wages up by marginally reducing the labor pool. In a free market, how would Walmart be forced to pay a "livable wage" if entitlements didn't exist? Do you really think people would just choose not to work and starve if their wages didn't cover all their expenses? Out of spite? It doesn't make sense, and it certainly doesn't comport with history. It makes even less sense when people buy this argument yet also support minimum wage laws.

The counterexample is the Earned Income Tax Credit (EITC). EITC increases as your wages increase, theoretically incentivizing work, rather than diminishing as you earn more. This would increase labor supply. What tends to happen to prices (i.e. wages--price of labor) when supply increases but not demand? Presumably the more cogent literature bemoaning Walmart's labor practices is primarily relying on EITC while hoping the reader glosses over the distinction.


> Anyhow, we know what life was like before Great Society programs, and it wasn't higher wages for the poor, we've just forgotten because it's been so successful.

That doesn't tell you the answer because the programs were instituted prior to the productivity increases in the 20th century. Are people better off now than they were before the general availability of electric light or mechanized transportation? Probably, but that doesn't mean you can trace the development of modern agriculture to the existence of SNAP.

> In a free market, how would Walmart be forced to pay a "livable wage" if entitlements didn't exist?

People frequently have choices between jobs that are easier or otherwise more pleasant and jobs that pay more. For example, long-haul truck drivers get paid significantly more than short-haul drivers, but they also sleep in their trucks and don't get to see their families most nights. Likewise, a lot of jobs require you to get a degree or certification, which can be a lot of work, which people may not be willing to do if they don't need to.

If you give them "benefits" then they take the easier job over the better paying one. Which allows the employer offering the easier job to pay less and still get applicants. It also creates a poverty trap if the benefits are contingent on not making more money, because then the compensation advantage of the higher-paying job is much smaller -- in some cases negative.

> EITC increases as your wages increase, theoretically incentivizing work, rather than diminishing as you earn more.

Except that it does diminish as you earn more, because it has an aggressive phase out. For a single person with no dependents, the phase out kicks in below federal minimum wage. If you had a minimum wage job at 30 hours a week and wanted to work 40 hours, increasing your hours would cause you to receive a smaller EITC.

There is a reason the EITC represents ~0.1% of the federal budget, and it's not because it's a bad idea, it's because it's implemented in a way that prevents people from getting much from it.


> People frequently have choices between jobs that are easier or otherwise more pleasant and jobs that pay more. For example, long-haul truck drivers get paid significantly more than short-haul drivers, but they also sleep in their trucks and don't get to see their families most nights. Likewise, a lot of jobs require you to get a degree or certification, which can be a lot of work, which people may not be willing to do if they don't need to.

That's a slight of hand. There's value in choice, and that value is being reaped by the worker precisely because poverty programs make it possible.

But let's go with that example. You're assuming the number of truckers and trucker-hours would remain constant. But they wouldn't. That's just not how dynamic systems work. There are other people for whom short-haul trucking is the less desirable choice than what they're doing now, or who work fewer hours than they're doing now. Without the welfare subsidies, the supply of short-haul trucking labor would likely increase--more people working more hours. Similarly, you're assuming the demand for short-haul trucking would remain the same at higher wages. But demand in economics is not the same thing as "I would like" or even "I need", and at higher wages the demand would likely diminish.

The whole argument is the economics equivalent of a perpetual motion machine, and it's sold by throwing contrived complexity at people and hoping they don't think it through. Like perpetual motion or free energy machines, at the most miniscule scale there are exceptions and caveats (maybe short-haul wages in particular would rise, especially after accounting for the totality of labor economy changes), but those exceptions don't scale to a systems level. That doesn't stop con artists from selling their Rube Goldberg machines, though, knowing the vast majority of people won't think it through.

What the rhetoric is trying to do is bolster support for a livable wage through radical policy changes by drumming up anti-corporate sentiment. It's in service of a normative argument (a "livable wage" is a reasonable social ask, IMO, notwithstanding its amorphous nature), but disguised as a scientific argument that can only result in failure by setting wrong expectations about how markets and policy operate, ultimately reinforcing cynicism.


> There's value in choice, and that value is being reaped by the worker precisely because of poverty programs make it possible.

It seems like you're ignoring the same thing you're objecting to: It's a dynamic system.

If long-haul trucking companies offer less desirable but higher paying jobs and easier jobs aren't paying a living wage then people would pick the harder job that lets them not starve. Which means the easier jobs would have to pay more in order to attract workers, unless those workers can get government assistance. If they can, the easier jobs can get people to work without paying more, because the assistance programs let them pick the easier job even at lower pay. In other words, the subsidies were supposed to go to the poor and instead they went to the lower-paying employers.

In a dynamic system the long-haul companies would then have to respond if it became more desirable to work somewhere the pay is low enough to get government assistance, but the phase outs give the low-paying employers another advantage.

Say the undesirability of the job is good for $15k/year in additional compensation. However, if you got paid $15k more, you'd lose $10k to government benefit phase outs and additional taxes. To actually get paid $15k more, you'd have to "get paid" $45k more. Which is to say, the employer with the low-paying job can pay you $45k less.

But it's a dynamic system, so they might "only" pay you $35k less and then hire more people. The trucking companies would then have to pay $45k more than them when it used to be $15k. Even with Walmart paying less than before, their relative advantage has increased. And there are two ways to get something a long distance over land: A long-haul truck the whole way, or a short-haul truck to the rail yard, a freight train, and then another short-haul truck. So then instead of a truck driver getting higher pay per mile over 2000 miles of driving, a different one gets lower pay per mile over 60 miles of driving twice, and a rail company gets the rest.

So the low-wage subsidies cause the amount of higher-wage labor demand to go down by making it less competitive with non-labor alternatives to perform the same function, as labor is diverted to the lower-paying jobs even while enabling them to pay even less.

> There are other people for whom short-haul trucking is the less desirable choice than what they're doing now, or who work fewer hours than they're doing now.

All of that is already baked in to the existing numbers; the long-haul drivers get paid more because fewer people want to do it.

> Like perpetual motion or free energy machines, at the most miniscule scale there are exceptions and caveats, but those exceptions don't scale to a systems level.

Only they're not exceptions. If you subsidize something you get more of it. What happens if you subsidize low-paying jobs but not higher-paying jobs?


This seems like an overly complicated phrasing of the much simpler "are progressive taxes and social safety nets actually bad?"

Yes, all members of the state should benefit from state programs. There is little value in adding an exclusionary check to state programs.

Yes but ideas exist like FairTax which directly address this issue in some fashion. It's easy to come up with reasons why something won't work, it is a lot harder to find solutions.

Embraer has been working on their auto takeoff system, E2TS, for some time. While improved safety during a critical phase of flight is a goal, airlines are looking at the possibility that it allows increased performance (higher MTOW, shorter runways, less fuel burn.)


PCIe already allows DMA between peers on the bus, but, as you pointed out, the traces for the lanes have to terminate somewhere. However, it doesn't have to be the CPU (which is, of course, the PCIe root in modern systems) handling the traffic - a PCIe switch may be used to facilitate DMA between devices attached to it, if it supports routing DMA traffic directly.


That’s what happened in TFA.


You're right. Let me correct myself: a hobbyist-friendly hardware solution. Dolphin's PCIe switches cost more than 8 RTX 3090 on a Threadripper machine.


Jeff forgot to mention that in his post!


> where allocating using malloc(3) in one DLL then freeing it with free(3) in another being a crash.

This can still happen all the time on UNIX systems. glibc's malloc implementation is a fine general purpose allocator, but there's plenty of times where you want to bring in tcmalloc, jemalloc, etc. Of course, you hope that various libraries will resolve to your implementation of choice when the linker wires everything up, but they can opt not to just as easily.


No actually, this doesn't happen the same way on modern Unix. The way symbol resolution works is just not the same. A library asking for an extern called "malloc" will get the same malloc. To use those other allocators, you would typically give them a different symbol name, or make the whole process use the new one.

A dll import on Windows explicitly calls for the DLL by name. You could have some DLLs explicitly ask for a different version of the Visual Studio runtime, or with different threading settings, release vs debug etc., and a C extern asking for simply the name "malloc", no other details, will resolve to that, possibly incompatible with another DLL in the same process despite the compiler's perspective of it just being extern void *malloc(size_t) and no other detail, no other decoration, rename of the symbol etc.. there might be a rarely used symbol versioning pragma to accomplish similar on a modern gcc/clang/elf setup but it's not the way anybody does this.

I would argue that the modern Unix way, with these limitations, is better, by the way. Maybe some older Unix in the early days of shared libraries, early 90s or so, tried what Windows does, I don't know. But it's not common today.


> No actually, this doesn't happen the same way on modern Unix. The way symbol resolution works is just not the same. A library asking for an extern called "malloc" will get the same malloc. To use those other allocators, you would typically give them a different symbol name, or make the whole process use the new one.

This is, yes, the behavior of both the ELF specification as well as the GNU linker.

I'm not here to get into semantics of symbol namespaces and resolution though, I can just as easily link a static jemalloc into an arbitrary ELF shared object and use it inside for every allocation and not give a damn about what the global malloc() symbol points to. There's a half dozen other ways I can have a local malloc() symbol as well instead of having the linker bind the global one.

Which, is the entire point I'm trying to make. Is this a bigger problem on Windows versus UNIX-like platforms due to the way runtime linker support is handled? Yes. Is it entirely possible to have the same issue, however? Yes, absolutely.


In about 27 years of using Linux and BSD I don't think I've seen it once. If you work professionally in C on Windows it is a practical concern, an everyday occurrence.

Another absurdly common issue is passing a FILE * across a DLL boundary. It is highly unlikely to work. I used to have to train new hires not to do this and tell partner teams working on C APIs to include I/o abstractions that don't involve FILE*, which would illicit a response as if I'm an alien.


This will work for any application compiled against uCRT, which has been the default for 10 years now.

https://learn.microsoft.com/en-us/cpp/windows/universal-crt-...


While they're not "the same", classic COM (or OLE? the whole history is a mess) did actually have ProgIDs, and WinRT introduces proper "classes" and namespaces (having given up global registration for everything but system provided API's) with proper "names" (you can even query them at runtime with IInspectable::GetRuntimeClassName).

Microsoft tried to do a lot with COM when they first released it, it wasn't just a solution for having a stable cross-language ABI, it was a way to share component libraries across multiple applications on a system, and a whole lot more.

> but that also wouldn't have flown when COM was invented and resources overall were much more scarce than they are today.

And this ultimately is the paradox of COM. There were good ideas, but given Microsoft's (mostly kept) promise of keeping old software working the bad ones have remained baked in.


What do you mean by "owned" strings?

WinRT, which is ultimately just an evolution of COM, has HSTRING which can own the data inside it (as well as contain a reference to an existing chunk of memory with fast-pass strings).


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: