We allow people to self-identify their religious beliefs. The government doesn't declare you to be a Catholic or a Buddhist and force your driver's license to say such.
Unless you're proposing making the military- and the rest of society- radically gender neutral, than as soon as a person interacts with a driver's license, passport, social media account, or bathroom, they will have to adopt a gender identity . Why should a third party's "gender beliefs" about a person be elevated above their own?
(I also can't see a justification for denying hormone replacement therapy treatment- it's cheap and ridiculously beneficial for mental health. But, for the same reasons, I advocate its availability over-the-counter.)
The architectural solution to gender neutral bathrooms is floor to ceiling panels and doors that actually close. It's quite simple and extremely common in newer buildings in Europe.
Driver's license and passport record sex, and bathrooms and other spaces that are separated for women and men are done so on the basis of sex.
Being an objective measurement of material biological state, sex isn't the same type of thing as a gender identity belief, which is indeed self-declared like a religion is.
A PIER plan is not Equal Protection. For the last few years, a PIER plan is explicitly about Un-equal Protection in the service of achieving "equitable" outcomes.
How do you explain the un-equitable outcomes that we have been getting? The whole point of DEI is that there are (largely unconscious) biases that have been producing these un-equitable outcomes, and we need some conscious effort to get rid of them.
If truly being race-blind (or any other way of dividing people that is not strictly merit) is really your goal, and you are aware of unconscious bias, then what are people really objecting to?
Exactly. This [1] headline from the NYT pretty much sums up the issue: "To Make Orchestras More Diverse, End Blind Auditions."
Different groups of peoples have different strengths and different weaknesses. This isn't some sort of a problem that needs fixing.
And if one does want to 'fix' it you need to start way earlier than at the e.g. hiring point. Want to beat that kid who's been playing violin for hours a day since he was 6? Ok, then your target demographic is 6 year olds, not professional orchestras.
I can't read the article, so I don't really know what they are arguing for there. The headline sounds a bit click-baity, so I don't want to guess what the content is.
But looking from a sex standpoint blind auditions are responsible for a sweeping change in the ratio of men and women in professional orchestras (though men still outnumber women %60 to %40, but that is from a near %100 a while ago). This is absolutely a case to look at to demonstrate what unconscious biases are, and a wonderfully effective way of fighting it in one specific place.
The article is what you'd expect from the headline.
The change in gender differences is easily explained by more girls pursuing music from an earlier age. More people (as a ratio) from a group competently doing something means you end up with more highly successful outliers from that group.
Unconscious biases are well studied, and a good presenter can show it live on stage.
Please explain to me how %85 of Fortune 500 CEOs are white men. I don't believe for a moment that white men (myself included) are generically better at being CEOs, or at performing any of the roles that lead up to those positions.
The unconscious biases that the people in those positions choosing their successors to look like them selves is a strong explanation of this, and studies have been show that in controlled situations this sort of things happens all the time. Even by those who want things to be color-blind.
You can create toy experiments to 'prove' just about anything - this is why social psychology has a replication rate in the 20% range. Though it's really much worse since that's only independently repeating the same experiments, not adversarially challenging hypotheses, at which point the entire field looks about as reliable as astrology. Quite appropriately since astrology, which was also studied as a 'science' for centuries, was highly influential on the founding fathers of modern psychology - Jung in particular.
And group differences are not only genetic. Why do you think that, for instance, 70% of American football or NBA players are black, while only 8% of baseball players are?
Go look at an average MBA classroom and know what you'll overwhelmingly see? Pretty much what a sampling of CEOs looks like, especially once accounting for performance.
An inequitable outcome is not proof of bias or discrimination. Ask a group of high school kids to solve a quadratic equation, and there won't be an equitable distribution of correct answers.
Unconscious biases can be eliminated through anonymization or adopting objective evaluation criteria. E.g. having an orchestra audition behind a veil is a way to eliminate bias. By contrast, setting quotas on the number of each gender or race achieves equity, but it does so through explicit discrimination.
Some DEI policies act like a veil. Others work like the latter. The recent executive orders ban the latter, not the former.
The US Constitution protects the people from excessive acts of government, not the other way around. It says nothing about "enforcement mechanisms" which, should any actually be enacted by the legislature or ordered by the executive, are subject to constitutional challenge by the people before the judiciary (unless rendered moot by a superseding legislative act or executive order).
Our constitutional republic is not merely "just blots of ink on paper" but rather alive and well.
Rustup downloads toolchains from third-party (to the distro) repositories; distros do not want to be in a position where they can no longer build packages because of an external service going down.
So, if you are developing something you want to see packaged in distros, it needs to be buildable with the tool versions in the distro's repositories.
(Not just rustup- Debian requires repackaging Cargo dependencies so that the build can be conducted offline entirely from source packages.)
You’re answering a slightly different question but to me that’s a Debian packaging problem to solve. It’s weird to me that QEMU devs take this problem seriously enough to be putting in all sorts of workarounds to support old versions of the toolchain in the tip of tree just to privilege Debian support.
This feels more like a CI thing for the QEMU project and I’m sure solvable by using rustup or a trusted deb repo that makes the latest tool chain available on older Debian platforms.
As for Debian itself, for toolchains it really should do a better job back porting more recent versions of toolchains (not just Rust) or at least making them available to be installed. The current policy Debian is holding is really difficult to work with and causes downstream projects to do all sorts of workarounds to make Debian builds work (not just for Rust by the way - this applies to C++ as well). And it’s not like this is something it’s unfamiliar with - you can install multiple JVM versions in parallel and choose a different default.
It's not about our own CI -- we could easily use rustup as part of setting up the CI environment, and I think we might actually be doing exactly that at the moment.
Lots of QEMU users use it through their downstream distros. We even recommend that if you're using QEMU in a way that you care about its security then you should use a distro QEMU, because the distros will provide you timely security fix updates. Sure, we could throw all that cooperation away and say "tough, you need to use up-to-the-minute rust, if that's a problem for distro packagers we don't care". But we want to be a good citizen in the traditional distro packaging world, as we have been up til now. Not every open source project will want or need to cater to that, but I think for us it matters.
That doesn't mean that we always do the thing that is simplest for distros (that would probably be "don't use Rust at all"); but it does mean that we take distro pain into account as a factor when we're weighing up tradeoffs about what we do.
To be clear. I’m not criticizing the position the QEMU project is in. I recognize you have to work with Debian here. I’m more frustrated that Debian has such a stranglehold on packaging decisions and it effectively refuses to experiment or innovate on that in any way.
Out of curiosity though, have you explored having your own deb repo instead? I would trust QEMU-delivered security fixes on mainline far more than the Debian maintainers to backport patches.
I think that trust would be somewhat misplaced -- QEMU has historically not made particularly timely security fixes either on mainline or on branches. To the extent that our stable-branch situation is better today than it was some years ago, that is entirely because the person who does the downstream Debian packaging stepped up to do a lot more backporting work and stable-branch maintenance and releases. (I'm very grateful for that effort -- I think it's good for the project to have those stable branch releases but I certainly don't have time myself to do that work.)
As an upstream project, we really don't want to be in the business of making, providing and supporting binary releases. We just don't have the volunteer effort available and willing to do that work. It's much easier for us to stick to making source releases, and delegate the job of providing binaries to our downstreams.
> QEMU has historically not made particularly timely security fixes either on mainline or on branches
> It's much easier for us to stick to making source releases, and delegate the job of providing binaries to our downstreams
Am I correct that this is essentially saying "we're going to do a snapshot of the software periodically but end users are responsible for applying patches that are maintained by other users as part of building"? Where do these security patches come from and how do non-Debian distros pick them up? Are Arch maintainers in constant contact with Debian maintainers for security issues to know to apply those patches & rebuild?
Security patches are usually developed by upstream devs and get applied to mainline fairly promptly[1], but you don't want to run head-of-git in production. If you run a distro QEMU then the distro maintainers backport security fixes to whatever QEMU they're currently shipping and produce new packages. None of this is particularly QEMU specific. There's a whole infrastructure of security mailing lists and disclosure policies for people to tell distros about security bugs and patches, so if you're a distro you're going to be in contact with that and can get a headsup before public disclosure.
[1] and also to stable branches, but not day-of-cve-announcement level of urgency.
Sure, but then why does the mainline branch need to worry about supporting the rust that’s bundled with the last stable Debian release? By definition that’s not going into a distro (or the distro is building mainline with rusts latest release anyway).
Is it a precautionary concern that backporting patches gets more complicated if the vuln is in Rust code?
But then again Rust code isn’t even compiled by default so I guess I’m not sure why you’re bothering to support for old versions of the toolchain in mainline, at least this early in the development process. Certainly not a two year old toolchain.
We already make an exception in that we don't support Debian bullseye (which is supported by the rest of QEMU until the April 2025 release), but not supporting Debian stable at all seemed too much.
That said we will probably switch to Debian rustc-web soon, and bump the lower limit to 1.75 or so.
I think I'm missing why you need to require using the toolchain bundled with the last stable Debian release vs having devs just rustup the latest version of the toolchain (or via a PPA [1] or however else they want to install it).
The current approach basically guarantees that you're always targeting a ~2-4 year old version of the toolchain and that feels like a particularly weird maintenance burden given how many workarounds you're putting in to do so.
Because it's not about devs, it's about distro packagers. They are an extremely important audience for QEMU. As pm215 said above, we don't want to make their lives unnecessarily harder. For example Debian has the backports repository, and CentOS Stream has QEMU and Rust updates done by different teams.
Anyhow starting April (August release) we will be able to target 1.75.0 while being consistent with QEMU's (C-targeted) distro support policies, which is not that bad. Maybe newer than that depending on what Ubuntu 22.04 does between now and August.
> I’m more frustrated that Debian has such a stranglehold on packaging decisions and it effectively refuses to experiment or innovate on that in any way.
What Debian has is not a "stranglehold" but an ideology, and Debian continues to matter to (some) upstream projects because lots of users identify with Debian's hyperconservative, noncommercial ideology.
Your complaint is basically, "it's too bad that the userbase not sharing my values is large enough to matter".
> What Debian has is not a "stranglehold" but an ideology, and Debian continues to matter to (some) upstream projects because lots of users identify with Debian's hyperconservative, noncommercial ideology.
> Your complaint is basically, "it's too bad that the userbase not sharing my values is large enough to matter".
Arch and rolling releases have about the same market share as Debian. Indeed, ironically, Debian's widespread adoption is seen primarily in the enterprise space where it's free as in beer nature and peer adoption is a signal it's a suitable free (as in beer) alternative to RedHat. Without Ubuntu's popularity a while back making Debian not so crazy an idea, I think "Debian" philosophy would not have anywhere near the adoption we see in commercial environments.
> I’m more frustrated that Debian has such a stranglehold on packaging decisions and it effectively refuses to experiment or innovate on that in any way.
Does Debian have a stranglehold? AFAIK every other distro does the same thing, and all of them for good reasons.
Ubuntu and Arch are about equal market share penetration for the desktop from what I researched with other and steam deck being the main dominant categories. So ignoring corp fleet deployments, I'd say Arch and NixOS have stolen quite a bit of market share from Debian-based systems in terms of end-user preference. But yes, Debian does still have a stranglehold because corp $ are behind Debian-style deployments.
> As for Debian itself, for toolchains it really should do a better job back porting more recent versions of toolchains (not just Rust) or at least making them available to be installed.
Disagree. No stable release of Debian today supports c++23 which is coming up on two years old at this point (this corresponds to a major Rust edition, not the minor stuff they publish on an ongoing basis every month).
Java in Bookworm installs JDK 17 which is three years old at this point. Java itself is on 23 with 21 being an LTS release.
This means that upstream users intentionally maintain an old toolchain just to support Debian packaging or maintain their own deb repo with newer tools.
You’re confusing cause and effect. People aren’t migrating because Debian packaging lags so badly, not because there aren’t improvements projects would love to otherwise use.
> Most toolchains don't have as much churn as Rust.
What churn? A release every 6 months? Unlike many others, toolchains (i count nodejs and co here) rust only need one toolchain because latest rustc always able to compile older rust code.
> latest rustc always able to compile older rust code.
That is not true. Adding any public method to any impl can cause existing working code not to compile, and Rust adds new methods to the stdlib all the time.
I don't believe this is correct. I have not once seen a new method being added to stdlib that causes code to not compile. And ABI concerns are a non-issue since Rust is statically linked.
The reason is that `get_or_insert_default` was added to the stdlib in 1.83, and takes a different number of arguments, so it clashes with the user-defined one here.
For context: since a base64 character represents 6 bits, every block of three data bytes corresponds to a block of four base64 encoded characters. (83 == 24 == 64)
That means it's often convenient to process base64 data 4 characters at a time. (in the same way that it's often convenient to process hexadecimal data 2 characters at a time)
1) You use = to pad the encoded string to a multiple of 4 characters, adding zero, one, or two as needed to hit the next multiple-of-4.
So, "543210" becomes "543210==", "6543210" becomes "6543210=", and "76543210" doesn't need padding.
(You'll never need three = for padding, since one byte of data already needs at least two base64 characters)
2) Leftover bits should just be set to zero; the decoder can see that there's not enough bits for a full byte and discard them.
3) In almost all modern cases, the padding isn't necessary, it's just convention.
If you want your function to block until the promise resolves, then the function calling it also has to block, and the function calling that has to block, and so forth.
At the top of the chain, this ultimately blocks the entire event loop (Javascript semantics are generally not concurrent), so no UI/network events can be processed until that promise resolves and the page/server is left non-responsive.
(And that's assuming you can somehow define clear semantics to run any Javascript code involved in resolving the promise; otherwise, you're deadlocked!)
It's not easy to divorce "culture war" from "problem solving", since the question of whether problems even _exist_ and need to be solved is frequently a culture-war issue.
For example, you can't pursue "energy modernization and decarbonization" in the USA without taking a side in the "does anthropogenic climate change exist?" culture war.
Any infrastructure plan, whether heavy-infrastructure or social-infrastructure, touches the culture-war questions of "should the government subsidize industry?" and "should the government subsidize the working class?"
Because much of the culture war stuff isn't substantive. It's theater. There aren't any numbers on it.
IF your concerned with say racial inequality - well then good! Solve a practical problem by focusing your energy on poverty which already selects for a racially diverse group.
End gerry mandering ? That right there is the root of racial disparity in representation, and people hate it. Open Primaries is the only thing Libertarians and Greens agree on - that should be an easy lift when 65% of America wants more options.
What Yang is saying is "Instead of 2 flavors of ice cream, how about 10!". Who the hell votes for only 2 flavors?
From the interwebs: In the U.S., the money supply is influenced by supply and demand—and the actions of the Federal Reserve and commercial banks. ... More money flowing through the economy corresponds with lower interest rates, while less money available generates higher rates."
This macro lets you embed an entire folder of assets in your binary at compile time, to simplify distribution.
Taking the concept further, I could also imagine build macros that compile Typescript or SASS files at build time, or generate data structures from a Protocol Buffers definition file, or in general operations that ingest non-Rust source code and use tools outside the repository.
tl;dr: Hyperbeam is roughly a netcat tunnel that connects via a DHT topic instead of network address; use it where you want a secure one-off tunnel. (for example, to transfer a key for Wireguard)
Wireguard:
+ Can tunnel arbitrary IP traffic
~ Has stricter encryption, with full asymmetric keys (and optionally adding a symmetric key)
- requires permissions to load a kernel module and configure the network stack
Hyperbeam:
+ Only needs userland UDP sockets, not a kernel module
~ Derives its keys from a passphrase, so does not require transferring a full cryptographic key between devices
- is a single application-layer pipe, applicable in shell workflows but not transparently tunneling arbitrary applications
Unless you're proposing making the military- and the rest of society- radically gender neutral, than as soon as a person interacts with a driver's license, passport, social media account, or bathroom, they will have to adopt a gender identity . Why should a third party's "gender beliefs" about a person be elevated above their own?
(I also can't see a justification for denying hormone replacement therapy treatment- it's cheap and ridiculously beneficial for mental health. But, for the same reasons, I advocate its availability over-the-counter.)