They aren't breaking up NOAA just for the sake of privatization, reliable weather reporting also makes it harder to ignore Climate Change.
From Project 2025 "[NOAA offices] form a colossal operation that has become one of the main drivers of the climate change alarm industry and, as such, is harmful to future U.S. prosperity."
> main drivers of the climate change alarm industry
This is clearly dog whistle langauge and not intended to be taken literally, but it is starting to be a common trope and it makes me very curious as to how this industry operates? What's their main source of income, who benefits from it, and how? And what is the supposed goal of it?
Raising the alarm about a conceieved threat could be a way to raise money for more research, which might indirectly benefit those scientists. But we haven't really seen a corresponding massive increase in scientists employed, and even if we did, they would have to find some way to leak money through publicly funded research to their own private enterprises because so far no one has suggested that we pay scientific researchers too much. The way to combat that would be to demand more transparency from universities, but they're already pretty good about that.
It also doesn't match very well what those scientists are actually saying. Which is mostly that the basic science is indisputable since the past century and more research is not required but action. Had the climate scientists been siphoning public money through alarmists schemes, wouldn't they rather say that things are very dire but "don't touch! We need much more expensive research before we can give any concrete advice"?
I don't think the allegation is that the scientists themselves are who benefits.
Rather, I think the allegation is that it's those involved in renewable energy development schemes that result from the raised alarm, from product vendors to site developers to construction contractors to energy trading firms to... See also: politicians pushing Green New Deal type policies. The scientists are enablers, not the primary beneficiaries, at least as I understand the allegations.
That's a real issue with several real life examples, but not really related to the issue of climate change or climate alarmism, is it?
There's been plenty of extended circles around political interests that has lined their pockets in matters of alcohol and drug prevention, abuse prevention and health care, but very few people seem to be taking the local step to actually, alcohol are good for you and anyone that says otherwise should be labelled alcohol alarmists. It's pretty unique to climate research.
I seem to remember that there were was a enormous backlash against CFC bans, and lots of talk about how it would lead to the spread of preventable illness and economic disaster, but it never reached nearly the same levels of anti scientific discourse as we see today.
>This is clearly dog whistle langauge and not intended to be taken literally, but it is starting to be a common trope and it makes me very curious as to how this industry operates?
It's clearly just dismissive language that doesn't have to make sense. The only purpose is to make climate change seem like a fake concept thought of by groups with nefarious intentions. It's like asking the author of a short story what the main character's favorite color is. That detail simply wasn't considered because it's not needed in the short story.
Gonna make one of those novelty clocks themed around the financial boom-bust cycle, where, to silence it, you have to get out of bed and literally kick a can.
The point is that this is a well-defined (not pun intended) behavior that exists within C, C++, Python, Ruby and probably a handful more popular languages. This set pretty much constitutes like 70%(?) of mainstream languages. Yet only JS gets shit thrown its way in this thread.
I somehow missed the comments on this post. I think I need to respond not just to this but to the other (completely valid) criticisms in the comments here. And because you bring up what I think is a more interesting problem, you're the comment I'm replying to.
The anger I think stewed for a long time but was in it very nearly since the beginning of this project two years ago (I stopped working on it for quite a while).
First, my anger was directed at C++. std::map forced to be a Red-Black Tree for one. And I had originally written a lot about that anger in the first draft. I had written about how if this paper had baked in the oven a bit longer, they would've had C++11 to work with. I wanted to try to write a Modern C++ version of the code but that goal was what kept me from touching the project for over a year. I had gotten sick of C++ as a whole both at work and in my personal projects. And by the time I got back to it, the discussion on C++ no longer felt like it had a place in the post if it were to have the structure it ended up having.
On the rewrite, I found myself getting more and more frustrated by the original paper itself. The methodology felt flawed yet I felt I had to keep it. I said in the aside how the read of the paper gave this impression that they wanted to shill Go with this paper, like this was going to be its big debut as a language until it failed pretty much every test thrown at it. I don't know how this turned into Google Hate. Or at least, I don't remember. Maybe something about the current state of the world or an impression of Google company culture based on the tone of the paper. Maybe I just didn't want to be angry at Hundt specifically.
Much like the original paper, this post should've spent more time in the oven. But after spending so much time on it, it also felt like I just needed to get it out there so I could be done with it.
Someone else mentioned the acrylic, I'm going to mention Scarlett Sparks' Open Source Knitting Machine if part of the fear is actually investing in the machine
https://github.com/ScarlettSparks/KnittingMachine
Anecdote/Tangent: Roku TV wins an award for worst input switching UX. From Home, you have to go into Settings (the quick settings accessible from the remote aren't enough,) then scroll down the Inputs menu and then select your input. At least with LG, inputs are directly available from the Home menu.
Funny, I just commented that I like the Roku method. My inputs show up as tiles on the home screen alongside the other apps. You can show/hide the ones you want.
When I first got a Roku TV I had to uninstall a ton of applications that were auto installed, which are treated just as equally as your actual inputs on the home screen. Then once I got that done I later realized removing an input device left that tile on the screen, you would think the OS could be smart enough to see that the device isn't connected for a while and maybe drop it off after some time? I find Roku TV kind of clunky.
Dropping access to disconnected inputs is extremely annoying. When connecting something new, you often want to change the TV to the input before powering on the device.
Because when you have a new device there might be boot messages that disappear faster than you can switch to the input after it detects a signal. So you want to be switched to that input before turning it on.
C++ has an absurd amount of UB to the point where there's an entire thought exercise devoted to how malicious a compliant compiler can be (Hell++.) There are things that are allowed and even common in C but UB in C++ (union type punning,) and things that people reasonably can assume would work but are UB anyway (signed integer overflow.) Then you have weird edge cases like assigning the return value of a two argument std::max involving temporaries to a reference. There are so many UB foot guns, no reasonable developer can be expected to keep track of them all.
When all hardware built for decades uses two's complement arithmetic and even the standards bodies have noticed this (e.g. https://www.open-std.org/jtc1/sc22/wg14/www/docs/n2218.htm) it's not remotely necessary to assume that overflowing a signed integer is undefined behaviour. It's totally defined exactly what that instruction is going to do on any hardware any electrical engineer is willing to build.
However, some benchmarks use iteration on a signed integer, and assuming that loop terminates makes it slightly faster, so in order to retain that marginal advantage over other languages, signed iteration shall be assumed to never overflow.
That's not why it's ub. It's ub to allow compilers to optimize x*2/2 to just x. If you want overflow, you can use unsigned, which has been defined to follow 2's complement semantics for quite some time.
> things that people reasonably can assume would work but are UB anyway (signed integer overflow.)
No, they cannot. You don't have the right to make any assumption about integer overflow in C.
Most of the whining about UB is from people who still refuse to accept that you don't have the right to think about your processor family once you write in any language that is not assembly.
Cannot find any articles discussing Hell++. Is it something specific? The thought seems very interesting and I'd like to read about it if you have any links :-)
Thanks!
I'm not sure about Hell++, but DeathStation 9000 is a "common" jocular name for a hypothetical system which is as pathological as possible with respect to implementation definined and undefined behavior.
Needed this post and this comment. And hopefully I'm not misreading either. Halfway through my greenfield redesign that I claim is catharsis because I was too scared of breaking things before they got better. I hope that I can put that redesign to rest and actually make progress with the original code.
Often when trying to improve a complex system, the best way to gain new insights and approaches is to just make a branch or even a clean project and try to redo a thing. Even if it doesn't end up in the project often you'll get ideas and inspiration
Does reading a dozen papers on different algorithms to accomplish a specific goal count as scavenging. Asking for me because I've gone down a rabbit hole with recent work. Not trying to be antagonistic or humble brag either. Actually not sure what I'm trying to accomplish with this question.
I think most papers should effectively just be scavenged through. In (mathematics) grad school I’ve only had time to seriously study a single digit number of research papers, but I at least feel like I gained a lot from looking through many many more than that.
I don't think you're misunderstanding, it's a strange choice to make even if the example here loses context to make it easier to make the point. It wasn't code that anyone currently here wrote and because it worked with the old compiler, nobody really touched it.
I believe the nodes are pushed into the list before adding connections but I don't think that changes the point you're trying to make. That said, one's intuition is different from another's and I don't think it's unfair to assume that pushing into a vector of lists won't cause every existing list to be copied. For that to be the actual behavior is kind of disgusting.
It may make more sense to create pointers here but that's a larger change to deal with and ensure correctness versus just swapping the vector out for another list. I don't claim to like that solution but it seems to me like legacy C++ code in general is fragile enough as it is, the less I have to change to fix bugs the better.
> It may make more sense to create pointers here but that's a larger change to deal with and ensure correctness versus just swapping the vector out for another list. I don't claim to like that solution but it seems to me like legacy C++ code in general is fragile enough as it is,
That's more than fair enough. In such code, more often than not, you make what you think is a small change, and you end up with an entire cascade of unexpected side-effects that pop-up because the original assumptions and the testing scope of those assumptions were lost. A can of worms best left unopened.
By the way, wouldn't such an error be something that gets detected by the "new" ASAN functionality that has been added to the newer MSVC toolchain you are using?
I mentioned it in a side note that I trimmed because there were so many that it spilled into the footer (faster to trim the article than to fix the CSS,) but Microsoft is the only implementation of the big three that doesn't mark the move constructor here as nothrow. The standard doesn't require it so it's valid for MSVC to do things the way they do, it just creates problems like this that would arguably be harder to find the cause of if one had to build code for multiple platforms.
Right. My point is that 1) this is a quality-of-implementation issue in MSVC, 2) the standard should be phrased such that the MSVC implementation is illegal, and 3) the C++ standard library solves a lot more problems than it creates despite having warts like this and C++ having some unfortunate defaults (e.g. mutability by default).