I never understood the "git cli sucks" thing until I used jj. The thing is, git's great, but it was also grown, over time, and that means that there's some amount of incoherence.
Furthermore, it's a leaky abstraction, that is, some commands only make sense if you grok the underlying model. See the perennial complaints about how 'git checkout' does more than one thing. It doesn't. But only if you understand the underlying model. If you think about it from a workflow perspective, it feels inconsistent. Hence why newer commands (like git switch) speak to the workflow, not to the model.
Furthermore, some features just feel tacked on. Take stashing, for example. These are pseudo-commits, that exist outside of your real commit graph. As a feature, it doesn't feel well integrated into the rest of git.
Rebasing is continually re-applying `git am`. This is elegant in a UNIXy way, but is annoying in a usability way. It's slow, because it goes through the filesystem to do its job. It forces you to deal with conflicts right away, because git has no way of modelling conflicts in its data model.
Basically, git's underlying model is great, but not perfect, and its CLI was grown, not designed. As such, it has weird rough edges. But that doesn't mean it's a bad tool. It's a pretty darn good one. But you can say that it is while acknowledging that it does have shortcomings.
It's more than that; it's also git's incredibly unfriendly way of naming things.
Take for example the "index" which is actually a useful thing with a bad name. Most tutorials start by explaining that the index is a staging area on which you craft your commit. Then why is it called index and not staging area? Incredibly bad name right there from the get go. If you ask what the word "index" means in computer science, people usually think of indices into an array, or something like a search index that enables faster searching. Git's index doesn't do any of that.
And git's model leaks so much implementation detail that many people mistake these for essential concepts; there are people who would tell you any version control system that doesn't have the "index" is not worth using because they don't allow one to craft beautiful commits. That's patently false as shown by jj and hg. This useful concept with a bad name becomes one amorphous thing that people cannot see past.
For hobbyists that's enough, for engineers often okay (I find myself in that situation) but for scientists "good enough" means nothing.
Optical metrology relies on accurate equations how a physical object maps to the image plane so in that case analytical solutions are necessary for subpixel accuracy.
I'm worried about how often kids these days discount precise mathematical models for all use cases. Sure, you get there most of the time but ignore foundational math and physics at your own peril.
I don’t discount foundational math. I do a lot of DSP, and many things in audio can be very elegantly solved with math.
The point I was trying to make is that edge detectors and feature descriptors like SIFT and ORB claimed to have a nice mathematical solution when in fact they are just throwing some intuitively helpful math at an ill-defined problem with an unknown underlying probability distribution. For these problems, NNs just perform much better, and the idea that handcrafted feature descriptors have some mathematical foundation is just false. They are mathematical tricks to approximate an unknown model.
I wonder how many "traditional" engineers are Mac coders. I work at hp so not many Macs around but in our R&D and especially in manufacturing and test&measurement Mac-only would be a problem.
Everyone here seems to thing GDB is awful. It's been a while for me but I remember using front-ends like Eclipse's CDT or similar and didn't find that experience so bad. Do most people use GDB straight-up? I haven't done that in probably 15 years although it's nice to have a lightweight command line on small embedded systems.
> Everyone here seems to thing GDB is awful. It's been a while for me but I remember using front-ends like Eclipse's CDT or similar and didn't find that experience so bad. Do most people use GDB straight-up? I haven't done that in probably 15 years although it's nice to have a lightweight command line on small embedded systems.
the issue is not with gdb's UX - you can build as many cool UI / UX on top of it as you want. The problem is that gdb will sometimes take 5 minutes to start debugging (and that's without debuginfod), straight up crash, be unable to resolve obvious symbols etc. which are all back-end bugs. For me developing a medium-sized C++ app, it's really hell to use and usually printf debugging is MUCH faster in the sense that I have the time to find and fix my problem through an iterative recompile cycle sometimes before gdb has even finished parsing my binary
Said binary is modified by me every time I change something while building (and I used to build with gdb-index but recently this caused crashes with another build flag so I had to disable it.. though maybe it's fixed with last gdb release?)
I do use GDB straight up for native programs and microcontrollers, mostly out of laziness. I haven't set up VS Code debugging at my current day job yet and it's been 6 months.
Out of the box GDB is kinda awful, especially for C++ codebases. I should probably look into scripting at some point, but meh. Even then, as far as I know I'm the best at using a debugger at work by a wide margin, but I attribute that more to my knowledge of low-level programming than my ability to use most of the basic GDB commands. Also, it tends to crash once a month or so.
If I needed to debug a userland program on a small Linux embedded system, I'd probably whip out gdbserver and attach gdb to the target remotely. I haven't done that in a while though.
Oh yea, we used both SoftICE and Periscope. I loved the pushbutton to fire off an NMI directly on the ISA bus. It was sad when your device driver crashed but it was a fun experience to push that button.
I've used only few times back then, but it was impressive (previous experience was with Borland's / Turbo debuggers which was nice also, but Numega was just another class)
I grew up in Germany where iodized salt and flouridated water aren't the norm. Now I have bad teeth and can't wrap my head around the latest changes in the C++ language...
That’s not true. Iodized salt is the norm in Germany. I couldn't find data more recent than 2018 but according to this paper, "iodized and fluoridated salt has a market share of 80-90% among salt sold to private households": https://jlupub.ub.uni-giessen.de/items/fcd6e613-49a9-414b-a4...
Iodized salt is quite the norm in Germany. You can buy the non -iodized version which is better for bread baking. But most everyone I know has iodized salt at home.
Also doesn't water fluoridation cost a few IQ points? In the end I could imagine the effects cancelling each other out.
> Also doesn't water fluoridation cost a few IQ points?
No, it does not. This is speculation based on a poor understanding of the actual science.
Most of our fluoride exposure comes from eating normal food, water fluoridation is a small fraction of that. There are no measurable cognitive effects in the many developed parts of the world with natural fluoride levels far higher than used in municipal fluoridation. Furthermore, there is no plausible mechanism of action for how this would cause a cognitive deficit. Fluoride toxicity is well-understood because it has an unusually simple biochemical mechanism. Therefore it isn't surprising that the handful of low-quality studies that show a weak relationship to IQ loss don't replicate.
As someone who actually worked on fluorine chemistry, it is disappointing to see how credulous even many people with a STEM background are on this topic. The absence of a plausible mechanism of action alone should raise serious questions.
This is a fair question. It is based on the chemistry of fluorine.
Fluorine has the strongest electronegativity of any element in the periodic table. It requires extreme measures to muscle fluorine off a molecule. This is why it is used in non-stick surfaces like Teflon (nothing can “grip” the surface molecules because fluorine won’t let it) and why it is used in toothpaste (molecules that might attack the tooth surface chemically can’t compete with the fluoride that is already there). The dark side of this is that it is difficult to contain fluorine compounds, they have a tendency to attack most containers you can put them in that aren’t also fluorine based. Famously, they tend to eat glass so you can’t store it in glass vessels.
The toxicity of fluorine flows from this. It has an insatiable appetite for Type II metals, notably calcium and magnesium in the human body. If you are exposed to fluorine, it will have a seek-and-destroy mission for these metal ions. A typical human body has a lot of calcium and magnesium circulating so it can absorb exposure from diet, water, etc. The net effect is that some calcium and magnesium is removed from circulation and is no longer bio-available. Not a big deal. In extreme exposure cases, like an industrial accident, the way it kills you isn’t toxicity per se but by removing all of the calcium ions from your system. Your heart uses calcium ions for electrical signaling, so if those are all neutralized by ravenous fluorine, your heart stops.
The antidote for extreme fluorine exposure is to ingest a bunch of simple calcium and magnesium salts. The fluorine latches on to the surplus floating around and there is enough left for your heart to keep running.
This is where the mechanism of action question comes in. For fluorine to have biological effects on cognition, the body would have to be so devoid of neutralizing calcium and magnesium ions, which it strongly prefers as a matter of physics, that you’d already be dead. In extreme exposure cases (like getting concentrated fluorine compound spilled on you) with prophylactic calcium/magnesium antidote, it does really nasty damage to the bones, but there has never been a case of cognitive damage that I’ve ever seen mentioned in the safety literature.
Fluorine is a nasty element, I don’t miss working with it, but it isn’t a serious threat in trace quantities because human bodies can easily absorb the loss of calcium and magnesium. Human bodies are tolerant of almost all elemental toxins at natural levels. The few for which there is no evidence of tolerance at even trace levels are elements like mercury. Even elements like arsenic and lead are believed to be required by human biology to some extent and therefore the human body has some evolved tolerance for them. (These two are pub quiz material, most people are shocked to find out that these are necessary micronutrients.)
I appreciate all that, but I think I might not have been clear with my question.
If our understanding is complete (in other words there is nothing else to know about flourine) then not having a known mechanism of action would be a good proof. But when is our understanding ever complete about anything?
It seems a lot easier to know things like "in the presence of x, fluorine does y" because you can easily observe that isolated thing under test. It's harder to know "fluorine does not do x in any circumstance" because the tests for that are infinite. How do we know we just haven't tested the right case yet?
Again, I'm a layman, so I can only try to logic my way through this. I acknowledge that it may be a dumb question.
Also, even if it is beneficial, which I do not think is true, it doesn’t need to be added to water to help teeth. It’s in most toothpaste and should be a topical treatment, not ingested.
The combination of flouridated water, tooth paste, and other sources, etc can lead to levels that can cause problems, but it's not as simple as "fluoridated water bad".
"The NTP monograph concluded, with moderate confidence, that higher levels of fluoride exposure, such as drinking water containing more than 1.5 milligrams of fluoride per liter, are associated with lower IQ in children. "
So maybe not exactly "water fluoridation cost a few IQ points" in a broad sense, but close enough.
> such as drinking water containing more than 1.5 milligrams of fluoride per liter
> The PHS panel that provided the recommendation considered all sources of fluoride intake and recommended 0.7 mg/L as the concentration that maximizes fluoride's oral health benefits while minimizing potential harms, such as dental fluorosis.
1.5mg/L was where effects could possibly start to be detected. That's over twice the recommended concentrations.
> It is important to note that there were insufficient data to determine if the low fluoride level of 0.7 mg/L currently recommended for U.S. community water supplies has a negative effect on children’s IQ.
“ It is important to note that there were insufficient data to determine if the low fluoride level of 0.7 mg/L currently recommended for U.S. community water supplies has a negative effect on children’s IQ.”
Any landlocked country without access to ocean water for irrigation and drinking. Heck, even the American midwest is notorious for having little iodine in its soil.
Taylor KW, Eftim SE, Sibrizzi CA, et al. Fluoride Exposure and Children’s IQ Scores: A Systematic Review and Meta-Analysis. JAMA Pediatr. Published online January 06, 2025. doi:10.1001/jamapediatrics.2024.5542