To be fair, there were a number of people chiming in who were claiming to be experienced in that area (ex Navy captains, people who did dredging ops, etc). One of the big things that changed the timeline was that the highest tide of the month occurred yesterday; if not, I imagine that ship would be stuck for another month!
Being expert in adjacent domains is sometimes worse than being clueless. The ratio of actual to assumed expertise seems to get worse. Navy captains vs shipping boats, geologists vs climate scientists, programmers vs cpu design, etc etc. You can very easily not understand subtleties, comment on a thing, and then people listen to you.
You went a bit too far. I'd presume a lot of programmers do know CPU architecture well. While not common some of them to work on boring web platforms, some still do. Also most CPU architects would be decent programmers to begin with.
Programming has not changed all that much and it was not so long time ago that programmers routinely knew assembly and how many cycles (and bytes) each opcode took... Nowadays it might be regarded as an arcane art by most, of course.
> Programming has not changed all that much and it was not so long time ago that programmers routinely knew assembly and how many cycles (and bytes) each opcode took...
Most programmers on Apple platforms don't actually think about execution order -- because they don't have to -- but also because Apple is actively using Clang to discourage assembly and writing for specific CPU architectures. It makes Apple's job of releasing new silicon that much easier if they don't have to worry about breaking existing software custom written for a previous architecture.
And this still assumes a one-to-one relationship between the code you're writing and the computer it's running on or designed for. When you get to the cloud, or cloud functions, that breaks down even further. If using Heroku, for example, you don't even have to consider how to deploy your code and you can make it pretty far running a production service.
It's possible for closely related fields to still have very large differences. Consider drivers and cars: The more automation is introduced, the less we might need to know about what the automation is doing for us under the hood. Anti-Lock Breaking (ABS) in cars might be a simple example where folks know about it because there's a light on the dash and instructions in driver's ed. But if we didn't have those indicators, how often would anyone know about it and other such features? Some technologies remain undocumented until discovered later by experimentation, the VW diesels come to mind. Specific chip designers likely know more than your average programmer, just as specific car manufacturers likely know more about their products than drivers would.
This is quite a blatant assumption on its right own (and very far from the truth). The programming, itself, has not changed. But of course, modern hardware is not a von neumann machine. Writing lock-free datastructure is not that different programming, it requires a lot more attention and (possibly) experience but the basic premise is still the same.
Understanding memory topology/hierarchy & latency, concurrency, branch (mis)prediction, cache coherency should be a minimum for anyone who comments on CPU architecture. I did mention Assembly and without some knowledge on the target architecture it's rather pointless to comment on, either.
I encourage most developers to at least understand that memory is not actually 'random access', which makes derefernce not cheap - but accessing data placed together is next to free as it is likely to hit L1.
> discourage assembly and writing for specific CPU architecture
I found out that I could not reliably beat a standard compiler writing everyday Assembly around K6-2 years. Yet, still some inner loops can be carefully hand optimized. The point is that there are plenty of programmers who would be able to understand modern architecture and to me basic understanding is needed unless the job is just gluing code.
In all of those examples, its possible the person DOES have a good understanding of the adjacent domain. And in all examples, it is possible they will miss some subtleties, but people will give their opinions a lot of weight.
Just as an example I see a lot: branch prediction. Some programmers don't know about it at all. Many do know about it, but think that it still works in some form like "assume the branch will go the same way it did last time". Which is how it worked in the 1990s. Then it evolved, and then it evolved two more times. Today there is something like a neural network that learns how the branches will go. (And careful, im a programmer so I may be communicating some subtleties wrong there!)
>Today there is something like a neural network that learns how the branches will go.
More like history, where the call comes from.
Oddly enough the price of branch misdirection has become lower as not the entire pipeline needs to be thrown away but also due to hyper threading taking the slack.
Flip note: with 'recent' developments of Spectre, one'd think branch prediction got into the lime light. Truth be told, though, not many would be able to write constant time 'fizz buzz' (can try it on your own, bonus points to having constant time int->string conversion)
And the aging of their experience matters. I mentioned upthread that I was trained as a merchant marine officer. However that was three decades ago and while a lot of my training will still apply, industry practices move on and a lot of the stuff I learned is long since outdated. A lot of times I start to type a reply to something relevant and have to smack myself into remembering that things are probably done differently in 2021 :-)
>Of course each meter of depth is a little harder, but not that much harder.
lol - cut and fill on a slope is not trivial. It’s more of an exponential function than a linear one for the amount of material removed the deeper you have to go down. They dodged a HUGE bullet with the highest tides happening this weekend. If they had gone beyond a Tuesday with the drop in tides each day as the moon got further away it would have been sketchy if they could have gotten ahead of the tide or not.
The timing of this couldn’t have been tighter. Thankfully they came out on the good side :)
Yeah, but on ship this size 400x60x0.2=4800 tons of difference! I'm somewhat exaggerating because ship hull isn't cuboid, but it is still likely equivalent to removing around hundred containers.
Only a portion of it was grounded, not the full length of the ship.
Someone linked a BBC article stating that they shifted 27,000 cubic meters of sand, so there you go, they could likely remove a meter under the whole thing in a similar amount of time (probably longer to cover area instead of digging down, but that isn't what they would need to do).
Yes, the high tide is a better opportunity to do it, the tides over the next couple of weeks are still within 20-30 cm. The worst day in the next 30 days is 60 cm below the highest.
But maybe the dredge only made a few centimeters of difference running for 5 days, who knows.