With SONAR, the transmitted sound source is also being received at the same point which allows for the intended measurement. This generalizes that method to a source being captured by a sink in any point in the predefined 3D space. I’d say it’s far more flexible as an emerging technology that sticking with something like SONAR which is primarily used for water-specific applications.
The V sign with palm facing the signer in British English means something like 'Fuck off'. All sorts of quite likely spurious history about archers insulting the French at Agincourt is used as a kind of etymology.
With the palm facing the viewer it means peace now but it's rare to see it referred to as a V sign in the UK with this meaning.
In WW II Churchill used it to mean Victory not peace but apparently started out used the insulting version and had to have the difference explained to him. Not sure if that is true or just apocryphal.
There are many photos of Churchill performing the "incorrect" version, so there's at least a grain of truth in there.
No doubt the posh school he went to didn't have the grubby little oiks running round the playground flicking the V's everywhere. Or grubby little oiks for that matter. :)
I never even knew that it appears to be a peace sign in the US. In my (central European) country it is the victory sign for everyone I have talked to about it.
The "victory" meaning comes from WWII, not Pokemon.
The hippie counter-culture movement in the 1960s co-opted it as the "peace" sign, but nowadays I'd argue it doesn't have much of a meaning, it's just "something people do for photos" (especially in East Asia).
In no way am I qualified to talk about this with any sort of certainty, but I have been studying EE for the past 2 years and live in a house with a physicist who is probably far more well read in this topic. Classical computers are nice because they are predictable; if you want to order some product online, it would't be very helpful if the bit stream of data coming into your computer was fuzzy and muddled, making it hard to tell a 0 or a 1 apart. Quantum computing takes advantage of quantum properties such as the uncertainty principle and quantum entanglement to store bits in a state of being either a 0 or 1 until we "observe" the bit coming through (quantum bits -> qubit). This allows for algorithms, experiments, and other things quantum to be ran much faster than any classical computer since it can encode more data into a single bit. To minimize the fuzz in a q-computer, they freeze the sh*t out of it to keep the energy at a minimum, along with using conducting materials which excel at conducting heat away (like diamonds). I believe the spin of an electron is what determines the qubit value and when used in conjunction with quantum entanglement will affect other electrons that are already entangled. My little knowledge ends here but with this thought on quantum entanglement, you can see how flipping bits values respectively can allow for more ways to store information with the same amount of "stuff" remaining constant vs. a classical computer.
I do understand where the confusion is stemming from, but taking the time to properly learn the clash between Leibniz and Lagrange is very important. It makes a lot of sense to use one versus the other in specific cases. You might be able to protest by saying there are certain cases where the differential operator is treated like a fraction, but not only would I call this a rarity but also a cheat to what is really happening (the chain rule comes to mind). I do think it's cool to think of analogies that could include notation changes for students in early calculus classes though.
Fixing the notation would be great, saving numerous future generations from confusion.
Getting agreement to do this is difficult. Short of laying it down in the law as part of a treaty (throw it in with WTO or Berne copyright?) there doesn't seem to be a way to make the change happen. Nobody wants to buy the calculus book with weird non-standard notation. It would be like a trigonometry book with the symbols Feynman invented in high school.
We're more likely to get serious mathematicians calling equations "math sentences".
I am in favor of notational changes, but calling this a 'flaw' is simply wrong, and the teacher profiled should correct the journal allowing his idea to be misrepresented.