I mean, you certainly should learn arithmetic before using a calculator, right? But a calculator is more of a library than a framework, anyway, so your attempt at proof-by-analogy isn't at all resonating with me.
(edit: I think I misread the comment I replied to and am now in the process of maybe figuring that out downthread.)
Calculators do a lot more than arithmetic these days. Without knowledge of calculus, most people think of a calculator as a magic “black box” that outputs square roots and cosines.
FWIW, I am willing to believe that I have misunderstood the comment I responded to, and that they are not making a sarcastic proof that people should not have to learn calculus to merely use a tool like a calculator, but that they in fact should do so, and I'd actually be willing to get on board with that? (But, as anti-framework as I am, I think that might be going a bit far.)
I frankly still can't tell as that's use of "fallacy" to describe "people picking up frameworks without fundamental understanding" is apparently extremely confusing to me as it isn't in the form of a fallacy--like, is the fallacy that this happens? that this is a problem? is it the thought which leads people to do that? etc.--so I automatically added stuff to the sentence to make it work and maybe I did it wrong.
Roots were calculated at least as far back as 1800BC thousands of years before calculus existed. Trig functions were indirectly messed with since around at least 300BC and was definitely discovered by Indians around 600AD. Meanwhile Calculus wasn't published until the 1700s.
Those techniques were precursors of calculus. They're often taught in introductory calculus courses (at the university level). I consider them part of the field of mathematics we now call calculus. Indeed, if you study the history of mathematics you'll find that calculus isn't just something Newton invented out of whole cloth, as he himself was well aware:
"If I have seen further, it is by standing on the shoulders of giants."
You could argue that algebra was a precursor of calculus and that basic operations were precursors of algebra. This is a rather reductionist argument. They hadn't even invented zero or algebra when they found ways to produce square roots or that you could make interesting ratios from the sides of a triangle (yes, I know about the history of zero and that is a slight exaggeration, but no symbol for zero existed in Mesopotamia when they were calculating inverse proportion roots).
My real point is that these thing don't require calculus (though calculus does require them). Most people who take trigonometry (or "precalculus" as they usually call it today) couldn't tell you anything about limits let alone derivatives, integrals, and the fundamental theorem of calculus.
As to Newton inventing calculus, he was a vehement antagonist to Leibniz claiming that his work on calculus had been stolen. That's no the attitude of a man who believed calculus was obvious from looking at previous works.
Most people who take trigonometry (or "precalculus" as they usually call it today) couldn't tell you anything about limits let alone derivatives, integrals, and the fundamental theorem of calculus.
And to those people, the calculator is a magic “black box” that spits out cosines. My original point stands: if you are using a tool professionally you should understand how it works, at least on a basic level. For a calculator, that means calculus, Taylor series, Newton’s method, etc.
Did you know that lots of calculators (including the famous TI-83) actually use CORDIC and that the basics of that method predate calculus by a hundred years? Did you fully understand your tool? Did that keep you from using it?
If you know the purpose of a trig function, it doesn't matter HOW the answer is calculated so much as that you know the answer is accurate. This doesn't require calculus.
I took calculus. As an engineering major, I actually had real-world applications of calculus across my coursework. How many times have I found calculus essential outside of college? Surprisingly few. Meanwhile, I've found a LOT of use for trig or linear algebra. There are things where the underlying theory is very important, but in my experience, this is not one of them.
My original point used calculus as an example. You turned this whole discussion into a referendum on calculus which I have no interest in continuing. Substitute linear algebra or even the basic theory of electronics and my point still stands, which you agree with. Furthermore, you studied calculus so you understand the principles behind the tools you are using, even if you aren’t using those principles directly, and that is valuable. People who don’t understand their tools risk being owned by them.
And sine, cosibe, and tangent are just ratios between different parts of a triangle and can be derived relatively easily (just like how you can derive pi if you know the diameter and circumference of a circle).
I’m not sure what you mean by relatively easily. Do you mean by using a protractor and straightedge? How would you calculate sine of 0.1 degrees that way?
This is a good example. You dont need to learn number theory and operations in n dimensions to be able to effectively use arithmetics in your daily life. A lot of CS courses feel like stuff thats purely theoric for that purpose only.
(edit: I think I misread the comment I replied to and am now in the process of maybe figuring that out downthread.)