I'm certainly not a real mathematician, though I was a math major in college. But I use math all the time. Meanwhile, my kids are taking math in school.
Practically everything I do with math, is done at the computer. When I derive something by hand, it's with the knowledge that I'm just doing it for nostalgia's sake. I could, and probably should, use Jupyter / Python / Maxima for everything. And I'd enjoy learning how to use even more interesting tools such as a proof assistant, even if it would be purely recreational at this point in my career.
Meanwhile, in their high school math classes, my kids will never touch a computer. Everything is done by hand, with occasional use of a graphing calculator (what an archaic device).
In a weird sense, not only are they learning history, but the entire curriculum is history.
I can't say if this is good or bad. Whatever I learned in high school must have paved the way for me to pick up more modern techniques fairly readily. Math really came alive for me when I began to learn abstract math, and was simultaneously introduced to computation at the front end of the microcomputer revolution. That's what made me want to be a math major.
When you say you should use a computer, you mean for arithmetic and calculations right?
Also calculators are hardly archaic. If I want to calculate something quickly I'll always go for my Casio FX-83GT, since I can type it much faster in there. They are archaic in the sense that the number of terms you can have can be limiting though...
I think a good grasp of arithmetic is incredibly helpful in the real world, and is something that academics (like me, as a physicist) often lack, whereas "regular people" are much better at it. I also rarely bother with change, I pay with card when I can...
I think doing anything more complicated than basic calculations on paper is pointless though
I have a calculator too, though not a graphing one, and I use it for a similar reason: The keypad is convenient. But if I need to graph something, or do a repetitive calculation, then I'll turn to Jupyter. In fact, I have it on my tablet.
I graduated from high school just as graphing calculators were introduced, so it never became part of my experience.
What seems unfortunate about the graphing calculators is that its special symbiosis with K-12 math teaching limits the development of both. You can't add features to the calculator, or offer a free alternative as a phone app, without facilitating "cheating," and the textbooks can't introduce lessons requiring computational power beyond the capabilities of the calculator.
Not to mention, the TI monopoly: Every family has to shell out for one of those things.
Practically everything I do with math, is done at the computer. When I derive something by hand, it's with the knowledge that I'm just doing it for nostalgia's sake. I could, and probably should, use Jupyter / Python / Maxima for everything. And I'd enjoy learning how to use even more interesting tools such as a proof assistant, even if it would be purely recreational at this point in my career.
Meanwhile, in their high school math classes, my kids will never touch a computer. Everything is done by hand, with occasional use of a graphing calculator (what an archaic device).
In a weird sense, not only are they learning history, but the entire curriculum is history.
I can't say if this is good or bad. Whatever I learned in high school must have paved the way for me to pick up more modern techniques fairly readily. Math really came alive for me when I began to learn abstract math, and was simultaneously introduced to computation at the front end of the microcomputer revolution. That's what made me want to be a math major.