I love this way of explaining it. I've been calling it the programmers fallacy -- "anything you can do you can do in a for loop."
I think in a lot of ways we all struggle with the nature of some things changing their nature depending on the context and scale. Like if you kill a frenchman on purpose that's a murder, if you killed him because if he attacked you first it's self defense, if you killed him because he was convicted of a crime that's an execution, if you killed him because he's french that's a hate crime, but if you're at war with France that's killing an enemy combatant, but if he's not in the military that's a civilian casualty, and if you do that a lot it becomes a war crime, and if you kill everyone who's french it's a genocide.
I don't think you can see the problem with your own analogy...
Human made ski runs only will use as much snow as they need because snow is expensive. If ski runs were popular/useful based on their depth then I'm absolutely sure some greedy company would keep piling it up until disaster occurred (mining waste is another great example here).
So how much 'intelligence' is enough? How much capability is too much? How fast is too fast when thinking?
We will never stop improving our capabilities unless some natural law provides that limit. And we absolutely know the base limit to intelligence is the smartest person. And there is little reason to 'peak' intelligence to be limited at the level of humans and their power restricted formats.
A snowball probably isn't harmful unless you do something really dumb.
A snow drift isn't harmful unless you're not cautious.
An avalanche, well that gets harmful pretty damned quick.
These things are all snow, but suddenly at some point scale starts to matter.