I disagree (although I think I understand that people might find it strange).
TDD is in a way a gamification of programming. You go back and forth between being rewarded with a green test suite, to challenging your implementation with another test, to "pinhole-surgery" for improving the implementation. Finally you get the relieve of being able to refactor your implementation without breaking the functionality.
So starting with stupid implementations delivers that first reward from the start and helps you find the rhythm of red-green-refactor.
Now of course TDD is also about making a correct implementation eventually, and usually you will get there after a handful of test cases.
I get where this is coming from but it gets boring fast. That's not how flow works.
You don't get into a flow state because of the reward / regular feedback alone. The challenge must match, or slightly exceed, your skill. Otherwise, you get bored. Boredom kills productivity and, ultimately, the point.
A competent programmer may as well just write out the addition implementation. I'd argue even that's too low a bar. In fact, I don't think it deserves a test (adding two numbers specifically).
That leads me to the second gripe. A majority of TDD examples are trivial problems. A great many are that trivial they wouldn't even warrant the time and expense to test in the real world, which is a genuine factor. If you're spending half your time testing trivial shit that can be verified by sight, you're not spending half your time testing what is crucial that may be difficult to verify without trying out the code.
You know that graph[1] with performance on the Y-axis and "arousal level" from bored -> eustress -> overwhelmed on the X-axis? I think that mental model is key to understanding why TDD tutorials start out with really dead-simple examples.
One primary cognitive demand when programming is holding many ideas in your head at once, many of which you're only really dealing with for the first time this week. In your day-to-day work, you move from subproject to subproject and have to quickly learn new parts of a codebase. As you are doing this though, you are also holding in your head some ideas which stay there: how the standard library is shaped, what the overall architecture of the project is, who on your team to ask for clarity, and how to use your testing framework. The fact that you already are familiar with these things means that they are "chunked" in the same way that the area code of a phone number is. That means it imposes a lower cognitive cost to hold them in your head.
But what if you are currently learning both the idea of TDD and the specific interface of a particular testing library? Then neither will be chunked and you're going to need to devote a lot more brainspace to remembering that recently-learned knowledge. If you are writing complex tests while you do that, you are a lot more likely to be overwhelmed and perhaps give up on TDD. TDD tutorials are written for an audience for whom how to write a basic test is not well-chunked enough to be boring.
TDD is in a way a gamification of programming. You go back and forth between being rewarded with a green test suite, to challenging your implementation with another test, to "pinhole-surgery" for improving the implementation. Finally you get the relieve of being able to refactor your implementation without breaking the functionality.
So starting with stupid implementations delivers that first reward from the start and helps you find the rhythm of red-green-refactor.
Now of course TDD is also about making a correct implementation eventually, and usually you will get there after a handful of test cases.