Keras is quite popular as well. My point is, these courses have the cognitive load that's quite high even without having to deal with an unfamiliar framework.
Don't get me wrong, it seems like a great course and anyone interested in the field will certainly get a lot of mileage out of it. But by choosing a rather obscure framework it kind of shoots itself in the foot compared to e.g. Fast.ai course, which is jam packed with practical advice and uses PyTorch. Like it or not, there are two dominant frameworks right now: TF and PyTorch, the latter is my personal favorite by far. A more practical (and fairly low-effort) approach, therefore, would be to duplicate code samples in PyTorch.
That said, Fast.ai shoots itself in the foot a little too, by requiring Python 3.6 and up, which a lot of people don't have out of the box. I understand why they do it (type annotations), but still. They also hide PyTorch behind a rather large ball of Python with cognitive loads of its own.
Your point is understood, and reiterated in other posts here. I just wanted to add context to the course as being presented by an author of MXNet, who is free to present it in his own framework.
Anyway, there's no dearth of courses in the popular frameworks already, it doesn't really help enrich the community if all educational resources only taught the current "market winner", we might otherwise be missing other opportunities or ways of thinking. Imagine if all schools only taught Javascript or Java!
Don't get me wrong, it seems like a great course and anyone interested in the field will certainly get a lot of mileage out of it. But by choosing a rather obscure framework it kind of shoots itself in the foot compared to e.g. Fast.ai course, which is jam packed with practical advice and uses PyTorch. Like it or not, there are two dominant frameworks right now: TF and PyTorch, the latter is my personal favorite by far. A more practical (and fairly low-effort) approach, therefore, would be to duplicate code samples in PyTorch.
That said, Fast.ai shoots itself in the foot a little too, by requiring Python 3.6 and up, which a lot of people don't have out of the box. I understand why they do it (type annotations), but still. They also hide PyTorch behind a rather large ball of Python with cognitive loads of its own.