word2vec and fasttext are specialized tools for creating word embeddings, this is a more generalist library. It's more comparable to PyText, AllenNLP or Flair, the main difference appearing to be that the other three use PyTorch, not Tensorflow.
with the recent change for TF 2.0. If you would design something similar, will you use TF or Pytorch?
What I am trying to ask here is, Is TF 2.0 is comparable to Pytorch when it comes to ease of use?
Probably not. It has "imperative" mode, but it also drags in quite a bit of API baggage that PyTorch just doesn't have. It's not that PyTorch is ideal, but its main advantage is it "feels like NumPy". Google also has a library that "feels like NumPy". In fact it kind of _is_ NumPy with hardware acceleration, but it seems to be in very early stages. I heard from insiders that there are only 2 people working on the project, if that. The name of the project is JAX: https://github.com/google/jax. It's arguably a lower level framework on top of which something like PyTorch could be built.