I don't think it can go any smaller and demonstrate its capabilities. Even GPT-2 is mostly generating nonsensical sentences that resembles English.
You might be looking at ngram or markov model for simpler NLP capabilities.
I don't think it can go any smaller and demonstrate its capabilities. Even GPT-2 is mostly generating nonsensical sentences that resembles English.
You might be looking at ngram or markov model for simpler NLP capabilities.