I’ve been playing with the Alpaca demo, and I’m really impressed! The outputs are generally excellent, especially for a model of that size, fine tuned on a $100 (!!) compute budget.
If the cloud of uncertainty around commercial use of derivative weights from LLaMA can be resolved, I think this could be the answer for a lot of domain-specific generative language needs. A model you can fine tune on your own data, and which you host and control, rather than depending on a cloud service not to arbitrarily up prices/close your account/apply unhelpful filters to the output/etc.
If the cloud of uncertainty around commercial use of derivative weights from LLaMA can be resolved, I think this could be the answer for a lot of domain-specific generative language needs. A model you can fine tune on your own data, and which you host and control, rather than depending on a cloud service not to arbitrarily up prices/close your account/apply unhelpful filters to the output/etc.