Has anyone tried out LLaMa 2? Would you build with it in its current state?
Sort by:
no title2 years ago
Llama 2 is a Meta product, what OpenAI API costs are you referring to?
no title2 years ago
What I meant was using OpenAI models via API costs a fortune if you use those at a huge scale. If you want to save this cost, it's rather easier to self-host LLaMa2 and pretty much get everything done at a lower expense.
Yes, we have recently tried this in a POC for one of our clients. It's quite useful if you are conscious of OpenAI API costs, this can be self-hosted and is able to cater to most of the GenAI kind of applications like Chatbots, Finding similar objects, summarising texts etc. Another benefit is that LLaMa2 can be easily trained using general libraries available for Python and you can get stuff done that OpenAI models would usually refrain from doing.