Has anyone tried out LLaMa 2? Would you build with it in its current state?

4.9k viewscircle icon3 Comments
Sort by:
Senior Vice President, Engineering in Software2 years ago

Yes, we have recently tried this in a POC for one of our clients. It's quite useful if you are conscious of OpenAI API costs, this can be self-hosted and is able to cater to most of the GenAI kind of applications like Chatbots, Finding similar objects, summarising texts etc. Another benefit is that LLaMa2 can be easily trained using general libraries available for Python and you can get stuff done that OpenAI models would usually refrain from doing.

2 Replies
no title2 years ago

Llama 2 is a Meta product, what OpenAI API costs are you referring to?

no title2 years ago

What I meant was using OpenAI models via API costs a fortune if you use those at a huge scale. If you want to save this cost, it's rather easier to self-host LLaMa2 and pretty much get everything done at a lower expense.

Content you might like

Yes82%

No18%

No, but we're working on it

View Results

Getting started with our AI strategy39%

Evaluating vendors' AI capabilities21%

Building (Agentic) AI apps in-house24%

Upskilling employees on how to use AI12%

Other (See comments...)2%

View Results