What are the best alternative on premise solutions for ChatGPT? We do not want our data to be shared online so we are looking for a solution which provides in house solution.
Sort by:
Chief Technology Architect in Governmenta year ago
There are a few ways. You can have standard OpenAI agreement with Microsoft Azure subscription that guarantees not to use your private data. You need to set your own chat bot or application to use OpenAI.
Then you can download a model in your private hardware and use. To get a sense of how that works, see nomic-ai/GTP4All -- you can run this without internet connection on your local data. The flip side is that your hardware must be good enough to run the model and index data efficiently.
no titlea year ago
Thank you Debasis
Engineera year ago
I would recommend to consider mistral.ai
Check their documentation for Self-deployment
no titlea year ago
Thankyou Dmytro, <br>I will check. <br><br>
I recently work on the local solution LM Studio to address the same problem of data security. The point was to evaluate a local application, installed on user's PC, as a tool to "enhance" the user experience. It's really effective even it's a bit hardware consuming and now META release the Llama model as a community resource, so you can access a incredibly effective and "commercial grade" LLM without any concerns about your data security. LM Studio doesn't share any data online and everything stay strictly local. It's a great way to test different LLM. You can also deploy a "Private GPT" (another community resource) server on premise to deploy the LLM to your users without any external connection. When we discuss about this with enterprise architect in our company, we all think the future of LLM is on premise (or in private Landing Zone) because of the data issue.