How are large professional services organisation deploying GPT4 (or similar LLMs) within their own organisation for internal use, particularly when using sensitive/confidential/PII data? For example we're a UK based accountancy firm so have created our own internal app based on Microsoft's Open AI GPT4 model deployed to our private Azure hosting. We have built a custom app designed to "preprogramme" personas, i.e. pre-created prompts, for different areas of the profession, so internal users can have GPT4 act in different ways without prior knowledge of prompt engineering. I would be interested to see how others are building or buying solutions, particularly those based in EU/UK who can't use OpenAI/ChatGPT for with customer/confidential information.


3.6k views3 Upvotes3 Comments

CDO in Software, 10,001+ employees
One of the answers is to use Microsoft Azure GPT-4 if we want GPT-4 kind of performance. Another option is to host open source models such as LLama2 in the cloud/in-premise environment and serve it for inference or consumption purposes. We see such requirements coming in from multiple clients. For one of the India Govt client for developer experience, we have hosted a code LLM and built visual studio plugin for developer to consume it. For a Italian financial client we are about to make this happen. This involve 1 time hardware cost if the client is looking for on-premise solution, for cloud based there will be recurrent GPU usage cost based on the number of requests that needs to be served.
Director of Other in Software, 501 - 1,000 employees
We are building solutions and enable our customers to bring their own LLMs or integrate with any open and closed-source LLMs. In case like yours, we encourage our customers to integrate with Azure OpenAI to take advantage of Azure's enterprise security. 
1
Global Head of AI, Data & Analytics in Software, 10,001+ employees
Assume this question is a bit older, but you did outline a standard approach, host / access a model as a service internally through Azure OpenAI, Azure Machine Learning / VM, or Amazon Sagemaker, Amazon Bedrock
Though now Enterprise ChatGPT will make things easier if you don't care about federating models

Content you might like

Too Much Hype25%

Amazing Engineering - Great step forward38%

Still lacking judgement17%

Too Dangerous4%

Whats is GPT-3 ? How do i learn more?17%


273 PARTICIPANTS

1.2k views1 Upvote

Providing more accurate service31%

Personalized service46%

Faster service50%

Analyse customer data39%

Provide insights to help businesses better understand their customers34%

Automate customer service task25%

Identify customer sentiment12%

Detect customer issues more quickly13%


127 PARTICIPANTS

482 views1 Upvote

CEO in Services (non-Government), Self-employed
Using AI tools 2-3 a week. Use cases: 
-summaries of content 
-slide outlines
-abstracts
-citations. 
-Beauti.Ai for slide preparation
-Chat GPT 4
-Styluschat
1
Read More Comments
3.4k views2 Upvotes9 Comments