Would you be willing to pay up to 10 times more for a dedicated, privacy-focused version of ChatGPT that runs on separate servers from other users (rumor is Microsoft is working on this)?
Sort by:
As Juan said, I am also not sure where the figure of 10 times comes from. But in general, in order to use this technology for certain use cases, we need to incorporate more privacy. And we are doing that with Azure openAI, etc. But disconnecting the infrastructure is not the important part. Making sure that the data you specify doesn't go into the training material, etc. - as Microsoft promises - is really important, at least for certain use cases.
Conceptually pay more for more privacy, more security, more predictable performance, yes probably but at the moment we haven’t proven the business for the free version yet. 10x zero = zero
When the business value is more clear we can debate value and cost
Not likely at this time. We are currently exploring open source alternatives such as Dolly-2, LLaMA, and others while performing POVs.
We are already working on it using Azure Open AI but not sure from where the 10 time figure came up, business cases are still to be built
No because its so much cheaper to build that ourselves with Azure OpenAI having done lots of the heavy lifting.