Comment by lhousa
Rookie question: the openAPI endpoint costs extra right? Not something that comes with chatGPT or chatGPT+.
Rookie question: the openAPI endpoint costs extra right? Not something that comes with chatGPT or chatGPT+.
Look into allowing it to connect to either a LM Studio endpoint or ollama please.
Openai does not train models on data that comes in from the API.
Assuming for the moment that they aren't saying that with their fingers crossed behind their back, that doesn't change the fact that they store the inputs they receive and swear they'll protect it (Paraphrasing from the Content section of the above link). Even if it's not fed back into the LLM, the fact that they store the inputs anywhere for a period of time is a huge privacy risk -- after all a breach is a matter of "when", not "if".
Correct but I'm going to loom into a locally running LLM so it would be free.