Description
Use your own OpenAI-compatible API served locally by Llama-server or Ollama in LM Studio by fetching the model list.
Stats
23 Downloads
Last updated
Updated on January 18byProject Files
README
This is for connecting local server models like llamacpp, ollama, etc... to LM studio. Any server with a base url. The original plugin does not allow the to add thier own model names (as far as I know) so I could not load/unload or query my models.
This plugin uses the user's /v1/models endpoint to list their available models, and adds them to the model dropdown in the UI, overwriting the OpenAI and Anthropic defaults.
To do this: