openai-compat-endpoint

Public

Use your own OpenAI-compatible API served locally by Llama-server or Ollama in LM Studio by fetching the model list.

23 Downloads

README

This is for connecting local server models like llamacpp, ollama, etc... to LM studio. Any server with a base url. The original plugin does not allow the to add thier own model names (as far as I know) so I could not load/unload or query my models.

This plugin uses the user's /v1/models endpoint to list their available models, and adds them to the model dropdown in the UI, overwriting the OpenAI and Anthropic defaults.

To do this:

  • Install this plugin
  • Enter base URL
  • Type something in the chat box and send that prompt
    • Your API endpoint will then populate the model dropdown with the models from your endpoint at /v1/models.
  • Uninstall this plugin
  • Reinstall this plugin Your dropdown will now be populated with your endpoint's model list