openai-compat-endpoint

Public

Forked from tupik/openai-compat-endpoint

qwen/qwen-turbo x-ai/grok-4.1-fast:free openrouter/bert-nebulon-alpha etc FREE @openrouter.ai baseUrl: "https://openrouter.ai/api/v1"

6 Downloads

README

OpenAI-Compatible Endpoint Adapter for LM Studio

Use any OpenAI-compatible API with LM Studio by routing chat requests through a configurable base URL (default: OpenRouter). The plugin streams responses, forwards tool calls, and lets you choose any model ID supported by your provider.

Features

  • Works with OpenAI-compatible providers (OpenRouter by default).
  • Streaming chat responses.
  • Tool call passthrough when the provider supports tools.
  • Simple per-chat model selection.
  • Optional debug logging for request/response diagnostics.

Requirements

  • Node.js + npm
  • LM Studio installed, with the lms CLI available (LM Studio must be run at least once before lms works).

Optional (Windows): bootstrap lms into PATH if needed:

cmd /c %USERPROFILE%/.lmstudio/bin/lms.exe bootstrap

Quick Start (Local Development)

npm install
npm run dev

npm run dev runs lms dev, which starts the plugin dev server, verifies manifest.json, installs deps if needed, and rebuilds on changes. Note: lms dev is part of LM Studio Plugins (beta).

Configuration

Configuration is done in LM Studio plugin settings.

Global (shared across chats):

  • API Key: provider API key (e.g., sk-or-... for OpenRouter)
  • Base URL: provider base URL (default: https://openrouter.ai/api/v1)

Per-chat:

  • Model: any provider model ID, e.g. allenai/molmo-2-8b:free
  • Debug Logging: enable verbose request logging and status output

Usage

  • Set API Key and Base URL in global settings.
  • Set the Model in the chat/plugin settings.
  • Start a chat in LM Studio.

If the model is missing, the plugin will return an error asking you to set it.

Troubleshooting

  • 400 Provider returned error: the model ID may be invalid or the provider rejected the request. Enable Debug Logging to inspect the payload.
  • Invalid or missing API key: verify the key in global settings.
  • 429 Rate limit: try again later or use a paid key.

Development Notes

  • Source code lives in src/.
  • Build output goes to dist/ (generated by TypeScript).
  • Key files: src/generator.ts, src/config.ts, manifest.json.

Publishing

npm run push

npm run push wraps lms push to upload the current folder as a plugin revision. lms push prompts for confirmation unless you pass -y.

If you have not authenticated with LM Studio Hub yet, run:

lms login

Security

Never hardcode API keys. Use the protected global config field instead.

License

ISC