LM Studio as a Local LLM API Server
Run an LLM API server on localhost with LM Studio
You can serve local LLMs from LM Studio's Developer tab, either on localhost or on the network.
LM Studio's APIs can be used through REST API, client libraries like lmstudio-js and lmstudio-python, and compatibility endpoints like OpenAI-compatible and Anthropic-compatible.
Running the server
To run the server, go to the Developer tab in LM Studio, and toggle the "Start server" switch to start the API server.
Alternatively, you can use lms (LM Studio's CLI) to start the server from your terminal:
lms server startAPI options
- LM Studio REST API
- TypeScript SDK -
lmstudio-js - Python SDK -
lmstudio-python - OpenAI-compatible endpoints
- Anthropic-compatible endpoints