Using LM Link with Integrations
Use a remote device's model with coding tools like Claude Code and Codex via LM Link
With LM Link, your coding tools can run models on a remote device (like a dedicated LLM rig on your network) while you work from your laptop
Use your integration as normal
Start LM Studio's server on your local machine and configure your tool to point to it. Model loads are routed to the device the model is loaded on or the preferred device if set.
Your local machine handles the API surface at localhost:1234, while the model runs on the device the model is present on.
lms server start --port 1234Claude Code
export ANTHROPIC_BASE_URL=http://localhost:1234
export ANTHROPIC_AUTH_TOKEN=lmstudio
claude --model qwen3-8bSee the full Claude Code guide.
Codex
codex --oss -m qwen3-8bSee the full Codex guide.
Set a preferred device
To use a model on a specific remote device, set the device as the preferred device.
See set a preferred device for more details.
If you're running into trouble, hop onto our Discord