Top
Best
New

Posted by yags 6/25/2025

MCP in LM Studio(lmstudio.ai)
240 points | 145 commentspage 2
jtreminio 6/26/2025|
I’ve been wanting to try LM Studio but I can’t figure out how to use it over local network. My desktop in the living room has the beefy GPU, but I want to use LM Studio from my laptop in bed.

Any suggestions?

numpad0 6/26/2025||

  [>_] -> [.* Settings] -> Serve on local network ( o)
Any OpenAI-compatible client app should work - use IP address of host machine as API server address. API key can be bogus or blank.
skygazer 6/26/2025||
Use an openai compatible API client on your laptop, and LM Studio on your server, and point the client to your server. LM Server can serve an LLM on a desired port using the openai style chat completion API. You can also install openwebui on your server and connect to it via a web browser, and configure it to use the LM Studio connection for its LLM.
smcleod 6/26/2025||
I really like LM Studio but their license / terms of use are very hostile. You're in breach if you use it for anything work related - so just be careful folks!
jmetrikat 6/26/2025||
great! it's very convenient to try mcp servers with local models that way.

just added the `Add to LM Studio` button to the anytype mcp server, looks nice: https://github.com/anyproto/anytype-mcp

bbno4 6/26/2025||
Is there an app that uses OpenRouter / Claude or something locally but has MCP support?
cedws 6/26/2025||
I’m looking for something like this too. Msty is my favourite LLM UI (supports remote + local models) but unfortunately has no MCP support. It looks like they’re trying to nudge people into their web SaaS offering which I have no interest in.
eajr 6/26/2025|||
I've been considering building this. Havent found anything yet.
cchance 6/26/2025||
vscode with roocode... just use the chat window :S
squanchingio 6/25/2025||
I'll be nice to have the MCP servers exposed like LMStudio OpenAI-like endpoints.
zaps 6/25/2025||
Not to be confused with FL Studio
maxcomperatore 6/25/2025||
good.
b0a04gl 6/25/2025|
[dead]
More comments...