Top
Best
New

Posted by yags 6 days ago

MCP in LM Studio(lmstudio.ai)
240 points | 144 commentspage 2
jtreminio 6 days ago|
I’ve been wanting to try LM Studio but I can’t figure out how to use it over local network. My desktop in the living room has the beefy GPU, but I want to use LM Studio from my laptop in bed.

Any suggestions?

numpad0 6 days ago||

  [>_] -> [.* Settings] -> Serve on local network ( o)
Any OpenAI-compatible client app should work - use IP address of host machine as API server address. API key can be bogus or blank.
skygazer 6 days ago||
Use an openai compatible API client on your laptop, and LM Studio on your server, and point the client to your server. LM Server can serve an LLM on a desired port using the openai style chat completion API. You can also install openwebui on your server and connect to it via a web browser, and configure it to use the LM Studio connection for its LLM.
smcleod 5 days ago||
I really like LM Studio but their license / terms of use are very hostile. You're in breach if you use it for anything work related - so just be careful folks!
jmetrikat 5 days ago||
great! it's very convenient to try mcp servers with local models that way.

just added the `Add to LM Studio` button to the anytype mcp server, looks nice: https://github.com/anyproto/anytype-mcp

bbno4 6 days ago||
Is there an app that uses OpenRouter / Claude or something locally but has MCP support?
cedws 6 days ago||
I’m looking for something like this too. Msty is my favourite LLM UI (supports remote + local models) but unfortunately has no MCP support. It looks like they’re trying to nudge people into their web SaaS offering which I have no interest in.
eajr 6 days ago|||
I've been considering building this. Havent found anything yet.
cchance 6 days ago||
vscode with roocode... just use the chat window :S
squanchingio 6 days ago||
I'll be nice to have the MCP servers exposed like LMStudio OpenAI-like endpoints.
zaps 6 days ago||
Not to be confused with FL Studio
maxcomperatore 6 days ago||
good.
b0a04gl 6 days ago|
[dead]
More comments...