Top
Best
New

Posted by gronky_ 3/26/2025

OpenAI adds MCP support to Agents SDK(openai.github.io)
807 points | 267 commentspage 2
johnjungles 3/26/2025|
If you want to try out mcp (model context protocol) with little to no setup:

I built https://skeet.build/mcp where anyone can try out mcp for cursor and now OpenAI agents!

We did this because of a painpoint I experienced as an engineer having to deal with crummy mcp setup, lack of support and complexity trying to stand up your own.

Mostly for workflows like:

* start a PR with a summary of what I just did * slack or comment to linear/Jira with a summary of what I pushed * pull this issue from sentry and fix it * Find a bug a create a linear issue to fix it * pull this linear issue and do a first pass * pull in this Notion doc with a PRD then create an API reference for it based on this code * Postgres or MySQL schemas for rapid model development

Everyone seems to go for the hype but ease of use, practical pragmatic developer workflows, and high quality polished mcp servers are what we’re focused on

Lmk what you think!

polishdude20 3/27/2025||
This looks super cool and I can see myself using this!

Although the name is a bit unfortunate.

pfista 3/27/2025||
my favorite is implementing issues from linear
goaaron 3/27/2025||
Call it a protocol and suddenly it sounds like a foundational technology. Nah, it's just a fancy JSON schema that lets LLMs play hot potato with metadata.
rgomez 3/26/2025||
Shamelessly promoting in here, I created an architecture that allows an AI agent to have those so called "tools" available locally (under the user control), and works with any kind of LLMs, and with any kind of LLM server (in theory). I've been showing demos about it for months now. Works as a middle-ware, in stream, between the LLM server and the chat client, and works very well. The project is open source, even the repo is outdated, but simply because no one is expressing interest in looking into the code. But here is the repo: https://github.com/khromalabs/Ainara. There's a link to a video in there. Yesterday just recorded a video showcasing DeepSeek V3 as the LLM backend (but could be any from OpenAI as well, or Anthropic, whatever).
nomel 3/26/2025|
The lack of interest may be from the crypto aspect:

> While the project will always remain open-source and aims to be a universal AI assistant tool, the officially developed 'skills' and 'recipes' (allowing AI to interact with the external world through Ainara's Orakle server) will primarily focus on cryptocurrency integrations. The project's official token will serve as the payment method for all related services.

rgomez 3/27/2025||
Thank you for the feedback... actually I need to update that, the crypto part of my project will be closed source (an specific remote server) but the idea behind the project itself is universal and open since the very beginning, I already developed dozens of skills including a meta-search engine (searches in several engines at once and combines results dynamically, all balanced by the AI) which are open source as well. Crypto just kind of showed itself as way of funding project, with no strings attached, and till this very day no one else showed up.
keyle 3/26/2025||
I'm new to "MCP"... It says here that even IDE plug into this MCP server [1], as in you don't edit files directly anymore but go through a client/server?

It wasn't bad enough that we now run servers locally to constantly compile code and tell us via json we made a typo... Soon we won't even be editing files on a disk, but accessing them through a json-rpc client/server? Am I getting this wrong?

[1] https://modelcontextprotocol.io/introduction

NicuCalcea 3/26/2025|
I think you are getting it wrong. Some IDEs like Cursor and VS Code extensions like Cline support MCP servers, meaning you can give them access to your databases, Jira tickets, Notion notes, Slack messages, etc.
larodi 3/26/2025||
Claude is like years ahead of everyone else with tools and agentic caps.
MoonGhost 3/26/2025|
Can't be. "It was launched by Anthropic back in November 2024," about MCP
senko 3/26/2025||
That's a few decades in AI time.
eigenvalue 3/27/2025||
This is great, I was debating whether I should do my latest project using the new OpenAI Responses API (optimized for agent workflows) or using MCP, but now it seems even more obvious that MCP is the way to go.

I was able to make a pretty complex MCP server in 2 days for LLM task delegation:

https://github.com/Dicklesworthstone/llm_gateway_mcp_server

chaosprint 3/26/2025||
claude needed these those tools in 2024, so having the community contribute for free was actually a smart move.

service providers get more traffic, so they’re into it. makes sense.

claude 3.5 was great at the time, especially for stuff like web dev. but now deepseek v3 (0324) is way better value. gemini's my default for multimodal. openai still feels smartest overall. i’ve got qwq running locally. for deep research, free grok 3 and perplexity work fine. funny enough, claude 3.7 being down these two days didn’t affect me at all.

i checked mcp since i contribute to open source, but decided to wait. few reasons:

- setup’s kind of a mess. it’s like running a local python or node bridge, forwarding stuff via sse or stdio. feels more like bridging than protocol innovation

- I think eventually we need all the app to be somehow built-in AI-first protocol. I think only Apple (maybe Google) have that kind of influence. Think about the lightening vs usb-c.

- performance might be a bottleneck later, especially for multimodal.

- same logic as no.2 but the question is that do you really want every app to be AI-first?

main issue for me: tools that really improve productivity are rare. a lot of mcp use cases sound cool, but i'll never give full github access to a black box. same for other stuff. so yeah—interesting idea, but hard ceiling.

striking 3/26/2025|
You should take a look at how Claude Code does its permissioning. It's totally fine to connect it right up to your GitHub MCP server because it'll ask each time it wants to take an action (and you can choose "don't ask again for this tool" if it's an obviously safe operation like searching your PRs).
chaosprint 3/26/2025||
First, I don't want to pay Claude Code at all...

Second, isn't that doable with API calling :)

TIPSIO 3/26/2025||
I know Cloudflare has been talking about remote MCP for a while, does anyone have a solid example of this in practice?
avaer 3/26/2025||
Does anyone have any prior art for an MCP server "message bus" with an agent framework like Mastra?

E.g. suppose I want my agent to operate as Discord bot listening on channel via an MCP server subscribed to the messages. i.e. the MCP server itself is driving the loop, not the framework, with the agent doing the processing.

I can see how this could be implemented using MCP resource pubsub, with the plugin and agent being aware of this protocol and how to pump the message bus loop, but I'd rather not reinvent it.

Is there a standard way of doing this already? Is it considered user logic that's "out of scope" for the MCP specification?

EDIT: added an example here https://github.com/avaer/mcp-message-bus

tonyhb 3/27/2025|
https://inngest.com and agentkit. disclaimer is I work on it.

Does all of the event stuff and state stuff for you. Plus the orchestration.

punkpeye 3/27/2025||
Big fan of Inngest! Most of http://glama.ai/mcp logic is built on top of Inngest.
paradite 3/27/2025|
MCP is basically commoditizing SaaS and software by abstracting them away behind the AI agent interface.

It benefits MCP clients (ChatGPT, Claude, Cursor, Goose) more than the MCP servers and the service behind the MCP servers (GitHub, Figma, Slack).

More comments...