Top
Best
New

Posted by mikeevans 17 hours ago

Codex is now in the ChatGPT mobile app(openai.com)
370 points | 180 commentspage 3
charlie90 10 hours ago|
Nice. Next step is giving codex/Claude Code local device control...problem is the current ios/android are so locked down that agents can't do much ...but the space is so ripe for disruption that I bet we'll see AI-native devices coming out within the next few years that allow agents to interact with everything. I would be nervous if I were apple right now.
shepherdjerred 9 hours ago||
I’d finally have a use case for my overpowered iPad if it could compile and run code
satvikpendem 9 hours ago||
Now how would Apple get that sweet Mac money if you could do everything from your iPad? And that's exactly why they artificially segment those devices.
adithyassekhar 10 hours ago||
Android can allow an app to control the device using accessibility permissions.
rexthonyy 6 hours ago||
As the winner of the everything app is revealed, I foresee this feature integrated in it. One platform to manage all agents by any provider.
satvikpendem 9 hours ago||
Somewhat related, some of these AI remote coding apps are iOS, what are people using for Android? Looks like some people are using terminal emulators to ssh into their machine and use the LLM CLIs but that seems clunky.
mrasong 5 hours ago||
Well, this just made it even easier to keep coding away from the desk.
mintflow 9 hours ago||
Oh no, I am just adding codex integration to my app with in-app tailscale networking, communicating with codex app server via websocket over tailscale

But I will still consider to release it anyways

Jimidesuu 10 hours ago||
So Codex is also heading towards 'portability' and I can see that here but I bet this will take time before it's cleanly optimized for mobile hard use
iridione 15 hours ago||
This is neat! Now I'm curious, what's left to innovate in the coding agent space? Sure there are the usual suspects like maintenance, security, reliability and other scalability improvements and looks like they will be addressed in the next year or two.
michelb 3 hours ago||
The entire UI/UX? We went back in time and basically have a text streamer in a 70's style terminal or existing editor-like situation. If you want to read and (hand)write code, sure, you might be done and be happy with the new variant of what you had decades ago.

A UI like Jira/Trello to stage features and see (agentic)team status. A Figma-like UX to actually build out the app/interface/features. A system that aids human review. There's tons of paradigms to explore and improve upon.

thornewolf 15 hours ago|||
there is something "wrong" with the ux that is hard to pin down. these things generate even text summaries more rapidly than i can read them. i need a better method for dumping info into my brain + dynamic control (if necessary)
jpalomaki 13 hours ago|||
Tell it to create html summaries with diagrams and sidebar for navigation.

Or ask Codex to create image that explains xyz.

ssl-3 14 hours ago|||
When I take time to read all of the output, I often find that it's mostly noise. I don't like noise so I usually don't bother.

But a person can use subagents, if they want, to filter that down. This burns tokens in a big hurry, but I think subagents can be arbitrary local commands (eg, a local LLM).

Or, you know: Just slow down. :) It doesn't always have to be a race, does it?

deadbabe 13 hours ago|||
Agent farms. Have agents make tons of random high fidelity variations around the clock of the same app or feature from some vague ideas, and you use each of them to see which one you like best and can productize, and you skip the need to do iterative prompts.
helsinki 11 hours ago||
Some of us pay by the token.
ukuina 13 hours ago||
Is texting your Coding Agent really the final form? Something that watches your interactions or process execution to surface improvements, or whips up prototypes while you brainstorm seems like the next step.
satvikpendem 9 hours ago||
Not sure why this was flagged, this makes sense but only if inference gets sufficiently cheap. It would be awesome to see a bunch of interactive prototypes and iterate on the UX before ever building the full app. Historically that's been somewhat difficult even with UX designers.
GTonehour 7 hours ago||
This feature could well be the reason OpenAI hired Peter Steinberger (OpenClaw).
andai 5 hours ago||
That was my thought too. Claude Code and Codex are very close to Claw already (general purpose computer use) and moving increasingly in that direction (mobile integrations, built in memory features etc.)

The main issue is reliability, so I think the corporations are going to take a much more gradual, piecemeal approach, and probably end up with something like Claw within a year.

sumedh 3 hours ago|||
They are just copying features from Claude Code.
rlt 7 hours ago||
No.
GanteRooibos 5 hours ago|
So we can finally stop tailscale + ssh + codex. Nice
More comments...