Top
Best
New

Posted by mfiguiere 4/16/2025

OpenAI Codex CLI: Lightweight coding agent that runs in your terminal(github.com)
516 points | 289 commentspage 5
brap 4/16/2025|
What's the point of making the gif run so fast you can't even see shit
sva_ 4/16/2025||
LLMs currently prefer to give you a wall of text in the hope that some of it is correct/answers your question, rather than giving a succinct, atomic, and correct answer. I'd prefer the latter personally.
mwigdahl 4/16/2025||
Try o3. My (very limited) experience with it is that it is refreshingly free of the normal LLM flattery, hedging, and overexplaining. It figures out your answer and gives it to you straight.
dheera 4/16/2025|||
People somehow seem to be adverse to making the shift from GIF to H.264
porphyra 4/16/2025||
To be fair, terminal output is one of the few things where GIF's LZW compression and limited color palettes shine at.
e12e 4/16/2025||
Not as much as https://asciinema.org/ - when you can use that...
porphyra 4/16/2025||
True, but embedding a gif is way easier than using a javascript thing which might not be allowed in most places.
dheera 4/16/2025||
Browsers just need to support <img src="foo.mp4" style="width:256px;"> already.

It should behave exactly like a GIF, loop by default, and be usable for emojis and everything.

There is absolutely ZERO reason we should be stuck to 256 colors for things like cat videos used as chat stickers. We have had 24-bit displays for ages.

porphyra 4/16/2025||
Animated webp has pretty good browser support by now and Discord uses it by default to serve animated emojis and stickers.

However, many image hosting tools still don't let you upload webp.

[1] https://caniuse.com/webp

yablak 4/16/2025||
That's the model speed :)
brap 4/16/2025||
Not really, they don't even give you a second to read the output before it loops back again.
jedberg 4/16/2025||
Apologies for the HN rule breaking of discussing the comments in the comments, but the voting behavior in this thread is fascinating to me. It seems like this is super controversial and I'm not sure why.

The top comments have a negative score right now, which I've actually never seen.

And also it's a top post with only 15 comments, which is odd.

All so fascinating how outside the norm OpenAI is.

Dangeranger 4/16/2025|
People are getting fed up with hijacking to promote a competing business or side project.
ChadMoran 4/16/2025||
Hey, I tried to solve that by building an upvote bot for the legit comments! Check out my GitHub!

/s

bigyabai 4/16/2025||

  RAM  4‑GB minimum (8‑GB recommended)
It's a CLI...
cryptoz 4/16/2025||
Possibly the heaviest "lightweight" CLI tool ever made haha.
ChadMoran 4/16/2025|||
Lightweight in it's capability I guess.
m00x 4/16/2025||
Which needs to fit all the code in memory + they're considering OS space, etc.
AllSuperIndians 4/17/2025||
[dead]
blt 4/16/2025||
Sorry for being a grumpy old man, but I don't have npm on my machine and I never will. It's a bit frustrating to see more and more CLI tools depending on it.
Dangeranger 4/16/2025||
You could just run it in a Docker container and not think about it much after that. Mount a volume to the container with the directory contents you want to be available for edit by the agent.

https://github.com/openai/codex/blob/main/codex-cli/scripts/...

John23832 4/16/2025|||
I asked the same question for Anthropic's version of this. Why is all of this in JS?
parhamn 4/16/2025|||
JS is web's (and "hip" developer's) python, and in many ways it is better. Also the tooling is getting a lot better (libraries, typescript, bundling, packaging, performance).

One thing I wonder that could be cool: when Bun has sufficient NodeJS compatibility the should ship bun --compile versions so you dont need node/npm on the system.

Then it's arguably a, "why not JS?"

throwaway314155 4/16/2025||
> and in many ways it is better

Right but is it worth having to write JS?

/s (kinda)

photonthug 4/16/2025||||
Tree-sitter related bits probably
emporas 4/16/2025||
tree-sitter is a C library though. Only grammars for each particular lang are defined in javascript.
AstroBen 4/16/2025||||
typescript is a pretty nice language to work with. why not?
tyre 4/16/2025|||
this is a strong HN comment. lots of “putting a stick in my own bicycle wheel” energy

there are tons fascinating things happening in AI and the evolution of programming right now. Claude and OpenAI are at the forefront of these. Not trying it because of npm is a vibe and a half.

schainks 4/16/2025|||
Why? I am not the biggest fan of needing a whole VM to run CLI tools either, but it's a low-enough friction experience that I don't particularly care as long as the runtime environment is self-contained.
sudofail 4/16/2025|||
Same, there are so many options these days for writing CLIs without runtime dependencies. I definitely prefer static binaries.
therealmarv 4/16/2025|||
It might shock you but many of use editors built on browsers for editing source code.

I think the encapsulating comment from a another guy (in Docker or any other of your favorite VM) might be your solution.

Vegenoid 4/16/2025|||
What package managers do you use, and what does npm do differently that makes you unwilling to use it?
teaearlgraycold 4/16/2025|||
Judge the packages on their dependencies, not on their package manager.
crancher 4/16/2025|||
What are your concerns?
jensenbox 4/16/2025||
The entire JS ecosystem.
ilrwbwrkhv 4/16/2025|||
Yep, this is another one of the reasons why all of these tools are incredibly poor. Like, the other day I was looking at the MCP spec from anthropic and it might be the worst spec that I've ever read in my life. Enshittification at the level of an industry is happening.
meta_ai_x 4/16/2025||
if OpenAI had really smart models, they would converted TS/JS apps to Go or Rust apps.

Since they don't, AGI is not here

terminaltrove 4/16/2025|
It's very interesting that both OpenAI and Anthropic are releasing tools that run in the terminal, especially with a TUI which is what we showcase.

aider was one of the first we listed as terminal tool of the week (0) last year. (1)

We recently featured parllama (2) (not our tool) if you like to run offline and online models in the terminal with a full TUI.

(0) https://terminaltrove.com/tool-of-the-week/

(1) https://terminaltrove.com/aider/

(2) https://terminaltrove.com/parllama/

sva_ 4/16/2025|
Github Copilot had a tool that runs in the terminal for longer I'm pretty confident. I can activate it with syntax "?? <question>" and it'll respond with a command, explaining the parameters. I've been using it quite a bit, for stuff like ffmpeg or writing bash 1-liners.
tough 4/17/2025||
yep that was the alias during beta

now its just gh copilot

https://docs.github.com/en/copilot/using-github-copilot/usin...

https://github.com/github/gh-copilot