Ask HN: Codex is too slow. Is there any solution?
The codex backend is of good quality, the frontend is average, but most importantly it is too slow. I wonder if OpenAI will improve it.
The codex backend is of good quality, the frontend is average, but most importantly it is too slow. I wonder if OpenAI will improve it.
It seems to work with less issues than CC opus.
I don’t mind if it takes longer as long as the answer is correct more often.
You can always be doing more work while one chat is working..
the new 0.47 has a better performance now imho
this seems like a crazy idea as the cli client has nothing to do with how many tokens per second the api streams
The api is never the bottleneck but how fast the cli provides context. So just by using ripgrep it will be faster than using grep. On top of this concurrent code search compared to sync etc
Sonnet and Gemini are good and fast. Can't speak for Grok.
Grok Turbo is fast.
[dead]