Top
Best
New

Posted by mooreds 1 day ago

I don't think AGI is right around the corner(www.dwarkesh.com)
350 points | 417 commentspage 6
pablocacaster 1 day ago|
Llms shit the bed in the real world, i have never seen them work as 'AGI' , sorry its just the transformers + the extra sauce of the apis, so much pollution for a thing that fails between 50% and 90% of the time.
tclancy 1 day ago||
Here I was worried.
incomingpain 1 day ago||
I think AGI already exists in multiple datacenters. Only justification for these huge moves being made for capacity that couldnt possibly be needed.
im3w1l 1 day ago||
I think current LLMs are smart enough to trigger the intelligence explosion. And that we are in the early stages of that.
aeve890 1 day ago|
An explosion by definition happens in a very short period of time. How long is supposed to be this explosion we already are? It reads like a Tom Clancy book.
im3w1l 1 day ago||
Timeframes are hard to predict. I just notice these signs: That it can suggest and reason about strategies to improve itself, and break those strategies down into smaller goals. That it can integrate with toals to (to some extent) work on those goals.
m3kw9 1 day ago||
When someone talks about AGI and then there is a public discussion about it, it’s very analogous to a cat talking to a duck. Everyone responds with a different fantasy version of of AGI in their minds.

Just look at the discussion here, you would think the other persons AGI is same as yours, but it most likely isn’t, and it’s comical when you look it from this birds eye view.

m3kw9 1 day ago||
Nobody has agreed on any definition of AGI, there are plenty of “makes sense” definitions though.
j45 1 day ago||
Even if something like AGI existed soon, or already does privately, it's likely at a very high requirement of horsepower and cost, limiting it's general and broad availability, leaving it leaving it in the hands of the few vs the many, and optimizing that may take it's sweet time.
mythrwy 1 day ago||
No of course not. But it doesn't need to be to realize profound effects.

LLMs don't model anything but are still very useful. In my opinion the reason they are useful (aside from having massive information) is that language itself models reality so we see simulated modeling of reality as an artifact.

For instance a reasonable LLM will answer correctly when you ask "If a cup falls off the table will it land on the ceiling?". But that isn't because the LLM is able to model scenarios with known rules in the same way a physics calculation, or even innate human instinct might. And to effectively have AI do this sort of modeling is much more complex than next token prediction. Even dividing reality into discrete units may be a challenge. But without this type of thinking I don't see full AGI arising any time.

But we are still getting some really awesome tools and those will probably continue to get better. They really are powerful and a bit scary if you poke around.

kachapopopow 1 day ago||
Honestly, o3 pro with actual 1m context window (every model right now drops out at around 128k) that's as fast and cheap as 4o is already good enough for me.
tedsanders 1 day ago||
o3 pro doesn't have a 1M context window, unfortunately. GPT-4.1 and Gemini 2.5 do, though.
kachapopopow 1 day ago||
That's why I said "if". And that's a lot more plausible than an AGI.
namenotrequired 1 day ago||
You didn’t say “if”
kachapopopow 19 hours ago||
It's kind of implied.
v5v3 1 day ago||
Thats nice to know.

What's that got to do with this post though.

kachapopopow 1 day ago||
I don't feel like the AGI people are talking about isn't necessary, something like that would at minimum require as much compute as neurons and synapses that of a teen (minus the requirements to maintain a body).

I think what we have right now with some (very difficult to achieve, but possible in the forseeable future) tweaks we can already see 95% of what an "AGI" could do come true: put most of the population out of jobs, work together and improve itself (to a limited degree) and cause general chaos.

v5v3 1 day ago||
It would put people out of their 'current jobs' which many of them hate and only do to pay the bills.

A lot of people would be far happier and would find something better to do with their day if universal income came along.

Take developers as an example, many don't enjoy the corporate CRUD apps they do.

sublinear 1 day ago|
Welly, welly, welly, welly, welly, welly, well! No fucking shit!
More comments...