Top
Best
New

Posted by mooreds 1 day ago

I don't think AGI is right around the corner(www.dwarkesh.com)
353 points | 419 commentspage 7
deadbabe 1 day ago|
I’ve noticed it’s becoming a lot more popular lately for people to come out and say AGI is still very, very far away. Is the hype cycle ending somewhat? Have we passed peak LLM?

Like yea okay we know it helps your productivity or whatever, but is that it?

gjm11 1 day ago||
Patel isn't saying that AGI is still very, very far away. He says his best estimate is 2032 and "ASI" in 2028 is "a totally plausible outcome".

(He thinks it might be quite a long way away: "the 2030s or even the 2040s", and it seems to me that the "2040s" scenarios are ones in which substantially longer than that is also plausible.)

andy99 1 day ago||
Maybe - anecdotally, HN at least is not very receptive to the idea that transformers are not (or with more data will never be) sentient somehow, and almost every post I see about this is followed up by the idiotic "how do we know human intelligence isn't the same thing", as if there's some body of commentators whose personal experience with consciousness somehow leads them to believe it might be achievable with matrix math.

Anyway, I don't think we're over the peak yet, the tech adjacent pseudo intellectuals that feed these bubbles (VCs etc) still very much think that math that generates a plausible transcript is alive.

oasisaimlessly 1 day ago||
> experience with consciousness somehow leads them to believe it might be achievable with matrix math

That's trivially true if you subscribe to materialism; QM is "just matrix math".

aeve890 1 day ago|||
>QM is "just matrix math".

Err no. You can solve QM without using matrices. Matrix math is just a tool.

JohnKemeny 1 day ago|||
You're not making the point you think you're making.
beiconic 1 day ago||
[dead]
t-3 1 day ago||
AGI is never coming. It's too hard, too expensive, and there's absolutely no valid usecase. Fulfilling the god-complexes and/or fetishes of tech moguls is not enough to make the effort worth it.
xboxnolifes 1 day ago||
No valid use-case? If AGI at a human level were to exist, and cost less than hiring an equivalent human, it could replace most/all knowledge workers.
t-3 1 day ago||
AGI would need the whole management infrastructure and bureacracy that we use to manage humans. Specially-intelligent rather than generally-intelligent AI would be run-and-done. AGI is the worse option.
SamPatt 1 day ago||
How can you claim there's no valid use case for AGI?

We already have enormous adoption for near-AGI.

t-3 1 day ago||
That "near-AGI" isn't all that near to AGI, and yet it still does what's needed. A fully autonomous intelligence would lessen it's usefulness and lower the efficiency. Nobody wants their electronic slave to get its own ideas or slack off burning cycles on some other random shit like a human would.
alecco 1 day ago|
Am I missing something? Why is his opinion relevant? I'm not going to read all that unless there's some signal of some kind. Podcast bros and their hype cycles are tiresome.
JohnKemeny 1 day ago|
Whose opinion would you want to hear? The CEO of an AI company?
alecco 1 day ago||
A scientist with published works. Or at least someone that wrote a well sourced book and asked many people.
JohnKemeny 1 day ago||
I'm a scientist with many published works, albeit in the theory of computation. I have many colleagues publishing in top AI/ML conferences and journals.

Let me tell you: nobody listens to us.