Welly, welly, welly, welly, welly, welly, well! No fucking shit!
grantcas 7/8/2025||
[dead]
beiconic 7/6/2025||
[dead]
t-3 7/6/2025||
AGI is never coming. It's too hard, too expensive, and there's absolutely no valid usecase. Fulfilling the god-complexes and/or fetishes of tech moguls is not enough to make the effort worth it.
xboxnolifes 7/6/2025||
No valid use-case? If AGI at a human level were to exist, and cost less than hiring an equivalent human, it could replace most/all knowledge workers.
t-3 7/6/2025||
AGI would need the whole management infrastructure and bureacracy that we use to manage humans. Specially-intelligent rather than generally-intelligent AI would be run-and-done. AGI is the worse option.
SamPatt 7/6/2025||
How can you claim there's no valid use case for AGI?
We already have enormous adoption for near-AGI.
t-3 7/6/2025||
That "near-AGI" isn't all that near to AGI, and yet it still does what's needed. A fully autonomous intelligence would lessen it's usefulness and lower the efficiency. Nobody wants their electronic slave to get its own ideas or slack off burning cycles on some other random shit like a human would.
alecco 7/6/2025|
Am I missing something? Why is his opinion relevant? I'm not going to read all that unless there's some signal of some kind. Podcast bros and their hype cycles are tiresome.
JohnKemeny 7/6/2025|
Whose opinion would you want to hear? The CEO of an AI company?
alecco 7/7/2025||
A scientist with published works. Or at least someone that wrote a well sourced book and asked many people.
JohnKemeny 7/7/2025||
I'm a scientist with many published works, albeit in the theory of computation. I have many colleagues publishing in top AI/ML conferences and journals.