Top
Best
New

Posted by pegasus 4 days ago

A definition of AGI(arxiv.org)
304 points | 507 commentspage 5
CaptainOfCoit 4 days ago|
> defining AGI as matching the cognitive versatility and proficiency of a well-educated adult

Seems most of the people one would encounter out in the world might not posses AGI, how are we supposed to be able to train our electrified rocks to have AGI if this is the case?

If no one has created a online quiz called "Are you smarter than AGI?" yet based on the proposed "ten core cognitive domains", I'd be disappointed.

jimbohn 3 days ago||
I think "our" mistake is that we wanted to make a modern human first, while being unable to make an animal or even a caveman, and we lost something in the leap-frog. But we effectively have a database of knowledge that has become interactive thanks to reinforcement learning, which is really useful!
hardenedsecure 3 days ago||
A forecast by one of the authors of the paper: 50% chance that AGI is reached according to the definition by end of 2028, 80% by end of 2030. https://ai-frontiers.org/articles/agis-last-bottlenecks
habinero 3 days ago|
People say things like this all the time. It's as reliable as the latest prediction for the rapture and about as scientific.
joomla199 3 days ago||
All models are wrong, but some are useful. However when it comes to cognition and intelligence we seem to be in the “wrong and useless” era or maybe even “wrong and harmful” (history seems to suggest this as a necessary milestone…anyone remember “humorism”?)
jncfhnb 3 days ago||
Completely wrong direction. AGI will not emerge from getting smarter. It will emerge from being a stateful system in a real environment.

You need context from internal system state that isn’t faked with a giant context window.

tim333 3 days ago||
Maybe we need a new term. I mean AGI just means artificial general intelligence as opposed to specialised AI like chess computers and never came with a particular level it had to be. Most people think of it as human level intelligence so perhaps we should call it that?
adamzwasserman 3 days ago||
I wish them luck. Any consensus at all, on any definition at all, would be a boon to mankind. Unfortunately I am certain that all we have to look forward to is endless goal post shifting.
giancarlostoro 3 days ago|
Maybe AGI should have levels / phases to achieve towards 100% or a maximum level?
l5870uoo9y 4 days ago||
Long-term memory storage capacity[1] scores 0 for both GPT-4 and GPT-5. Are there any workable ideas or concepts for solving this?

[1]: The capability to continually learn new information (associative, meaningful, and verbatim). (from the publication)

stephc_int13 3 days ago||
You need some expertise in a field to see past the amazing imitation capabilities of LLMs and get a realistic idea of how mediocre they are. The more you work with it the less you trust it. This is not _it_.
sureglymop 4 days ago|
I think that's a good effort! I remember mentioning the need for this here a few months ago: https://news.ycombinator.com/item?id=44468198
More comments...