Before defining AGI, I guess we need to define Intelligence and align on whose definition of intelligence can be considered the grounded truth. The theory of multiple intelligences (Howard Gardner) seems the most accepted today. Is there anything better?
Going with the assumption that there is no other, this definition of AGI only considers the following intelligences. - Verbal-Linguistic - Logical-Mathematical - Visual-Spatial - Musical - Naturalist
It doesnt/barely mentions - Bodily-Kinesthetic - Interpersonal - Intrapersonal - Existential So this definition fees incomplete
Cattell-Horn-Carroll theory, like a lot of psychometric research, is based on collecting a lot of data and running factor analysis (or similar) to look for axes that seem orthogonal.
It's not clear that the axes are necessary or sufficient to define intelligence, especially if the goal is to define intelligence that applies to non-humans.
For example reading and writing ability and visual processing imply the organism has light sensors, which it may not. Do all intelligent beings have vision? I don't see an obvious reason why they would.
Whatever definition you use for AGI probably shouldn't depend heavily on having analyzed human-specific data for the same reason that your definition of what counts as music shouldn't depend entirely on inferences from a single genre.
A team of humans can and will make a GPT-6. Can a team of GPT-5 agents make GPT-6 all on its own if you give it the resources necessary to do so?
That 10-axis radial graph is very interesting. Do others besides this author agree with that representation?
The weak points are speed and long-term memory. Those are usually fixable in computing system. Weak long-term memory indicates that, somehow, a database needs to be bolted on. I've seen at least one system, for driving NPCs, where, after something interesting has happened, the system is asked to summarize what it learned from that session. That's stored somewhere outside the LLM and fed back in as a prompt when needed.
None of this addresses unstructured physical manipulation, which is still a huge hangup for robotics.
Like in order for an LLM to come close to a human proficiency on a topic, the LLM seems to have to ingest a LOT more content than a human.
This is bad definition, because human baby is already AGI when it's born and it's brain is empty. AGI is the blank slate and ability to learn anything.
We are born with inherited "data" - innate behaviors, basic pattern recognition, etc. Some even claim that we're born with basic physics toolkit (things are generally solid, they move). We then build on that by being imitators, amassing new skills and methods simply by observation and performing search.
That's wrong. It knows how to process and signal low carbohydrate levels in the blood, and it knows how to react to a perceived threat (the Moro reflex).
It knows how to follow solid objects with its eyes (when its visual system adapts) - it knows that certain visual stimuli correspond to physical systems.
Could it be that your concept of "know" is defined as common sense "produces output in English/German/etc"?
I think it'll be a steep sigmoid function. For a long time it'll be a productivity booster, but not enough "common sense" to replace people. We'll all laugh about how silly it was to worry about AI taking our jobs. Then some AI model will finally get over that last hump, maybe 10 or 20 years from now (or 1000, or 2}, and it will be only a couple months before everything collapses.
A specific key opens a subset of locks, a general key would open all locks. General intelligence, then, can solve all solvable problems. It's rather arrogant to suppose that humans have it ourselves or that we can create something that does.
If you think this is true, I would say you should leave artificial life alone until you can understand human beings better.
If the teacher was a robot, I don't think the piano would get practiced.
IDK how AI gains that ability. The requirement is basically "being human". And it seems like there's always going to be a need for humans in that space, no matter how smart AI gets.