So far, LLMs aren't even remotely close to this, as they only do what they are told to do (directly or otherwise), they can't learn without a costly offline retraining process, they do not care in the slightest what they're tasked with doing or why, and they do not have anything approximating a sense of self beyond what they're told to be.
- It's autonomous
- It learns (not retraining, but true learning)
- By definition some semblance of consciousness must arise
This is why I think we're very far from anything close to this. Easily multiple decades if not far longer.
It is a valuable contribution but the CHC theory from psychology that this is based on is itself incomplete.
By commonsense physics, I mean something like simulating interactions of living and non-living entities in 3D over time. Seems more complicated than the examples in the web site and in most tests used in psychometrics.
Creative problem solving with cognitive leaps required for truly novel research & invention could lie outside the rubrics as well. The criteria in CHC are essential but incomplete I believe.
  > defining AGI as matching the cognitive versatility and proficiency of a well-educated adult.