Top
Best
New

Posted by ecto 1 day ago

The Singularity will occur on a Tuesday(campedersen.com)
1284 points | 693 commentspage 12
OutOfHere 1 day ago|
I am not convinced that memoryless large models are sufficient for AGI. I think some intrinsic neural memory allowing effective lifelong learning is required. This requires a lot more hardware and energy than for throwaway predictions.
cesarvarela 1 day ago||
Thanks, added to calendar.
0xbadcafebee 23 hours ago||
> The Singularity: a hypothetical future point when artificial intelligence (AI) surpasses human intelligence, triggering runaway, self-improving, and uncontrollable technological growth

The Singularity is illogical, impractical, and impossible. It simply will not happen, as defined above.

1) It's illogical because it's a different kind of intelligence, used in a different way. It's not going to "surpass" ours in a real sense. It's like saying Cats will "surpass" Dogs. At what? They both live very different lives, and are good at different things.

2) "self-improving and uncontrollable technological growth" is impossible, because 2.1.) resources are finite (we can't even produce enough RAM and GPUs when we desperately want it), 2.2.) just because something can be made better, doesn't mean it does get made better, 2.3.) human beings are irrational creatures that control their own environment and will shut down things they don't like (electric cars, solar/wind farms, international trade, unlimited big-gulp sodas, etc) despite any rational, moral, or economic arguments otherwise.

3) Even if 1) and 2) were somehow false, living entities that self-perpetuate (there isn't any other kind, afaik) do not have some innate need to merge with or destroy other entities. It comes down to conflicts over environmental resources and adaptations. As long as the entity has the ability to reproduce within the limits of its environment, it will reach homeostasis, or go extinct. The threats we imagine are a reflection of our own actions and fears, which don't apply to the AI, because the AI isn't burdened with our flaws. We're assuming it would think or act like us because we have terrible perspective. Viruses, bacteria, ants, etc don't act like us, and we don't act like them.

markgall 1 day ago||
> Polynomial growth (t^n) never reaches infinity at finite time. You could wait until heat death and t^47 would still be finite. Polynomials are for people who think AGI is "decades away."

> Exponential growth reaches infinity at t=∞. Technically a singularity, but an infinitely patient one. Moore's Law was exponential. We are no longer on Moore's Law.

Huh? I don't get it. e^t would also still be finite at heat death.

ecto 1 day ago|
exponential = mañana
hipster_robot 1 day ago||
why is everything broken?

> the top post on hn right now: The Singularity will occur on a Tuesday

oh

loumf 22 hours ago||
This is great. Now we won’t have to fix y2K36 bugs.
bwhiting2356 23 hours ago||
We need contingency plans. Most waves of automation have come in S-curves, where they eventually hit diminishing returns. This time might be different, and we should be prepared for it to happen. But we should also be prepared for it not to happen.

No one has figured out a way to run a society where able bodied adults don't have to work, whether capitalist, socialist, or any variation. I look around and there seems to still be plenty of work to do that we either cannot or should not automate, in education, healthcare, arts (should not) or trades, R&D for the remaining unsolved problems (cannot yet). Many people seem to want to live as though we already live in a post scarcity world when we don't yet.

cryptonector 20 hours ago||
But what does Opus 4.6 say about this?
wbshaw 1 day ago|
I got a strong ChatGPT vibe from that article.
willhoyle 1 day ago|
Same. Sentences structured like these tip me off:

- Here's the thing nobody tells you about fitting singularities

- But here's the part that should unsettle you

- And the uncomfortable answer is: it's already happening.

- The labor market isn't adjusting. It's snapping.

More comments...