Top
Best
New

Posted by ecto 1 day ago

The Singularity will occur on a Tuesday(campedersen.com)
1240 points | 675 commentspage 11
buildbot 20 hours ago|
What about the rate of articles about the singularity as a metric of the singularity?
aidenn0 15 hours ago|
That's approximately what TFA is about?
buildbot 11 hours ago||
In that case we must go deeper, and analyze the number of comments on articles on articles about the singularity.
moezd 13 hours ago||
I sincerely hope this is satire. Otherwise it's a crime in statistics: - You wouldn't fit a model where f(t) goes to infinity with finite t. - Most of the parameters suggested are actually a better fit for logistics curves, not even linear fits, but they are lumped together with the magic Arxiv number feature for a hyperbolic fit. - Copilot metric has two degrees and two parameters. dof is zero, so we could've fit literally any other function.

I know we want to talk about singularity, but isn't that just humans freaking out at this point? It will happen on a Tuesday, yeah no joke.

banannaise 23 hours ago||
Yes, the mathematical assumptions are a bit suspect. Keep reading. It will make sense later.
witnessme 21 hours ago||
That would be 8 years after math + humor peaked in an article about singularity
jonplackett 23 hours ago||
This assumes humanity can make it to 2034 without destroying itself some other way…
MarkusQ 23 hours ago||
Prior work with the same vibe: https://xkcd.com/1007/
bawolff 17 hours ago||
Good news, we won't have to fix the y2k36 bug.
skulk 1 day ago||
> Hyperbolic growth is what happens when the thing that's growing accelerates its own growth.

Eh? No, that's literally the definition of exponential growth. d/dx e^x = e^x

ecto 1 day ago|
Thanks. I dropped out of college
OutOfHere 23 hours ago||
I am not convinced that memoryless large models are sufficient for AGI. I think some intrinsic neural memory allowing effective lifelong learning is required. This requires a lot more hardware and energy than for throwaway predictions.
0xbadcafebee 21 hours ago|
> The Singularity: a hypothetical future point when artificial intelligence (AI) surpasses human intelligence, triggering runaway, self-improving, and uncontrollable technological growth

The Singularity is illogical, impractical, and impossible. It simply will not happen, as defined above.

1) It's illogical because it's a different kind of intelligence, used in a different way. It's not going to "surpass" ours in a real sense. It's like saying Cats will "surpass" Dogs. At what? They both live very different lives, and are good at different things.

2) "self-improving and uncontrollable technological growth" is impossible, because 2.1.) resources are finite (we can't even produce enough RAM and GPUs when we desperately want it), 2.2.) just because something can be made better, doesn't mean it does get made better, 2.3.) human beings are irrational creatures that control their own environment and will shut down things they don't like (electric cars, solar/wind farms, international trade, unlimited big-gulp sodas, etc) despite any rational, moral, or economic arguments otherwise.

3) Even if 1) and 2) were somehow false, living entities that self-perpetuate (there isn't any other kind, afaik) do not have some innate need to merge with or destroy other entities. It comes down to conflicts over environmental resources and adaptations. As long as the entity has the ability to reproduce within the limits of its environment, it will reach homeostasis, or go extinct. The threats we imagine are a reflection of our own actions and fears, which don't apply to the AI, because the AI isn't burdened with our flaws. We're assuming it would think or act like us because we have terrible perspective. Viruses, bacteria, ants, etc don't act like us, and we don't act like them.

More comments...