The Singularity - Official Thread

Talk about scientific and technological developments in the future
Post Reply
User avatar
wjfox
Site Admin
Posts: 8663
Joined: Sat May 15, 2021 6:09 pm
Location: London, UK
Contact:

The Singularity - Official Thread

Post by wjfox »

The Singularity

The Singularity has various definitions, but is generally viewed as a hypothetical point in the future when the pace and convergence of technology becomes so great that it profoundly disrupts human society. Artificial intelligence is the driving force behind this exponentially growing trend. It may lead to humans being left behind by the technology they have created, unless they merge with it and enhance their brains and bodies.

Many futurists believe this event is inevitable, while others are more sceptical. Most futurists seem to believe the Singularity will occur at some point during the next 100 years, based on current trends in computing power.

This thread is for discussions relating to the Singularity and its implications for humanity's future.

More reading:

https://en.wikipedia.org/wiki/Technological_singularity

https://www.penguinrandomhouse.com/book ... -kurzweil/

https://www.reddit.com/r/singularity/


Image
User avatar
funkervogt
Posts: 1165
Joined: Mon May 17, 2021 3:03 pm

Re: The Singularity - official thread

Post by funkervogt »

My official stance is that the technological singularity won't happen because new developments in science, technology and other domains will never get so rapid that unaugmented humans will not be able to keep track of them. This will even be the case after artificial general intelligence is created. Put more simply, even if AGI accelerates the pace of advancement and change, it will never get so fast that a "rupture" in the timeline of civilization will occur, like a particular date when everything fundamentally changed from Condition A to radically different Condition B.

That said, I believe we will achieve the high standards of living and high levels of science, technology, and power that most "singularitarians" envision, but getting from here to there will be a gradual process spanning about a century, and not marked by a singularity.

I agree with singularitarians that there will be a day when AGIs are so advanced and intelligent that we unaugmented humans will not be able to anticipate their thinking or to keep up with them, and that even our smartest members might not be able to grasp some of the concepts they formulate. Additionally, though unaugmented humans will eventually lose control of Earth and of the civilization we created, the transition to posthuman and/or AGI control will be gradual, and not abrupt as the singularity conceptualizes it. For decades after the invention of the first AGI, humans will retain control over intelligent machines and over world events, but our power will slowly slip away.

There are several reasons for my belief, the most important of which are the complexity brake and limitations of how quickly energy and resources can be marshalled and expended productively.
Last edited by funkervogt on Tue May 18, 2021 2:37 pm, edited 1 time in total.
User avatar
joe00uk
Posts: 121
Joined: Sun May 16, 2021 5:00 pm
Location: UK

Re: The Singularity - official thread

Post by joe00uk »

Yeah, the Singularity to me has always just sounded like evangelical fundamentalism dressed in a lab coat.
User avatar
Yuli Ban
Posts: 4631
Joined: Sun May 16, 2021 4:44 pm

Re: The Singularity - official thread

Post by Yuli Ban »

And remember my friend, future events such as these will affect you in the future
Jakob
Posts: 72
Joined: Sun May 16, 2021 6:12 pm

Re: The Singularity - official thread

Post by Jakob »

funkervogt wrote: Mon May 17, 2021 7:39 pm My official stance is that the technological singularity won't happen because new developments in science, technology and other domains will never get so rapid that unaugmented humans will not be able to keep track of them. This will even be the case after artificial general intelligence is created. Put more simply, even if AGI accelerates the pace of advancement and change, it will never get so fast that a "rupture" in the timeline of civilization will occur, like a particular date when everything fundamentally changed from Condition A to radically different Condition B.

That said, I believe we will achieve the high standards of living and high levels of science, technology, and power that most "singularitarians" envision, but getting from here to there will be a gradual process spanning about a century, and not marked by a singularity.

I agree with singularitarians that there will be a day when AGIs are so advanced and intelligent that we unaugmented humans will not be able to anticipate their thinking or to keep up with them, and that even our smartest members might not be able to grasp some of the concepts they formulate. Additionally, though unaugmented humans will eventually lose control of Earth and of the civilization we created, the transition to posthuman and/or AGI control will be gradual, and not abrupt as the singularity conceptualizes it. For decades after the invention of the first AGI, humans will retain control over intelligent machines and over world events, but our power will slowly slip away.

There are several reasons for my belief, the most important of which are the complexity brake and limitations of how quickly energy and resources can be marshalled and expended productively.
I agree with this mostly. Eventually machines may be running the show but they'll be constrained by the laws of physics and how much people in power are willing to trust them, so it will take decades, maybe even generations instead of happening overnight like singulatarians claim. And humans may well maintain their autonomy and current power structures since most post singularity beings would probably have better things to than micromanage a bunch of apes.
User avatar
funkervogt
Posts: 1165
Joined: Mon May 17, 2021 3:03 pm

Re: The Singularity - official thread

Post by funkervogt »

Jakob wrote: Sat May 22, 2021 5:52 pm And humans may well maintain their autonomy and current power structures since most post singularity beings would probably have better things to than micromanage a bunch of apes.
Counterpoint: Once AGIs get smart enough and powerful enough, micromanaging X billion of us apes will be trivially easy for them. They could assign the interns to do it.

(I think the late 22nd century is the earliest this could be the case.)
User avatar
Yuli Ban
Posts: 4631
Joined: Sun May 16, 2021 4:44 pm

Re: The Singularity - Official Thread

Post by Yuli Ban »

And remember my friend, future events such as these will affect you in the future
User avatar
Yuli Ban
Posts: 4631
Joined: Sun May 16, 2021 4:44 pm

Re: The Singularity - Official Thread

Post by Yuli Ban »

And remember my friend, future events such as these will affect you in the future
User avatar
Ozzie guy
Posts: 486
Joined: Sun May 16, 2021 4:40 pm

Re: The Singularity - Official Thread

Post by Ozzie guy »

Unknown to most we have entered a new small era on the path to the singularity.

User avatar
Yuli Ban
Posts: 4631
Joined: Sun May 16, 2021 4:44 pm

Re: The Singularity - Official Thread

Post by Yuli Ban »


Remember the hype over Atari mastering AIs? I do. They're part of what made me become a Born Again Singularitarian back in 2014.
But even by 2018, it was obvious they weren't panning out. I remember when I made /r/MachinesPlay and looked for a bunch of videos of AI playing video games, and I felt a bit melancholic when I saw that hobbyist programmers made AIs that did more impressive things than DeepMind. Like play Super Mario World.
An AI that can play multiple Atari games isn't unimpressive at all, and MuZero's generality is interesting due to not having rules, but Atari games are simple by design. The fact that's still the best DeepMind can do circa 2021 is... troubling, to say the least. I'm no expert; that's blatantly obvious, so I wouldn't know how difficult it would be to do Q-learning for more complex games.
But I just feel like, for a company as heavily hyped up and flush with talent and money as DeepMind, they ought to have at least teased more impressive abilities years ago. Starcraft is one thing; I'm talking doing for the NES, SNES, arcade titles, maybe even early 3D games what they've done for Atari 2600. I think they did something with DOOM or a DOOM-like game once, but little came of that.

Now we're on the verge of the next big retroactive disappointment, something which will make language models MORE impressive as well: the fact that large language models can play games! So far, it hasn't been DIGITAL games. But do recall that even GPT-2 could play chess. Basically, large language models and their successors will accomplish singlehandedly everything that all these disparate developments of the 2010s struggled to do.
And remember my friend, future events such as these will affect you in the future
Post Reply