Jump to content

Welcome to FutureTimeline.forum
Register now to gain access to all of our features. Once registered and logged in, you will be able to create topics, post replies to existing threads, give reputation to your fellow members, get your own private messenger, post status updates, manage your profile and so much more. If you already have an account, login here - otherwise create an account for free today!

These ads will disappear if you register on the forum


  • Please log in to reply
214 replies to this topic

#201
bee14ish

bee14ish

    Psionic Reality-Warping God

  • Members
  • PipPipPipPipPip
  • 370 posts
  • LocationEarth

Well this is one of the oldest threads on the site, and the most timelessly relevant, so it might as well get bumped whenever possible.

I still can't believe April 2013 was so long ago... it feels like yesterday.

On topic, is it inevitable if we don't destroy ourselves? Just about. We will still try to stop it with a bit of spirituality here and there, but it's like a toy car vs. a freight train. Unless that toy car is as massive as a mountain, nothing's stopping the train.  For the first time ever, "No it's not" is actually the nutcase side.

But "who will benefit" is my real question. The rich or the poor? And those saying "everyone" reeeeally need to re-evaluate how class privilege works.

This is what I've been thinking about. How will the technology be distributed when it comes out? Will it be available to everyone or only the rich? How will people react? Those types of questions.



#202
AnimalsoftheFields

AnimalsoftheFields

    New Member

  • Members
  • Pip
  • 1 posts

 

Well this is one of the oldest threads on the site, and the most timelessly relevant, so it might as well get bumped whenever possible.

I still can't believe April 2013 was so long ago... it feels like yesterday.

On topic, is it inevitable if we don't destroy ourselves? Just about. We will still try to stop it with a bit of spirituality here and there, but it's like a toy car vs. a freight train. Unless that toy car is as massive as a mountain, nothing's stopping the train.  For the first time ever, "No it's not" is actually the nutcase side.

But "who will benefit" is my real question. The rich or the poor? And those saying "everyone" reeeeally need to re-evaluate how class privilege works.

This is what I've been thinking about. How will the technology be distributed when it comes out? Will it be available to everyone or only the rich? How will people react? Those types of questions.

 

The question of wheter new tech will be made available to everyone often comes up, I have seen it asked again and again during Q & As with Kurzweil and the answer is the following:

 

Article 27.
  • (1) Everyone has the right freely to participate in the cultural life of the community, to enjoy the arts and to share in scientific advancement and its benefits.

 

http://www.un.org/en/documents/udhr/



#203
bee14ish

bee14ish

    Psionic Reality-Warping God

  • Members
  • PipPipPipPipPip
  • 370 posts
  • LocationEarth

 

 

Well this is one of the oldest threads on the site, and the most timelessly relevant, so it might as well get bumped whenever possible.

I still can't believe April 2013 was so long ago... it feels like yesterday.

On topic, is it inevitable if we don't destroy ourselves? Just about. We will still try to stop it with a bit of spirituality here and there, but it's like a toy car vs. a freight train. Unless that toy car is as massive as a mountain, nothing's stopping the train.  For the first time ever, "No it's not" is actually the nutcase side.

But "who will benefit" is my real question. The rich or the poor? And those saying "everyone" reeeeally need to re-evaluate how class privilege works.

This is what I've been thinking about. How will the technology be distributed when it comes out? Will it be available to everyone or only the rich? How will people react? Those types of questions.

 

The question of wheter new tech will be made available to everyone often comes up, I have seen it asked again and again during Q & As with Kurzweil and the answer is the following:

 

Article 27.
  • (1) Everyone has the right freely to participate in the cultural life of the community, to enjoy the arts and to share in scientific advancement and its benefits.

 

http://www.un.org/en/documents/udhr/

 

What the hell does that mean? Do you think anyone is actually going to follow that?



#204
JCO

JCO

    Member

  • Members
  • PipPipPipPipPipPipPip
  • 1,032 posts
  • LocationWA, USA

The concept of soon to evolve singularity has several flaws. The first is that in most calculations of when it will occur deal with when a computer core will have more transistors than the human brain has neurons. There are 2 flaws in this, the neuron is not a binary switch but more like a router. Everyone connects to multiple other neurons and the connection are not binary but analog. In addition to this the architecture of the brain has been refined by evolution, even after a computer reaches our capacity we will still be better at using that capacity. Next is software, it is a fact that it lags far behind hardware and it will be a long time before the processing power is efficiently used. Next is that we work together. We have evolved to use our minds abilities cooperatively to let us as a group accomplish what a no single person could. This does not apply to just physical task but mental ones too. Finally, by the time we get close to building computers that will exceed our current brain capacity we will have learned to extend it.


Confirmed Agnostic - I know that I don't know for sure and I am almost certain no one else does either.


#205
Ru1138

Ru1138

    Member

  • Members
  • PipPipPipPipPipPipPipPipPip
  • 3,190 posts
  • LocationIllinois

What the hell does that mean? Do you think anyone is actually going to follow that?

Considering that half of the politicians here think the U.N. is an evil conspiracy, the U.S. won't. :(


What difference does it make?


#206
shane_allen

shane_allen

    Member

  • Members
  • PipPipPipPipPip
  • 322 posts
  • LocationMinnesotta

in most calculations of when it will occur deal with when a computer core will have more transistors than the human brain has neurons.

 

I am not sure about that.


Check out /r/futuretimeline and voice you opinion on when various technologies will emerge.


#207
Raklian

Raklian

    An Immortal In The Making

  • Moderators
  • PipPipPipPipPipPipPipPipPipPip
  • 6,512 posts
  • LocationRaleigh, NC

The concept of soon to evolve singularity has several flaws. The first is that in most calculations of when it will occur deal with when a computer core will have more transistors than the human brain has neurons. There are 2 flaws in this, the neuron is not a binary switch but more like a router. Everyone connects to multiple other neurons and the connection are not binary but analog. In addition to this the architecture of the brain has been refined by evolution, even after a computer reaches our capacity we will still be better at using that capacity. Next is software, it is a fact that it lags far behind hardware and it will be a long time before the processing power is efficiently used. Next is that we work together. We have evolved to use our minds abilities cooperatively to let us as a group accomplish what a no single person could. This does not apply to just physical task but mental ones too. Finally, by the time we get close to building computers that will exceed our current brain capacity we will have learned to extend it.

 

You forgot quantum computing.


What are you without the sum of your parts?

#208
Troodon

Troodon

    Member

  • Members
  • PipPipPipPipPip
  • 238 posts
  • LocationUnited States

I wouldn't say that it's definitely inevitable. I think that it is very likely that it will occur, but it will not necessarily be inevitable. Maybe the technology would get advanced enough for some kind of singularity to happen, but most people will just remain opposed to it. Who knows?



#209
La Bodysnatcher

La Bodysnatcher

    Member

  • Members
  • PipPipPipPipPip
  • 397 posts

Is a technological singularity inevitable?  Yes.  Are we gonna make it?  Maybe.  Probably.  Not necessarily.  But even if we didn't make it, another intelligent species would evolve and take over where we left off.  For all the debate about whether or not enough resources would remain on earth to fuel another "industrial revolution," the fact remains: barring cosmic catastrophe, Earth has at least 5 billion years left.  That is plenty of time for the oil wells to fill back up.

 

The problem is not the resources, but that the conditions that led to us were unique to begin with. The sun was at a nice temperature, the moon was in the right place, and certain events like super eruptions happened at the perfect times. Continents were in fabulous places, and the oceans were nice and mostly clean.

 

Within another few million years, when another primate or animal could possibly evolve, the sun will start becoming too hot (by a very minuscule amount, but enough to throw this balance will be thrown out of whack).  So they'd have to get on the ball, be prepared to evolve right now and not dilly dally.


When you say "this is overrated" do you mean "I think this sucks?"

When you say "the majority hate it" do you mean "I hate it?"


#210
La Bodysnatcher

La Bodysnatcher

    Member

  • Members
  • PipPipPipPipPip
  • 397 posts

Now to answer the thread

 

How do we know the Singularity isn't happening right now and we're just too dumb to perceive it?


When you say "this is overrated" do you mean "I think this sucks?"

When you say "the majority hate it" do you mean "I hate it?"


#211
CamGoldenGun

CamGoldenGun

    Member

  • Members
  • PipPipPipPipPipPip
  • 595 posts
  • LocationFort McMurray, AB. Canada.

because the singularity is defined as a thing or event that triggers technological advancement so quickly it would be hard to measure it (which in turn would benefit social and economic advancements as well). We're still running at a pace defined by Moore's Law... so until that changes, we still haven't hit it.



#212
StanleyAlexander

StanleyAlexander

    Saturn, the Bringer of Old Age

  • Members
  • PipPipPipPipPipPip
  • 975 posts
  • LocationPortland, Oregon, USA, Earth, 2063

Also because the singularity is a subjective event, with a definition that specifically refers to our ability to predict the future.  But if you zoom out a very little bit on the temporal scale--to the level of centuries, say--then the singularity is happening now.

 

It's as if the collective predictive power of humanity at a given distance into the future exists at a certain percentage.  For example:

 

in 5000 BCE, you could make general predictions about life 100 years into the future with 90% accuracy (and please note that all of these numbers and measures I'm about to list are completely arbitrary.  They are just examples by which I hope to indicate a speculative trend that I can use to help define the singularity).  Still in 5000 BCE, you could make a general prediction about 30 years into the future with 95% accuracy.  And about 10 years into the future with 99% accuracy.

 

In 1500 CE, you could make general predictions about life 100 years into the future with maybe 70% accuracy, 30 years ahead with 85% accuracy, and 10 years ahead with 95% accuracy.

 

At some point, probably around the renaissance and attendant revolutions, humanity's average predictive power of 100 years into the future will have fallen below 50%, for various reasons (not least of which being that the actions and their consequences of smarter individuals are harder to predict).  Farther into the future, maybe around the industrial revolution (and again, I'm completely speculating), the ability to predict beyond 30 years falls below 50%, while predictions of 100 years out are approaching 0%.  By the mid-twentieth century, 30 year predictions are approaching 0%.  Today, 10 year predictions are approaching 0%.

 

I like to think of the singularity as the point where the ability of unaided human intelligence to make meaningful predictions about the immediate future falls to 0%.


  • Raklian, EVanimations, Casey and 3 others like this
Humanity's destiny is infinity

#213
jabaricruz80662

jabaricruz80662

    New Member

  • Members
  • Pip
  • 3 posts
In response to the artcle that caiman showed was quite instresting about how you should combine both of best worlds Mr. Carmicheal response about the need A.I singualarity was flawed we don't need In that mattter I agree with Azureous society wants that type singualarity but the quesition that keeps coming up is what is the singualarity in since I agree on ealier it the point on which the rate of discoveries or advancement would reach infinity

#214
Yuli Ban

Yuli Ban

    Nadsat Brat

  • Moderators
  • PipPipPipPipPipPipPipPipPipPipPip
  • 17,148 posts
  • LocationAnur Margidda

Also because the singularity is a subjective event, with a definition that specifically refers to our ability to predict the future.  But if you zoom out a very little bit on the temporal scale--to the level of centuries, say--then the singularity is happening now.

 

It's as if the collective predictive power of humanity at a given distance into the future exists at a certain percentage.  For example:

 

in 5000 BCE, you could make general predictions about life 100 years into the future with 90% accuracy (and please note that all of these numbers and measures I'm about to list are completely arbitrary.  They are just examples by which I hope to indicate a speculative trend that I can use to help define the singularity).  Still in 5000 BCE, you could make a general prediction about 30 years into the future with 95% accuracy.  And about 10 years into the future with 99% accuracy.

 

In 1500 CE, you could make general predictions about life 100 years into the future with maybe 70% accuracy, 30 years ahead with 85% accuracy, and 10 years ahead with 95% accuracy.

 

At some point, probably around the renaissance and attendant revolutions, humanity's average predictive power of 100 years into the future will have fallen below 50%, for various reasons (not least of which being that the actions and their consequences of smarter individuals are harder to predict).  Farther into the future, maybe around the industrial revolution (and again, I'm completely speculating), the ability to predict beyond 30 years falls below 50%, while predictions of 100 years out are approaching 0%.  By the mid-twentieth century, 30 year predictions are approaching 0%.  Today, 10 year predictions are approaching 0%.

 

I like to think of the singularity as the point where the ability of unaided human intelligence to make meaningful predictions about the immediate future falls to 0%.

This sounds about right. I feel your statement on the ability of ancients to predict the future was also accurate. The more I read up on ancient technology, the more I realize how not stagnant it was back in the day. Oh sure, life was still relatively stagnant, but there were many innovations throughout lifetimes. It just happened that a lot of these innovations were lost to time. 

 

Now to answer the thread

 

How do we know the Singularity isn't happening right now and we're just too dumb to perceive it?

I've been wondering about this for the past few nights. There's something in my chest right now that's pounding just as furiously as my heart, telling me this moment is at hand.

 

No, capitalism is the main driver behind the rapid advancement of tech the past 200 years. End that without a real replacement=a end to most advancement.

In particular, industrial capitalism— a drive for increased efficiency. As it happens, non-industrial capitalism (yes, it is a thing) would not have led us down the path we took. Even if we were to abandon capitalism in lieu of socialism or feudalism, we would still see advancement if we kept the ideals of efficiency alive.

The problem with socialism and feudalism is that both are a-okay with "well enough". As long as something gets the job done, there's no point discarding what works. 

It's funny how so many authoritarian-socialists lament the environmental rape caused by industrial capitalism while simultaneously and conveniently ignoring how maximum over-rapey authoritarian-socialism was for the environment— if you research the top 10 major environmental horrors in history, the majority of the list will have been caused by state-socialist regimes. This is because central planning does not require efficiency; it merely requires a baseline of success.

All things trend towards using the least amount of energy possible. This is true for everything from humans, non-human animals, plants, hell even electrons and galaxies! Industrial capitalism is strange in that it temporarily abandons the lowest energy state in order to develop something that could potentially lead to being able to use less energy and receive greater returns. Central planning abandons that quirk, and returns towards aiming for the least amount of energy expended at all times. 

It makes me wonder just how inconceivably efficient decentralized markets would be.

 

I'm becoming more and more skeptical that we'll ever reach a Singularity. There are just too many things that could go wrong before it happens. Financial collapse, economic stagnation, the end of growth. Peak oil. Worsening climate change. Nuclear terrorism. A man-made bioweapon. Social/cultural upheavals and religious conflict. Maybe even World War 3.

Maybe once upon a time I would have subscribed to this, but I no longer do. In fact, I find it laughable to think that the Singularity won't happen. The only thing that could possibly prevent it at this point is an extinction or sub-extinction level event.

 

As to why I feel this way... I couldn't tell you off the top of my head. I could spend all night pointing to various developments in the field of artificial intelligence, but that's not it.

No, there's something else. Something I can't seem to put into words. 


  • StanleyAlexander likes this
Nobody's gonna take my drone, I'm gonna fly miles far too high!
Nobody gonna beat my drone, it's gonna shoot into the sky!

#215
Rusakov

Rusakov

    Member

  • Validating
  • PipPipPipPipPip
  • 339 posts
  • LocationIllinois

We won't know until the singularity actually happens.

 

Until then, a lot can destroy us.







Also tagged with one or more of these keywords: technological singularity, singularity, Ray Kurzweil, strong ai, singularity, technology, technological singularity, the singularity, ray kurzweil, kurzweil, trends, artificial intelligence, computers, brain

0 user(s) are reading this topic

0 members, 0 guests, 0 anonymous users