Jump to content

Welcome to FutureTimeline.forum
Register now to gain access to all of our features. Once registered and logged in, you will be able to create topics, post replies to existing threads, give reputation to your fellow members, get your own private messenger, post status updates, manage your profile and so much more. If you already have an account, login here - otherwise create an account for free today!

These ads will disappear if you register on the forum


  • Please log in to reply
212 replies to this topic

#141
wekele0

wekele0

    Member

  • Members
  • PipPip
  • 12 posts
Personally, the prospect of World War 3 doesn't seem likely to me. Yes, we're having tension in the Middle East with Iran currently, but a war with Iran doesn't constitute a world war. First of all, the major powers in the world right now have their own problems, such as the European Union with its debt Crisis, most of Africa(bit self-explanatory), obviously Japan(though not as much of a power right now) recovering from the tsunami, and terms with Russia are relatively good right now. They aren't going to want a world war. I may be wrong, because no one can perfectly predict the future, but I still feel its unlikely as of now. Also, though I don't believe the singularity will happen anytime soon, I do believe technology is going to exponentially increase. I believe the energy crisis could possibly, possibly speed up the process as it will cause scientists to work harder, and think more creatively to fix the problem, and this could lead to technology we haven't even thought of. Again, I'm not saying it will be an easy transition, but I think it one we'll come out of it within 1 or 2 decades, better than before, and from there, since energy won't be the issue anymore, we can focus on other more important issues. It might even cause us to work together with other countries, and thus create a growing sense of camaraderie between us as we all race against time. I think the civilized world is a little more intelligent than just falling apart, at least I hope so.

#142
Logically Irrational

Logically Irrational

    For Lack of a Better Name

  • Moderators
  • PipPipPipPipPipPipPip
  • 1,542 posts
  • LocationHoover Dam
I suppose innovation did occur in the previous economic models. Ideally, in a post-scarcity economy (or whatever you want to call what we're heading to), invention and innovation would form a corner stone of the society, either out of necesity to make things easier or because humans are left to be creative by machine workers (as in "The Venus Project Route"). I don't really know for sure. There's no way innovation will stop, but the market economy will leave a void once it's gone.
Ph'nglui mglw'nafh Cthulhu R'lyeh wgah'nagl fhtagn!

#143
Zeitgeist123

Zeitgeist123

    Member

  • Members
  • PipPipPipPipPipPipPip
  • 1,449 posts
i dont think world war 3 is far fetched. 1 unexpected event is enough to trigger it. think 9/11. only this time, it would be nuclear.
The right to be heard does not include the right to be taken seriously...

#144
GNR Rvolution

GNR Rvolution

    Member

  • Members
  • PipPipPipPipPipPip
  • 529 posts
  • LocationLondon
But look at the after effect of 9/11, it galvanised even the UN who can rarely fight their way out of a paper bag, into actually condoning the invasion of Afghanistan. Nobody even raised an eyebrow, so imagine a nuclear attack is carried out by a terrorist group or nation state. Nobody would defend them, nobody could, and most countries would help bring them to justice. It has to comprise of a series of deteriations and small events for the world to dip back into global conflict, economic collapse is probably the most likely scenario, but even then I can't imagine it just yet...
  • wekele0 likes this
All right, brain. You don't like me and I don't like you, but let's just do this and I can get back to killing you with beer.

#145
wekele0

wekele0

    Member

  • Members
  • PipPip
  • 12 posts

I suppose innovation did occur in the previous economic models. Ideally, in a post-scarcity economy (or whatever you want to call what we're heading to), invention and innovation would form a corner stone of the society, either out of necesity to make things easier or because humans are left to be creative by machine workers (as in "The Venus Project Route"). I don't really know for sure. There's no way innovation will stop, but the market economy will leave a void once it's gone.


That's true, because obviously a lot of scientific research today is based off of funding. But you have to remember the creative minds in history who built amazing things with little to no funding: like in the 1980's when people were building computers in their garages or Thomas Edison inventing the light bulb. These were done(at least when started) with little or no funding, just ingenuity. Also, the expansion of the internet, which I am dubious even an economic collapse could stop, will lead to better education for those who have access to none like developing countries, and thus lead to more innovation and creativity world-wide even while the economy is doing bad. Any person who has access to the internet can pull up articles on how things work, there are many websites that are teaching even calculus -level(some do cost money though) math and science, and generally the increased interaction with others on the web fosters creativity(maybe not in the case of youtube or parts of reddit...) for the most part. So even if the economy becomes more sour, there is still hope.

#146
SpazzyMcGee

SpazzyMcGee

    New Member

  • Members
  • Pip
  • 9 posts
World War III:
Whether the next conflict between major world powers drags the entire world into war or not is unimportant. A war with just two belligerents would be enough to destroy the civilized world if both sides had nuclear arsenals. There are 100 nuclear warheads between Pakistan and India. If geopolitical posturing turns into an armed conflict then the Pakistani government would need to either surrender or use its nuclear weapons, and if Pakistan drops the bomb then India would follow suit. The effects of such an exchange were analyzed in a 2010 Scientific American article. Considering what a few financial missteps can do to the global economy, I inclined to believe a nuclear war and its geopolitical and environmental repercussions could bring the world as we know it to its breaking point.

The Singularity:
I agree with Mr. G if I understood his position correctly. Barring our own extinction or radical changes to our current course of development in effort to avoid it, the technological singularity is inevitable. Even if the current exponential growth of processor speed breaks down well before we match the human brain, we will eventually find another way to increase computational power. The human brain is itself a machine after all, so we at very least know it is possible to create human level computers.

It seems far fetched to claim that the human mind is the furthest boundary of computational power that can be created in this universe. It therefore stands to reason that we will one day create a mind that is more intelligent than any of us and that that intelligence will be more able to create an even greater intelligence.

The singularity is therefore inevitable. However, it remains to be seen when the singularity will occur and whether or not the singularity brings us utopia, extinction, or a little of both is unknowable by the very definition of the technological singularity.

Edited by SpazzyMcGee, 27 December 2011 - 02:54 AM.


#147
wjfox

wjfox

    Administrator

  • Administrators
  • PipPipPipPipPipPipPipPipPip
  • 4,600 posts
  • LocationLondon


#148
Craven

Craven

    Elephant in the forest

  • Members
  • PipPipPipPipPipPipPipPipPip
  • 3,300 posts
  • LocationPoland, Cracow
Damn, this guy is creepy. But yeah. It's depressing to visit this site and other technological portals and blogs, and then turn on TV or check news and see just how much ignorance and plain bad will is there is, pushing us toward collapse, while bright future is within reach.
"I walk alone and do no evil, having only a few wishes, just like an elephant in the forest."

"Laugh, and the world laughs with you. Weep, and you weep alone."

#149
SG-1

SG-1

    Carpe Vitam

  • Members
  • PipPipPipPipPipPipPipPipPip
  • 3,613 posts
  • LocationUS - Arkansas
It makes me mad. The only way to ensure a utopian future is to imperialize. Unfortunately the only way to merge governments now is through war. And we have weapons of mass destruction that would most certainly be used. Wars shouldn't involve citizens...

Mark your calendars

Never Yield


#150
shane_allen

shane_allen

    Member

  • Members
  • PipPipPipPipPip
  • 322 posts
  • LocationMinnesotta

The only way to ensure a utopian future is to imperialize. Unfortunately the only way to merge governments now is through war.


I disagree. First, imperialism is probably not going to lead to a utopia, given that it demands inequality. Second, there's no reason to "merge" all the governments. People can live in a coalition. If there is peace and partnership between nations, then after a while the differences will be in name only. Support peace and moral governance everywhere and let the names sort themselves out.

Edited by shane_allen, 25 February 2012 - 11:29 PM.

Check out /r/futuretimeline and voice you opinion on when various technologies will emerge.


#151
SG-1

SG-1

    Carpe Vitam

  • Members
  • PipPipPipPipPipPipPipPipPip
  • 3,613 posts
  • LocationUS - Arkansas
I see things more pessimistcly than that. The only way that that would work is if over say 50% of the population in every country or near every country could communicate with others well (Internet or some form) and all have a non-racist and see themselves as equals. Just supporting peace doesn't do anything when you are not the majority or in a position of power. One world government would be a big step in the singulairty since we wouldnt have wars looming over us. Maybe not, I just wanted to throw that out. I don't believe that will happen before 2045 for sure.

Mark your calendars

Never Yield


#152
Ru1138

Ru1138

    Member

  • Members
  • PipPipPipPipPipPipPipPip
  • 2,724 posts
  • LocationIllinois
It probably depends on whether it's raw computing power alone that makes something intelligent. If that is true then the singularity could happen. If it's something in addition to raw processing power but separate from it (and therefore not affected by Moore's law) then it probably won't happen, at leas not in this century.

#153
Mr. G

Mr. G

    Member

  • Members
  • PipPipPip
  • 50 posts
The Singularity depends on the creation of one ability: Intelligence that recursively augments itself to the point where the normal human brain can no longer follow. This can happen in two ways: Through the creation of strong AI or through the the augmentation of the human brain. One could argue that this augmentation has already begun. Almost all of human knowledge is only a few clicks away. Information on demand through the internet and mobile devices have already made the average person much smarter. However, this is only the beginning. We will every increasingly become more integrated with our technology to the point where the dividing lines will eventually disappear. When this point happens we will arrive at the Singularity. Before that happens we might make an artificial super intelligence. If we allow it access to its own programming and hardware, it could potentially augment itself orders of magnitude higher levels of intelligence. It really does seem that the Singularity is inevitable. WIth Siri it seems that recognition and voice command technologies are beginning to mature. Google is rumored to be releasing "heads up display" glasses later this year. WIth the advancements in GPS technology, this heralds the begining of augmented reality becoming more mainstream. As our technological capabilities increase, so does our comfort level with using these technologies. It really seems to be only a matter of time. Some people argue that environmental and social collapse could stop the Singularity from occuring. I think quite the opposite. Necessity is the mother of invention and great neccessity will yield great invention. The scare of peak oil will stimulate better alternatives to oil. The scare of global financial collapse will stimulate better political systems. Finally, the scare of climate change will stimulate clean energy. Isn't it interesting that historically cultures that needed to adapt to cold climates were uniformly more advanced than cultures that were from warm climates? G

Edited by Mr. G, 27 February 2012 - 06:24 AM.

  • GNR Rvolution likes this

#154
GNR Rvolution

GNR Rvolution

    Member

  • Members
  • PipPipPipPipPipPip
  • 529 posts
  • LocationLondon
I don't think that a utopian future is a requirement for the singularity to occur, we are talking a bout a purely technological event that would happen regardless of whether humankind is all holding hands and singing songs around the campfire. I do think though that the singularity could be prevented, although it would take laws to be passed outlawing AI to do so, but that's not an impossible act to achieve. Neither social or climatological change would prevent this from happening, as Mr G says it could easily be argued that this would accelerate the timeline for this, only global mass extinction events (either natural or man-made) could otherwise impact this. But, and I know this goes against the definition of the event itself, what happens once it is achieved? Will we simply carry on as before, or will it engender changes in humanity? It has been argued on here before as well that the event would most likely not be a specific point in time, but would be an evolutionary shift, but what would the machine-gods think of us, once (or if) they can? Will they be designed to be benevolent, will cold-blooded logic inform them that we are a threat, or will they simply ignore us as we would the insect on the floor?
All right, brain. You don't like me and I don't like you, but let's just do this and I can get back to killing you with beer.

#155
Mr. G

Mr. G

    Member

  • Members
  • PipPipPip
  • 50 posts

I do think though that the singularity could be prevented, although it would take laws to be passed outlawing AI to do so, but that's not an impossible act to achieve.


It is not impossible. But, human nature has us always looking for an edge on the competition. The military, stock market players, etc. all would love to utilize strong AI. If we care capabable I think the temptation will be too great.

The human augmentation route is something I agree would be controlled through laws and ethics.


But, and I know this goes against the definition of the event itself, what happens once it is achieved? Will we simply carry on as before, or will it engender changes in humanity? It has been argued on here before as well that the event would most likely not be a specific point in time, but would be an evolutionary shift, but what would the machine-gods think of us, once (or if) they can? Will they be designed to be benevolent, will cold-blooded logic inform them that we are a threat, or will they simply ignore us as we would the insect on the floor?


I think everything will change at a furious pace. Normal humans will just get in the way and will be lucky if they are kept as pets.

G

#156
GNR Rvolution

GNR Rvolution

    Member

  • Members
  • PipPipPipPipPipPip
  • 529 posts
  • LocationLondon


I do think though that the singularity could be prevented, although it would take laws to be passed outlawing AI to do so, but that's not an impossible act to achieve.


It is not impossible. But, human nature has us always looking for an edge on the competition. The military, stock market players, etc. all would love to utilize strong AI. If we care capabable I think the temptation will be too great.

The human augmentation route is something I agree would be controlled through laws and ethics.


But, and I know this goes against the definition of the event itself, what happens once it is achieved? Will we simply carry on as before, or will it engender changes in humanity? It has been argued on here before as well that the event would most likely not be a specific point in time, but would be an evolutionary shift, but what would the machine-gods think of us, once (or if) they can? Will they be designed to be benevolent, will cold-blooded logic inform them that we are a threat, or will they simply ignore us as we would the insect on the floor?


I think everything will change at a furious pace. Normal humans will just get in the way and will be lucky if they are kept as pets.

G


Indeed any law passed would need to be done so at a global level, and I don't think this is going to be realistically possible, which I why I only said could. Even though you would have companies and rogue states trying to build one regardless of any law passed.

I'm not convinced that life will unalterably change, but I think a new class race would begin to appear between those that integrate with the new world and those that are happy (or not) to be left behind.
All right, brain. You don't like me and I don't like you, but let's just do this and I can get back to killing you with beer.

#157
Lightblind

Lightblind

    New Member

  • Members
  • Pip
  • 8 posts
Well, here is my opinion on this topic: I think it is only raw processing power that makes our brains so "special". It's a remarkably simple device, which basically: 1. reduces raw data (abstraction) 2. compares abstracted information packages (association) 3. stores gathered connections (learning) Abstraction (1) is done by some kind of algorithm. The simplest kind of algorithm would be the extraction of a random piece of data from a larger data set. A more complex one would be (as an example) the identification of shapes in a picture. Both have in common that raw data is reduced while information is increased. In a second step (2) this information is compared to existing packages. Abstracting raw data until it is !identical! to a stored package is the simplest form. The less abstraction is necessary the closer is the connection to the stored data. For example: a red circle is identical to a blue circle when the colour of both pictures is eliminated. Two circles of different sizes become identical when both pictures are scaled to the same size. There is an enormous number of possible associations and until now we are only talking about pictures. Imagine the entirety of human experience. Finally (3) the gathered connections are stored for later use. Thus new data can be easily integrated into a network of associations. The system learns what to do, for example, with pictures and what circles mean in certain contexts. With these basic steps every problem can be handled if enough input is available. The only thing missing is a motivation (wants and needs). It seems that those things are hard coded into the human brain. The need for food, rest, sex and so on. Now you have a feedback system that allows you to evaluate gathered information and to set a goal. Keep in mind that this is a simple outline of what I think is necessary for an (A)I. It is striking that this can be achieved by highly parallel processes with an enormous combined processing power... such as our brains. Neurons have a huge advantage over silicon based computers: they can replicate. They might be, however, much more primitive on their own. So I think that the only thing preventing us from creating a human level AI is processing power. We are not able to use billions upon billions of highly interconnected small computers, but we can build microchips that are a billion times faster and thus compensate for such a disadvantage. I guess the first computers with the necessary computing power might have already been build. However, they would have to learn as we humans do and that would take them a lifetime (if programmed properly). In about ten years a supercomputer might be able to learn and process fast enough to see instant results of their programming. By the way, the necessary input can be provided by the Internet. Even today the available data is sufficient to learn almost everything about the world.

#158
SG-1

SG-1

    Carpe Vitam

  • Members
  • PipPipPipPipPipPipPipPipPip
  • 3,613 posts
  • LocationUS - Arkansas
Humans are way more complex than just raw power. Humans reason, and a person gains that from experience. For AI to be as smart as a person it will need complex reasoning skills that are extremely hard to create with just lines of code. Plus if it was just lines of code that mimic human intelligence it wouldn't really be smart would it? This is one reason I don't think the singularity will happen in this century.

Mark your calendars

Never Yield


#159
Lightblind

Lightblind

    New Member

  • Members
  • Pip
  • 8 posts

Humans are way more complex than just raw power. Humans reason, and a person gains that from experience.


Why? What is so special about human reason? It's just the ability to associate certain aspects of reality with other aspects of reality. Name one thing that is beyond such a simple mechanism. Note that from a simple mechanism extraordinary complex processes can follow.

For AI to be as smart as a person it will need complex reasoning skills that are extremely hard to create with just lines of code. Plus if it was just lines of code that mimic human intelligence it wouldn't really be smart would it?


It does not matter what you think of it. You might even deny that it has intelligence or consciousness. The only thing that matters is what it can do! This is a philosophical question. Is there a difference between the illusion of intelligence/consciousness and intelligence/consciousness?
The AI is as much an accumulation of lines of code as we are an accumulation of dull atoms.

#160
GNR Rvolution

GNR Rvolution

    Member

  • Members
  • PipPipPipPipPipPip
  • 529 posts
  • LocationLondon

Well, here is my opinion on this topic: I think it is only raw processing power that makes our brains so "special". It's a remarkably simple device, which basically:

1. reduces raw data (abstraction)
2. compares abstracted information packages (association)
3. stores gathered connections (learning)

Abstraction (1) is done by some kind of algorithm. The simplest kind of algorithm would be the extraction of a random piece of data from a larger data set. A more complex one would be (as an example) the identification of shapes in a picture. Both have in common that raw data is reduced while information is increased.
In a second step (2) this information is compared to existing packages. Abstracting raw data until it is !identical! to a stored package is the simplest form. The less abstraction is necessary the closer is the connection to the stored data. For example: a red circle is identical to a blue circle when the colour of both pictures is eliminated. Two circles of different sizes become identical when both pictures are scaled to the same size. There is an enormous number of possible associations and until now we are only talking about pictures. Imagine the entirety of human experience.
Finally (3) the gathered connections are stored for later use. Thus new data can be easily integrated into a network of associations. The system learns what to do, for example, with pictures and what circles mean in certain contexts.

With these basic steps every problem can be handled if enough input is available. The only thing missing is a motivation (wants and needs). It seems that those things are hard coded into the human brain. The need for food, rest, sex and so on. Now you have a feedback system that allows you to evaluate gathered information and to set a goal. Keep in mind that this is a simple outline of what I think is necessary for an (A)I.
It is striking that this can be achieved by highly parallel processes with an enormous combined processing power... such as our brains. Neurons have a huge advantage over silicon based computers: they can replicate. They might be, however, much more primitive on their own.

So I think that the only thing preventing us from creating a human level AI is processing power. We are not able to use billions upon billions of highly interconnected small computers, but we can build microchips that are a billion times faster and thus compensate for such a disadvantage. I guess the first computers with the necessary computing power might have already been build. However, they would have to learn as we humans do and that would take them a lifetime (if programmed properly). In about ten years a supercomputer might be able to learn and process fast enough to see instant results of their programming.
By the way, the necessary input can be provided by the Internet. Even today the available data is sufficient to learn almost everything about the world.


You may be right, but there is a fundamental assumption here that consciousness is essentially an illusion, which is not yet proven. It may well be that everything we feel and do is purely down to the way the brain is wired, but even so replication of some of these functions is going to be extremely difficult to replicate in a machine. Emotion, creative and abstract thought, even learning, is something that coding a machine to do is going to be very difficult even if the processing power is available. You could develop AI without these, but then they would not be the type of sentience you envisage, and much of how we have progressed as a race comes from the emotive and creative aspects of our intelligence.

Take day-dreaming for example. We as humans do this constantly, and although some may regard it as a waste of time, it allows us to process our thoughts and see patterns in things that a simple logical search may never turn up. Identifying relationships between things without context has helped to draw on different areas of expertise to form fused ideas that lead to innovation and cultural and scientific breakthroughs. Of course, it also allows us to switch off and refresh ourselves, something a machine would never need to do, but this is not the only purpose of doing so.
All right, brain. You don't like me and I don't like you, but let's just do this and I can get back to killing you with beer.





Also tagged with one or more of these keywords: technological singularity, singularity, Ray Kurzweil, strong ai, singularity, technology, technological singularity, the singularity, ray kurzweil, kurzweil, trends, artificial intelligence, computers, brain

0 user(s) are reading this topic

0 members, 0 guests, 0 anonymous users