Jump to content

Welcome to FutureTimeline.forum
Register now to gain access to all of our features. Once registered and logged in, you will be able to create topics, post replies to existing threads, give reputation to your fellow members, get your own private messenger, post status updates, manage your profile and so much more. If you already have an account, login here - otherwise create an account for free today!

These ads will disappear if you register on the forum

Photo

Will the world be forced to abandon capitalism in the next 20 years?


  • Please log in to reply
131 replies to this topic

#21
TransAustin

TransAustin

    Member

  • Members
  • PipPipPipPipPip
  • 276 posts
  • LocationSan Diego, CA

On the topic of sentient AGI, do you think intelligence automatically brings consciousness? Not trying to sound pseudo-sciency, but what if sentience is a bit more than a complex neural network. Maybe we can create AI with unlimited intelligence, but they will never gain sentience. Maybe they can become self aware, yet still lack emotions and desires. Who knows, maybe they will work as "slaves" for us willingly, simply because they have no reason to object to it.



#22
Jakob

Jakob

    Member

  • Members
  • PipPipPipPipPipPipPipPipPipPip
  • 5,705 posts

On the topic of sentient AGI, do you think intelligence automatically brings consciousness? Not trying to sound pseudo-sciency, but what if sentience is a bit more than a complex neural network. Maybe we can create AI with unlimited intelligence, but they will never gain sentience. Maybe they can become self aware, yet still lack emotions and desires. Who knows, maybe they will work as "slaves" for us willingly, simply because they have no reason to object to it.

Asimov's robots seem to be this, but even if it can be done in real life, some are going to, through some fluke in their programming, "wake up". Though personally, I doubt that a nonsentient machine could create and innovate in any meaningful fashion. Such machines might have more processing power than us, they might be able to use a brute-force attack to do some clever tricks, but they will do nothing of any importance to society on their own merit. And even if we ignore this, humans will then be relevant for jobs that require sentience, and there are those.



#23
Omosoap

Omosoap

    Member

  • Members
  • PipPipPipPip
  • 158 posts

@TransAustin, I believe that there have already been discoveries made that on a very basic level, some robots have already become sentient via the wise men puzzle test.



#24
Unity

Unity

    Information Organism

  • Members
  • PipPipPipPipPipPipPipPip
  • 2,477 posts

On the topic of sentient AGI, do you think intelligence automatically brings consciousness? Not trying to sound pseudo-sciency, but what if sentience is a bit more than a complex neural network. Maybe we can create AI with unlimited intelligence, but they will never gain sentience. Maybe they can become self aware, yet still lack emotions and desires. Who knows, maybe they will work as "slaves" for us willingly, simply because they have no reason to object to it.

Actually, this is really interesting and it has become the new way of thinking among many people who study consciousness either from a biological or computational standpoint. The idea is called "embodied consciousness" and the idea is that our physical bodies attract us and repel us from stimuli based on our needs. For example, when you are thirsty your throat constricts or when a person is very hungry it produces changes in the amygdala that make them fear for their survival existentially. So one argument is that computers CANNOT produce these kinds of responses because they compute with respect to their environment and even may respond to outside stimuli based on sensors creating changes to software, but these changes do not produce the physiological changes that they do in humans altering hormone levels, constricting muscles changing bloodflow and thus neurological processing, etc. So until a computer can be housed in something that produces these kinds of responses it may lack a certain qualia that we associate with consciousness. However, you can argue this already happens to some degree, for example fish experience pain qualitatively differently than we do or redheads are more sensitive to some kinds of pain and less sensitive to others. You can read more about this idea here:

Frederick Adams and Kenneth Aizawa. The Bounds of Cognition (2010).
Anthony Chemero. Radical Embodied Cognitive Science (2009).
Andy Clark. Supersizing the Mind (2008).
George Lakoff and Mark Johnson. Philosophy in the Flesh: The Embodied Mind and its Challenge to Western Thought (1999).
Alva Noë. Out of Our Heads: Why You Are Not Your Brain and Other Lessons From the Biology of Consciousness (2009).
Francisco Varela, Evan Thompson and Eleanor Rosch. The Embodied Mind (1993).
Mark Rowlands. The New Science of the Mind: From Extended Mind to Embodied Phenomenology (2010).

http://blogs.scienti...not-your-brain/
  • TransAustin likes this

#25
TranscendingGod

TranscendingGod

    2020's the decade of our reckoning

  • Members
  • PipPipPipPipPipPipPipPip
  • 1,750 posts
  • LocationGeorgia
You guys have personified the robots. One does not have to be human to be sentient. Sentience is not defined by emotions. To say that a machine could not innovate when it is 10^500 times more intelligent than us is nonsense and there is nothing to back that statement. There is nothing beyond our brains than them being complex machines. We are not special.

The growth of computation is doubly exponential growth. 


#26
Unity

Unity

    Information Organism

  • Members
  • PipPipPipPipPipPipPipPip
  • 2,477 posts

You guys have personified the robots. One does not have to be human to be sentient. Sentience is not defined by emotions. To say that a machine could not innovate when it is 10^500 times more intelligent than us is nonsense and there is nothing to back that statement. There is nothing beyond our brains than them being complex machines. We are not special.


I'm not sure if this will help, but how are you not a personalized robot if you view things from a material standpoint? In other words, isn't the delineation you're making just arbitrary and why wouldn't social creatures whether they be cells, wolves, humans, etc... not have similarities in how they interact with their environment? For example you can see altruism in the other examples I gave and other "personified" characteristics.

#27
TranscendingGod

TranscendingGod

    2020's the decade of our reckoning

  • Members
  • PipPipPipPipPipPipPipPip
  • 1,750 posts
  • LocationGeorgia

You guys have personified the robots. One does not have to be human to be sentient. Sentience is not defined by emotions. To say that a machine could not innovate when it is 10^500 times more intelligent than us is nonsense and there is nothing to back that statement. There is nothing beyond our brains than them being complex machines. We are not special.


I'm not sure if this will help, but how are you not a personalized robot if you view things from a material standpoint? In other words, isn't the delineation you're making just arbitrary and why wouldn't social creatures whether they be cells, wolves, humans, etc... not have similarities in how they interact with their environment? For example you can see altruism in the other examples I gave and other "personified" characteristics.
First of all to answer your question. Words are arbitrary, and I could very well be considered a robot but I am colloquially coined a human. To ,hopefully, clear up this dilemma you seem to have I will say quite clearly that the Gulf in interaction and recognition of enviroment is immense between those groups you mentioned. Similarities? Of course. Similar enough to not warrant distinction? Never. Listen I look at things from a "dry" perspective. Instincts, emotions, "consciousness", are things that we have acquired through millions of years of evolution. We can replicate that. 100% we can. It may take us 20 years or it may take us 200. We can make a "conscious" being which is distinctly NOT human. It is entirely possible. Is there a prime directive for beings which are intelligent that would drive them to be humanlike? There is no scientific evidence of one.

The growth of computation is doubly exponential growth. 


#28
Unity

Unity

    Information Organism

  • Members
  • PipPipPipPipPipPipPipPip
  • 2,477 posts
Yes, I see.

#29
Zeitgeist123

Zeitgeist123

    Member

  • Members
  • PipPipPipPipPipPipPipPip
  • 1,805 posts

We will never be forced to abandon capitalism. Certain leaders may choose to do nothing about technological unemployment and let chaos reign, or, much worse, replace it with an oppressive communist regime (Or "welfare capitalism", which is almost as oppressive and dystopian, in addition to being something of an oxymoron.) . That is not a path we must take. There is still time for us to prepare for technological unemployment.

 

but thats the point, if we apply moore's law on automation and the leaders will do nothing and let chaos reign, then the end result would be to abandon capitalism. people will end up democratizing automation and a large part of human labor  and employment will no longer be needed. 


  • TranscendingGod likes this

“Philosophy is a pretty toy if one indulges in it with moderation at the right time of life. But if one pursues it further than one should, it is absolute ruin." - Callicles to Socrates


#30
TranscendingGod

TranscendingGod

    2020's the decade of our reckoning

  • Members
  • PipPipPipPipPipPipPipPip
  • 1,750 posts
  • LocationGeorgia

Yes, I see.


That actually might not be a good thing. You "see" too much. Philosophy is good, and in fact it is integral and is in our very nature. However the universe has no philosophy. The closest thing would be entrophy. What we "see" is often times what we project. The scientific method is not impervious. However it is a a sharper tool than the musings of minds. Not greater for we have the ability to contrive importance but more refined and efficacious.

The growth of computation is doubly exponential growth. 


#31
Unity

Unity

    Information Organism

  • Members
  • PipPipPipPipPipPipPipPip
  • 2,477 posts
Ok.

#32
TranscendingGod

TranscendingGod

    2020's the decade of our reckoning

  • Members
  • PipPipPipPipPipPipPipPip
  • 1,750 posts
  • LocationGeorgia

Ok.


What i said came off as "Oh, shut up you don't know what you're talking about. Your ideas are wrong and you should follow my style of thinking." When it should have came off like so "I noticed that you are someone who thinks deeply about things and that is something that we should all aspire to do. However,and I hope I'm not overstepping my bounds here, I find that some things are more clear cut that we often time postulate." Either that or you may be thinking "This imbecile thinks he actually knows anything about philosophy to be able to make such an audacious statement! Hell, I have a more scientific inclination than this moron can ever hope to achieve and yet he thinks he knows anything about me, the scientific method, or philosophy? Give me a break." OR maybe you really agree and "ok" literally means ok.

The growth of computation is doubly exponential growth. 


#33
Unity

Unity

    Information Organism

  • Members
  • PipPipPipPipPipPipPipPip
  • 2,477 posts
Quiet your mind. It's too loud.
  • TranscendingGod likes this

#34
MarcZ

MarcZ

    Chief Flying Car Critic

  • Members
  • PipPipPipPipPipPipPipPipPip
  • 3,251 posts
  • LocationCanada

This topic again...

 

Answer is still no. 


  • voluntaryist likes this

#35
Jakob

Jakob

    Member

  • Members
  • PipPipPipPipPipPipPipPipPipPip
  • 5,705 posts

 

We will never be forced to abandon capitalism. Certain leaders may choose to do nothing about technological unemployment and let chaos reign, or, much worse, replace it with an oppressive communist regime (Or "welfare capitalism", which is almost as oppressive and dystopian, in addition to being something of an oxymoron.) . That is not a path we must take. There is still time for us to prepare for technological unemployment.

 

but thats the point, if we apply moore's law on automation and the leaders will do nothing and let chaos reign, then the end result would be to abandon capitalism. people will end up democratizing automation and a large part of human labor  and employment will no longer be needed. 

 

Exactly! That's why we can't do that. We need to fix the mess before it starts.



#36
Zeitgeist123

Zeitgeist123

    Member

  • Members
  • PipPipPipPipPipPipPipPip
  • 1,805 posts

 

 

We will never be forced to abandon capitalism. Certain leaders may choose to do nothing about technological unemployment and let chaos reign, or, much worse, replace it with an oppressive communist regime (Or "welfare capitalism", which is almost as oppressive and dystopian, in addition to being something of an oxymoron.) . That is not a path we must take. There is still time for us to prepare for technological unemployment.

 

but thats the point, if we apply moore's law on automation and the leaders will do nothing and let chaos reign, then the end result would be to abandon capitalism. people will end up democratizing automation and a large part of human labor  and employment will no longer be needed. 

 

Exactly! That's why we can't do that. We need to fix the mess before it starts.

 

 to be honest with you i find that to be a good thing. why would anyone want to preserve capitalism in the presence of high automation?


“Philosophy is a pretty toy if one indulges in it with moderation at the right time of life. But if one pursues it further than one should, it is absolute ruin." - Callicles to Socrates


#37
Jakob

Jakob

    Member

  • Members
  • PipPipPipPipPipPipPipPipPipPip
  • 5,705 posts

 

 

 

We will never be forced to abandon capitalism. Certain leaders may choose to do nothing about technological unemployment and let chaos reign, or, much worse, replace it with an oppressive communist regime (Or "welfare capitalism", which is almost as oppressive and dystopian, in addition to being something of an oxymoron.) . That is not a path we must take. There is still time for us to prepare for technological unemployment.

 

but thats the point, if we apply moore's law on automation and the leaders will do nothing and let chaos reign, then the end result would be to abandon capitalism. people will end up democratizing automation and a large part of human labor  and employment will no longer be needed. 

 

Exactly! That's why we can't do that. We need to fix the mess before it starts.

 

 to be honest with you i find that to be a good thing. why would anyone want to preserve capitalism in the presence of high automation?

 

To keep preserving innovation, progress, and meaning in life, of course.



#38
Unity

Unity

    Information Organism

  • Members
  • PipPipPipPipPipPipPipPip
  • 2,477 posts
Those things didn't exist before capitalism? You should really read the Hyperion Cantos Jakob, I'm not sure you see how differently the world would look to an AI than it does to you. For example, a fleet of AIs could just colonize the oort cloud and other places inhopitable to humans and just ignore us. It would be silly for an AI to take a job, it could just hack the global economy and transfer all wealth to itself and then take over as a benevolent leader rewarding us with resources and caring for us much like we care for ecosystems (though hopefully much better). It may even give us nanobots or something to cure disease while also using those bots to process calculations in solving complex mathematical puzzles. Actually, scratch that you should read Ilium and Olympos and pay special attention to the character of Prospero. An AI has zero incentive to constrain itself by human economic systems.

#39
TranscendingGod

TranscendingGod

    2020's the decade of our reckoning

  • Members
  • PipPipPipPipPipPipPipPip
  • 1,750 posts
  • LocationGeorgia
Jakob. Innvoation does not stem from a system. Certainly the meaning of life does not derive from capitalism. Progress is stymied by our current system and a deviation will only serve to help.
  • illykitty likes this

The growth of computation is doubly exponential growth. 


#40
Jakob

Jakob

    Member

  • Members
  • PipPipPipPipPipPipPipPipPipPip
  • 5,705 posts

Really? You think ANYONE would do ANYTHING if they didn't have to? Only a handful of people would, and even they would not be nearly as productive as if they had to. People will just think, "I'll sit in bed and play video games all day long, but that's okay, because someone else will do the hard work." Problem is, everybody will think that. Tragedy of the commons.

 

The meaning of life derives from meaningful work, which is best done in a capitalist economy. For work is the way to prove that you deserve to exist. If you are valuable enough that someone pays you enough to earn a living, you deserve to live.

 

Our current system is not perfect. But it needs to be improved upon, not scrapped for a "welfare capitalist"/communist dystopia.






0 user(s) are reading this topic

0 members, 0 guests, 0 anonymous users