Jump to content

Welcome to FutureTimeline.forum
Register now to gain access to all of our features. Once registered and logged in, you will be able to create topics, post replies to existing threads, give reputation to your fellow members, get your own private messenger, post status updates, manage your profile and so much more. If you already have an account, login here - otherwise create an account for free today!

These ads will disappear if you register on the forum


I'm Beginning To Realize How Absurdly OP Posthumans Will Be

transhumanism posthumanism posthuman superintelligence evolution

  • Please log in to reply
2 replies to this topic

Yuli Ban

Yuli Ban

    Born Again Singularitarian

  • Moderators
  • PipPipPipPipPipPipPipPipPipPipPip
  • 19,438 posts
  • LocationNew Orleans, LA

The first thing I had to do in order to realize this was separate "transhuman" from "cyborg." All cyborgs are transhumans, but not all transhumans are cyborgs.

When you realize that, everything changes. No longer are transhumans just humans with a lot of wires and metal parts stuck to them. 

I suppose the best way to think about it would be to follow this rule


Which is augmented: the MIND or the BODY?


If "mind", then "transhuman."

If "body", then "cyborg."


A transhuman isn't just a person who has 8 arms or a bionic eye or an advanced prosthetic arm. They have radically altered minds, meaning they think differently from regular humans. And it enters a new league of incredulity when you consider posthumans— those transhumans who have so totally altered their minds and bodies that they can no longer be considered the same species or even the same genus as us.


What are human limitations? We have this 3 pound lump of wrinkly meat inside our skulls. The brain has actually been shrinking over the past several millennia because evolution has sought efficiency and power over all— we have chemical-based brains, and our neurons are very densely packed. The more densely packed the brain, the shorter amount of time it takes for chemicals to move around in it. That microsecond of saved time we've evolved could be what separated us from damnation akin to the Neanderthals— who had much larger brains that were much more loosely packed, meaning that chemicals took longer to reach various areas of the brain. So yeah, it's getting smaller, but we're still getting smarter because it's getting smaller. We can't have Gray Alien sized brains because the chemical signals would take forever to get from one part of the brain to the other. Sure, by forever, I still mean a fraction of a second— but the difference between 5 milliseconds and .5 milliseconds in the wild can be the difference between the success of your species and utter extinction after a few generations


Posthumans won't have that limitation. Their brains won't be limited to braincases— the most conservative ideas of a posthuman could have their minds be the size of their bodies, as in their whole body is a brain. Then you get the freaky cyberdelics like me who take it too many steps too far and imagine the limits of what is possible with posthumanity. There's no limit. A posthuman's mind can be the size of a planet.

If a posthuman had a chemical-based brain, he wouldn't be able to have such things. That's why the only logical step forward is eschewing chemicals and going with photonic neurons— neurons that work using light. This is a hundred-thousand fold speedup compared to what we have now. And that's only assuming posthumans keep similar brain structures. There's no reason why their brains would have to resemble ours. It's possible that our brain is optimal for chemical-based computing, but photonics allows for an infinitely more efficient model. And that's just assuming we can't transmit information using quantum physics— if it turns out we can, holy shit


We don't know of any upper speed limit for quantum physics. We can guess '10x the speed of light' for some things, but we're so totally new to the physics of the ultra-small that we simply don't know. We didn't even know quantum physics were a thing a hundred years ago, and we still can't explain it well to this day. And considering our current understanding of physics is so incomplete (we can't even do something as simple as unite Relativity and Quantum Physics), who knows what we can and can't do in the future. Sure, there's a lotta quantum woo out there, but then again quantum physics itself sounds like woo.


And it's possible that only posthumans and ASI will be able to understand it. At this juncture, I'm starting to wonder if there will be any meaningful differences between artificial superintelligence and posthuman superintelligences. 


All I know is that it's no use comparing them to anything we know. We sometimes (okay, all the time) use Einstein as a benchmark for "superintelligence." But if we had actual superintelligence for comparison, we'd realize just how fucky superintelligence is compared to everything we've ever known.


That's not to say we don't have anything to compare it to. In fact, every single human has something they can compare it to— ourselves


Homo sapiens sapiens represents superintelligence in comparison to any other life form on Earth.  Just compare us to our closest relatives— chimpanzees. In our Romantic idealization of humanity, we tend to vastly underestimate just how similar we are with chimps and bonobos. But if there's one thing that does ring true, it's that humans can be considered 'superintelligent hairless chimps.'


Even if you got 7 billion chimpanzees, gave them 100,000 years to survive, and then distantly observed them from low-Earth orbit, you would never witness the rise of a civilization. You would never witness them inventing clothing, or the wheel, or fire, or spears, or higher-order music. In order for them to do any of those things, they'd need an extra 5 or 6 million years— that's when us humans were chimp-esque apes ourselves. That's when the human branch of ape split off from the chimps. And even then, we didn't immediately pop into existence. A million years is a long damn time, and it took 4.5 of those 6 million years before any Homo appeared on Earth at all. It then took another 1.3 million years for those early humans to become Homo sapiens. It then took another 150,000 years for those Homo sapiens to develop behavior modernity and evolve into Homo sapiens sapiens. It then took another 40,000 years for those Homo sapiens sapiens to develop agriculture and civilization. It then took 12,000 years for Homo sapiens sapiens to develop industrial society, computers, air travel, space travel, and ultimately artificial intelligence.


12,000 years is the smallest time frame I just threw at you, and even that is a ridiculously long amount of time for us to comprehend. But in that time frame, we went from hunter-gatherers to transhumans.


The 7 billion chimpanzees I mentioned earlier? The most they'd have done would have been to go from shit flingers to fewer shit flingers— they wouldn't be able to support such a huge population after all. You wouldn't have seen a single technological development. Not one. Or, if there was one, it would've quickly been lost due to chimps' inability to transmit culture. It's not even like that's an impossible feat meant only for us sapient humans— crows can do it. Elephants can do it. Dolphins can do it. Chimps are less capable because they never needed to do it, so they never evolved the ability to transmit technological culture. We've taught chimps sign language, but they proved unable to teach their children even a tiny fraction of what they learned— even though they knew from experience that sign language was a huge benefit to themselves.


This is actually one reason why I disagree so fervently with Jakob and Linux— simply having quadrillions of humans does not guarantee infinitely-expanding innovation. In fact, I think it will lead to infinitely-expanding stagnation for the same reasons I mentioned above. And by the time you've reached posthumanity, the concept of having 'quadrillions' of them becomes an obsolete concept because posthumans almost certainly won't resemble individuals as we know them to be, especially considering their potential capabilities. They can will into existence quadrillions of individual intelligences, no doubt, but they'll be of such a Christlike quality, I'm unsure if that's even the right way to think of them. And I say 'Christlike' not referring to their kindness or generosity, but to their nature— one posthuman could be a billion different people, who are all one person begotten into a billion individual people. Something that both is and isn't a hive mind, but something that's well above our feeble understanding.


In this situation, you have to replace humans and chimps with posthumans and humans


After 100,000 years, those 7- billion chimps will surely have achieved a few things that are epic— by chimp standards. Yet even these incredible feats of chimpgineering pale in comparison to the might of just a small contracted team of humans. Likewise, you could have a quadrillion humans that live for doing the impossible, and yet they won't achieve one trillionth of one percent of the abilities a small hive of posthumans will be able to do.


We've been grappling with basic science for thousands of years. A posthuman could figure out these stumping problems over an afternoon tea. 


And there are a multitude of reasons for this.


As I said, they won't have our limitations. They'll have collective intelligence much like AI— any particular unit will be able to instantly share knowledge with any other unit, and continuously build off this knowledge. So okay, I can see where Jakob and Linux are coming from. My only contention is seeing them as being techno-humans, particularly techno-humans who engage in manual or semi-manual labor.


Another reason: I said that we see Einstein as being the benchmark for superintelligence. For chimps, that benchmark might be Oliver. For gorillas, it might be Koko. Do we see Oliver or Koko as anywhere near our abilities? Would we have been able to teach Oliver general relativity to the point he could've built off of it and gave us new theorems? Can we teach Koko so much about neuroscience that she can offer us new hypotheses into the brain? No, they can't— because their own brains are not wired like ours, and thus they cannot even begin to comprehend the basic concepts we're telling them about. 

The same goes for Einstein. He was brilliant by human standards, but he's just our version of Oliver to a superintelligence. He'd be a cute novelty to them. 


Combine collective intelligence with super intelligence along with having no basic form (thus freeing up space for added thinking material) and much more efficient methods of thinking and you get an entity that's so far beyond us that even my pitiful attempts to describe it are failing. 


So why not use some thought-experiment examples?


A posthuman will have sensory experiences we cannot fathom. They can alter perspective; while we see the 'railroad tracks merging into one point', a posthuman could see something entirely different, and it hurts my brain trying to imagine what that could be.


A posthuman will experience emotions we have no words for. We don't even have the words to describe feelings like these superemotions. There is absolutely no possible way for us to experience these superemotions because our brains do not have the necessary features to experience them. We're limited to what our bodies allow us.


Same goes for all those extra colors they will be able to see. We already know that we humans do not see all possible colors, as crazy as that sounds. We think that "We can see the rainbow, white, black, everything in between, and all variants— how can we not fathom a color?" Well apparently, it is possible. And not only from visible light, but for all of the electromagnetic spectrum. A posthuman artist could create a painting that we could only see a tiny part of because our brains cannot see all the rest of it. It can't compute gamma ray colors.


Same goes for the sounds they will be able to hear. And then there are other experiences besides sights and sounds that, again, we cannot fathom. It's hard for some of us to even take this seriously because we have no comparison to make. It wouldn't just be like describing a psychedelic painting to a blind man— it would be like describing any color at all to a prokaryotic cell.


Posthumans will think on levels far beyond us, using methods far beyond us. Again, this is why I say there will probably be no meaningful distinction between posthumans and ASI, because they could just merge their minds into one supermind.

At the same time, there's an ultrastrange concept that completely destroys our minds, and I teased it earlier when I mentioned Christlike posthumans— when you're spread across so many bodies, you can experience any number of different perspectives at once. The closest comparison we humans can think of would be to imagine how lizards see— each eye looking off into a different direction, while the brain still understands that it's part of one creature. For us humans, trying to imagine how we could even pay attention to more than one thing at a time sounds like hell to try to figure out, which is yet another example of posthumans being lightyears beyond us.


They'll discover and figure out the solutions to problems that would never occur to any number of humans no matter how much time we had in this universe. They'll use their abilities at nano-scale automation to achieve nigh-infinite levels of development on infinitesimally small time-scales, directing their pawns to do any number of things. These pawns might not even be fully automated, but yet more examples of posthuman ultra-intelligence, or perhaps part of that posthuman/ASI hybrid I keep bringing up. I don't know, man. It's beyond me. It's far beyond me. 


Whether there'll even be more than one general posthuman/ASI supermind or not, I don't know. I've maintained that a superintelligence will want to have the maximum amount of computing power as it can get, so a hive mind is a given. Whether that hive mind means 1 ultraterrestrial entity or billions of them, I don't know. It's probably both— 1 ultraterrestrial entity, plus billions of them begotten from that one entity at any particular moment.


It's a clusterfuck of the mind trying to figure it out because I'm basically a chimpanzee trying to figure out how human civilization in 2016 works.


And it all comes back to that one basic fact— transhumanism is the alteration and enhancement of the human mind, more often than not including the human body. After all, a posthuman mind in a sapiens body would be living in Hell, just like how a human mind in a chimp body would be miserable. Anatomically, chimps can't talk— and that's basically one of the three defining traits of sapiens, besides our advanced tool usage and huge brains. And even then, our brains are very plastic, meaning they develop according to their stimuli. Outside of the basic genetically-certain parts of the brain, you do need to be taught various things in order to actually know them, such as language. If we had 4 arms, our brains would be adapted to having 4 arms. However, it's more energy efficient for us landlubber mammals to have 2, so we trended towards two. If you added two more arms to us, however, our brains would have no problem learning how to use them.


Likewise, cyborg bodies will usually trend towards more complex brains and behaviors than we have now, but it would still be a better gambit to actively augment our minds.


Alright, spontaneous rant over.

  • bee14ish, SkyHize, Whereas and 2 others like this

And remember my friend, future events such as these will affect you in the future.




  • Members
  • PipPipPipPipPip
  • 479 posts

I'm starting to wonder if there will be any meaningful differences between artificial superintelligence and posthuman superintelligences.

The way you're describing posthumans, not really. The general difference now would be that we're biological, 'human'. But a mind upload (to a non-brain medium) would be running on that medium the same way as types of AGI - no longer human and presenting the same general risks as an AGI.

To become a posthuman in this manner would actually involve dying as a human - not a comforting prospect to me. Theoretically we might crack "the hard problem of consciousness" and find a way to transfer ourselves to a new medium without sacrificing continuity, but even then a radically different way of thinking would essentially make a person an alien from a human perspective - and not necessarily in a good way. Obviously the Borg scenario comes to mind.

In my mind you can change any (even every) part of the body *other* than the brain and the result is still essentially a human being. But if they think like an octopus ... You know how humans have fixed personalities that don't change easily? The human mind goes out of its way to ensure we're acting *consistently* with our own idea of who we are. Among other things, this helps with social interactions: other people can reasonably expect to understand "who we are" and how we act, so this helps with building relations based on trust and understanding. Not so for an octopus, which can change personalities at the drop of a hat. You can never really be sure that a being like that won't stab you in the back, because something not being in its nature ... is not in its nature.

So basically I'd be really careful about letting people swap out the human mode of thought for something less predictable. The risks of AGI are due to it's unstable unpredictability as well, but at least we'll be making sure to build in proper ethics into those.

[...] simply having quadrillions of humans does not guarantee infinitely-expanding innovation. In fact, I think it will lead to infinitely-expanding stagnation [...]

Absolutely agreed. If AGI can do an all-round better job of scientific exploration and engineering than humans and consume fewer resources to boot, then you'd obviously advance way faster by focusing on building more computers and robots. Humans could still stick around for legacy reasons, but letting them do important jobs (that they can't do well) is more like a Sci-Fi setting than the actual future towards which we're heading.

A posthuman will have sensory experiences we cannot fathom. [...]

Are those worth the loss of humanity, though? Actually brain implants could theoretically provide most of the posthuman capabilities described here without needing to go through a mind upload.
  • bee14ish and Alislaws like this

If you're wrong, how would you know it?




  • Members
  • PipPipPip
  • 59 posts

I once had a similar realization. It started when I was in 6th grade, when all I wanted was a built in ear-phones for my ears so that I can listen to any music anytime, anywhere I want. And then, I can ignore the whole world or listen to Linkin Park while I'm in the shower. And then it hit me. I also have to link my brain to that earphones, and then why just earphones? why not my entire phone, then printers, then computers.


But around 1st year high school, It occurred to me, why not just have everything built-in inside my body?

Not in a cyborg/implant/prosthetic kind of way, but in such a way there are no distinction between which are my nerves and which are the fiber optic cables, between which is a quantum supercomputer and which is my central nervous system, or where are the nanorobotic 3d-printers and where are the platelets in my blood, to an extent that where's the augmented reality screen display and where are my eyelids.


And then I thought. Having all these wonderful things as an integral part of me, and only for me, will be lonely as fuck and weird as hell. So I tried extending my perceived awesomeness into other people. Imagine a society full of such beings who can only be called the epitome of humanity, nature, and technology. Imagine a civilization where all these beings are interacting, where controlled chaos is the new order. You're right!!. It's a world that will never be boring, but we will also never understand, so I want to write a book about how such a civilization can be achieved - or might already be achieved.


You can probably guess my transhumanist ideas started fairly early in life so yes, I'm a ginormous nerd always hanging out in places like futurism.com, futuretimeline.net, orionsarm.com, future.wikia.com, singularityhub.com. aleph.se/Nada, you name it. I've got a ton of posthuman inspiration being a teenager, and I'm posting this because I'm not in the mood of finishing my homework.

  • Alislaws and rennerpetey like this

As you can see, I'm a huge nerd who'd rather write about how we can become a Type V civilization instead of study for my final exams (gotta fix that).

But to put an end to this topic, might I say that the one and only greatest future achievement of humankind is when it finally becomes posthumankind.

Also tagged with one or more of these keywords: transhumanism, posthumanism, posthuman, superintelligence, evolution

0 user(s) are reading this topic

0 members, 0 guests, 0 anonymous users