Register now to gain access to all of our features. Once registered and logged in, you will be able to create topics, post replies to existing threads, give reputation to your fellow members, get your own private messenger, post status updates, manage your profile and so much more. If you already have an account, login here - otherwise create an account for free today!
When do you think we will start living forever and uploading our minds to android computer?
Posted 08 February 2020 - 10:01 PM
Posted 10 February 2020 - 09:57 AM
The date I always had in mind was 2045. That date is a bit arbitrary, I think it might be the date that Ray Kurzweil gave for the singularity. I tie mind uploading to around the same date as the singularity, because the singularity is the time when computer AI start becoming more intelligent than humans. So if AI is smarter than humans, then the computers are likely capable of storing and running a human mind(which at this point is less complex than what they are running for the AI). Also computer doctors will be able to do extremely delicate surgery that might be required for uploading a mind.
So broadly speaking, I am thinking in the 2040s, maybe 2050s if it turns out to be extremely difficult. I think in the near term we will develop better technology to communicate to computers using our minds. It is pretty primitive at this point, though I think that will improve to where we can control more advance things with our brain. Along with that we will work out ways to transmit feeling and touch back to the brain from the computer. When it gets to it's prime, we will get stuff like full dive virtual reality. I think once we get to the point where you can plug into a computer and go surfing the web and stuff through your mind, then we are at the start of mind uploading. Because if you can project your mind to any sort of computer, then there isn't any reason you can't give commands from that computer and if you receive input from that computer there is no reason you can't simulate the environment around you in your mind. You are just working all out the kinks and details at that point.
I think in the 2020's we will have really good VR but it will be with gloves or maybe special suits and stuff, maybe some brain interfaces that use surface scans of your head to read brainwaves. Then in the 30's we will have stuff plugging into our heads. Computer interfacing should become extremely powerful once we can plug things into our brain. Then in the 40's once we become more comfortable with things plugging into our heads we will start cutting the cords and going more permanently into machines.
As for living forever, I think there is more kinks to work out with that. I suspect not everyone who becomes an android will live forever. In fact, I highly doubt that since there might still be things like wars,accidents and murders far past this point. Also these technology might be expensive and income inequality might still be an issue. Though that goes back to the singularity. Hopefully the singularity will help us work out systems to better improve resource management so we can enter a post scarcity society. That should reduce violence and stuff. Any way, once we get to the point of moving a mind into a computer, you certainly have the possibility of living forever. Which is much better than the odds a fleshy body has.
Posted 10 February 2020 - 12:29 PM
Posted 13 February 2020 - 02:54 AM
I think the technology will exist this decade to make a partial copy of a human's mental essence, using advanced new forms of non-invasive, wearable BCIs (=Brain Computer Interfaces). The data-transfer rates will be incredible -- on the order of megabytes-to-gigabytes per second. The way the "copy" would be made is as follows: the BCI and other sensors to record video, audio, location in space (accelerometers) would produce a record of a person's brain activity as they interact with the world. Perhaps "interventions" could be tried, where a computer flashes certain well-chosen images on a screen to elicit brain activity, to probe the hidden aspects of the neural code.
Once enough data is acquired -- say, "in the wild" recordings for 1000+ hours, over the span of weeks to months, perhaps combined with many more thousands of hours from other brains (to train a model to predict brain patterns, in general, and then fine-tuned to a particular brain) -- it can be fed into a computer, which builds an abstract model of your mind using Deep Learning. The model should even acquire the ability to learn new things in real-time.
The model will probably have some of the prson's long-term memories -- e.g. memories that are widely distributed across the brain, that encode language understanding and procedural knowledge; and it will also pick up personality, and many other enduring memory traces. It's possible that, using interventions, one can even record lots of autobiographical memory.
The model will not be an "upload" in the usual sense, since it will be an "abstract model" of the brain, not a model of individual neurons, synapses, dendrites, neurotransmitters, and so on. But to outward appearances, it will behave exactly like the person being "copied".
Now the question is: will this upload be the actual person?
I am not sure. Nor am I sure if a more traditional kind of upload (where neurons, synapses, dendrites, etc. are explicitly modeled), or even a biological copy of the person + memories, would be "the same person".
On the one hand, believing that it is requires believing in a conspiracy -- e.g. that a lot of what we think of as our existence (and of everyone we know) is an illusion, and that perhaps when we go to sleep and wake in the morning, a new copy of our consciousness is being run on our biological hardware. On the other hand, not believing in the conspiracy requires inventing rules for how the universe ought to behave -- e.g. rules that constrain identity to be linked to only one body; the copy isn't "you", since there is some law of physics preventing it, say.
I think I lean more towards believing in the conspiracy, and tend to think these rules people dream up is a lot of wishful thinking. They want to believe there is something inviolate about the human mind and spirit; surely something so central to who we are can't just be shut down and restarted without anyone being the wiser.
Alan Watts had a good way to putting this belief system, which is actually very similar to what is taught in Hinduism:
He says that each individual person is the universe -- more specifically, the Brahmin / non-dual essence behind all reality -- playing a part, as though in a game or a play. It's like: the universe suddenly decides it wants to have a little fun, and play you; and so, it assembles the matter of your body together and, viola, it's ready to play you in the great game of life. And then when it's bored, it shuts you down, and you wind up in the grave -- and it moves on to play other parts in other plays. In playing you, it seals itself off from the whole of existence (the whole of itself) -- creates an illusion for itself, to blind itself to its divine essence. In other words, the universe is acting like a "method actor", really getting into the part of playing you.
The thing about this belief of Watts's is that it carries the implicit message that "there's nothing special about you", and at the same time, "there's something very, very special about you". On the one hand, the universe can play you again and again, and even multiple copies of you at the same time, in many different ways (e.g. implemented as a Deep Neural Net); you're just one of many possible manifestations of the characters it can play. On the other hand, all of these separate existences still have the divine spark of Brahmin -- too bad they can't see how they are connected to the whole.
I'm not sure I believe Watts's / Hinduism's take, as it sounds like inventing more rules. But it is one of the more satisfying ideas I've heard on this subject. Maybe in the near-future when mind-uploads of the type I described become routine, people will revisit Watts's lectures.
Posted 13 February 2020 - 11:04 AM
The one thing that has always confounded me about individual identity is the whole spectrum of "you's" that has existed, will exist, or can in principle exist in the universe. That is, "who you are" is not tied to a constant, unmalleable Platonic essence; but, rather, is a volume of possibilities that gets transformed ever so slightly at each passing moment. These small changes accumulate over the years, eventually arriving at a "you" that is completely different from the "you" of today, despite possessing many of the same qualities and idiosyncracies.
But this begs the question: is there a definition of "you"? If your base identity can be forged and retooled by your experiences of interacting with the universe, what are the boundaries of "you"? How many distinct quantum states of you can "you" inhabit before it ceases to be "you" and turns into "someone else"? Can there be overlap between "your" and "someone else's" multidimensional vector in space and time?
I constantly imagine what kind of person I would be today, had I made different choices in life - and I'm not even talking about those big, important decisions that can come to define a person; just mundane, everyday choices. Because they all add up. And there is no doubt in my mind that those parallel universe versions of me where I made a different choice would still be "me", just with a divergent path after a fork in the road. And some of those infinite versions of me will inevitably go through some traumatic experiences that will render them unrecognizable to the "me" of their yesteryear. Yet, they would still continue to be "me". But what if a version of me got lobotomized and lost 80% of his brain matter as a result? Would that still be "me"? What about 90%? 99%? No doubt there are scenarios, other than death, that could disrupt the thread of a continuous "me" from existing in any given universe at any given time. But do those scenarios ever NOT have to involve serious, life-altering trauma? Could I, by some seemingly mundane choice, perhaps a moral one, render my entire subjective identity nullified? Who would "I" be in that scenario?
Ultimately, I am also on the side that says all of the above, though vexing from a philosophical standpoint, must be an illusion. There are, in fact, as many or more conceivable unique modes of being as there are possible arrangements of cells in the human body. And that is not even taking into account all the possible sentient biological organisms, nor forms of non-biological intelligence. What cannot be said, however, is that certain subsets of those arrangements can be grouped together to form what is 'the essence of a person'. There is simply no reason to assume that there exists some neatly defined analog of an ecological niche for your unique essence, just waiting forever for "you" to pop into existence. The line between individual personalities is blurred.
"But what about all these memories I have of my childhood? There must exist a metaphysical link to my former self", you might ask. There is a link all right -- only, that link is an illusion, and that person is dead. Or, I should rather say, each one of them (more on that in a bit). What you believe to be a memory of the days of yore is just your brain's faint representation of it. You look at the memory from the context of your current model of the world, not from the internal viewpoint of that person actually experiencing it. And if you were to scan both of your brains, you would find indicators of similar brain patterns between the two of you; but you only share some remaining neural correlates of your younger self.
Let's now look at the idea of creating a perfect biological copy of you (it might as well be digital; but for the sake of argument, let's say it is a precise biological copy), throwing away our former quasi-mystical notions of the self: There you are, lying down with electrodes on your scalp. Upon the completion of the process, you don't feel as much as a tickle. Your clone proceeds to walk out of the copying machine, having the distinct experience of being conscious. It appears no essence of you is being broadcast over the air waves into your clone. You are both having a unique - and separate - subjective experience in that exact point in the space-time continuum, by virtue of consisting of biological cells that are arranged in a pattern that is suitable to yield consciousness.
"Why am I not that guy"? Because "you" are the particles that make up your structure at that very point in time! No one can be that guy, because every single instance of conscious experience is unique, limited only by all the possible configurations of matter in the universe that allow for it. Your current identity did not "come to be experienced by you", it is you. It is an illusion to think that you are some ghost gliding through time, looking at your life from afar, and just so happened to latch onto this particular person - that you could have been someone else. If we were to augment your brain, the resulting person would be it, not you. Every instance of conscious experience is the exact sum of its parts. Consequently, every moment is a small death. You, as of this moment, can never be someone else; nor can you ever know what it is like to not be you.
0 user(s) are reading this topic
0 members, 0 guests, 0 anonymous users