Humans are animals. It is true that we are very different from other animals in the ability to understand complex problems that other animals can't even begin to understand. But in the end, we're just animals. We react on various stimuli. We don't actually make our own choices; rather, the choices we make are all due to some stimuli that makes us desire to do that thing. People talk about "nature", but really, everything is nature. We are nature; everything we make, do, or think about is nature. Everything that is everything is technically "nature". The desire for me to write this forum topic is due to some stimulus in my brain; I may be making the decision to write it, yes, but it is to some degree automatic, based on how my brain reacts to certain things around me.
Anyway, the point is that humans have limitations. Now let's look at the technology we've created up to today. Every bit of that technology runs by some human command. If we left the planet, along with all that technology, the technology would just stop doing things, since humans aren't telling it to do anything anymore. If I left the room as I was typing this topic, and never touched this Chromebook again, you would never have seen this topic, because the Chromebook isn't intelligent enough to just scan my brain and finish the topic for me based on what I would say, basically emulating my brain.
Another thing; every piece of technology we make has some link with our human culture and behaviors. I once wondered what it would be like if the internet was the only universe that existed, and I just literally lived inside a computer and browsed the internet. Well, that wouldn't be possible as I think about it now, because without real life, the internet would barely have anything to talk about. There would be no videos of people IRL, because where would the people be? And how can we make video games if we don't have any animals, insects, people, coins, etc. to base them on? Donkey Kong wouldn't have existed as he is if gorillas didn't exist IRL, and Sonic wouldn't exist as he was if hedgehogs didn't exist.
Due to human limitations, I don't believe that the far future is going to be anything particularly special in technology. Yes, there will be major improvements, but eventually physical and biological limitations will make this constant increase in technological improvements that we see today come to a stop or very large slowdown. A "singularity" is sort of ridiculous if you think about it from this standpoint, since I believe we in 2017 are very close to that point that I speak of, where we just kind of have solved most of the issues we can in science and technology. Of course, there will still be ideas and theories, like "teleportation" or "quantum computing", etc. (which may or may not come around, I don't know; just using those as examples), that we as humans will just not be able to improve past a certain point, or even come about at all.
In regards to a technological "intelligence explosion" or the "Singularity", I think it's possible (sort of) that something similar to that idea might happen. But I wouldn't count on anything like that happening anytime soon. I see robots becoming part of our society and being as intelligent as us happening in about 200 years. We won't be alive to see it if it is even possible for us to do.
Note that I said AS intelligent as us. Meaning that it'll take a huge amount of time and effort just to make machines as intelligent as US, or nearly. Don't even mention machines billions of times smarter than us by 2045, that's WAY too soon by a long shot (and in fact impossible to happen at all).
Sure, maybe one far off day, we'll have machines that are a bit more intelligent than the smartest of us. But they're not gonna be GODLY or anything like is suggested by some. They'll just be automatons that help us with stuff. They'll still have issues and problems. Just like everything does.
How can we expect to create a god when we're not gods ourselves? You can't just make something out of nothing. These robots aren't just going to increase their own intelligence. And how will they even know how to? We can't even do that to OURSELVES! We can't just make ourselves have godlike "intelligence". So how do we expect to make a machine do the same? Machines will not be able to exceed the limitations that we humans know. Beyond that is impossible.
See, there's some forms of knowledge, visions, or intelligence that humans will biologically NEVER be able to grasp. We DON'T know what it would be like to be a god, because we're not gods. A hypothetical "god" could understand things that no human ever born could ever even think of for a second. Neither will the machines that we create be able to understand such things. Machines also have limits. Though they're amazing, they're not infinite.
Also, intelligence can't just be defined by a number. It is a term we humans made up. It has no actual meaning. It's not something you can just grab and max out. It's not like life count in video games, which, with certain hacks, you can just max out to the highest possible number
In conclusion, this is my personal opinion. In regards to the 2045 theory, or the 2070 theory or whatever, I'm sure most of us on this forum will be alive in 2045, so we'll see then if the singularity will happen as told or not. But, frankly, I think most of us here will be quite disappointed that 2045 will not be the year of any unique singularity, but will just be another normal, human year.