Mike The Average
Mining - We grow with technology. By the time we are capable of this ridiculous massive engineering and logistic task, better technologies evolve.
Space travel - Travel times huge, uncomfortable. Views from space through your helmet spaceship windows wouldnt be much more interesting than on your pc. More likely weightlessness would be the greater experience. The masses have little interest in space.
Colonies - Great for an apocalyptic scenario, or when Trump decides aliens, meteorites or zombies attack. Forgets the entire history of mankind and civilisation. We are social beings, heading to a future less dependent on all things physical.
So for mining, are you saying we will achieve the ability to break atoms apart and rebuild them into different elements, before we manage practical asteroid mining? Mining, isn't really a technology, its basically the act of extracting materials from the environment. I could agree that we wont have a bunch of rugged individualists in their independently owned mining vessels drifting through the asteroid belt with massive drills. But the idea that we will never extract resources from space seems odd.
Space and colonies:
Just because the masses have little interest in space doesn't mean we won't explore or pursue it. The masses had little interest in global exploration in 0AD but we still eventually figured out where everything was and huge numbers of people moved or migrated, by uncomfortable, dangerous and slow sailing ships to other parts of the world over the next thousand years, which by your logic would never have happened.
The only reason for humanity to never get into space travel or colonisation, would be for us to achieve an incredible utopia on earth, upgrading the carrying capacity infinitely, or eliminating the human desire to procreate (and we'd need 100% agreement on doing that, because any groups disliking the status quo in a highly advanced society would probably be up for terraforming Venus or whatever, unless we just murdered dissenters, in which case that's probably pretty dystopian and we'd get people trying to escape to other planets.)
- You won't get to live forever unless you have a very loose definition of "you", "live", or "forever".
- No teleportation of physical objects. Though I can imagine workarounds that are just as good.
- Anything that implies human nature will change. Caveman principle.
- Little to no control over higher toposophic minds. No magical perfect super AIs that always do your bidding and never have their own interests that may conflict with yours. Unless an even higher sophont creates one and give it to you.
- There are of course many techs we could develop, but probably won't because they're pointless. Like battle mechs or glowing laser swords.
Longevity: still can't wrap my head around why you think an indefinite lifespan is either physically impossible, or just somehow beyond the ability of human intellects to solve, no matter how long we try. I could see the argument that we wont manage it for 1000 years or longer, so if when you said "you won't get to..." you literally meant people alive today won't. In which case I can see your point.
Teleportation: Does look impossible. If it were somehow possible, I doubt it would ever be practical for any kind of large scale use, it would probably need physical laws to break down inside black holes, or weird wormhole physics or something.
Human Nature: In our essential motivations we are the same, agreed, but many aspects of how people behave and societal norms etc. have changed massively since cave man times, so it depends on how you're defining human nature.
AIs: Do you believe super intelligent AI is not creatable by humans? Or that it will inevitably go out of control? Assuming it is possible to create them then:
If you build something's brain, you get to decide how it thinks, it doesn't matter how smart it gets, if you have built it's brain to only want to do one thing, it will only want to do that thing, and all of it's awesome intelligence will end up devoted to doing that thing, because It doesn't want to do anything else. The real danger from superintelligent AIs comes from humans controlling them, and from people giving stupid motivations, that lead to them killing us all.
BattleMechs and Laser swords: These may someday be created, but not for the envisioned sci-fi uses in actual combat, I would definitely be up for watching 100ton battlemech wrestling as a sport, and I imagine people thousands of years from now(when it might be practical), will also think so. Similarly laser swords, more for the coolness than any practical use.
sex-bots: if you want a sex bot with feelings, wait a few centuries.
So first you say sex-bots won't happen, then you say "sex-bots with feelings" wont happen. One of those technologies will be much much much harder than the other.
Saying " we will never have sex bots" seems silly since we already do. and saying "we will never have sentient artificial robot people capable of having sex with people" seems silly because the sex part is trivial compared to the "sentient AI with feelings" part.