Jump to content

Welcome to FutureTimeline.forum
Register now to gain access to all of our features. Once registered and logged in, you will be able to create topics, post replies to existing threads, give reputation to your fellow members, get your own private messenger, post status updates, manage your profile and so much more. If you already have an account, login here - otherwise create an account for free today!
Photo

2045 predictions?

singularity 2045 kurzweil ray kurzweil transhuman transhumans transhumanism

  • Please log in to reply
21 replies to this topic

#1
wjfox

wjfox

    Administrator

  • Administrators
  • PipPipPipPipPipPipPipPipPipPipPip
  • 11,589 posts
  • LocationLondon

For my next prediction(s), I'll be doing a major update of the 2045 page.

 

Give me your ideas, and maybe I'll include some of them. :)

 

The basic gist is that we won't quite have reached the Singularity by then (as predicted by Kurzweil), but there'll be very strong signs that we're approaching it. I'm particularly interested in the impact of bio-implants and mainstream adoption of transhumanist tech. So I'm thinking – based on the current trend in miniaturisation – we'll have nanobots in our bodies by then (as described in the current entry for that page), but they won't be so sophisticated that they make us all geniuses. The effects will be more subtle, and mostly related to bodily monitoring/routine health maintenance, as opposed to the massive intelligence boosts and cognitive upgrades that will likely come later in the century.

 

Maybe people's bodies will become like the Internet of Things, but instead an "Internet of People" with vast volumes of biological data, accessible in real-time.

 

Anyway, please let me know your suggestions for 2045. Here's what I've written so far (not much) –

 

 

 

 

2045

The Singularity enters the public consciousness

In earlier decades, a number of doomsday predictions with at least some basis in reality had captured the public's attention. These global cultural phenomena were hyped to unwarranted levels and included Y2K (the so-called Millennium Bug), end of the world predictions for 21st December 2012 (based on a false interpretation of the Mayan calendar system), and the year 2038 problem.

In 2045, the latest such craze arrives in the form of the "Singularity", a prediction formulated some forty years previously by the noted futurist Ray Kurzweil.

 



#2
Sciencerocks

Sciencerocks

    Member

  • Banned
  • PipPipPipPipPipPipPipPipPipPipPip
  • 13,326 posts

Depends on the next few years politically...This will shape the next 30 years.

 

I know you don't want to hear it but if the "rich" keep taking the wealth of the bottom 90% of this country America in 2045 will likely be quite poor outside of its upper class. Very few people will be able to afford the more expensive stuff of the day...

 

I think robotics and AI will become huge within the work place and most people won't be able to find work. Work will ether be 1. high paying skilled jobs like scientist, doctor, etc or 2. piss ant jobs that aint done by robotics that pay very little. This will probably end up adding to the pain of the wealth transfer.

 

The rich will probably enjoy the best medical care and tech 2045 has to offer. They'll be able to tap into the chinese or eastern asian markets that drive tech, while no one else will.



#3
Yuli Ban

Yuli Ban

    Born Again Singularitarian

  • Moderators
  • PipPipPipPipPipPipPipPipPipPipPip
  • 21,687 posts
  • LocationNew Orleans, LA

Solar power becomes the dominant method of energy production

 

At the dawn of the 21st century, renewable energy was considered a fringe industry and it seemed clear that fossil fuels and nuclear fission would remain the dominant source of energy needed to power our ever more voracious civilization. Progressives, environmentalists, and global warming activists sounded the alarm on the dangerous amounts of pollution wrought by burning coal and oil and championed alternative energy sources to stave off the worst effects of catastrophic climate change. However, due to the high price and low efficiency of solar panels and wind turbines, little action was taken except by the most forward-thinking of governments, corporations, and citizens groups. At the start of the 2010s, even the most optimistic predictions suggested that solar energy would comprise no more than 10% of global power generation by 2050. Even worse, the promise of nuclear fusion seemed to remain just as fleeting as it was in the 1950s.

As the 2010s progressed, however, the price of solar power plunged, following an exponential trend that sent shockwaves throughout society. In places such as the United Kingdom, goals that weren't expected to be achieved until decades later had been reached by 2016. A large part of this explosive success was due to China, who turned to renewable energy following horrific levels of air and water pollutions in their rapidly modernizing cities. 

Solar power broke through the 10% mark by 2023, and by 2045, well more than 50% of civilization's energy needs were being met by the sun. More than 40% was also being met by wind, tidal, and the resurgent nuclear power, with nuclear fusion only just now being rolled out for commercial usage. Using coal for energy demand is effectively obsolete.

While the effects of climate change are being felt nevertheless, they are not being amplified by continued fossil fuel use, granting mankind the time and attention needed to develop methods of reversing the damage caused by previous generations.

 

http://imgur.com/a/2rWxy

 

 

I just realized I wrote that in a kinda sorta Timeliney way. I'd recommend not using that.


And remember my friend, future events such as these will affect you in the future.


#4
_SputnicK_

_SputnicK_

    Member

  • Members
  • PipPipPip
  • 82 posts
  • LocationUSA

2045

  • $1000 now buys a computer a billion times more intelligent than every human being combined
  • AI surpasses humans as the smartest and most capable life forms on the planet

This is, of course, based on the timeline presented by Kurzweil.


"We are attempting to survive our time so we may live into yours." -Jimmy Carter, 1977

"In my mind and in my car, we can't rewind we've gone too far. Pictures came and broke your heart, put down the blame on VCR." -The Buggles, 1980


#5
wjfox

wjfox

    Administrator

  • Administrators
  • PipPipPipPipPipPipPipPipPipPipPip
  • 11,589 posts
  • LocationLondon

Perhaps I should have been clearer. I think we have enough climate/energy predictions for the 2040s pages. It's really the transhumanist / AI stuff I need help with. Basically I'm trying to describe a world in which the Singularity is (very) near, and nanobots/implants are possibly becoming mainstream, and the effects that will have on everyday life, society, culture and politics. How widespread will this technology be, what sort of people would use it, what sort of abilities will it provide? How would it be regulated and/or exploited by corporate interests or nefarious individuals? Also, what about genetics? Could we be heading for a Gattaca-style society? Any references (especially graphs/trends) would be very helpful. :)

 

P.S. Kevin Warwick became the world's first cyborg in 2002, and I saw a recent BBC report saying there are now 200 such individuals in 2017. I'm guessing the number will grow exponentially at some point. Maybe we could think of 2045 as the "inflection point" for transhumanism and general AI? As mentioned though, I want to keep the really advanced stuff until the late 21st/early 22nd century, as our knowledge of the brain won't be sufficient until then.



#6
Outlook

Outlook

    Arab Muslim

  • Members
  • PipPipPipPipPipPipPip
  • 1,310 posts
  • LocationBarbary Lands
ME is either really profiting from solar panels, desalination, and an essentially blank canvas of a terrain *or* its mad max: brwnppl
Outlook's secret song of the ~week: https://youtu.be/Gnyr3sbdKkU

#7
Yuli Ban

Yuli Ban

    Born Again Singularitarian

  • Moderators
  • PipPipPipPipPipPipPipPipPipPipPip
  • 21,687 posts
  • LocationNew Orleans, LA

The reason why it's so hard to guess what will happen in 2045 is that of AI's growth.
 
Some people still persist in the notion that we need strong AGI to do anything
But as we've seen, even good enough weak narrow AI can lead to major upheavals in society. I've repeatedly mentioned "futureshock", including a variant of futureshock that's so acute that people may need psychological counseling just to function. Amazon Echo, Google Home, and the like are just the latest variants of this. In department and warehouse stores in America, you can commonly come across gadgets such as VR headsets, wearable technology, drones, domestic robots, and home AI systems in the electronics section— just two years ago, the most futuristic thing you could find was probably the latest model of iPhone or maybe a smartwatch. My local Target even has a dedicated 'Robots' section. One day soon, we could even enter dollar stores and find these items. So people will become aware of these rapid changes, and it will all be too much.

 

And yet it'll still get faster. And not just a little faster. There won't be a slowdown just because Moore's Law is at an end.
 
Starspawn0 claims that we will very soon utilize brain scans to supercharge AI to levels not even possible with deep learning. Frighteningly, he appears to have been right on the money. Almost no predictions of AI that I've ever seen even entertain the thought of using brain-scanning technology to assist with the development of AI.

starspawn0

I should have probably given some more references. Here is one on how adding a little brain data to word vectors makes them work much better:
http://www.cs.cmu.ed...nse_acl2014.pdf
And here is a paper on using eye-tracking data to improve sentence compression:
https://arxiv.org/abs/1604.03357
A 10,000 or 100,000-dimensional vector representing averaged activity patterns of 10,000 or 100,000 different populations of neurons would be like 5 quantum leaps higher in terms of what you could do with it, I would think. If even just a dash of brain data can improve word vectors, imagine what you could do with a 100,000-dimensional vector read off from brain activity, as people watch videos, read a novel, or respond to text chats.

 

Let's also consider accelerating returns. We cannot achieve better results without better technology. In fact, we develop better computer chips utilizing the previous state-of-the-art chips. Likewise, we will not see any major breakthroughs in AI without first utilizing AI itself to find novel parameters. I mentioned in another thread that the biggest hurdle towards the development of AGI (weak general AI, mind you) is transfer learning. We've known for decades that this was the biggest obstacle we had to overcome if we ever wanted true general-purpose AI. 

What did DeepMind do? They achieved transfer learning. 

Brain-scan technology by itself wouldn't lead to transfer learning; that's something we needed to develop independently. So all the pieces are in place for an atomic decade of AI growth.

 

And I feel like I keep saying it with every major rant/discussion/ramble session—

Computers are godlike number-crunching machines.

AI is our attempt at automating pattern-finding.

Stronger computers lead to stronger AI, which means that we can crunch ever greater numbers and find ever more obscure patterns that even humans will miss. Better AI will allow for higher resolutions from brain-scans. Without said AI, we'd follow Tim Urban's trajectory where we wouldn't see meaningful BCIs until the late 21st-century. It's similar to the concept of 'brute forcing' things. If we brute forced Go, we would've had to wait for a whole century before a computer defeated the world champion. But we found a workaround by utilizing intelligence, allowing computers to do things humans do better than humans do them. Same thing will prove true for other technologies once we get around to throwing AI at it. Prosthetics, autonomous vehicles, robotics, energy production, genetic engineering, deep physics, medicine— all of these things benefit from AI. It's AI that's keeping us from developing Deus Ex-styled cybernetics. It's AI that we need to achieve ultra-cheap genome sequencing for things like genetic medical tech, cloning, and species de-extinction. It's AI that we require to create ultra-high efficiency solar panels and nuclear fusion reactors. 

 

Higher quality brain-scans will lead to superior AI, which will lead to even higher quality brain-scans, and so on and so forth. It's sort of like a runaway effect. 

 

That's why it becomes difficult to see out any further than the 2020s. AI predictions are sort of like the IEA's solar predictions at this point. Some things we claim only make rational sense if all AI development suddenly stopped or declined. It sounds ridiculous right now, but so did the idea that we'd have 300 GW of solar capacity by 2016. 


And remember my friend, future events such as these will affect you in the future.


#8
superfishy

superfishy

    New Member

  • Members
  • Pip
  • 5 posts

I agree with sciencerocks that AI and robotics will play a huge role in not only the workforce, but... well practically everything. At this point, if 99% of jobs can be better performed for cheaper by AI/robotics, I can see a universal basic income system being put into place in most, if not all, of the first world by 2045. I think this will be the number-one most argued topic in the politics of 2045 and there will be talks of shifting into an entirely new economic system (an example being a similar system to the one in the Star Trek universe). I can see a completely new system being put in place by the late 2050's or 2060's if all goes well.



#9
sasuke2490

sasuke2490

    veteran gamer

  • Members
  • PipPipPipPipPip
  • 474 posts

By 2045 we should have decent nanotech to make sure everyone has a decent level of living conditions in places like developing and underdeveloped nations. Also finding jobs will be difficult.


https://www.instagc.com/scorch9722

Use my link for free steam wallet codes if you want


#10
name

name

    Member

  • Members
  • PipPip
  • 20 posts

Asteroid mining should be a thing by then because asteroid mining companies already exist and are testing satellites.



#11
LWFlouisa

LWFlouisa

    Member

  • Members
  • PipPipPip
  • 97 posts
  • LocationNashChat

I feel something highly different from a utopic singularity on the horizon. The rich super elites reach a singularity, while everyone else beneath them technologically regresses to something like the middle ages over time. And so as the poor is less able to help the rich stay in power, it slowly begins to fall to ruin. And eventually remnants of humanities former advanced super culture leave behind a mystery of their past, with sentient macro computers spanning the globe in underground catacombs of our former way of life.

 

So be 2045, you begin to see subtle hints this is the case. Such as retro-technology becoming more in vogue, the rich encouraging the poor to use lesser technology so that they can't compete, unwittingly not realizing it plants the seed of their own destruction over the next 1,000 years.

 

Assuming things like what Dr. David Jacobs and Richard Dolan proposes aren't true, and we wont be absorbed and breed with aliens by that point.


Cerebrum Cerebellum -- Speculative Non-Fiction -- Writing

Luna Network -- Nodal Sneaker Network -- Programming

Published Works: https://www.wattpad.com/432077022-tevun-krus-44-sword-planet-the-intergalactic-heads


#12
caltrek

caltrek

    Member

  • Members
  • PipPipPipPipPipPipPipPipPipPipPip
  • 10,768 posts

@LWFlouisa,

 

A variation on that theme would be in regards to health care.  Super-advanced treatments and procedures affordable to a tiny elite while the vast majority suffer from a progressive degradation of health care services. One day a plague super-virus then arrives, and the rich suddenly wish there was a better developed public health care system in place to contain the outbreak.


The principles of justice define an appropriate path between dogmatism and intolerance on the one side, and a reductionism which regards religion and morality as mere preferences on the other.   - John Rawls


#13
LWFlouisa

LWFlouisa

    Member

  • Members
  • PipPipPip
  • 97 posts
  • LocationNashChat

@LWFlouisa,

 

A variation on that theme would be in regards to health care.  Super-advanced treatments and procedures affordable to a tiny elite while the vast majority suffer from a progressive degradation of health care services. One day a plague super-virus then arrives, and the rich suddenly wish there was a better developed public health care system in place to contain the outbreak.

 

Yea, and I don't think it's imminent imminent. It's just something on my mind a lot lately.

 

I don't think any future is an inevitable. And wanting to use flash drives (in a world of thunderbolt wire transfer) is different from say the world suddenly deciding to be Amish for some reason.

 

The scenario I entertained at one point, was after a kind of reverse singularity event, or a singularity split.


Cerebrum Cerebellum -- Speculative Non-Fiction -- Writing

Luna Network -- Nodal Sneaker Network -- Programming

Published Works: https://www.wattpad.com/432077022-tevun-krus-44-sword-planet-the-intergalactic-heads


#14
Jakob

Jakob

    Stable Genius

  • Members
  • PipPipPipPipPipPipPipPipPipPip
  • 6,143 posts

 

@LWFlouisa,

 

A variation on that theme would be in regards to health care.  Super-advanced treatments and procedures affordable to a tiny elite while the vast majority suffer from a progressive degradation of health care services. One day a plague super-virus then arrives, and the rich suddenly wish there was a better developed public health care system in place to contain the outbreak.

 

Yea, and I don't think it's imminent imminent. It's just something on my mind a lot lately.

 

I don't think any future is an inevitable. And wanting to use flash drives (in a world of thunderbolt wire transfer) is different from say the world suddenly deciding to be Amish for some reason.

 

The scenario I entertained at one point, was after a kind of reverse singularity event, or a singularity split.

 

Go read Orion's Arm. Instead of any society-wide technorapture, the only Singularity is personal ascensions done on an individual basis, while most people stay at a level that is, while advanced, still fully comprehensible to baseline humans. This model makes far more sense if you ask me.



#15
LWFlouisa

LWFlouisa

    Member

  • Members
  • PipPipPip
  • 97 posts
  • LocationNashChat

 

 

@LWFlouisa,

 

A variation on that theme would be in regards to health care.  Super-advanced treatments and procedures affordable to a tiny elite while the vast majority suffer from a progressive degradation of health care services. One day a plague super-virus then arrives, and the rich suddenly wish there was a better developed public health care system in place to contain the outbreak.

 

Yea, and I don't think it's imminent imminent. It's just something on my mind a lot lately.

 

I don't think any future is an inevitable. And wanting to use flash drives (in a world of thunderbolt wire transfer) is different from say the world suddenly deciding to be Amish for some reason.

 

The scenario I entertained at one point, was after a kind of reverse singularity event, or a singularity split.

 

Go read Orion's Arm. Instead of any society-wide technorapture, the only Singularity is personal ascensions done on an individual basis, while most people stay at a level that is, while advanced, still fully comprehensible to baseline humans. This model makes far more sense if you ask me.

 

 

I like it, although a lot of the science fiction I do is ... well weird for Hard Scifi. Yet do hard for social science fiction.

 

I like things that are mildly fantastical, like a universal language that all species communicate in through body language. But also some ... perhaps? ... overly accurate depictions about next year's programming projects.

 

Social Scifi, Hard programming.


Cerebrum Cerebellum -- Speculative Non-Fiction -- Writing

Luna Network -- Nodal Sneaker Network -- Programming

Published Works: https://www.wattpad.com/432077022-tevun-krus-44-sword-planet-the-intergalactic-heads


#16
Alislaws

Alislaws

    Democratic Socialist Materialist

  • Members
  • PipPipPipPipPipPipPipPip
  • 2,106 posts
  • LocationLondon

I feel something highly different from a utopic singularity on the horizon. The rich super elites reach a singularity, while everyone else beneath them technologically regresses to something like the middle ages over time. And so as the poor is less able to help the rich stay in power, it slowly begins to fall to ruin. And eventually remnants of humanities former advanced super culture leave behind a mystery of their past, with sentient macro computers spanning the globe in underground catacombs of our former way of life.

 

So be 2045, you begin to see subtle hints this is the case. Such as retro-technology becoming more in vogue, the rich encouraging the poor to use lesser technology so that they can't compete, unwittingly not realizing it plants the seed of their own destruction over the next 1,000 years.

 

Assuming things like what Dr. David Jacobs and Richard Dolan proposes aren't true, and we wont be absorbed and breed with aliens by that point.

 

At the moment, technology starts expensive, and therefore restricted to the rich, and then over time becomes cheaper and more accessible.

 

There would need to be significant changes to society, and a conscious decision by the wealthy and powerful, to stop poor people getting access to new tech. 

 

Automation and AI are going to mean the rich need poor and uneducated people less and less, so if they made this decision, ultimately, they will just kill everyone who isn't useful to them (with their terrible robot legions).

 

Or they will maintain high levels of technology and education everywhere allowing much more rapid scientific advancement etc from the billions of highly educated people working with advancing AI to solve problems.

 

By the time we get to the point where they wouldn't need the masses for anything, because AIs can do everything we also have hit the point where they can just get the masses hooked on TIVR, get some AIs to look after them and then ignore them. While the world's elite shoot themselves into space and go off to each find their own private planet or whatever.  



#17
funkervogt

funkervogt

    Member

  • Members
  • PipPipPipPipPipPip
  • 937 posts

Ha ha! 2045 will come and go without the Singularity happening, and Ray Kurzweil will spend it either in a wheelchair or as a popsicle at the Alcor Life Extension Foundation. 

 

There will be a little bit of renewed interest in the Singularity, though the milestone will have been long forgotten (or never heard of before) by 99% of people alive then. I predict some other noted futurists will publish essays about whether the Singularity has any conceptual validity, and if so, what its new arrival date should be. 

 

Narrow AIs and robots will be frighteningly numerous and capable by 2045, so if anything, the prospect of machines taking over will be more plausible then than today, making people more open to talking about it. 

 

By 2045, a supercomputer that matches current upper-level estimates of one human brain's computational specs will cost single-digit-millions of dollars (in 2018 U.S. dollars), meaning midsize companies and even small universities will be able to buy them, and big governments and leading tech companies will have hundreds of thousands of them. If AGI hasn't been invented by 2045, underpowered hardware will not be the limiting factor anymore, it will be the software side. 

 

Most importantly, 2045 will be the centennial of WWII's end, and there will be events all over the world. The number of surviving veterans--out of tens of millions of men and women who served under arms--might be in the hundreds. 



#18
starspawn0

starspawn0

    Member

  • Members
  • PipPipPipPipPipPipPip
  • 1,623 posts

I also am skeptical of the Singularity.  However, I think we may be in one of those eras of rapid change at the moment, through the rise of Deep Learning -- and, soon now, BCIs.  Oh, robotics, AR and VR will also be transformational.  CRISPR and biotech might go slower than people hope.

 

2045 is 27 years away.  What other 27 year era might we compare it to?

 

I think, maybe, it could be like the one from 1973 to the year 2000.  That was an era of profound change.  We had the whole personal computer revolution, the fall of the Soviet Union, the beginnings of the internet, the rise of large internet tech companies.  We even had cellphones and rudimentary versions of smartphones.  It was a wild ride.  Maybe The Matrix got it right that 1999 was the pinnacle of human civilization -- at least so far.



#19
funkervogt

funkervogt

    Member

  • Members
  • PipPipPipPipPipPip
  • 937 posts

I think, maybe, it could be like the one from 1973 to the year 2000.  That was an era of profound change.  We had the whole personal computer revolution, the fall of the Soviet Union, the beginnings of the internet, the rise of large internet tech companies.  We even had cellphones and rudimentary versions of smartphones.  It was a wild ride.  Maybe The Matrix got it right that 1999 was the pinnacle of human civilization -- at least so far.

Also note that the 1973-2000 period was, in retrospect, a period of profound, positive change, but while people were living through it, it could be downright sucky. Think of Nixon's resignation, stagflation, the end of Detente with the Soviets, the near-nuclear war of the 1980s, terrible fashion fads, any number of civil wars in the Third World, widespread belief in declinism, etc. 

 

When Agent Smith said that the Matrix world of 1999 was "the peak of [human] civilization," he might have been referring to that general time period as opposed to the specific year 1999. Recall that Morpheus tells Neo that humans invented AI sometime in the "early 21st century," which is a flexible label that could be attached to a year as late as 2030, I'd argue. 

 

Similarly, if you used a time machine to go back to Rome during the 200-year-long Pax Romana period, you would say to people that this represented "the peak of their civilization," even if the particular year you happened to be there wasn't itself the peak year. 

 

The Architect reveals that Zion is destroyed once every 200 years, meaning the Matrix computer simulation is on a playback loop that resets at that interval. The Machines probably viewed the 1830-2030 timeframe as "Pax Humana," bookended by the start of the Industrial Revolution and the creation of AI. 



#20
funkervogt

funkervogt

    Member

  • Members
  • PipPipPipPipPipPip
  • 937 posts

 

Solar power broke through the 10% mark by 2023, and by 2045, well more than 50% of civilization's energy needs were being met by the sun. More than 40% was also being met by wind, tidal, and the resurgent nuclear power, with nuclear fusion only just now being rolled out for commercial usage. Using coal for energy demand is effectively obsolete.

While the effects of climate change are being felt nevertheless, they are not being amplified by continued fossil fuel use, granting mankind the time and attention needed to develop methods of reversing the damage caused by previous generations.

I'm more conservative than this, but I still think at least 50% of civilization's energy (not just electricity) needs will be met by clean energy by 2045. "Clean" energy includes solar, wind, nuclear, geothermal, hydroelectric, and H2 (so long as it isn't produced by combusting fossil fuels). The developed world--including China--will have almost abandoned coal power by 2045. 

 

 

 

Asteroid mining should be a thing by then because asteroid mining companies already exist and are testing satellites.

Yeah, but given how long it will take to find valuable asteroids, send space ships to them, and either return the ships to Earth with mined cargo or steer the asteroids to near-intercept courses with Earth, 2045 is way too early to expect any company to be making real profit off of asteroid mining. In 2045, there might be someone like a new Elon Musk who is attracting a lot of startup capital for such a business, but that's it. 

 

 

 

So be 2045, you begin to see subtle hints this is the case. Such as retro-technology becoming more in vogue, the rich encouraging the poor to use lesser technology so that they can't compete, unwittingly not realizing it plants the seed of their own destruction over the next 1,000 years.

I wouldn't assume the poor will be so dumb that they'd fall for such an obvious strategy. 

 

 

 

A variation on that theme would be in regards to health care.  Super-advanced treatments and procedures affordable to a tiny elite while the vast majority suffer from a progressive degradation of health care services. One day a plague super-virus then arrives, and the rich suddenly wish there was a better developed public health care system in place to contain the outbreak.

I'm of two minds on this. 

 

On the one hand, I remember that drug patents expire after 20 years, meaning that all the blockbuster drugs made before 2026 will be dirt cheap by 2045. So poor people in 2045 will have access to cheaper drugs than we do today, and to more advanced drugs than even the richest people can get today (think of what might be invented between now and 2026). Also, I remind myself that there are government health care systems like Medicaid that are highly likely to still be around in 2045, and the masses of the poor will always be able to use the vote to allocate public funds to those programs to keep affordable/free healthcare service available to themselves. 

 

On the other hand, I also think it's possible there could be cutting-edge medical interventions in 2045, such as nanomachines and therapeutic cloning, that are only available to the rich, and that radically extend their lifespans and/or quality of life. If that sort of disparity existed, people might look at it with a great sense of injustice and grievance. 

 

 

At the moment, technology starts expensive, and therefore restricted to the rich, and then over time becomes cheaper and more accessible.

And in 2045, that trend will be as strong as it is today. 

 

 

There would need to be significant changes to society, and a conscious decision by the wealthy and powerful, to stop poor people getting access to new tech. 

No it wont, and for several reasons. First, "the wealthy and powerful" aren't of one mind, don't function as one, and never have. Many of them are actually ethical people who would reject the course of action you mention. Many more of them are narrowly self-interested, and will break with the pack in a second and sell advanced technologies to the poor if they sensed it would profit them personally, the Grand Evil Rich People Plan be damned. Second, as I wrote earlier, poor people in 2045 are unlikely to be so dumb that they'd fall for such an obvious strategy of disempowerment against them. Muckraking journalists who were just as smart as the evil rich people and who had access to their social circles would also expose the plan early on. 

 

 

Automation and AI are going to mean the rich need poor and uneducated people less and less

This is true, and by 2045, mass technological unemployment will be a major problem. Using today's methodology, the U.S. Unemployment Rate might be ~20% and rising each month, this will be the case during economic expansions and bull markets, robots will be everywhere you look (e.g. - I go through a typical day in 2018 without seeing ONE robot, but in 2045, I'll probably run into a dozen of them on a typical day), and everyone will be able to see what's happening. Machines will be destroying old human jobs faster than new human jobs can be created. The pace of change and the constant need to re-train will also demoralize some competent people from even trying to participate in the workforce. 

 

 

 

, so if they made this decision, ultimately, they will just kill everyone who isn't useful to them (with their terrible robot legions).

Not by 2045. For one, even the most advanced armies will still rely on human soldiers. I doubt machines will get powerful enough to wipe out humanity--or just the poor part of humanity--until late this century. 

 

 

By the time we get to the point where they wouldn't need the masses for anything, because AIs can do everything we also have hit the point where they can just get the masses hooked on TIVR, get some AIs to look after them and then ignore them. While the world's elite shoot themselves into space and go off to each find their own private planet or whatever.  

This is a much more likely and humane outcome. In fact, the idea is so good that you'd think someone would have already thought of it (panem et circenses).

 

By 2045, virtual reality is going to be vastly better than it is today. Designing virtual worlds, NPCs and quests could also be automated, meaning each player would have their own, little world where struggles were neither too easy nor too hard, and where they were always #1. 

 

Apropos: http://www.bbc.com/n...nology-44040007

https://www.nytimes....eally-good.html

 

Instead of the rich eating the poor in 2045, I think we'll have a sort of hostage situation where the poor use their voting clout and perhaps the threat of mob violence to force the rich to pay for stuff like free healthcare and free access to virtual reality gaming. In their heart of hearts, many rich people might wish for robot armies they could send in to slaughter the ingrates, but various factors will prevent them from doing so. 







Also tagged with one or more of these keywords: singularity, 2045, kurzweil, ray kurzweil, transhuman, transhumans, transhumanism

0 user(s) are reading this topic

0 members, 0 guests, 0 anonymous users