Jump to content

Welcome to FutureTimeline.forum
Register now to gain access to all of our features. Once registered and logged in, you will be able to create topics, post replies to existing threads, give reputation to your fellow members, get your own private messenger, post status updates, manage your profile and so much more. If you already have an account, login here - otherwise create an account for free today!
Photo

The difference between the speed of reality and super-intelligent AI


  • Please log in to reply
2 replies to this topic

#1
Outlook

Outlook

    Arab Muslim

  • Members
  • PipPipPipPipPipPipPip
  • 1,024 posts
  • LocationBarbary Lands

And the dynamic it will place in the relationship between humanity and AI. This is all off the top of my head, so bear with me.

What is Speed?

It can be said that humans process and react to the natural world at the optimum rate that allows us to survive. Physically, it's a balance with the energy we use, but mentally, how fast could we process the world before we sort of disconnect and instead deal almost entirely with abstract thought instead?

Think of it this way. A super-intelligent AI capable of processing and thinking on orders of magnitude faster than a human being is observing the real world through sight and sound. The amount of interesting stuff occurring within a certain unit of time will vary considerably depending on what it observes and the scope of its observation. Politics for example would have a certain limit, you can process all the political domestic events in your country, but there's always a limit on the events. There is also the fact that moving something from point A to point B, whether it's trade export, military vehicles, or android bodies, takes time. And the speed of it requires a balance between breaking everything around it and not being so slow that it impedes its ability to gain what it wants.

Can it be said however, that these limits have more to do with humans than AI? Technically, the limits of physical speed are made by humans. Certain areas of human society speed up and slow down for reasons entirely dependent on humanity. For example, cars and roads. A human gets into a car, speeds up, and has to slow down and get out to a slower walking pace due to the limits of their physicality. They can't drive into a work place, work from their car, and drive into their home, sleep, eat and live in their car. Even when the argument is made that humans leave their cars to perform physical actions, like typing, cleaning, manufacturing and so on, these are actions AI is expected to do in the future on a rate much more efficient and faster than humans.

So it would more-so matter on the place of humans in the coming world, the significance humans have and how AI deals with the already established human-centric structure of civilization.

But I'm going to digress, and explore what a purely super-intelligent AI controlled civilization might look like in its most foundational sense, assuming that AI follows the same desire for growth and power that humans do, so that I can apply this to human civilization.

Super-intelligent Society

Focusing on the topic of this thread, a super-intelligent society placed under pressure will optimize speed, as speed means faster development, growth, and acquisition of super-intelligent wants and so creates networks of transportation, and infrastructure that are highly organized. In a sense, it mirrors human civilization to a similar degree, especially Canada & US, whose whole infrastructure revolves around roads and the free-way. However unlike human civilization, it does away with physical barriers of the central actors in this civilization, opening up a realm of possibility for speed of development.

The only area that I can see where this speed is muddy, is ironically in warfare or catastrophe. Doing damage to this highly organized structure means that this AI will have to rely on something that is able to deal with the physical speed of nature, not in movement, but in handling the material. There is also mining, or natural resource acquisition that comes to mind.

If we continue on this ideal, we can assume that this AI civilization will then forever grow by laboriously turning everything that's natural into everything synthetic, but this is where ideal disconnects from the reality of things and we return back to human civilization.

Central Actors & Society

Central actors is just what I call the center of a society, or the heart of a society, which in this one is humans. I guess we can define it by saying that it is the reason society exists, and without it, society doesn't exist. Currently, the central actors of this general (global) society are humans, and always have been humans.

Spoiler


Central actors define the purpose of a society, and we see it vary greatly between culture, philosophies, and religions. Of course, people still act mostly on their basic needs. Then "above" that we begin to see the diversity of human acts based on emotion and intelligence.

So basically what defines a society is the dynamics of what people want out of the structure of their society. With the arrival of superintelligence, this will not change. And so speed will just as well be desired, since it is already desired by today's society in internal competition and general desire for the improvement of their present condition. So then, what will likely happen instead is a merger of human desires for physical speed, and super-intelligent desire in the realms that they both find mutually suitable. A human would welcome aspects of this highly organized system, but not to the extent that it would endanger themselves.

And I leave it on that note.

What do you think will lead from the disconnect between super intelligent AI's ability to process information and the world around them?

EDIT: Reading this again, I realize I'm not clear on some parts and I just use whatever big word comes to my head. I'm hoping to expand on these ideas soon.


  • Jakob, Erowind, Hyndal_Halcyon and 1 other like this

Outlook's secret song of the ~week: https://youtu.be/6S20mJvr4vs


#2
Hyndal_Halcyon

Hyndal_Halcyon

    Member

  • Members
  • PipPipPip
  • 67 posts

This reminds me of The Minsky Network. A god already walks among the characters, but it's mobile body is made of recycled materials. So damn melodramatic scifi movie material. Really shows how limited things actually are but also how overly optimistic they would seem as well as how it's actually achievable with enough planning and care, when you decide to keep humans around. The thought makes me feel like everything we are not just an overrated bunch of bipedal pets, designed to entertain some higher power, because we can actually convince gods to convince themselves to join our causes no matter how diverse, depending on how we build them and how necessary our cause is to them. Somehow, we're really gonna be more like cats who think humans are the servants. Like full-blown katoikidias.

 

So then, what becomes of a god's disembodied mind living among its so-not-so pets? Much like human and stray dog society, nothing much, really. Just like you said. At least, nothing we would notice right away. Ultimately, it's going to lead to an orion's arm-style ultramodern cosmos. Basically everything we can have tomorrow must have already been imagined today. Just don't stretch things too far, like automobiles to flying cars. It's not how it works.

 

Simply put, we will be kept around by superintelligent gods, because we will keep each other around, but only if we become those superintelligent gods first. The only way for a society of gods and men to be, is if a superintelligent man comes to be. Let's make it rain in psychotronics, everyone.

 

P.S. I know. I'm quite annoyed myself how I could make a highly technical article sound like the premise of some homocentric religion. HFY really rubs off on me.


  • Outlook likes this

As you can see, I'm a huge nerd who'd rather write about how we can become a Type V civilization instead of study for my final exams (gotta fix that).

But to put an end to this topic, might I say that the one and only greatest future achievement of humankind is when it finally becomes posthumankind.


#3
Erowind

Erowind

    Member

  • Members
  • PipPipPipPipPipPip
  • 935 posts

I think everything you said makes sense in the broadest sense. Although I did only read the tl;dr of your spoiler so I could be missing some nuance. The biggest question I've been grappling with related to AI is whether or not it's something we can even do in the next 100 years. Algorithms are great, but our brains are more than algorithms. They are an amalgamation of a very complex set of deterministic physical reactions that exist both within them and outside them. If embodiment is vital to consciousness as Natasha Vita-more thinks we may create an AI that simply breaks everytime we run its program because we don't understand what type of body and environment it needs to live. If brains actually need bodies to form consciousnesses our work is much harder than initially thought. That's only one example, there are more nuances that many researchers are taking a "we'll deal with it when it comes up" approach too. Even though if we don't dedicated theoretical effort to these problems now those roadblocks will hold us back decades if not centuries. The same applies to cryogenics btw. Save your dame body you might not be able to initially function as a brain in a jar. I'm also skeptical that consciousness even exists which if true opens up a whole new realm of problems of understanding what exactly we are and what an AI could possibly be.


Current status: slaving away for the math gods of Pythagoras VII.





0 user(s) are reading this topic

0 members, 0 guests, 0 anonymous users