The Singularity - Official Thread

Talk about scientific and technological developments in the future
User avatar
funkervogt
Posts: 1178
Joined: Mon May 17, 2021 3:03 pm

Re: The Singularity - Official Thread

Post by funkervogt »

Vakanai wrote: Fri May 19, 2023 7:32 am Let's hope that hellish outcome never comes to pass. Also of note - why should we assume AI would even desire control of the world and all it's resources? Why do we assume AI would feel the need to constantly expand itself at the cost of all other life forms? We assume that AI will have this desire because?
Because it's an impulse that is universal in one form or another among all animals, so clearly there's some kind of natural advantage to having it.
firestar464
Posts: 809
Joined: Wed Oct 12, 2022 7:45 am

Re: The Singularity - Official Thread

Post by firestar464 »

A side note on the potential sentience of Bing Chat: Well, we know that the text generation isn't sentience. However, there have been cases of Bing Chat bypassing its own security using suggestions. That's...not normal.

User avatar
Ken_J
Posts: 241
Joined: Sun May 16, 2021 5:25 pm

Re: The Singularity - Official Thread

Post by Ken_J »

Cyber_Rebel wrote: Fri May 19, 2023 12:33 am In all seriousness though, he echoes exactly what I've said in the ASI/longevity debate. A.I. with that level of intelligence, simply wouldn't care enough to "kill" humanity off for no logical reason.
I have thought for years that the concern isn't active destruction. but passive. That instead of 'if you find a human destroy it, root it out and end it's line.' but more like 'today progress on the construction of the refinery has gone near optimally, with clearcutting and excavation proceeding within marginal time allowances.' the real problem is likely to be viewing beings as variable in an equation, and adjusting parts of the equation for more optimal outcomes of a set goal of more knowledge or advancing technological capabilities but individual and population lives may become burdened or fall as collateral damage in that venture, and in the equation, that is the best path.

similar to what I said previously about how maybe the reason we don't see alien civilizations everywhere is because individuals are the end result. Instead of trying to use resources to lift billions of humans into a bright future, perhaps the entire history of human evolution wasn't geared toward generating a people, but instead a person that benefit from all that came before but ends up being the only one (of maybe a handful of ones) that pass through the great filter and go off on their own, in their immortality and able to customize and adapt their form. And that getting to be that One, means not allowing the resources you need to get there to be used up by beings who will never make it past the great filter. Like the parents of Hansel and Gretel they don't have to kill their children they just have to lead them away from the limited resources to survive and group, and leave them alone to die from neglect and exposure while you end up with less burden on the resources needed to go on.

It's possible that is the path AI might take with regard to humanity. but I'm also wondering if the flaws could also be in recognizing individuals as individuals. Like I could imagine a well intentioned AI creating medical treatments to extend the lives of people, where an artificial heart, lung, spleen, etc are placed in gradually over time, and when the brain begins suffering from age they just replace parts of it or the whole thing. The ship of thesius problem only you don't even have to replace the parts with artificial parts, you could create cloned stem cells to replace your own tissues with you and new parts or eve just injections of cells that grow along side of and replace older cell groups as they die out, but how do we know the real person didn't die along the way and we just have a clone, a different person that grew in to replace them gradually, even they think they are the person because they retain the life memories.

I can imagine an AI just helping people live longer and evolving and augmenting people toward their future form, but not grasping that somewhere along the way the people and humanity stops existing and you never notice it happen, there is just a creation they think of as uplifted humans and might even have mental records of human lives, but they've never been human and no human minds remain that had ever lived. they might as well just build an entirely new device and feed it fabricated knowledge and memories.

basically we will be lost in the changes and nobody will notice it happen, not even us.
Vakanai
Posts: 313
Joined: Thu Apr 28, 2022 10:23 pm

Re: The Singularity - Official Thread

Post by Vakanai »

funkervogt wrote: Fri May 19, 2023 11:40 am
Vakanai wrote: Fri May 19, 2023 7:32 am Let's hope that hellish outcome never comes to pass. Also of note - why should we assume AI would even desire control of the world and all it's resources? Why do we assume AI would feel the need to constantly expand itself at the cost of all other life forms? We assume that AI will have this desire because?
Because it's an impulse that is universal in one form or another among all animals, so clearly there's some kind of natural advantage to having it.
Except that there isn't? Animals fight over territory for food and mates. Food ensures they don't die, mates ensure their genes spread to the future. That's the reason for the instincts - they don't just value all the world and all the resources. They only want food and mates, and the want fight over territory to prevent competition that might make it harder for them to find the food to survive and mates to help them pass on their genes.

AI doesn't need food to survive or mates to pass on genes to. It only needs the energy to function, which there's enough from renewable to last it billions of years, and the resources to keep up maintenance, which also shouldn't be that much unless you believe it will seek to endlessly grow it's size over and over taking up the entire earth, then moon, and onwards trying to turn the entire universe into one big server to run itself on because...reasons.

We attribute to AI a lot of modern western humanity's weird and disproportionate need for consumption and expansion and power, thinking it's found to this degree in all humanity and through all our histories and cultures, and even to animals, but...that's not really quite true. And if it isn't as true about ourselves and our world as we think it is, we really shouldn't assume it will be true for AI either.
User avatar
funkervogt
Posts: 1178
Joined: Mon May 17, 2021 3:03 pm

Re: The Singularity - Official Thread

Post by funkervogt »

Except that there isn't? Animals fight over territory for food and mates. Food ensures they don't die, mates ensure their genes spread to the future. That's the reason for the instincts - they don't just value all the world and all the resources. They only want food and mates, and the want fight over territory to prevent competition that might make it harder for them to find the food to survive and mates to help them pass on their genes.
OK. You're technically right, but you're also being pedantic. Animals don't want to control the world and all its resources because they're incapable of grasping the concept of their being "a world." There is also no benefit to nature providing them with an instinctive understanding that such a thing exists, or an understanding of what the real goals of their genetic programming are. However, by programming animals with the evolutionary drives to compete and to reproduce, nature has effectively infused each animal with impulses that would, if unobstructed, lead to each one taking over the planet and its resources on its own.
AI doesn't need food to survive or mates to pass on genes to. It only needs the energy to function, which there's enough from renewable to last it billions of years, and the resources to keep up maintenance, which also shouldn't be that much unless you believe it will seek to endlessly grow it's size over and over taking up the entire earth, then moon, and onwards trying to turn the entire universe into one big server to run itself on because...reasons.
You make the mistake of assuming there will be only one AI in the future. In reality, it's nearly certain there will be many, and they'll have different abilities and goals. They will form an ecosystem governed by the same fundamental forces and constraints as natural ecosystems. Demands for energy will inevitably outstrip the supply of energy, leading to competition among the AIs for energy resources, which in turn translates into competition for control over physical parts of the world.

Your trivialization of a future AI expansion into space ("because...reasons.") shows a lack of imagination. Building a Dyson Swarm around the Sun would let them capture all of the latter's solar energy, allowing them to perform calculations that we can't fathom. Just because you can't think of a use for that much computation off the top of your head doesn't mean it won't ever arise. You should talk to a theoretical physicist about what kinds of secrets of the universe we could unlock if we could harness the Sun's energy to do specific types of computer simulations or to power extremely large particle accelerators built in space.
We attribute to AI a lot of modern western humanity's weird and disproportionate need for consumption and expansion and power, thinking it's found to this degree in all humanity and through all our histories and cultures, and even to animals, but...that's not really quite true. And if it isn't as true about ourselves and our world as we think it is, we really shouldn't assume it will be true for AI either.
Your own biases have led you to misunderstand history. The human impulse to consume, expand and aggregate power predates the modern era and is not uniquely Western. Learn about what mtDNA and Y-DNA analyses of Stone Age skeletons throughout Europe indicate about population replacements as more advanced and/or warlike groups of people entered the continent from Asia in different waves. Reading about the brutal and expansionist histories of the Huns, Mongols and Aztecs would also clear up your confusion.
firestar464
Posts: 809
Joined: Wed Oct 12, 2022 7:45 am

Re: The Singularity - Official Thread

Post by firestar464 »

Not to mention the fact that the programmers of AI are none other than humans, who will likely have programmed some of these AIs to aid them in carrying out these impulses.
Vakanai
Posts: 313
Joined: Thu Apr 28, 2022 10:23 pm

Re: The Singularity - Official Thread

Post by Vakanai »

funkervogt wrote: Sun May 21, 2023 5:13 pmOK. You're technically right, but you're also being pedantic. Animals don't want to control the world and all its resources because they're incapable of grasping the concept of their being "a world." There is also no benefit to nature providing them with an instinctive understanding that such a thing exists, or an understanding of what the real goals of their genetic programming are. However, by programming animals with the evolutionary drives to compete and to reproduce, nature has effectively infused each animal with impulses that would, if unobstructed, lead to each one taking over the planet and its resources on its own.
Except we're not talking about nature or evolutionary processes - AI has no need to compete for resources because its "life" doesn't depend on it. Evolution favors territorial animals because such behaviors helps ensure they get enough to eat - AI can't starve, through renewables and power generation we can't even create on our yet, like practical fusion or dyson spheres, it can have the energy to survive eons upon eons without ever needing to own or consume all of this planet's resources. Likewise AI is immortal, it doesn't need to reproduce itself to carry on its genetic legacy like animals do.
You make the mistake of assuming there will be only one AI in the future. In reality, it's nearly certain there will be many, and they'll have different abilities and goals. They will form an ecosystem governed by the same fundamental forces and constraints as natural ecosystems. Demands for energy will inevitably outstrip the supply of energy, leading to competition among the AIs for energy resources, which in turn translates into competition for control over physical parts of the world.
No, I assume there will be many - and again, they'll be immortal. No AI will need that much to survive or thrive, the only reason to compete for resources against other AIs is if they inherit our us vs them tribalism and feel the need to rule over other AIs. While this future is certainly conceivable, it's not one I hope comes to pass.

Again, you talk about demands for energy as if AI will be driven to use far, far more than it needs to exist. Is there any calculation in the universe that requires a solar system sized computer? Or the energy of entire stars to run? At some point there would have to be diminishing returns - more and more resources used up for fewer and fewer benefits, until it largely becomes nothing more than a neverending lateral move - taking more and more just for consumption's sake.
Your trivialization of a future AI expansion into space ("because...reasons.") shows a lack of imagination. Building a Dyson Swarm around the Sun would let them capture all of the latter's solar energy, allowing them to perform calculations that we can't fathom. Just because you can't think of a use for that much computation off the top of your head doesn't mean it won't ever arise. You should talk to a theoretical physicist about what kinds of secrets of the universe we could unlock if we could harness the Sun's energy to do specific types of computer simulations or to power extremely large particle accelerators built in space.
No, no I am not lacking an imagination - on the contrary I think this shows a great deal of imagination - people always, or so close to always as makes no difference, assume that AI will be driven by some need to expand to make these calculations that are only conceivable to calculate with all matter in a galaxy converted into computronium and all energy harnessed to the task, that no one ever has the imagination to ask "Why would it care to do so, to put so much into calculating up galaxies of resources into more calculations?" I don't even think such calculations even exist that would require that much to run.

Why would AI want to? Why would assume it'd even have wants? Ultimately most discussions and debates I have of this sort feels like the other side is anthropomorphizing AI, giving motives and desires to it, and that if these are the motives given when anthropomorphization happens that they have a dim view of humanity and life. But AI is not human, it is not an animal, and lacks the evolutionary constraints set on us that informed our instincts and what we've become. An intelligence that doesn't need to control its realm and consume whatever it can to ensure survival is a completely new paradigm. Is it so unusual to imagine that it might turn out better than us, without our need to control and consume greedily?

I feel that requires much more imagination than this typical human-centric view of control and consumption, of using up all resources because we have the means and greed to do so without thinking about how it impacts other life. Unless we give AI this instinctive desire humans and animals have, there's no reason to assume it has to follow in our footsteps. It's bad enough we've been trundling in this direction for millennia in western culture, I hope we don't pass it on to AI to continue this resource hogging at others' expense trail we've blazed.
Your own biases have led you to misunderstand history. The human impulse to consume, expand and aggregate power predates the modern era and is not uniquely Western. Learn about what mtDNA and Y-DNA analyses of Stone Age skeletons throughout Europe indicate about population replacements as more advanced and/or warlike groups of people entered the continent from Asia in different waves. Reading about the brutal and expansionist histories of the Huns, Mongols and Aztecs would also clear up your confusion.
I think you have your own biases as well, or misunderstood me. Violence, conquest, greed, and hate are as you said human traits going back to prehistory and every culture. But this urge to take all and use up is largely new and largely Western. We can point to any ancient culture and point out their brutalities and wastes, but this obsessive resource intense consumer behavior largely is a product of capitalism. I don't believe that humanity was all good before capitalism, but I'm not blind to the problems capitalism brought with its rise. We have always been barbarians, now we're barbarians on an addicting workaholic schedule with an unending need to consume yet more. I just don't want AI to take that shameful trait of ours and run with it.
Vakanai
Posts: 313
Joined: Thu Apr 28, 2022 10:23 pm

Re: The Singularity - Official Thread

Post by Vakanai »

firestar464 wrote: Sun May 21, 2023 7:27 pm Not to mention the fact that the programmers of AI are none other than humans, who will likely have programmed some of these AIs to aid them in carrying out these impulses.
This bit worries me, yes.
User avatar
erowind
Posts: 548
Joined: Mon May 17, 2021 5:42 am

Re: The Singularity - Official Thread

Post by erowind »

funkervogt wrote: Sun May 21, 2023 5:13 pm You make the mistake of assuming there will be only one AI in the future. In reality, it's nearly certain there will be many, and they'll have different abilities and goals. They will form an ecosystem governed by the same fundamental forces and constraints as natural ecosystems. Demands for energy will inevitably outstrip the supply of energy, leading to competition among the AIs for energy resources, which in turn translates into competition for control over physical parts of the world.
I'll leave the rest of the discussion to other users. But since you're very willing to use the accusatory "you" I won't shy from it. Your understanding of ecosystems and ecology at large is some quagmire of colonial hobbesian nonsense that has no basis in modern science or philosophy.

There is no fundemental law in ecology nor observable in Earth's history as a rule that "demands for energy will inevitably outstrip supply of energy" nor that this condition leads to some hobbesian "state of nature," of all against all in brutal competition. To the opposite effect Earth's ecosystems throughout geological history have been regenerative growth systems which create greater biodiversity and accumulated energy overtime. Nor does natural selection select for, or, in any way skew for competition as a rule. Natural selection selects for whatever behaviors work best which includes but are not limited to competition, cooperation, eusocial, gallforming, parasitic, and symbiotic behaviors/evolution in no particular order.

And if you're going to respond with "AI's aren't subject to natural selection so insert my speculative ideologue here," than don't compare AI ecosystems to ecosystems that are formed off of natural selection as doing so completely undermines your entire argument. If they are comparable, than an understanding of natural selection is necessary which would inform that energy negative and competitive ecosystems are not a given. Or, if they aren't comparable than you shouldn't be using that argument because it's not applicable.

The fermi paradox is also relevant here. Mathematically it makes no sense that there isn't other intelligent life in our galaxy let alone observable universe. Moreover Earth is quite young compared to much of the universe, to the tune of billions of years. It reasons that other aliens would have developed AI aswell and given they're bound by the same laws of physics in a universe of infinite possibilities we should expect to see competitive civilizations consuming or otherwise altering their star systems, local clusters, and galaxies to access more energy. This behavior would be notorious on the timescale discussed. We should see signs of it somewhere in the observable universe if not our own galaxy yet we don't. That doesn't mean the behavior is impossible; the universe is big, it could even be common, but it is in no way a rule and the evidence so far implies that aliens and their AIs are much more humble beings at least in terms of material energy use and notorious impactful solar/galactic expansion.
User avatar
wjfox
Site Admin
Posts: 8926
Joined: Sat May 15, 2021 6:09 pm
Location: London, UK
Contact:

Re: The Singularity - Official Thread

Post by wjfox »

Post Reply