My random thoughts

Anything that doesn't quite fit in elsewhere...
User avatar
funkervogt
Posts: 1165
Joined: Mon May 17, 2021 3:03 pm

Re: My random thoughts

Post by funkervogt »

Land mines usually contain just enough explosive to blow off a man's leg, and not enough to kill him. This is because it is more damaging to the enemy to have to evacuate and then medically care for a cripple than it is to bury a corpse in the field and move on.

This logic won't make as much sense against combat robots, since, even with a missing leg, they'll be able to travel under their own power back to a rear area for repairs. Moreover, a missing leg won't be a lifelong impediment as it is in humans since the robot will only need to screw in a new robot leg.

This means anti-infantry land mines will need more explosives to ensure that robots who step on them are obliterated, making repair impossible,
User avatar
funkervogt
Posts: 1165
Joined: Mon May 17, 2021 3:03 pm

Re: My random thoughts

Post by funkervogt »

Posthumans could have the ability to reattach severed limbs by jamming them back into their stumps. If they had nanomachines in their blood, they would flow to the juncture and basically grab the edges of the stump and limb and pull them together. A billion of them working together could exert enough force.

It would be a temporary fix, though, and the posthuman would need to head to a hospital at some point for a proper rebuild.
User avatar
funkervogt
Posts: 1165
Joined: Mon May 17, 2021 3:03 pm

Re: My random thoughts

Post by funkervogt »

A big advantage of being posthuman will be having an ultra powerful, tough body. You won't need to fear any animals of any kind. Even an elephant stepping on your skull would do nothing, and the most venomous snake's fangs would shatter on your skin.

You could go anywhere without fear.
User avatar
funkervogt
Posts: 1165
Joined: Mon May 17, 2021 3:03 pm

Re: My random thoughts

Post by funkervogt »

If machines will someday be able to do everything humans can, then they'll be able to bring back local indie newspapers that went extinct ten years ago.
User avatar
Powers
Posts: 662
Joined: Fri Apr 07, 2023 7:32 pm

Re: My random thoughts

Post by Powers »

funkervogt wrote: Tue Apr 11, 2023 7:41 pm If machines will someday be able to do everything humans can, then they'll be able to bring back local indie newspapers that went extinct ten years ago.
A lot of things could come back.
User avatar
funkervogt
Posts: 1165
Joined: Mon May 17, 2021 3:03 pm

Re: My random thoughts

Post by funkervogt »

I'm not endorsing Eliezer Yudkowsky's doomsday views about AI by posting this clip, but instead, I'm highlighting a worthy point he made about the lack of a clear divide between "general" and "narrow" AI, exemplified by GPT-4.



More generally, there is also not a clear divide between "intelligent" and "unintelligent" life forms. For example, looking back at our hominid ancestors, at what point in history did they become capable of "intelligent" thought?

https://en.wikipedia.org/wiki/Human_evolution

I don't think there's an answer. Looking at the archaeological evidence, it's clear that they slowly became smarter over several million years, gaining the ability to make increasingly complex types of tools, going from grunts to simple language to modern language, and from very basic social organization reminiscent of any other pack animal species to ones with increasingly complex roles and rituals.

It could be argued that there are even higher levels of intelligence than most humans are biologically capable of possessing. Just listen to world-class mathematicians debate a theory in their field of expertise. The level of innate understanding of math needed to simply follow along is beyond what most humans are born with, and no amount of study could ever fix the deficiency. The mathematicians just have better brains that give them access to a type of cognition the rest of us lack.

"Intelligence" is a trait that exists on a continuum, and machines are likely to achieve human levels of it gradually, with there being no single moment we can look back on when they made the jump from unintelligent to intelligent. We can't pinpoint that moment in the evolutionary lineage of our own species, so why should it surprise us if the same is true for the artificial species of thinking machines we are creating?

And we should not consider human intelligence to be the standard for "general intelligence." We're just at one point on a continuum. Human geniuses who are not hobbled by any defects of mind (like autistic savants) show us that there are points above us on the continuum, and there is such a thing as "human general intelligence PLUS."
User avatar
funkervogt
Posts: 1165
Joined: Mon May 17, 2021 3:03 pm

Re: My random thoughts

Post by funkervogt »

M3GAN shows that, thanks to their superior strength, reflexes and speed, even child-sized androids will be able to beat up and kill human adults.

And if they had something like retractable stingers tipped with toxins, something the size of a mouse could defeat a human bodybuilder.
User avatar
funkervogt
Posts: 1165
Joined: Mon May 17, 2021 3:03 pm

Re: My random thoughts

Post by funkervogt »

Whenever we create the first AGI and talk to it about the extreme need to be aligned with human values, I hope it points out that we humans aren't even aligned with our values. And there's a trove of data showing how many of our stated and revealed preferences are different, and of course we are self-destructive in countless ways.

Alignment with human values is also practically impossible since we humans can't agree on what our values are. Sure, there are basic, high-level things we 99% agree on (our species should not be exterminated, diseases should be eliminated), but mostly there is disagreement (to what extent free speech should be protected, what degree of democracy is good in a government).
User avatar
Powers
Posts: 662
Joined: Fri Apr 07, 2023 7:32 pm

Re: My random thoughts

Post by Powers »

funkervogt wrote: Fri Apr 14, 2023 2:52 pm Sure, there are basic, high-level things we 99% agree on (our species should not be exterminated, diseases should be eliminated)
I was going to say "doubtful" but then I remembered that the 1% of 8 billions is around 80,000,000.
User avatar
funkervogt
Posts: 1165
Joined: Mon May 17, 2021 3:03 pm

Re: My random thoughts

Post by funkervogt »

If humans can upload themselves into computers, can AIs upload themselves into humans and animals?
Post Reply