"Death with Dignity" Elizer Yudkowsky and goal setting

Anything that doesn't quite fit in elsewhere...
Post Reply
User avatar
Ozzie guy
Posts: 487
Joined: Sun May 16, 2021 4:40 pm

"Death with Dignity" Elizer Yudkowsky and goal setting

Post by Ozzie guy »

Elizer Yudkowsky is convinced everyone is going to die due to AI. This caused him to come up with the eye-catching strategy of "Death with Dignity".
Which can be applied not just to AI alignment but your own life.


If something you care greatly about is seemingly impossible or at least extremely unlikely (I would also add if most probability occurs despite your actions) your likely to give up or at least not make a serious effort reducing the odds of getting what you want.

what are the solutions?

Change your psychology?
No! This in itself is very hard or impossible.

Change your beliefs?
No! Humans are very good at hope/wishful thinking but you're now just lying to yourself corrupting your decision making on said goal and allowing inaccuracies/lies to leak into other goals.

Change your goal? Yes!
Changing a goal can be nominally corrupting but it is the easiest option.

You change your goal from I will achieve X thing to I will maximize the odds of X thing being achieved.
Instead of focusing on binary victory/defeat. You can focus on how each action increases or decreases the odds of getting what you want.

Lets say you're a 20 year old living in 1960 and you really want to live forever something seemingly ridiculous and impossible.
You can give up dying by 2020 or you can go down with slightly more dignity, dignity being how much you shifted your odds and maybe squeeze out a win after successfully making your life expectancy 2030.

When it comes to AI safety this is what Elizer advocates, (not a quote but how I interpret his message) "hey we are pretty much certainly all going to die I won't lie about this but lets go down with slightly more dignity by at least putting in a really good effort and increase the odds we squeeze out humanities survival".

If there is something you really care about that you think is impossible don't give up fail with dignity.

https://www.lesswrong.com/posts/j9Q8bRm ... y-strategy
User avatar
Powers
Posts: 732
Joined: Fri Apr 07, 2023 7:32 pm

Re: "Death with Dignity" Elizer Yudkowsky and goal setting

Post by Powers »


Religious fanaticism...
User avatar
erowind
Posts: 548
Joined: Mon May 17, 2021 5:42 am

Re: "Death with Dignity" Elizer Yudkowsky and goal setting

Post by erowind »

Image

Every year Elizer Yudkowsky just reminds me more and more of this.

I agree on not giving up and pushing forward on things that matter btw. Just think that Elizer is crazy and building a death cult slowly.
User avatar
Cyber_Rebel
Posts: 331
Joined: Sat Aug 14, 2021 10:59 pm
Location: New Dystopios

Re: "Death with Dignity" Elizer Yudkowsky and goal setting

Post by Cyber_Rebel »

Truly does sound like a death cult, and his rhetoric is going to end up inspiring someone to do something really stupid. I never have and never will understand the logic of, "the AI is absolutely going to kill us all."

As if that's the only outcome out of many possibilities a superintelligence could come up with. Speaks to our own ignorance more than anything else. Imagine a Chimpanzee thinking (if it could to such a degree) that we wish to kill its troop and mate with the females simply because we possess the capability to, rather than all the other higher order problems we tend to be dealing with. And we're talking about a potential difference in intelligence far greater than that example.

Terminator 2: Judgement Day was a good film, classic even, but I find both Ex Machina and Her to be more nuanced with any potential AGI imagining. Even Westworld which did depict AGI uprising was much more nuanced and well balanced.
User avatar
funkervogt
Posts: 1178
Joined: Mon May 17, 2021 3:03 pm

Re: "Death with Dignity" Elizer Yudkowsky and goal setting

Post by funkervogt »

He really needs to chill out.
Jakob
Posts: 110
Joined: Sun May 16, 2021 6:12 pm

Re: "Death with Dignity" Elizer Yudkowsky and goal setting

Post by Jakob »

Cyber_Rebel wrote: Tue Sep 05, 2023 5:34 pm Truly does sound like a death cult, and his rhetoric is going to end up inspiring someone to do something really stupid. I never have and never will understand the logic of, "the AI is absolutely going to kill us all."

As if that's the only outcome out of many possibilities a superintelligence could come up with. Speaks to our own ignorance more than anything else. Imagine a Chimpanzee thinking (if it could to such a degree) that we wish to kill its troop and mate with the females simply because we possess the capability to, rather than all the other higher order problems we tend to be dealing with. And we're talking about a potential difference in intelligence far greater than that example.

Terminator 2: Judgement Day was a good film, classic even, but I find both Ex Machina and Her to be more nuanced with any potential AGI imagining. Even Westworld which did depict AGI uprising was much more nuanced and well balanced.
Well human activity does pose a threat to chimpanzees, from loggers and miners destroying their natural habitats, to scientists kidnapping them and sticking them in cages, to poachers hunting them for meat, even if there is no organized effort to genocide the chimpanzees. It's very plausible that superintelligent AIs will kill humans or damage our natural habitat as a by-product of their industrial activities.
Post Reply