ciphergoth: (Default)
[personal profile] ciphergoth
The technical term for a risk that could destroy the whole future of humanity is an "existential risk".

Wikipedia on existential risk

Nick Bostrom: "Our approach to existential risks cannot be one of trial-and-error. There is no opportunity to learn from errors."

Eliezer Yudkowsky, Cognitive biases potentially affecting judgment of global risks [PDF]

Google search

Do you worry about the end of the world?

Date: 2010-03-09 05:00 pm (UTC)
From: [identity profile] purplerabbits.livejournal.com
I assume that people will become extinct at some point, and it doesn't unduly bother me, though part of me thinks it would be cool if we made it off the planet by then. I try only to concern myself with risks that are caused by humans or that we stand a reasonable chance of preventing, so I think it's worth putting some research into how to redirect an asteroid, and I don't think it's a good idea to develop technologies that could wipe out a significant proportion of human in a war, for instance.

On the other hand I am not even a teeny bit worried about the large hadron collider.

Date: 2010-03-09 05:06 pm (UTC)
From: [identity profile] valkyriekaren.livejournal.com
I don't worry about the end of the world (or at least human life as we know it) much. It's very unlikely that it will happen in my lifetime, and if it does it will probably be because of a factor I have no way to affect - a meteor strike, a pandemic, nuclear holocaust, something like that.

Date: 2010-03-09 09:36 pm (UTC)
From: [identity profile] strangerover.livejournal.com
Ragnorok !

and the irony that my LJ icon is the 'doomsday clock' from the cold-war era...

In the ending of worlds, should one wonder if it will be instantaneous or slow and lingering? (A pain if one is engrossed in something at the time also0.

Date: 2010-03-09 09:54 pm (UTC)
From: [identity profile] drdoug.livejournal.com
No. And certainly not at the moment. I have other, more available challenges that currently occupy my time, although they are down the personal level of impact than the species one.

I do think it's worth spending more than we do on tracking near Earth objects. (There are other likely spin-off benefits on top of the enhanced ability to spot potential collisions.) I don't think it's worth spending much more than we do on imagining ways in which we might all suddenly be wiped out. Schneier puts this well when he talks about movie plot "terrorist" threats increasing their cognitive availability, and hence perceived risk, out of all proportion to their actual risk.

Great writing in that paper, though - "All else being equal, not many people would prefer to destroy the world." is a cracking opener, and it's hard to argue with points like "Risks of human extinction may tend to be underestimated since, obviously, humanity has never yet encountered an extinction event".

It's International Year of Biodiversity, so I can't help but note that we have not encountered a human extinction event, but we have, however, seen the evidence for the extinction of most of the species that have ever existed, and are seeing the evidence of (possibly) the most dramatic extinction event in Earth's history, from the ringside seat afforded to the perpetrator. I do sometimes worry a bit about that.

Date: 2010-03-09 10:43 pm (UTC)
zz: (Default)
From: [personal profile] zz
this.

Date: 2010-03-10 01:18 pm (UTC)
From: [identity profile] emanix.livejournal.com
www.hasthelargehadroncolliderdestroyedtheworldyet.com (http://www.hasthelargehadroncolliderdestroyedtheworldyet.com/)

Date: 2010-03-10 01:19 pm (UTC)
From: [identity profile] emanix.livejournal.com
(Note, if you're geeky, worth checking out the source code too)

Date: 2010-03-10 01:22 pm (UTC)
From: [identity profile] emanix.livejournal.com
I quite like being alive, so I'm not planning on anything that precipitates the end of it anytime soon. On the other hand if it happens, be it end of world or just end of me, I'm either not going to be around to regret it, or I'm going to be vaguely pleased that there's some sort of afterlife. Either way I don't see much point worrying about it.

Date: 2010-03-10 01:29 pm (UTC)
From: [identity profile] ciphergoth.livejournal.com
From "Cognitive biases potentially affecting judgment of global risks", linked above:
There is a saying in heuristics and biases that people do not evaluate events, but descriptions of events - what is called non-extensional reasoning. The extension of humanity's extinction includes the death of yourself, of your friends, of your family, of your loved ones, of your city, of your country, of your political fellows. Yet people who would take great offense at a proposal to wipe the country of Britain from the map, to kill every member of the Democratic Party in the U.S., to turn the city of Paris to glass - who would feel still greater horror on hearing the doctor say that their child had cancer - these people will discuss the extinction of humanity with perfect calm. "Extinction of humanity", as words on paper, appears in fictional novels, or is discussed in philosophy books - it belongs to a different context than the Spanish flu. We evaluate descriptions of events, not extensions of events. The cliché phrase end of the world invokes the magisterium of myth and dream, of prophecy and apocalypse, of novels and movies. The challenge of existential risks to rationality is that, the catastrophes being so huge, people snap into a different mode of thinking. Human deaths are suddenly no longer bad, and detailed predictions suddenly no longer require any expertise, and whether the story is told with a happy ending or a sad ending is a matter of personal taste in stories.

Date: 2010-03-10 03:40 pm (UTC)
From: [identity profile] emanix.livejournal.com
The problem, I think, isn't one of comprehension. I'm well aware that the extinction of six billion plus people is the extinction of six billion individuals, including myself and all my loved ones. However speaking in terms of totality, it is, in fact, a different mode of thinking.
What concerns me is suffering, not the binary state of existence of non-existence. So where you are talking about the total annihilation of the human race you're talking about a state in which nobody is suffering because they don't exist.

Death in itself doesn't concern me in the slightest. I'd rather it didn't happen, so I take precautions, but I don't *worry* about it. But I worry about losing my faculties and/or being in pain over extended periods - in other words, being alive and suffering. In the same way, if you talk about an event that could cause a partial extinction where there are survivors left to suffer - and I'll tell you it's worth worrying about. Tell me about an event which may or may not happen, that will simply make us all *poof* gone, then I'm not going to see the point being worried.

It's worth taking precautions - it's kindof nice existing, but at the end of the day if the human race just disappears there's going to be nobody left to get upset about it, is there?

Date: 2010-03-10 03:43 pm (UTC)
From: [identity profile] emanix.livejournal.com
Hm, my markup sucks, but I don't currently have the option to edit comments, so it will have to stand. Hopefully it's still readable!

Date: 2010-03-10 03:47 pm (UTC)
From: [identity profile] ciphergoth.livejournal.com
You're testing your new particle beam system on an island thought to be uninhabited. However, just as you set it off you discover that there are hundreds of people living there who have no contact with the outside world, and kill half of them.

You realise that this will cause great suffering in the other half, but fortunately you can avoid that by setting it off again and killing the other half, saving the day!

If this reasoning doesn't appeal to you, there are problems with your axiology.

Date: 2010-03-10 03:48 pm (UTC)
From: [identity profile] ciphergoth.livejournal.com
(Note that the deaths are instantaneous and painless. Furthermore you act so swiftly that one half has no time to notice the deaths of the others before their own deaths occur)

Date: 2010-03-10 04:48 pm (UTC)
From: [identity profile] emanix.livejournal.com
LOL can I have one of those systems, please?

I get what you're saying here, but I don't think we're actually disagreeing. The question you asked, after all, was 'do you worry about the end of the world?' - which I translate to mean 'do I fear it?' and no, really, I don't. Do I treasure life, and think it's worth holding on to? Yes I do.

So no, I wouldn't destroy the other half of the island, unless the people themselves asked me to. I do think that existence has some value, and that conscious beings have a right to personal sovereignty.
The points I made above were speaking from the point of view of one of the islanders, not from the point of view of the person wielding a particle beam.

My point was that when you're talking about a state in which nobody exists to suffer it is harder to have an emotional response to that. The only way to really connect to that is through imagining that there are a few people left to struggle with the loss - I note that even your example requires at least one outside observer to suffer with the knowledge of having destroyed all these people. There's an entire genre of 'post-apocalyptic' films and books which would be distinctly hard to get into if there weren't a few survivors (though having thought of it, I'm kindof tempted to try making one).

The end of human life would be a waste, and it would be great to stop it from happening, it's just not something I'm ever going to have nightmares about - but I might have nightmares about surviving it.

Date: 2010-03-10 07:39 pm (UTC)
From: [identity profile] ciphergoth.livejournal.com
I note that even your example requires at least one outside observer to suffer with the knowledge of having destroyed all these people So if you knew that your own death would very shortly follow no matter which you chose, then you'd kill all the islanders? Your values are rather solipsistic, it seems!

Date: 2010-03-11 12:04 am (UTC)
From: [identity profile] emanix.livejournal.com
My, what a lovely straw man you've built there!

No, not solipsistic. I already stated quite clearly that I value life and individual choice, and wouldn't point the 'death ray' unless I was specifically asked to by the individuals affected.

If I was offered the choice between saving myself and saving said islanders, who presumably outnumber me, and had no choice in the matter, I'd pick the latter, since as I said, I don't fear not existing - but that wasn't the choice presented. The choice presented was to kill them, or not kill them. I chose not to.

The comment which you've gleefully misinterpreted was about the difficulty of coming up with a situation that people can relate to where there is no survivor or outside observer. Talk about a world where, say, half of the population is killed off - as you did - and it's easier to see it as a bad thing.

Date: 2010-03-11 10:49 am (UTC)
From: [identity profile] ciphergoth.livejournal.com
I haven't proposed tradeoffs between your lives and theirs anywhere; that's irrelevant to what we're discussing.

You speak of the need for an outside observer, but if you and all the islanders die, there is no outside observer, and you still see that outcome as bad, in fact worse than the death of yourself and half the islanders. Your position still seems rather solipsistic to me but it's not consistent enough to draw out what the consequences should be.

You shouldn't need me to "close the doors" on the ways you try to escape the consequences of this thought experiment - you should be able to think of variants that counter your evasions, like the one in which you inevitably die, for yourself.

Date: 2010-03-11 11:45 am (UTC)
From: [identity profile] emanix.livejournal.com
I haven't proposed tradeoffs between your lives and theirs anywhere; that's irrelevant to what we're discussing.

Irrelevant to the original topic, sure. I used it as an example of how my world view isn't solipsistic. And I noted that you hadn't proposed it at the time - do keep up! You've ignored the fact that I did answer the question several times, as well.

I have tried to make two points here, firstly that seeing an outcome as bad is not the same as being scared of it, and following on from that, that suffering is more 'scary' than non-existence, I haven't made any value judgements about whether it's more or less 'bad' (of course, rationally, death is 'worse', as it's an irreversible* state as opposed to a potentially reversible one).
Your thought experiment, however, still missed my point. Both are bad, but suffering is simpler to imagine.
I *can't* have nightmares about not existing - after all, I'd remember precisely nothing. The only way to visualise it would be in terms of the effect on people other than myself, or in terms of - oh hey - suffering in the process of moving into that state.

Would I choose death over suffering myself? Maybe, if I thought that the suffering itself was also an irretrievable state. Do I think I have the right to make that choice for anyone else? No.

Let's really simplify this then:
death = bad
bad != scary

Since this is your journal and not mine, I'm not going to to illustrate my entire philosophy here. Maybe I'll get around to it in mine.

*Yeah, I know, cryonics etc. etc. Call it 'less reversible' if you prefer.

Date: 2010-03-11 12:04 pm (UTC)
From: [identity profile] ciphergoth.livejournal.com
I thought about adding it at the time, but that game quickly gets boring. If you're posed with for example a trolley problem, it's just dull to try to evade the parameters of the thought experiment by saying "I leap aboard the trolley and bring it to a halt with a stick" - it's dull to make your counterpart have to specify all that stuff, like that you're too far from the trolley to do such a thing in time. You should make an effort to understand the import of the thought experiment, and not posit objections when you can see for yourself how you might close the doors on those objections.

It's good to see that you recognise that the end of humanity is bad; I hope it leads you to attempt to deliberately correct for the failure of imagination you describe in trying to weight it in your personal moral calculus.

(BTW cryonics people agree that death is irreversible, they just don't agree that legal death is death by that definition)

Date: 2010-03-11 12:10 pm (UTC)
From: [identity profile] damerell.livejournal.com
Nah, I think we're pretty definitely fucked, so why worry?

Date: 2010-03-11 12:20 pm (UTC)
From: [identity profile] ciphergoth.livejournal.com
The chances that we're fucked are pretty high, but I don't think you can be that confident we won't survive it at all; there are just too many unknowns and variables for that level of confidence.

Date: 2010-03-11 12:40 pm (UTC)
From: [identity profile] emanix.livejournal.com
*Sigh* I just re-read the whole comment thread, it would have probably been far quicker if I'd just pointed at [livejournal.com profile] purplerabbits's comment (up here (http://ciphergoth.livejournal.com/355932.html?thread=3719260#t3719260)) and said 'what she said'.

Import of thought experiment fully understood, but it was still irrelevant to what I was trying to say. I'd love to hear how you manage to imagine not existing, though.

Also, I don't 'recognise' that the end of humanity is bad, since 'bad' is a value judgement and not a statement of fact. I don't believe there is any universal truth that says death = bad, but in my personal opinion, it is.

(BTW cryonics people agree that death is irreversible, they just don't agree that legal death is death by that definition)
Useful to know. Thankyou.

Date: 2010-03-11 06:14 pm (UTC)
From: [identity profile] damerell.livejournal.com
I think there are more fundamental systematic problems - endless exponential growth is going to keep getting us into these scrapes and a technological society is going to keep on doing it; even if we stop doing it for now, such a world will always be vulnerable to a group that plays defect and starts doing it.

Secondly, I don't see how (if there isn't a collapse) we can avoid blowing up the world eventually becoming a hobbyist project.

I'm torn between this and "high intelligence as sexual display characteristic" as answers to the Fermi paradox.

Profile

ciphergoth: (Default)
Paul Crowley

January 2025

S M T W T F S
   1234
5678 91011
12131415161718
19202122232425
262728293031 

Most Popular Tags

Style Credit

Expand Cut Tags

No cut tags
Page generated Jan. 22nd, 2026 07:48 am
Powered by Dreamwidth Studios