ciphergoth: (Default)
[personal profile] ciphergoth
The technical term for a risk that could destroy the whole future of humanity is an "existential risk".

Wikipedia on existential risk

Nick Bostrom: "Our approach to existential risks cannot be one of trial-and-error. There is no opportunity to learn from errors."

Eliezer Yudkowsky, Cognitive biases potentially affecting judgment of global risks [PDF]

Google search

Do you worry about the end of the world?

Date: 2010-03-09 05:00 pm (UTC)
From: [identity profile] purplerabbits.livejournal.com
I assume that people will become extinct at some point, and it doesn't unduly bother me, though part of me thinks it would be cool if we made it off the planet by then. I try only to concern myself with risks that are caused by humans or that we stand a reasonable chance of preventing, so I think it's worth putting some research into how to redirect an asteroid, and I don't think it's a good idea to develop technologies that could wipe out a significant proportion of human in a war, for instance.

On the other hand I am not even a teeny bit worried about the large hadron collider.

Date: 2010-03-10 01:18 pm (UTC)
From: [identity profile] emanix.livejournal.com
www.hasthelargehadroncolliderdestroyedtheworldyet.com (http://www.hasthelargehadroncolliderdestroyedtheworldyet.com/)

Date: 2010-03-10 01:19 pm (UTC)
From: [identity profile] emanix.livejournal.com
(Note, if you're geeky, worth checking out the source code too)

Profile

ciphergoth: (Default)
Paul Crowley

January 2025

S M T W T F S
   1234
5678 91011
12131415161718
19202122232425
262728293031 

Most Popular Tags

Style Credit

Expand Cut Tags

No cut tags
Page generated Jan. 22nd, 2026 09:36 am
Powered by Dreamwidth Studios