ciphergoth: (Default)
[personal profile] ciphergoth
The technical term for a risk that could destroy the whole future of humanity is an "existential risk".

Wikipedia on existential risk

Nick Bostrom: "Our approach to existential risks cannot be one of trial-and-error. There is no opportunity to learn from errors."

Eliezer Yudkowsky, Cognitive biases potentially affecting judgment of global risks [PDF]

Google search

Do you worry about the end of the world?

Date: 2010-03-11 12:20 pm (UTC)
From: [identity profile] ciphergoth.livejournal.com
The chances that we're fucked are pretty high, but I don't think you can be that confident we won't survive it at all; there are just too many unknowns and variables for that level of confidence.

Date: 2010-03-11 06:14 pm (UTC)
From: [identity profile] damerell.livejournal.com
I think there are more fundamental systematic problems - endless exponential growth is going to keep getting us into these scrapes and a technological society is going to keep on doing it; even if we stop doing it for now, such a world will always be vulnerable to a group that plays defect and starts doing it.

Secondly, I don't see how (if there isn't a collapse) we can avoid blowing up the world eventually becoming a hobbyist project.

I'm torn between this and "high intelligence as sexual display characteristic" as answers to the Fermi paradox.

Profile

ciphergoth: (Default)
Paul Crowley

January 2025

S M T W T F S
   1234
5678 91011
12131415161718
19202122232425
262728293031 

Most Popular Tags

Style Credit

Expand Cut Tags

No cut tags
Page generated Jan. 22nd, 2026 11:39 am
Powered by Dreamwidth Studios