View Single Post
Old 03-30-2021, 09:31 PM   #346
#-3
#1 Goaltender
 
Join Date: Mar 2008
Exp:
Default

Quote:
Originally Posted by FlamesAddiction View Post
Going off on a tangent
Cool, it's an enjoyable tangent. The AI apocalypse really doesn't directly weight on the argument that simulation theory is unlikely because there is an energy/resolution problem.

I'm also a little skeptical about the AI apocalypse, because it's a little too simple, and reality is probably too path dependant to actually hold this up as a credible prediction. Even if it does happen to us I don't think it means it had to happen to us. Given the down side risk to the AI Apocalypse, I do think it is one "religion" where Pascal's wager currently applies. We should be thoughtful and prepare for it as best we can understanding that no singular action can prevent humanity as a whole from reaching the point where this risk is very much real, so we have to plan to survived the singularity if it is an existential risk, because there is no hope of avoiding it.
#-3 is offline   Reply With Quote