The tldr is critical technology failures of the magnitude required to put humanity out for good, rather than merely temporarily inconvenience civilisation, are profoundly unlikely. Humanity wouldn't be made extinct by a nuclear war for instance. Remember, to be a real doomsday scenario it has to be an extinction level event. Even if we were knocked back to the stone age with only a few thousand survivors of whatever technological apocalypse we hit ourselves with, we'd be back eventually.
Juan Bell
>We understand the problem domain for general intelligence too For the 11th time, no we don't. Stop making things up. Basically what you're saying is that if you make your program JUST right, it will behave intelligently. It's a dumb argument.
Wyatt Cooper
That's in the long run. Yes, 100 years from now, the world will be fucking awesome, everybody will have basic income and access to free healthcare while robots and AI systems do all the work. But in the short term, the government will be in denial, refusing to admit that the traditional "labor for money" system no longer works.
Julian Moore
Nature created general intelligence with nothing but time and brute force. I see no reason why we couldn't achieve the same given time and the scientific method.