The opening post reminded me of a movie I saw a few years back regarding Shaka Zulu. Shaka Zulu developed a spear like weapon that was very efficient at reaching under an opponents shield and spearing that opponent to death. Previously, the elders had made a great show of engaging in ritual warfare, but had stayed away from anything as barbaric as a one sided slaughter of their enemies. So they argued against deployment of the new weapon. Shaka's response: if we don't use it first then we risk our opponents using it against us. He won the debate.
Today such logic may result in the singularity, or it may result in something far more dangerous - the annihilation of human civilization, if not the human species. We need to find a better way. One which is clear headed about the dangers of both action and inaction.
I said in the Transhumanism & Cybernetics thread that it's already too late to find a better way. The AI Arms Race has begun and is already spiraling wildly out of control. This arms race began years ago, and Trump's antagonism towards China has done nothing but accelerate this race.
It was dreadnaughts and machine guns in the 1890s and 1900s, and it was nuclear weapons and ICBMs in the 1950s and 1960s. Now it's AI and robotics.