Leading artificial intelligence executives including OpenAI CEO Sam Altman have published a lone sentence saying "mitigating the risk of extinction from AI should be a global priority," akin to nuclear war or pandemics.
One of the so-called godfathers of artificial intelligence says governments need to move faster on regulations to protect against the dangers of the rapidly advancing technology, before it poses a larger threat to humanity.