👩💻 Join our community of thousands of amazing developers!
Crossposted from AI Impacts [Content warning: death in fires, death in machine apocalypse] ‘No fire alarms for AGI’ Eliezer Yudkowsky wrote that ‘there’s no fire alarm for Artificial General Intelligence’, by which I think he meant: ‘there will be no future AI development that proves that artificial general intelligence (AGI) is a problem clearly enough that the world gets common knowledge (i.e. everyone knows that everyone knows, etc) that freaking out about AGI is socially acceptable instead o...