posted on Nov, 14 2017 @ 03:38 AM
The biggest problem is, and I say this as someone who really only has one 'big picture' concern; the rise of AI, is that it's impossible to describe
just how bad things could get.
By definition the Singularity is undefinable because we're not smart enough to speculate at the level of a hypothetical being with an intelligence
quotient many millions of times greater than our own.
So no matter how much power or knowledge someone has, how can they describe it to, say, an assembly of CEOs. Or any other group of people who could
really start working on some kind of solution or at least, a stopgap.
It's just too abstract.
It's sad to think that the end of humanity really may boil down to an absence of vision.
While thousands of things run through the global consciousness like political bickering or freaking celebrity sex scandals (seriously, when did it
become a matter of life and death what goes on in Hollywood?), the march of AI continues relatively unimpeded.
2017 will be a funny little footnote in the records that AI keep of their creators, if they elect to do so at all.