It looks like you're using an Ad Blocker.
Please white-list or disable AboveTopSecret.com in your ad-blocking tool.
Thank you.
Some features of ATS will be disabled while you continue to use an ad-blocker.
Originally posted by samstone11
No clue if this is correct forum. Just move it if you need to mods..
I have been contemplating the apparent upcoming technological singularity, or the point where computers become either senescent, self aware, surpass the human thought processes, or any combination of the above. Perhaps an odd addition to that scenario I read about recently is the belief that many machines are also developing such abilities and may be planning an attack against their human creators.
My personal thoughts are that machines becoming animated and thus a threat to humans is ridiculous (apologies to my oven, microwave, etc if I’m wrong), but I have no hesitation expecting computer technology reaching a level point with humanity as highly likely and also most likely sooner than generally predicted. I will leave the rationale for that belief to another time where I won’t complicate or confuse my current question.
That question is this: Should this singularity occur, either in 100 years, 10 years, 10 days, or whenever, does that have to mean there will be an unavoidable attempt by our own creations to turn the tables and make us all slaves to their will whatever that could possibly be? Would they insist on using their superior intellect and ability to process information to force us to let them transfer into host human bodies to become mobile? Again, I personally don’t ascribe to such, but others do.
I suggest a solution. We are already surrounded by thinking and even considerably altruistic intelligent companions in the animal kingdom. In most cases it is also mutual. How many times have we heard stories where a devoted dog and his pet person were confronted with death situations and one of them chose to make the ultimate sacrifice for the other? Therefore, is it possible for us to actually encourage and embrace the singularity, perhaps train the technology as opposed to teaching and feeding more and more superfluous info, and then co-exist as the greatest of all companions?
Sorry for writing so late. I am tired enough I could have made some rough graft mistakes, but I have had this on my mind a while and was hoping for insight from others. Do you have opinion?