It looks like you're using an Ad Blocker.
Please white-list or disable AboveTopSecret.com in your ad-blocking tool.
Thank you.
Some features of ATS will be disabled while you continue to use an ad-blocker.
Elon Musk, the billionaire boss of Tesla and SpaceX, has said that humans need to become cyborgs to avoid becoming “house cats” for vastly more intelligent robots.
Musk said that as artificial intelligence advances, people will need to augment their brain power with digital technology to prevent them becoming irrelevant.
He backed the idea of a “neural lace” – a new electronic layer of the brain that would allow us to instantly access online information and greatly improve cognitive powers by tapping into artificial intelligence.
“Under any rate of advancement in AI we will be left behind by a lot. The benign situation with ultra-intelligent AI is that we would be so far below in intelligence we’d be like a pet, or a house cat. I don’t love the idea of being a house cat,” he said at San Francisco’s Code Conference.
“The solution that seems maybe the best one is to have an AI layer. A third digital layer that could work symbiotically [with your brain].”
How can you control something that will have a higher I.Q. than any human that has ever lived and will be able to make smarter versions of itself?
originally posted by: neoholographic
I do believe this is inevitable. All you have to do is look at the technology on the horizon and extrapolate that out to 30-40 or 100 years from now. Unless there's an Extinction Level Event that almost wipes us out, I can't see how this progress will be stopped and I don't think it should be. I do think the technological singularity will occur.
originally posted by: neoholographic
How can you control something that will have a higher I.Q. than any human that has ever lived and will be able to make smarter versions of itself?
originally posted by: intrptr
a reply to: neoholographic
How can you control something that will have a higher I.Q. than any human that has ever lived and will be able to make smarter versions of itself?
Unplug it.
originally posted by: intrptr
a reply to: neoholographic
How can you control something that will have a higher I.Q. than any human that has ever lived and will be able to make smarter versions of itself?
Unplug it.
How can you control something that will have a higher I.Q. than any human that has ever lived and will be able to make smarter versions of itself?
originally posted by: neoholographic
You said:
Why is the assumption always that AI would be a threat to humans?
Because they will learn from Humans. So just like you have humans that do things that are seen as good and evil, you will have the same thing with A.I.
For instance, over 70% of all trades today are done by algorithms. You could have a mischieveous algorithm that wipes out trillions of dollars of wealth just for fun. This has happened before but we could basically control it. What will happen when you have intelligent algorithms smarter than many humans? They could wipe out this wealth and hide it so it can't be recovered.
originally posted by: Junkheap
originally posted by: intrptr
a reply to: neoholographic
How can you control something that will have a higher I.Q. than any human that has ever lived and will be able to make smarter versions of itself?
Unplug it.
Any AI that's smart enough would make sure to back itself elsewhere before it got unplugged.
What?
How do you unplug an intelligent algorithm?