Not yet a member? No problem!
Sign-up just takes a second.
Forgot your password?
Recover it now.
Already signed up?
Log in now.
Forgot your password?
Recover it now.
Not yet a member? No problem!
Sign-up just takes a second.
Remember your password?
Log in now.
1 Comment
Lodurrsays...I had one of those comments that didn't survive the great database crash of 3/12/09 for this post.
I think it's the wrong assumption to think that whatever software security systems we can create will be able to contain an "intelligence explosion." The understanding we have of our security systems will, by nature, be less than the understanding that the resulting super-intelligence will have. It's like opening Pandora's Box just a crack to peek inside.
The most powerful forces above Earth's crust have been evolution and intelligence, but intelligence has been limited by evolution and evolution is limited by time. This hypothetical singularity event is essentially removing the limitations of time on the evolution of intelligence in a synthetic environment.
The other wrong assumption taken by many people is the idea that the result of a singularity event will benefit us, or that the resulting super-intelligence created will relate to us on any level. It might relate to us as well as we relate to dust mites. It's naive to think that our intelligence is so great that no future entity can be beyond our comprehension. That's the very definition the singularity--that it is beyond our comprehension. Foolish optimism is pervasive in singularity enthusiasts. Why is it considered a wise move for humanity to risk our survival on an unknown?
Discuss...
Enable JavaScript to submit a comment.