search results matching tag: Nick Bostrom
» channel: learn
go advanced with your query
Search took 0.000 seconds
- 1
Videos (8) | Sift Talk (0) | Blogs (0) | Comments (2) |
- 1
Videos (8) | Sift Talk (0) | Blogs (0) | Comments (2) |
Not yet a member? No problem!
Sign-up just takes a second.
Forgot your password?
Recover it now.
Already signed up?
Log in now.
Forgot your password?
Recover it now.
Not yet a member? No problem!
Sign-up just takes a second.
Remember your password?
Log in now.
Will AI make us immortal? Or will it wipe us out?
*quality
I'm currently reading "Superintelligence" by Nick Bostrom and it's pretty *fear inducing.
If we ever do hit the singularity... it doesn't really matter what we do.
First up, if it's an AGI is to be any use, it needs to be at least as smart as us. Which means, by definition, it will be capable of writing its own AGI.
There are any number of nightmare scenarios where even a seemly benevolent goal might be disastrous for humanity.
"Multivac, fix the environment!"
"Ok, what's the major issue with the environment?"
/wipes out humanity
"Multivac, solve the Riemann hypothesis!"
"Ok, but I'll need a lot of processing power"
/converts entire planet to a giant cpu
Even if, by some miracle, we end up developing a benevolent AI that understands our needs and is aligned with them, (e.g. the Culture Minds), we are no longer in control.
We might be along for the ride, but we're certainly not driving.
The other thing that's interesting is a question of scale. We tend to think of intelligence in human terms, like this.
infants..... children.....mentally impaired..Sarah Palin.....normal people.....Neil deGrasse Tyson.....Feynman, Einstein, Newton, etc.
But that ignores all the lower intelligences:
bacteria...Trump......insects.....sheep.....dogs...chimps, dolphins, etc............................... dumb human..smart human.
But from a superintelligence's point of view, the scale looks more like this
bacteria.. humanity ... Tesla ................................................... ..................................................................................................
..................................................................................................
............................................................................................. AI
Eliezer Yudkowsky - The Intelligence Explosion and Humanity
This is taken from The Singularity Summit symposium hosted by Stanford University, where a good number of speakers about this topic gave keynote addresses. My goal was to have them all posted in the same place, in order for people to easily find them, and I was in the process of doing just that, but due to the queue, Sunkid got to this one first. Here's the rest of the info I already had prepared to go along with this.
--------------------------------------------------
Eliezer Yudkowsky - The Human Importance of the Intelligence Explosion (Full Title)
The Singularity Summit symposium hosted by Stanford University was a series keynote addresses given with the purpose of addressing the very real implications that the Singularity may hold in the near future in an academic setting, and (without being too melodramatic on my part) to question what the very fate of the human species may be in the 21st century.
--------------------------------------------------
Here are the rest of the keynote videos that go along with this, in the order that they were given at the event.
Ray Kurzweil - The Singularity: A Hard or Soft Takeoff?
http://www.videosift.com/video/Ray-Kurzweil-The-Singularity-A-Hard-or-Soft-Takeoff
Douglas R. Hofstadter - Trying to Muse Rationally about the Singularity Scenario
http://www.videosift.com/video/Douglas-Hofstadter-Musing-Rationally-about-the-Singularity
Nick Bostrom - Artificial Intelligence and Existential Risks
http://www.videosift.com/video/Nick-Bostrom-Artificial-Intelligence-and-Existential-Risks
Sebastian Thrun - Toward Human-Level Intelligence in Autonomous Cars
http://www.videosift.com/video/Sebastian-Thrun-Human-Level-Intelligent-in-Autonomous-Cars
Cory Doctorow - Singularity or Dark Age?
http://www.videosift.com/video/Cory-Doctorow-Singularity-or-Dark-Age
K. Eric Drexler - Productive Nanosystems: Toward a Super-Exponential Threshold in Physical Technology
http://www.videosift.com/video/Eric-Drexler-Productive-Nanosystems
Max More - Cognitive and Emotional Singularities: Will Superintelligence come with Superwisdom?
http://www.videosift.com/video/Max-More-Will-Superintelligence-come-with-Superwisdom
Christine L. Peterson - Bringing Humanity and the Biosphere through the Singularity
http://www.videosift.com/video/Christine-Peterson-Humanity-Biosphere-the-Singularity
John Smart - Searching for the Big Picture: Systems Theories of Accelerating Change
http://www.videosift.com/video/John-Smart-Systems-Theories-of-Accelerating-Change
Eliezer Yudkowsky - The Human Importance of the Intelligence Explosion
http://www.videosift.com/video/Eliezer-Yudkowsky-The-Intelligence-Explosion-and-Humanity
Bill McKibben - Being Good Enough
http://www.videosift.com/video/Bill-McKibben-Being-Good-Enough
Ray Kurzweil - Stanford Singularity Summit: Closing Thoughts
http://www.videosift.com/video/Ray-Kurzweil-Stanford-Singularity-Summit-Closing-Thoughts