search results matching tag: superintelligence

» channel: learn

go advanced with your query
Search took 0.000 seconds

  • 1
    Videos (5)     Sift Talk (0)     Blogs (1)     Comments (3)   

Myths and Facts About Superintelligent AI

ChaosEngine says...

As I've said before, even if we somehow wind up in a best case scenario of a benevolent superintelligent AI that shares our goals, we're still just pets at that point.

You might love your dog, but you don't let it make decisions for you.

But that's extremely unlikely. Max talks about "letting " the AI decide at one point in the video. The verb "let" implies a degree of control on our part. If we get a superintelligent AI, we won't be "letting" it do anything, any more than apes "let" humans do something. Maybe on a short scale/timeline, an individual ape might force a human to do something (i.e. a gorilla makes a human run away), but in real terms, apes don't really get a say if we decide to do something.

Basically, as soon as we get superintelligent AI* , the world will be unrecognisable.

* I mean a true superintelligent AGI, something that is smarter than us and therefore by definition able to write a better version of itself.

Will AI make us immortal? Or will it wipe us out?

ChaosEngine says...

*quality

I'm currently reading "Superintelligence" by Nick Bostrom and it's pretty *fear inducing.

If we ever do hit the singularity... it doesn't really matter what we do.

First up, if it's an AGI is to be any use, it needs to be at least as smart as us. Which means, by definition, it will be capable of writing its own AGI.

There are any number of nightmare scenarios where even a seemly benevolent goal might be disastrous for humanity.

"Multivac, fix the environment!"
"Ok, what's the major issue with the environment?"
/wipes out humanity

"Multivac, solve the Riemann hypothesis!"
"Ok, but I'll need a lot of processing power"
/converts entire planet to a giant cpu

Even if, by some miracle, we end up developing a benevolent AI that understands our needs and is aligned with them, (e.g. the Culture Minds), we are no longer in control.

We might be along for the ride, but we're certainly not driving.

The other thing that's interesting is a question of scale. We tend to think of intelligence in human terms, like this.

infants..... children.....mentally impaired..Sarah Palin.....normal people.....Neil deGrasse Tyson.....Feynman, Einstein, Newton, etc.

But that ignores all the lower intelligences:

bacteria...Trump......insects.....sheep.....dogs...chimps, dolphins, etc............................... dumb human..smart human.

But from a superintelligence's point of view, the scale looks more like this
bacteria.. humanity ... Tesla ................................................... ..................................................................................................
..................................................................................................
............................................................................................. AI

Eliezer Yudkowsky - The Intelligence Explosion and Humanity

Cronyx says...

This is taken from The Singularity Summit symposium hosted by Stanford University, where a good number of speakers about this topic gave keynote addresses. My goal was to have them all posted in the same place, in order for people to easily find them, and I was in the process of doing just that, but due to the queue, Sunkid got to this one first. Here's the rest of the info I already had prepared to go along with this.

--------------------------------------------------

Eliezer Yudkowsky - The Human Importance of the Intelligence Explosion (Full Title)

The Singularity Summit symposium hosted by Stanford University was a series keynote addresses given with the purpose of addressing the very real implications that the Singularity may hold in the near future in an academic setting, and (without being too melodramatic on my part) to question what the very fate of the human species may be in the 21st century.

--------------------------------------------------

Here are the rest of the keynote videos that go along with this, in the order that they were given at the event.

Ray Kurzweil - The Singularity: A Hard or Soft Takeoff?
http://www.videosift.com/video/Ray-Kurzweil-The-Singularity-A-Hard-or-Soft-Takeoff

Douglas R. Hofstadter - Trying to Muse Rationally about the Singularity Scenario
http://www.videosift.com/video/Douglas-Hofstadter-Musing-Rationally-about-the-Singularity

Nick Bostrom - Artificial Intelligence and Existential Risks
http://www.videosift.com/video/Nick-Bostrom-Artificial-Intelligence-and-Existential-Risks

Sebastian Thrun - Toward Human-Level Intelligence in Autonomous Cars
http://www.videosift.com/video/Sebastian-Thrun-Human-Level-Intelligent-in-Autonomous-Cars

Cory Doctorow - Singularity or Dark Age?
http://www.videosift.com/video/Cory-Doctorow-Singularity-or-Dark-Age

K. Eric Drexler - Productive Nanosystems: Toward a Super-Exponential Threshold in Physical Technology
http://www.videosift.com/video/Eric-Drexler-Productive-Nanosystems

Max More - Cognitive and Emotional Singularities: Will Superintelligence come with Superwisdom?
http://www.videosift.com/video/Max-More-Will-Superintelligence-come-with-Superwisdom

Christine L. Peterson - Bringing Humanity and the Biosphere through the Singularity
http://www.videosift.com/video/Christine-Peterson-Humanity-Biosphere-the-Singularity

John Smart - Searching for the Big Picture: Systems Theories of Accelerating Change
http://www.videosift.com/video/John-Smart-Systems-Theories-of-Accelerating-Change

Eliezer Yudkowsky - The Human Importance of the Intelligence Explosion
http://www.videosift.com/video/Eliezer-Yudkowsky-The-Intelligence-Explosion-and-Humanity

Bill McKibben - Being Good Enough
http://www.videosift.com/video/Bill-McKibben-Being-Good-Enough

Ray Kurzweil - Stanford Singularity Summit: Closing Thoughts
http://www.videosift.com/video/Ray-Kurzweil-Stanford-Singularity-Summit-Closing-Thoughts

  • 1


Send this Article to a Friend



Separate multiple emails with a comma (,); limit 5 recipients






Your email has been sent successfully!

Manage this Video in Your Playlists

Beggar's Canyon