search results matching tag: supercomputer

» channel: learn

go advanced with your query
Search took 0.000 seconds

    Videos (42)     Sift Talk (1)     Blogs (6)     Comments (71)   

Robot Violinist

cybrbeast says...

I'm no expert, but to me that didn't sound like a very well played violin. But after another few generations we might have a perfectly playing robot violinist. We could make a robot orchestra. Would people want to see this after the novelty has worn off? Or will they have a supercomputer conductor robot who will play with different and innovative conducting styles.

Capitalism Hits The Fan

Psychologic says...

>> ^jwray:
About supercomputers already exceeding the power of the human brain -- I call bullshit and do the math to back it up.
10^11 neurons 10^4 connections per neuron (average) 1 floating point operation per switching event per connection 1000 switching events per second = 10^18 flops. The most powerful supercomputer in the world does 10^15 flops, and is probably too large to have millisecond message passing from point to point.
http://www.top500.org/system/performance/9707


Your neuron count seems accurate, but I've only seen estimates of less than half the number of connections per neuron (2000-5000) that you are using, aside from the fact that not every neuron in the human brain is dedicated to intelligence (less than 85% by weight, not sure about neuron count).

That still puts it over the output of modern supercomputers, but that is also assuming that the human brain runs at full capacity 24/7. Brain power is affected by mood, nutrition, and the need for sleep... computers are not. Brain calculation estimates are also very rough... we're still figuring out what is going on in there. How many of those connections per neuron are actually used? No one really knows how many computations are accomplished by the human brain in a given day.

But I will concede that we cannot currently claim that the fastest super computer on Earth is more powerful than the human brain. Kurzweil predicted that it would happen some time in 2010 (not that it matters). Whether we actually build one at any given time will be mostly a function of whether we feel the need to spend the money on it, but we are still improving the processes needed to build such a machine even if we don't do so immediately.

Like you said, the software is the important part, and that depends on a lot more than what hardware is available at the time.

Capitalism Hits The Fan

jwray says...

About supercomputers already exceeding the power of the human brain -- I call bullshit and do the math to back it up.

10^11 neurons * 10^4 connections per neuron (average) * 1 floating point operation per switching event per connection * 1000 switching events per second = 10^18 flops. The most powerful supercomputer in the world does 10^15 flops, and is probably too large to have millisecond message passing from point to point.

http://www.top500.org/system/performance/9707

Capitalism Hits The Fan

Psychologic says...

>> ^chilaxe:
Creating human-level AI isn't an issue of processing power.
We've had for some time supercomputers that already possess greater processing power than the human brain, and that doesn't mean that they do anything more than crunch lots of numbers.
Human-level AI will be about organizing that processing power in a way that's similar to the human brain's functionality.
Regenerative medicine and reprogenetics are much closer on the horizon, and have just as much potential to transform society.


I agree with everything you said. =)

As far as regenerative medicine, that will be an interesting path. We are on the verge of "curing" the aging process. That doesn't mean by 2020, but certainly within our lifetimes.

That brings entirely new situations involving ideas like Social Security and the entire idea of "retirement", not to mention population growth and resource availability.

I totally agree that software will be the largest hurdle on the road to true "strong AIs", but I also think that when most people ponder the possibility of achieving that goal they are doing so in the context of our current technological tools. The leap we've had specifically in data mining has been incredible... Kurzweil spoke of Google Images as one of the main reasons why his computer program was able to teach itself how to distinguish between dogs and cats. We have so much information available to us today (and still increasing) that it is becoming increasingly possible to create learning AIs based on that information.

I don't think it will be terribly long before the same thing is done with medical knowledge, which will be very interesting when combined with voice recognition. The biggest hurdle will be teaching computers to understand the meaning behind what people are telling it, but that is an area of AI that is increasing rapidly as well. There are already programs for cell phones that can translate your speech into other languages. It won't happen overnight, but when you're talking about the current speed of technological progress, 2030 is not an unreasonable prediction for something like that.

Capitalism Hits The Fan

flavioribeiro says...

>> ^chilaxe:
Creating human-level AI isn't an issue of processing power.
We've had for some time supercomputers that already possess greater processing power than the human brain, and that doesn't mean that they do anything more than crunch lots of numbers.
Human-level AI will be about organizing that processing power in a way that's similar to the human brain's functionality.


Exactly. The human-level AI problem is about knowledge representation and automated reasoning, and not about raw processing power. And as much as "futurists" like to describe a future filled with intelligent machines, no one has a clue about how to efficiently encode and process knowledge.

Capitalism Hits The Fan

chilaxe says...

Creating human-level AI isn't an issue of processing power.

We've had for some time supercomputers that already possess greater processing power than the human brain, and that doesn't mean that they do anything more than crunch lots of numbers.

Human-level AI will be about organizing that processing power in a way that's similar to the human brain's functionality.

Regenerative medicine and reprogenetics are much closer on the horizon, and have just as much potential to transform society.

Capitalism Hits The Fan

Psychologic says...

Supercomputers have accomplished increased computing power though increasing parallelism for years. Personal computers currently have no need for increased speed in most cases, so there hasn't been a big push for faster PCs. For most people, computers are already fast enough.

Most of the recent trends, other than parallelism, have been reducing the size of computing while maintaining processing power. The latest cell phones are much more powerful than three years ago, for instance. We already have working prototypes of computerized clothing, and the progress in driverless vehicles over the past few years has been amazing.

Yes, we are nearing the end of advances in transistor-based computing, but quantum computing is also nearing functionality. There are still a few issues to work out, but that is only a matter of time as (limited) working QCs have already been demonstrated.

Of course, a lot of this is demand-based. We've had very functional voice-interface computing for years, but no one has really found a viable market for it. Right now I'm much happier typing than speaking when I'm writing a paper or programming, but that may change as computers gain the ability to do more on their own. Who knows how the current economic problems with affect demand-driven innovation though.

Like I said, most of this is a question about timing, not about ability. Biology is effective, but very inefficient. It is only a matter of time before we create computing that is both more effective and more efficient than what has been created through evolutionary processes. It may happen by 2030, or it maybe be 2060, but with the constant advances in medicine it is very likely that we will all be around to see it happen (again, assuming nothing catastrophic happens along the way).

Capitalism Hits The Fan

jwray says...

Processor core frequencies have been nearly stagnant for 3 years. They're building more and more parallelism instead. The capacity to directly simulate a human brain is still far beyond the most powerful supercomputer. Storing only an abstract representation of all of the connections between neurons in a human brain would require 5 petabytes (10^11 neurons * 10^4 connections per neuron * 5 bytes per connection)

Clarkson Kills a Corvette... With a Helicopter

rychan says...

The Corvette is a colossal piece of crap and a near perfect example of why I wouldn't buy an American car. I'm an American and I know well how stupid most of us are.

Luckily engineering in America isn't done democratically, so I think your logic is faulty.

The US has the best higher education in the world, and the best engineers in the world. But frankly, the most talented mechanical and electrical engineers are probably more interested in building space equipment, nuclear submarines, fighter jets, supercomputers, or even helicopter gunships.

Compare this to 1980 Japan, where their brightest mechanical engineers would build either robots (smallish industry) or cars, because there wasn't much of an aerospace, defense, semiconductor, or space industry.

These days I think Europeans (EADS) and Japanese have more space, semiconductor, and defense industries attracting talent, while car engineering is becoming sexy with the interest in hybrid, fuel cell, and intelligent cars, so maybe things will even out more.

I did know an MIT Mech E who went to Ford out of school, but he left relatively quickly. I know a Gatech Mech E who's happily working at Lockheed Martin.

Fastest Computer in the World - Project Roadrunner

Fastest Computer in the World - Project Roadrunner

A Nerd's Dream - supercomputer built from 8 GPUs

siftbot says...

Tags for this video have been changed from 'fastra, 9800, GX2, Lian Li, Happy Geeks' to 'fastra, 9800, GX2, Lian Li, Happy Geeks, tomography, antwerp, supercomputer, gpu' - edited by lucky760

A Nerd's Dream - supercomputer built from 8 GPUs

deathcow says...

I take it these cards are really good at tomographical reconstruction because it's a job involving images and these cards are designed to process images. The video goes on about the supercomputing equivalency of the box though, but I am thinking the whole time -- yeah -- for graphics work. Can a video card be harnessed to speed up other tasks? How about faster rendering in programs like 3D studio max? Thats image-y. Obviously not I guess or they'd be doing it already.

Alienware's Curved Monitor

Seal - Crazy



Send this Article to a Friend



Separate multiple emails with a comma (,); limit 5 recipients






Your email has been sent successfully!

Manage this Video in Your Playlists

Beggar's Canyon