Not yet a member? No problem!
Sign-up just takes a second.
Forgot your password?
Recover it now.
Already signed up?
Log in now.
Forgot your password?
Recover it now.
Not yet a member? No problem!
Sign-up just takes a second.
Remember your password?
Log in now.
6 Comments
AeroMechanicalsays...There will be autonomous or at least semi-autonomous combat robots and there is no stopping it. For one thing, they've already been in use since the early 80's. Cruise missiles are autonomous killer robots. The only difference is that they don't come back when the job is done, but that's a trivial point. More importantly, having a human always in the loop makes current drones massively less effective weapons and giving a potential enemy such a huge opening for that advantage is unwise. You also really can't ban them because they're simply an assembly of extremely useful civilized tools.
That said, she does hit on a very important point though regarding the application of modern drones, which is that they're being used as terror weapons, which we should not be comfortable with. Having drones loiter day in and day out over civilian populated areas is an inexcusable act of cruelty.
ChaosEnginesays...I don't really have a problem with a weaponized robot. It's no different to a tank or a fighter plane really. The question of "robots" is largely irrelevant. Robots are just another tool that has been weaponized, something that humans have done with everything.
We took knives and made swords.
We took cars and made tanks.
We took planes and made bombers.
And we've almost certainly got some killer satellites.
So putting a gun on a robot isn't really any different than giving one to a soldier, except it can be more easily repaired.
The question is about AI. Once you get an AI and put it in control of a machine, it can probably use that machine to kill you. OTOH, once you put a human in control of a machine, the human can probably use that machine to kill you.
But we're going to develop robots and we're going to develop AI and sooner or later, someone is going to put the two together. You can't ban innovation (well, you can, but it won't stop it). So I'd much rather we build and understand the technology. That's how we learn.
Fundamentally, it's how we use these weapons that will determine our fate, and right now, we're not using them very well.
AeroMechanicalsays...Absolutely. It's a mistake to make assumptions about what AI will be like. The doomsayers too often attribute human qualities to it. It's like speculating about alien intelligence. It will come in bits an pieces as we understand it more. My own guess is that, not weighed down by long obsolete genetic imperatives and human psychological pathologies, it will most likely be (in its higher form) an extraordinarily capable problem solver and prognosticator. It will lack the human flaws that typically motivate the killer AIs of science fiction. Of course, it will probably have it's own unique flaws. I do think it's wise to be wary of software that has developed beyond our capability to understand it (much as we don't understand the workings of our own consciousness).
Probably my primary concern about robotic weapons comes from a DARPA proposal I read about some time in the past. What they wanted was an autonomous, bird sized UAV. It would contain surveillance equipment and sensors, and be able to share the data it collects through a mesh network established with it's fellows and the commanders as well as receive orders. It would be intelligent enough to find a suitable strategic vantage point and hide itself. From there it would simply observe. With a large enough swarm of these, perhaps many thousands, you could send them into a city at night. They would each also potentially carry a small warhead allowing them to launch themselves at and destroy threats. Once these robots were entrenched, which might only take an hour or two, whoever controls them would effectively rule the city. Even if they were cut off from their command structure, they might still retain enough intelligence to recognize a particular individual, someone in a forbidden area, someone holding a weapon, or someone not brodcasting the right IFF signal, or any number of things. There might be no defense against such a thing (though there probably will be).
To me, that concept is terrifying. It's not huge hulking terminator-like war machines that could be the greatest threat, just flying, self-guiding, intelligent hand grenades. All someone would need is the capability to manufacture them. No raising an army, no speeches or threats, just a factory and a design. It's also not too far fetched to believe this capability might be available in just a matter of a few decades. They'll be easier to build than nuclear weapons, and oh so convenient and easy to deploy.
Um.... anyways, I dunno where I was going with that. Just lots of random pontificating, but because it's technology, it's silly to try to stop it with legislation. It will happen, as ChaosEngine rightly points out, the best course of action is to be on top of it and to understand it.
siftbotsays...Moving this video to kulpims's personal queue. It failed to receive enough votes to get sifted up to the front page within 2 days.
antjokingly says...*engineering *documentary
I didn't see Siftbot.
siftbotsays...Adding video to channels (Engineering) - requested by ant.
Discuss...
Enable JavaScript to submit a comment.