Saturday, November 01, 2008

Science Fiction or Future Reality?

*******
Patent for killer chip denied in GermanyFri, 08 May 2009
http://www.presstv.ir/detail.aspx?id=94070&sectionid=351020604
Tiny ID chips can be inserted under the skinA Saudi inventor's proposal to insert semiconductors subcutaneously in visitors and remotely kill them if they misbehave will not be patented in Germany.
On Wednesday, a German Patent Office spokeswoman said the application was received on October 30, 2007 and published 18 months later, as required by law, in a patents database. But inventions that are unethical or a danger to the public are not recognized.
Reporters said the document proposed that tiny semiconductors be implanted or placed by injection under the skin of people so their whereabouts could be tracked by global-positioning satellites. This could be used to prevent immigrants overstaying.
A model B of the system would contain a poison such as cyanide, which could be released by remote control to "eliminate" people if they became a security risk. The document said this could be used against terrorists or criminals.
Microchip implantation in humans has raised new ethical discussions by scientific professional forums, academic groups, human rights organizations, government departments and religious groups.
The Council on Ethical and Judicial Affairs (CEJA) of the American Medical Association published a report in 2007 alleging that RFID implanted chips may compromise privacy because there is no assurance that the information contained in the chip can be properly protected, notwithstanding health risks (chips can travel under the skin).
*******
"Wired for War: The Robotics Revolution and Conflict in the 21st Century”
06 February 2009
http://www.democracynow.org/2009/2/6/wired_for_war_the_robotics_revolution
The US has carried out thirty drone attacks on alleged al-Qaeda targets inside Pakistani territory since last summer, killing an estimated 250 people. The Predator attacks highlight the US military’s increased use of unmanned aerial vehicles and other robotic devices on the battlefield. We speak to P.W. Singer, a former defense policy adviser to President Obama’s election campaign and author of Wired For War: The Robotics Revolution and Conflict in the 21st Century.
Amy Goodman: We turn now to war. Three days after President Obama took office, an unmanned US Predator drone fired missiles at houses in Pakistan’s Administered Tribal Areas. Twenty-two people were reported killed, including three children. According to a tally by Reuters, the US has carried out thirty such drone attacks on alleged al-Qaeda targets inside Pakistan since last summer, killing some 250 people.
The Predator attacks highlight the US military’s increased use of unmanned aerial vehicles and other robotic devices on the battlefield. At the start of the Iraq war, the US had only a handful of drones in the air. Today the US has over 5,300 drones. They have been used in Iraq, in Afghanistan, Pakistan, Somalia and Yemen, as well as here at home. The Department of Homeland Security uses drones to patrol the US-Mexico border.
There has been a similar boom in the use of ground robotics. When US forces went into Iraq in 2003, they had zero robotic units on the ground. Now they have as many as 12,000. Some of the robots are used to dismantle landmines and roadside bombs, but a new generation of robots are designed to be fighting machines. One robot, known as SWORDS, can operate an M-16 rifle and a rocket launcher.
A new book has just come out examining how robots will change the ways wars are fought. It’s called Wired for War: The Robotics Revolution and Conflict in the 21st Century. The author, P.W. Singer, joins me here in the firehouse. He’s a senior fellow at the Brookings Institution, served as coordinator of the Obama campaign’s defense policy task force. He is also the author of Corporate Warriors and Children at War.
Welcome to Democracy Now!
P.W. Singer: Thanks for having me.
Amy Goodman: Let’s start with Pakistan. Explain what these unmanned drones are.
P.W. Singer: Well, you’re talking about systems that can be flown remotely. So, the planes, these Predator drones, are taking off from places in, for example, Afghanistan, but the pilots are physically sitting in bases in Nevada. And there, you have incredible capabilities. They can see from a great distance. They can stay in the air for twenty-four hours. And so, they’re very valuable in going after these insurgent and terrorist hide sites, which is in, you know, mountainous terrain, and it would be difficult to get US troops in.
But the flipside is that there’s a question of what’s the message that we think we are sending with these systems versus the message that’s being received on the ground, in terms of the broader war of ideas.
Amy Goodman: What do you mean?
P.W. Singer: Well, so, I spent the last several years going around trying to meet with everyone engaged in this robotics revolution, everything from the scientists behind it to the science fiction authors influencing them, to the drone pilots, to the four-star generals, but also went out and interviewed people in the region.
And this question of messaging, one of the people that I met with was a senior Bush administration official, and he said, “The unmanning of war plays to our strength. The thing that scares people is our technology.” But that’s very different when you go meet with someone, for example, in Lebanon. One of the people that I met with for the book was an editor of a leading newspaper there. And he had to say that basically this shows that you are cowardly, that you are not man enough to come fight us. So a disconnect between message we think we’re sending versus message that’s being received.
Or another illustration of this would be, there was a popular music—one of the hit songs in Pakistan last year talked about how the Americans look at us like insects. Shows you how it’s permeating pop culture. So you have this balancing act that we’re facing between short-term goals and long-term goals.
Amy Goodman: P.W. Singer, the SWORDS, the CRAM, the PackBot—you talk about the robots taking on the “Three D’s.”
P.W. Singer: The “Three D’s” are roles that are dull, dirty or dangerous, and they’re basically areas where they’ve found robotics have been useful. Dull would be, you know, you can’t keep your eyes open thirty hours straight; a robot can. So it can monitor empty desert for just in case something happens. Dirty is the environment. It can operate not only in, you know, chemical or biological but also in dust storms or at night. We can’t see at night. Things like that. And then, of course, dangerous is you can send out a robot to do things that you wouldn’t send a soldier to do. And the sort of joke of it is that, you know, when it comes to war, you are the weakest link.
Now, the problem is, what are the implications of that for our democracy? So, for example, if you are sending less and less Americans into harm’s way, does it make you more cavalier about the use of force? And one of the people that was fascinating that I interviewed was a former assistant secretary of Defense for Ronald Reagan, who actually had this to say about these systems. He worried that they would create more marketization of war, as he put it. We might have more shock and awe talk to defray discussion of the true costs of war.
Amy Goodman: But that is a very serious issue, when—I mean, the time when wars are ended is when one side cannot take the number of casualties, that, for example, if your soldiers that are fighting are being killed. But if they’re robots…
P.W. Singer: I mean, the concern I have is that it takes certain trends that are already in play in our body politic. We don’t have declarations of war anymore. We don’t have a draft. We don’t buy war bonds anymore. We don’t pay higher taxes for war. And now you have the fact that you may be sending more and more machines instead people. And so, you may be taking the already lowering bars to war and dropping them to the ground.
And then there’s another part of this, of course, is it changes the public’s relationship with war. It’s not just that the public is becoming de-linked, but remember, these machines record everything that they see. And so, we have the rise of what I call YouTube war. That is, you can go on YouTube right now and download video clips of combat footage, much of it from these drones. And so, in some ways, you could say that’s a good thing. The home front and war front are finally connected. You can see what’s going on. But we all know this is taking place in sort of our weird, strange world, and these video clips have become a form of entertainment for people. The soldiers call it war porn.
Amy Goodman: P.W. Singer, you write about robots making mistakes, like in South Africa in 2007, a software glitch in a robotic gun; in ’88, a semi-automatic defense system of the USS Vincennes accidentally shoots down an Iranian passenger plane, killing all 290 people on board, including sixty-six children.
P.W. Singer: The challenge here is that while we are gaining incredible capabilities with these systems, it doesn’t end the human mistakes and dilemmas behind them. Another way of putting it is, a lot of people are familiar with Moore’s Law, the idea that you can pack in more and more computing power, such that they double in their power every two years. It’s the reason why the Pentagon in 1960 had the amount of computing power that you and I can get from a Hallmark card right now. Now, Moore’s Law is certainly operative. These systems are getting better and better. But Murphy’s Law is still in place. And so, you get what robot executives call these “oops” moments with robots, when things don’t work out the way you want. And it’s just like, you know, our laptop computers crash. Well, imagine your laptop computer crashing with an M-16 rifle.
Amy Goodman: How does international law address robots in war?
P.W. Singer: The problem right now is we don’t have a good answer to that question. Some of the people that I met with for the book were at both the International Red Cross and then also at Human Rights Watch. And there’s two sort of interesting things that came out of that.
At the Red Cross, they basically said, “There’s so much going on in the world that’s bad, we don’t have time to focus on something like this, something that’s like this kind of science fiction.” And that’s a completely justifiable thing to say. I mean, there’s a lot of bad things going on in the world, you know, Darfur, for example. But you could have said the exact same thing back in, say, 1942, where there was lots of bad things going on then, but it doesn’t mean that we shouldn’t have, if we had had the chance, launched a discussion about atomic weapons back then. And unlike with atomic weapons, robotics isn’t being worked on in secret desert labs that no one knows about; it’s in our face. It’s being worked on in, you know, suburban factories we all know about. We can see the video clips of it.
The Human Rights Watch visit was sort of interesting, but both funny, because while I was there, two of their lead people got in an argument over whether the Geneva Conventions were the best guideline or the Star Trek prime directive. And it kind of points to that we’re grasping at straws right now when it comes to regulating these machines. And this is—again, as you pointed out, this isn’t science fiction. We have 12,000 of them on the ground right now.
Amy Goodman: What happens if a robot commits a massacre?
P.W. Singer: It’s a great question. You know, who do you hold responsible? Do you hold responsible the human operator? Do you hold responsible the commander who authorized them there? Do you hold responsible the software engineer who wrote it wrong? And how do you hold them accountable? We don’t have good answers.
And what was funny is, one person that I interviewed was a Pentagon robotic scientist. And he said, “You know what? You’re wrong. There’s no social or ethical or legal dimensions with robotics in war that we have to figure out.” He said, “That is, unless the machine kills the wrong person repeatedly.” Quote, “Then it’s just a product recall issue.” That isn’t the way I think we should be looking at the social, ethical and legal dimensions of all of this. And that’s why we need to launch a discussion about it. Otherwise, we’re going to make the very same mistake that a past generation did with atomic bombs, you know, not talking about them until Pandora’s box is already opened. [Read entire interview at: http://www.democracynow.org/2009/2/6/wired_for_war_the_robotics_revolution]
*******

*******
Army Orders Pain Ray Trucks; New Report Shows 'Potential for Death'For $25 Million, Army Buys System That Drives Off Rioters With Microwave-Like BeamBy David Hambling
Oct. 11, 2008— http://abcnews.go.com/Technology/story?id=6007823&page=1
After years of testing, the Active Denial System -- the pain ray which drives off rioters with a microwave-like beam -- could finally have its day. The Army is buying five of the truck-mounted systems for $25 million. But the energy weapon may face new hurdles, before it's shipped off to the battlefield; a new report details how the supposedly non-lethal blaster could be turned into a flesh-frying killer.
The contract for the pain ray trucks is "expected to be awarded by year's end," Aviation Week notes. "A year after the contract is signed, the combination vehicle/weapons will start be fielded at the rate of one per month."
It's been a very long time coming. As we've previously reported, there have been calls to deploy the Active Denial System in Iraq going back to 2004. But it's always been delayed for legal, political, and public relations reasons. Anything that might be condemned as torture is political dynamite. Interestingly, the version being bought is not the full-size "Version 2," but a containerized system known as Silent Guardian, which Raytheon have been trying to sell for some time. They describe Silent Guardian as "roughly 1/3 the size and power of the other Active Denial Systems," and quote it's range as "greater than 250 meters." The larger system has a range somewhere in excess of 700 meters.
Silent Guardian weighs a shade over 10,000 pounds all up, and will be mounted on an "armored ruggedized HEMTT [Heavy Expanded Mobility Tactical Truck]."
The announcement arrives on the same day as a new report from less-lethal weapons expert Dr. J├╝rgen Altmann that analyzes the physics of several directed energy weapons, including Active Denial, the Advanced Tactical Laser (used as a non-lethal weapon), the Pulsed Energy Projectile (a.k.a. "Maximum Pain" laser) and the Long Range Acoustic Device (a.k.a. "Acoustic Blaster").
Dr. Altmann describes the Active Denial beam in some detail, noting that it will not be completely uniform; anyone unlucky enough to be caught in the center will experience more heating than someone at the edge. And perhaps more significant is his thorough analysis of the heating it produces -- and the cumulative effect if the target does not have the chance to cool down between exposures. In U.S. military tests, a fifteen-second delay between exposures was strictly observed; this may not happen when the ADS is used for real.
"As a consequence, the ADS provides the technical possibility to produce burns of second and third degree. Because the beam of diameter 2 m and above is wider than human size, such burns would occur over considerable parts of the body, up to 50% of its surface. Second- and third-degree burns covering more than 20% of the body surface are potentially life-threatening due to toxic tissue-decay products and increased sensitivity to infection and require intensive care in a specialized unit. Without a technical device that reliably prevents re-triggering on the same target subject, the ADS has a potential to produce permanent injury or death. "
This potential hazard need not be a show-stopper -- existing less-lethals, such as plastic bullets and tear gas, can also be fatal under some circumstances (and I'm not even going to get into the argument about Tasers).
Dr. Altmann notes that "the present analysis has not found convincing arguments that the ADS would be immoral or illegal in each foreseeable circumstance," and that acceptance will depend very much on how it is used. If the ADS prevents small boats from approaching a U.S. vessel without harming anyone, then it will be seen as a humane option. If it is used to clear protesters out of the way it may be seen differently.
Meanwhile, the National Institute of Justice is still has a reported interest in a "hand-held, probably rifle-sized, short range weapon that could be effective at tens of feet for law enforcement officials." That's just one of the likely domestic applications of Active Denial technology which are likely to follow if the Army's experiment with ADS is successful. A lot of people will be watching this one very closely.
Copyright © 2008 ABC News Internet Ventures
*******
The Pentagon's Ray Gun
June 1, 2008 http://www.cbsnews.com/stories/2008/02/29/60minutes/main3891865.shtml
(CBS) This story was originally broadcast on March 2, 2008. It was updated on May 30, 2008. What if we told you the Pentagon has a ray gun? And what if we told you it can stop a person in his tracks without killing or even injuring him? Well, it’s true. You can’t see it, you can't hear it, but as CBS News correspondent David Martin experienced first hand, you can feel it. Pentagon officials call it a major breakthrough which could change the rules of war and save huge numbers of lives in Iraq. But it's still not there. That because in the middle of a war, the military just can't bring itself to trust a weapon that doesn't kill. It's a gun that doesn't look anything like a gun: it's that flat dish antenna which shoots out a 100,000-watt beam at the speed of light, hitting any thing in its path with an intense blast of heat. An operator uses a joystick to zero in on a target. Visible only with an infrared camera, the gun, when fired emits a flash of white hot energy - an electromagnetic beam made up of very high frequency radio waves. Col. Kirk Hymes, head of the Joint Non-Lethal Weapons Directorate, is in charge of the ray gun which is being tested at Moody Air Force Base in southern Georgia. The targets at the base are people, military volunteers creating a scenario soldiers might encounter in Iraq, like angry protestors advancing on American troops, who have to choose between backing down or opening fire. Off in the distance, half a mile away, the operator of the ray gun has the crowd in his sights. Unlike the soldiers on the ground, he has no qualms about firing away because his weapon won't injure anyone. He squeezes off a blast and the first shot hits like an invisible punch. The protestors regroup and he fires again, and again. Finally they’ve had enough. The ray gun drives them away with no harm done. Officially called the "Active Denial System," it does penetrate the body, but just barely. What happens when the beam hits a person? "It's absorbed in the top layer, 1/64th of an inch, which is about three sheets of paper that you’d find in your printer," Col. Hymes explains. "And it’s hitting what inside that 1/64th of an inch?" Martin asks. "Well, right within that 1/64th of an inch is where the nerve endings are," Hymes says. You have to feel the ray gun to believe it, and there's only one way to do that. Martin, who voluntarily became a target, described the sensation of being hit by the ray gun like scalding water. What makes this a weapon like no other is it inflicts enough pain to make you instantly stop whatever it is you’re doing. But the second you get out of the beam the pain vanishes. And as long as it's been used properly, there's no harm to your body.
*******
Nonfatal weapons touted for U.S. use
Air Force secretary says they should be used for domestic crowd control
The Associated Press
updated 8:02 p.m. ET, Tues., Sept. 12, 2006, http://www.msnbc.msn.com/id/14806772/
WASHINGTON - Nonlethal weapons such as high-power microwave devices should be used on American citizens in crowd-control situations before they are used on the battlefield, the Air Force secretary said Tuesday.
Domestic use would make it easier to avoid questions in the international community over any possible safety concerns, said Secretary Michael Wynne.
“If we’re not willing to use it here against our fellow citizens, then we should not be willing to use it in a wartime situation,” said Wynne. “(Because) if I hit somebody with a nonlethal weapon and they claim that it injured them in a way that was not intended, I think that I would be vilified in the world press.”
The Air Force has funded research into nonlethal weapons, but he said the service isn’t likely to spend more money on development until injury issues are reviewed by medical experts and resolved.
Nonlethal weapons generally can weaken people if they are hit with the beam. Some of the weapons can emit short, intense energy pulses that also can be effective in disabling some electronic devices.
On another subject, Wynne said he expects to pick a new contractor for the next generation of aerial refueling tankers by next summer. He said a draft request for bids will be put out next month, and there are two qualified bidders: The Boeing Co. and a team of Northrop Grumman Corp. and European Aeronautic Defence and Space Co., the majority owner of European jet maker Airbus SAS.
The contract is expected to be worth at least $20 billion.
Chicago-based Boeing lost the tanker deal in 2004 amid revelations that it had hired a top Air Force acquisitions official who had given the company preferential treatment.
Air Force tightens its beltWynne also said the Air Force, which is already chopping 40,000 active duty, civilian and reserves jobs, is now struggling to find new ways to slash about $1.8 billion from its budget to cover costs from the latest round of base closings.
He said he can’t cut more people, and it would not be wise to take funding from military programs that are needed to protect the country. But he said he also encounters resistance when he tries to save money on operations and maintenance by retiring aging aircraft.
“We’re finding out that those are, unfortunately, prized possessions of some congressional districts,” said Wynne, adding that the Air Force will have to “take some appetite suppressant pills.” He said he has asked employees to look for efficiencies in their offices.
The base closings initially were expected to create savings by reducing Air Force infrastructure by 24 percent.
*******
RNC to Feature Unusual Forms of SoundUnusual Forms of Sound to Emanate From RNC
By Amanda Onion, http://abcnews.go.com/Technology/story?id=99472
Aug. 25, 2004
Coming soon to a convention near you: Sound like it has never (or at least, rarely) been heard before.
As politicians at the Republican National Convention use microphones to make themselves heard from the podium, other sounds in and around the event will be emitted in cutting-edge audio technology.
Outside the convention hall, New York City police plan to control protesters using a device that directs sound for up to 1,500 feet in a spotlight-like beam. Meanwhile, a display of former Republican presidents inside the hall will feature campaign speeches that are funneled to listeners through highly focused audio beams.
"These are totally different from the way an ordinary speaker emits sound," said Elwood (Woody) Norris, founder and head of American Technology Corp. of San Diego. "It's like it's inside your head."
Norris, an intrepid entrepreneur who has no college degree but more than 43 patents to his name, invented both the crowd control tool, called the Long Range Acoustical Device (LRAD), and the display audio technology, called HyperSonic Sound (HSS).
Both technologies feature unprecedented manipulation of sound, but for very different purposes. And while both technologies have unique, "gee-whiz" factors, some remain uneasy with the idea of using sound to control crowds.
"It produces sound in a way that for most people will be a novel experience, so I think it has potential to create confusion and panic," said Richard Glen Boire, founder of the Center for Cognitive Liberty and Ethics in Davis, Calif. "It can't be identified, it's an invisible force."
Sound as a WeaponIn fact, LRAD, which is 33 inches in diameter and looks like a giant spotlight, has been used by the U.S. military in Iraq and at sea as a non-lethal force. In these settings, operators can use the device not only to convey orders, but also as a weapon.
When in weapon mode, LRAD blasts a tightly controlled stream of caustic sound that can be turned up to high enough levels to trigger nausea or possibly fainting. The operators themselves remain unaffected since the noise is contained in its focused beam.
"We've devised a system with a multiplicity of individual speakers that are phased so sound that would normally go off to the side or up or down, cancels out, while sound directly in front is reinforced," Norris explained. "It's kind of like the way a lens magnifies a beam of light."
The Department of Defense gave Norris and his team funding to develop LRAD following the 9/11 attacks. The concept is to offer an intermediate tool to warn and ward off attacking combatants before resorting to force.
"Regular bullets don't have volume control on them," said Norris. "With this, you just cause a person's ears to ring."
The NYPD, however, has said they won't be using the $35,000 tool to make people's ears ring, but only as a communication device.
"We're only going to use them for safety announcements and directions," said Paul Browne, a police spokesman.
*******
Packs of Robots will Hunt Down Uncooperative Humans
October 22, 2008 6:00 PM
The latest request from the Pentagon jars the senses. At least, it did mine. They are looking for contractors to provide a "Multi-Robot Pursuit System" that will let packs of robots "search for and detect a non-cooperative human".One thing that really bugs defence chiefs is having their troops diverted from other duties to control robots. So having a pack of them controlled by one person makes logistical sense. But I'm concerned about where this technology will end up.Given that iRobot last year struck a deal with Taser International to mount stun weapons on its military robots, how long before we see packs of droids hunting down pesky demonstrators with paralysing weapons? Or could the packs even be lethally armed? I asked two experts on automated weapons what they thought - click the continue reading link to read what they said.
Both were concerned that packs of robots would be entrusted with tasks - and weapons - they were not up to handling without making wrong decisions.Steve Wright of Leeds Metropolitan University is an expert on police and military technologies, and last year correctly predicted this pack-hunting mode of operation would happen. "The giveaway here is the phrase 'a non-cooperative human subject'," he told me:
"What we have here are the beginnings of something designed to enable robots to hunt down humans like a pack of dogs. Once the software is perfected we can reasonably anticipate that they will become autonomous and become armed.We can also expect such systems to be equipped with human detection and tracking devices including sensors which detect human breath and the radio waves associated with a human heart beat. These are technologies already developed."
Another commentator often in the news for his views on military robot autonomy is Noel Sharkey, an AI and robotics engineer at the University of Sheffield. He says he can understand why the military want such technology, but also worries it will be used irresponsibly.
"This is a clear step towards one of the main goals of the US Army's Future Combat Systems project, which aims to make a single soldier the nexus for a large scale robot attack. Independently, ground and aerial robots have been tested together and once the bits are joined, there will be a robot force under command of a single soldier with potentially dire consequences for innocents around the corner."
What do you make of this? Are we letting our militaries run technologically amok with our tax dollars? Or can robot soldiers be programmed to be even more ethical than human ones, as some researchers claim?Paul Marks, technology correspondent
http://www.newscientist.com/blogs/shortsharpscience/2008/10/packs-of-robots-will-hunt-down.html?DCMP=ILC-hmts&nsref=specrt
*******

'Robot arms race’ underway, expert warns
12:10 27 February 2008
NewScientist.com news service
Tom Simonite
http://technology.newscientist.com/channel/tech/dn13382-robot-arms-race-underway-expert-warns.html
Governments around the world are rushing to develop military robots capable of killing autonomously without considering the legal and moral implications, warns a leading roboticist. But another robotics expert argues that robotic soldiers could perhaps be made more ethical than human ones.
Noel Sharkey of Sheffield University, UK, says he became "really scared" after researching plans outlined by the US and other nations to roboticise their military forces. He will outline his concerns at a one-day conference in London, UK, on Wednesday.
Over 4000 semi-autonomous robots are already deployed by the US in Iraq, says Sharkey, and other countries – including several European nations, Canada, South Korea, South Africa, Singapore and Israel – are developing similar technologies.
Crucial decisions
In December 2007, the US Department of Defense (DoD) published an "Unmanned systems roadmap" proposing to spend about $4 billion by 2010 on robotic weapons, a figure that will later rising to about $24 bn.
Sharkey is most concerned about the prospect of having robots decide for themselves when to "pull the trigger". Currently, a human is always involved in decisions of this nature. But the Pentagon is nearly 2 years into a research programme aimed at having robots identify potential threats without human help.
"The main problem is that these systems do not have the discriminative power to do that," he says, "and I don't know if they ever will."
The US and other governments have also set a very short timeframe to achieve such sophistication, says Sharkey. "It is based I think on a mythical view of AI."
Temporary ban
Governments and robotics engineers should re-examine current plans, and perhaps consider an international ban on autonomous weapons for the time-being, he suggests. "We have to say where we want to draw the line and what we want to do – and then get an international agreement."
After writing publicly of his concerns, he says engineers working for the US military have contacted him with similar worries. "Some wrote to thank me for speaking out," he says.
Ronald Arkin, a robotics researcher at Georgia Tech University, US, says that Sharkey is right to be concerned. "We definitely need to be discussing this more," he says. However, he believes that robots could ultimately become a more ethical fighting force.
'Moral responsibility'
As governments seem determined to invest in robotic weapons, Arkin suggests trying to design ethical control systems that make military robots respect the Geneva Convention and other rules of engagement on the battlefield.
"I have a moral responsibility to make sure that these weapons are introduced responsibly and ethically, and reduce risks to non-combatants," he says.
Arkin also notes that human combatants are far from perfect on the battlefield. "With a robot I can be sure that a robot will never harbour the intention to hurt a non-combatant," he says. "Ultimately they will be able to perform better than humans."
Arkin is using computer simulations to test whether ethical control systems can be used in battlefield scenarios, some of which are modelled on real-life events.
Refusing orders
One involved an Apache helicopter attacking three men laying roadside bombs in Iraq. Two were killed, and the third clearly wounded. The pilot is ordered by a superior to kill the incapacitated man, and reluctantly does so.
"I still find the video of that event disturbing," says Arkin, "I hope an autonomous system could realise that man was clearly incapacitated, effectively a prisoner of war and should not have been killed."
"One of the fundamental abilities I want to give [these systems] is to refuse an order and explain why."
Yet Arkin does not think battlefield robots can be made a smart as human soldiers. "We cannot make them that generally intelligent, they will be more like dogs, used for specialised situations," he says
But he is so far concentrating his research scenarios involving armies. "For those situations we have very clear cut guidance from the Geneva Convention, the Hague and elsewhere about what is ethical," he explains.
*******
When Seeing and Hearing Isn't BelievingBy William M. Arkin Special to washingtonpost.com, Monday, Feb. 1, 1999 http://www.washingtonpost.com/wp-srv/national/dotmil/arkin020199.htm
"Gentlemen! We have called you together to inform you that we are going to overthrow the United States government." So begins a statement being delivered by Gen. Carl W. Steiner, former Commander-in-chief, U.S. Special Operations Command.
At least the voice sounds amazingly like him.
But it is not Steiner. It is the result of voice "morphing" technology developed at the Los Alamos National Laboratory in New Mexico.
By taking just a 10-minute digital recording of Steiner's voice, scientist George Papcun is able, in near real time, to clone speech patterns and develop an accurate facsimile. Steiner was so impressed, he asked for a copy of the tape.
Steiner was hardly the first or last victim to be spoofed by Papcun's team members. To refine their method, they took various high quality recordings of generals and experimented with creating fake statements. One of the most memorable is Colin Powell stating "I am being treated well by my captors."
"They chose to have him say something he would never otherwise have said," chuckled one of Papcun's colleagues.
A Box of Chocolates is Like War
Most Americans were introduced to the tricks of the digital age in the movie Forrest Gump, when the character played by Tom Hanks appeared to shake hands with President Kennedy.
For Hollywood, it is special effects. For covert operators in the U.S. military and intelligence agencies, it is a weapon of the future.
"Once you can take any kind of information and reduce it into ones and zeros, you can do some pretty interesting things," says Daniel T. Kuehl, chairman of the Information Operations department of the National Defense University in Washington, the military's school for information warfare.
Digital morphing — voice, video, and photo — has come of age, available for use in psychological operations. PSYOPS, as the military calls it, seek to exploit human vulnerabilities in enemy governments, militaries and populations to pursue national and battlefield objectives.
To some, PSYOPS is a backwater military discipline of leaflet dropping and radio propaganda. To a growing group of information war technologists, it is the nexus of fantasy and reality. Being able to manufacture convincing audio or video, they say, might be the difference in a successful military operation or coup.
Allah on the Holodeck
Pentagon planners started to discuss digital morphing after Iraq's invasion of Kuwait in 1990. Covert operators kicked around the idea of creating a computer-faked videotape of Saddam Hussein crying or showing other such manly weaknesses, or in some sexually compromising situation. The nascent plan was for the tapes to be flooded into Iraq and the Arab world.
The tape war never proceeded, killed, participants say, by bureaucratic fights over jurisdiction, skepticism over the technology, and concerns raised by Arab coalition partners.
But the "strategic" PSYOPS scheming didn't die. What if the U.S. projected a holographic image of Allah floating over Baghdad urging the Iraqi people and Army to rise up against Saddam, a senior Air Force officer asked in 1990?
According to a military physicist given the task of looking into the hologram idea, the feasibility had been established of projecting large, three-dimensional objects that appeared to float in the air.
But doing so over the skies of Iraq? To project such a hologram over Baghdad on the order of several hundred feet, they calculated, would take a mirror more than a mile square in space, as well as huge projectors and power sources.
And besides, investigators came back, what does Allah look like?
The Gulf War hologram story might be dismissed were it not the case that washingtonpost.com has learned that a super secret program was established in 1994 to pursue the very technology for PSYOPS application. The "Holographic Projector" is described in a classified Air Force document as a system to "project information power from space ... for special operations deception missions."
War is Like a Box of ChocolatesVoice-morphing? Fake video? Holographic projection? They sound more like Mission Impossible and Star Trek gimmicks than weapons. Yet for each, there are corresponding and growing research efforts as the technologies improve and offensive information warfare expands.
Whereas early voice morphing required cutting and pasting speech to put letters or words together to make a composite, Papcun's software developed at Los Alamos can far more accurately replicate the way one actually speaks. Eliminated are the robotic intonations.
The irony is that after Papcun finished his speech cloning research, there were no takers in the military. Luckily for him, Hollywood is interested: The promise of creating a virtual Clark Gable is mightier than the sword.
Video and photo manipulation has already raised profound questions of authenticity for the journalistic world. With audio joining the mix, it is not only journalists but also privacy advocates and the conspiracy-minded who will no doubt ponder the worrisome mischief that lurks in the not too distant future.
"We already know that seeing isn't necessarily believing," says Dan Kuehl, "now I guess hearing isn't either."
*******
'Matador' With a Radio Stops Wired Bull
1965-05-17, New York Times http://select.nytimes.com/gst/abstract.html?res=F20817F9395812738DDDAE0994DD4...
The brave bull bore down on the unarmed "matador" — a scientist who had never faced a fighting bull. But the charging animal's horns never reached the man behind the heavy red cape. Moments before that could happen, Dr. Jose M. R. Delgado, the scientist, pressed a button on a small radio transmitter in his hand, and the bull braked to a halt. Then, he pressed another button on the transmitter and the bull obediently turned to the right and trotted away. The bull was obeying commands from his brain that had been called forth by electrical stimulation—by the radio signals—of certain regions in which fine wire electrodes had been painlessly implanted the day before. [Experiments] have shown, he explained, that "functions traditionally related to the psyche, such as friendliness, pleasure or verbal expression, can be induced, modified and inhibited by direct electrical stimulation of the brain." For example, he has been able to "play" monkeys and cats 'like little electronic toys" that yawn, hide, fight, play, mate and go to sleep on command. With humans under treatment for epilepsy, he has increased word output sixfold in one person, has produced severe anxiety in another, and in several others has induced feelings of profound friendliness—all by electrical stimulation of various specific regions of their brains. "I do not know why more work of this sort isn't done," he remarked recently, "because it is so economical and easy." Monkeys will learn to press a button that sends a stimulus to the brain of an enraged member of the colony and calms it down, indicating that animals can be taught to control one another's behavior.
Note: This article shows mind control was being developed over 40 years ago. Though this technology can be used for good purposes, it also can and secretly has been used to manipulate and control for many years.
*******
Live rats driven by remote control
James Meek, science correspondent
The Guardian,
Thursday May 2 2002
http://www.guardian.co.uk/world/2002/may/02/animalwelfare.highereducationScientists have turned living rats into remote-controlled, pleasure-driven robots which can be guided up ladders, through ruins and into minefields at the click of a laptop key.
The project, which is funded by the US military's research arm, Darpa, was partly inspired by the September 11 terrorist attacks on the US, and partly by the earthquake in India last January.
Animals have often been used by humans in combat and in search and rescue, but not under direct computer-to-brain electronic control. The advent of surgically altered roborats marks the crossing of a new boundary in the mechanisation, and potential militarisation, of nature.
Scientists at the State University of New York (Suny) created the roborats by planting electrodes into their brains, a paper in today's edition of the journal Nature reports.
Two electrodes lead to the parts of the rats' brains which normally detect an obstacle against their whiskers. A third plunges into an area of the brain identified as far back as the 1950s as providing the rat with a feeling of pleasure when stimulated.
In 10 sessions the rats learned that if they ran forward and turned left or right on cue, they would be "rewarded" with a buzz of electrically delivered pleasure.
Once trained they would move instantaneously and accurately as directed, for up to an hour at a time. The rats could be steered up ladders, along narrow ledges and down ramps, up trees, and into collapsed piles of concrete rubble.
The Suny team suggests roborats fitted with cameras or other sensors could be used as search and rescue aids in natural disasters such as earthquakes, or in mine clearance.
Sanjiv Talwar, lead author of the Nature paper, said not only did the rats wearing electrodes feel no pain, but they were having a good time.
"If the rat moves left or right as commanded, it feels this burst of happiness," he said. "It follows this sort of cue very accurately. They work only for rewards. They love doing it."
The work on guided rats was an offshoot of earlier research which showed that animals wired up to a processor could command a robotic arm by thought alone, a development which could potentially empower paralysed humans.
Asked to speculate on potential military uses for robotic animals, Dr Talwar agreed they could, in theory, be put to some unpleasant uses, such as assassination.
"Is it possible, objectively? I would imagine, if anybody wanted to do something as absurd as that. But yes, surveillance is pretty straightforward, although for these sort of operations you could use robots. You could apply this to birds ... if you could fit birds with sensors and cameras and the like."
Michael Reiss, professor of science education at London's Institute of Education and a leading bioethics thinker, said: "It could be argued that we have, for 10,000 years or more, pushed farm animals around and directed their behaviour, but this clearly involves a degree of control and degree of invasiveness that in most people's eyes is a step change."
Prof Reiss said he was uneasy about humankind "subverting the autonomy" of animals. "There is a part of me that is not entirely happy with the idea of our subverting a sentient animal's own aspirations and wish to lead a life of its own."
Dr Talwar said that perhaps there needed to be a wider ethical debate.
But he argued that the roborat programme was not so far from training dogs. "The only thing different, and perhaps creepy, is that instead of whistling or giving food, you're directly tapping into the brain," he said.
*******
Friday, Apr. 18, 2008
Unleashing the Bugs of War
By Mark Thompson/Washington http://www.time.com/time/nation/article/0,8599,1732226,00.html
The Defense Advanced Research Project Agency, that secretive band of Pentagon geeks that searches obsessively for the next big thing in the technology of warfare, is 50 years old. To celebrate, DARPA invited Vice President Dick Cheney, a former Defense Secretary well aware of the Agency's capabilities, to help blow out the candles. "This agency brought forth the Saturn 5 rocket, surveillance satellites, the Internet, stealth technology, guided munitions, unmanned aerial vehicles, night vision and the body armor that's in use today," Cheney told 1,700 DARPA workers and friends who gathered at a Washington hotel to mark the occasion. "Thank heaven for DARPA."
Created in the panicky wake of the Soviets' launching of Sputnik, the world's first satellite, DARPA's mission, Cheney said, is "to make sure that America is never again caught off guard." So, the Agency does the basic research that may be decades away from battlefield applications. It doesn't develop new weapons, as much as it pioneers the technologies that will make tomorrow's weapons better.
So what's hot at DARPA right now? Bugs. The creepy, crawly flying kind. The Agency's Microsystems Technology Office is hard at work on HI-MEMS (Hybrid Insect Micro-Electro-Mechanical System), raising real insects filled with electronic circuitry, which could be guided using GPS technology to specific targets via electrical impulses sent to their muscles. These half-bug, half-chip creations — DARPA calls them "insect cyborgs" — would be ideal for surveillance missions, the agency says in a brief description on its website.
Scientist Amit Lal and his team insert mechanical components into baby bugs during "the caterpillar and the pupae stages," which would then allow the adult bugs to be deployed to do the Pentagon's bidding. "The HI-MEMS program is aimed at developing tightly coupled machine-insect interfaces by placing micro-mechanical systems inside the insects during the early stages of metamorphosis," DARPA says. "Since a majority of the tissue development in insects occurs in the later stages of metamorphosis, the renewed tissue growth around the MEMS will tend to heal, and form a reliable and stable tissue-machine interface." Such bugs "could carry one or more sensors, such as a microphone or a gas sensor, to relay back information gathered from the target destination."
DARPA declined TIME's request to interview Dr. Lal about his program and the progress he is making in producing the bugs. The agency added that there is no timetable for turning backyard pests into battlefield assets. But in a written statement, spokeswoman Jan Walker said that "living, adult-stage insects have emerged with the embedded systems intact." Presumably, enemy arsenals will soon be well-stocked with Raid.
*******
Flying Spy Drones Now the Size of Insects; Unmanned Vehicles Mimic Insects
By David A. Fulghum/Nashua
Posted On: January 16th, 2009
http://republicbroadcasting.org/index.php?cmd=news.article&articleID=3069
Gaggles of mechanical grasshoppers, flies, bees and spiders--each a relatively dumb creature--can be networked into very smart networks to conduct intelligence, surveillance and reconnaissance.
In the last decade, remote sensors arrays have been changing from somewhat obvious, hard-to-mask, mechanical objects to autonomous, self-propelled, insect-like devices that can climb walls or jump up stairs and then lie dormant until motion, noise or vibrations trigger their activation.
The grandchildren and great-grandchildren of "WolfPack"--a coffee-can size, air-dropped network of ground sensors--include fast-moving spiders, high-jumping grasshoppers, bees with detachable surveillance payloads and sensor-equipped dragonflies.
Development of BAE Systems' WolfPack worked out the dynamics of connecting a series of low-cost, not-so-smart sensors to create a very smart network. That network could, for example, monitor and analyze nearby communications and map the information flow. It then could trigger electronic jamming or even the injection of a data stream of algorithms that captures low-power traffic, attacks communications protocol stacks and otherwise manipulates a foe's flow of information. A second-generation WolfPack added a propulsion system to manipulate the modules and recharge the batteries.
"Advanced communications can move intelligence, surveillance and reconnaissance (ISR) around the battlefield in real time," says Lt. Gen. Dave Deptula, the U.S. Air Force's Deputy Chief of Staff for ISR. "These ISR sensors now are transformed into the nodes of a truly global, net-centric weapons system." [Read entire article at: http://republicbroadcasting.org/index.php?cmd=news.article&articleID=3069]
*******
A remote control that controls humans
Headset sends electricity through head, forcing wearer to move
By Yuri Kageyama
The Associated Press
updated 5:47 p.m. ET, Tues., Oct. 25, 2005 http://www.msnbc.msn.com/id/9816703/wid/11915829
ATSUGI, Japan - We wield remote controls to turn things on and off, make them advance, make them halt. Ground-bound pilots use remotes to fly drone airplanes, soldiers to maneuver battlefield robots.
But manipulating humans?
Prepare to be remotely controlled. I was.
Just imagine being rendered the rough equivalent of a radio-controlled toy car. Nippon Telegraph & Telephone Corp., Japans top telephone company, says it is developing the technology to perhaps make video games more realistic. But more sinister applications also come to mind.
I can envision it being added to militaries' arsenals of so-called "non-lethal" weapons.
A special headset was placed on my cranium by my hosts during a recent demonstration at an NTT research center. It sent a very low voltage electric current from the back of my ears through my head _ either from left to right or right to left, depending on which way the joystick on a remote-control was moved.
I found the experience unnerving and exhausting: I sought to step straight ahead but kept careening from side to side. Those alternating currents literally threw me off.
The technology is called galvanic vestibular stimulation — essentially, electricity messes with the delicate nerves inside the ear that help maintain balance.
I felt a mysterious, irresistible urge to start walking to the right whenever the researcher turned the switch to the right. I was convinced — mistakenly — that this was the only way to maintain my balance.
The phenomenon is painless but dramatic. Your feet start to move before you know it. I could even remote-control myself by taking the switch into my own hands.
There's no proven-beyond-a-doubt explanation yet as to why people start veering when electricity hits their ear. But NTT researchers say they were able to make a person walk along a route in the shape of a giant pretzel using this technique.
It's a mesmerizing sensation similar to being drunk or melting into sleep under the influence of anesthesia. But it's more definitive, as though an invisible hand were reaching inside your brain.
NTT says the feature may be used in video games and amusement park rides, although there are no plans so far for a commercial product.
Some people really enjoy the experience, researchers said while acknowledging that others feel uncomfortable.
I watched a simple racing-car game demonstration on a large screen while wearing a device programmed to synchronize the curves with galvanic vestibular stimulation. It accentuated the swaying as an imaginary racing car zipped through a virtual course, making me wobbly.
Another program had the electric current timed to music. My head was pulsating against my will, getting jerked around on my neck. I became so dizzy I could barely stand. I had to turn it off.
NTT researchers suggested this may be a reflection of my lack of musical abilities. People in tune with freely expressing themselves love the sensation, they said.
"We call this a virtual dance experience although some people have mentioned it's more like a virtual drug experience," said Taro Maeda, senior research scientist at NTT. "I'm really hopeful Apple Computer will be interested in this technology to offer it in their iPod."
Research on using electricity to affect human balance has been going on around the world for some time.
James Collins, professor of biomedical engineering at Boston University, has studied using the technology to prevent the elderly from falling and to help people with an impaired sense of balance. But he also believes the effect is suited for games and other entertainment.
"I suspect they'll probably get a kick out of the illusions that can be created to give them a more total immersion experience as part of virtual reality," Collins said.
The very low level of electricity required for the effect is unlikely to cause any health damage, Collins said. Still, NTT required me to sign a consent form, saying I was trying the device at my own risk.
And risk definitely comes to mind when playing around with this technology.
Timothy Hullar, assistant professor at the Washington University School of Medicine in St. Louis, Mo., believes finding the right way to deliver an electromagnetic field to the ear at a distance could turn the technology into a weapon for situations where "killing isn't the best solution."
"This would be the most logical situation for a nonlethal weapon that presumably would make your opponent dizzy," he said via e-mail. "If you find just the right frequency, energy, duration of application, you would hope to find something that doesn't permanently injure someone but would allow you to make someone temporarily off-balance."
Indeed, a small defense contractor in Texas, Invocon Inc., is exploring whether precisely tuned electromagnetic pulses could be safely fired into people's ears to temporarily subdue them.
NTT has friendlier uses in mind.
If the sensation of movement can be captured for playback, then people can better understand what a ballet dancer or an Olympian gymnast is doing, and that could come handy in teaching such skills.
And it may also help people dodge oncoming cars or direct a rescue worker in a dark tunnel, NTT researchers say. They maintain that the point is not to control people against their will.
If you're determined to fight the suggestive orders from the electric currents by clinging to a fence or just lying on your back, you simply won't move.
But from my experience, if the currents persist, you'd probably be persuaded to follow their orders. And I didn't like that sensation. At all.
*******