Monday, July 09, 2012

Is This the Future? (Part 3)

*******
Is this a mosquito? No.
It's an insect spy drone for urban areas, already in production, funded by the US Government. It can be controlled and is equipped with a camera and a microphone.
It can land on you, and it may have the potential to take a DNA sample or leave RFID tracking nanotechnology on your skin.
It can be operated to fly through the open window or door and attach to your clothing.
*******
Robot avatar body controlled by thought alone
by Helen Thomson
06 July 2012
Magazine issue 2872
For the first time, a person lying in an fMRI machine has controlled a robot just by thinking about moving.
*******
*******
IMAGINE trekking across the Sahara, popping in to see a friend in France, and admiring Niagara Falls, all without moving from your home. And what if you could do this despite being unable to move your limbs? For the first time, researchers have used fMRI - which detects your brain activity in real time - to allow someone to embody a robot hundreds of kilometres away using thought alone.
"The ultimate goal is to create a surrogate, like in Avatar, although that's a long way off yet," says Abderrahmane Kheddar, director of the CNRS-AIST joint robotics laboratory at the National Institute of Advanced Industrial Science and Technology in Tsukuba, Japan. He is part of an international team that hopes to use this kind of technology to give healthy people and those who are "locked in" - unable to move but fully conscious - the ability to interact with the world using a surrogate body.
Teleoperated robots, those that can be remotely controlled by a human, have been around for decades. Kheddar and his colleagues are going a step further. "True embodiment goes far beyond classical telepresence, by making you feel that the thing you are embodying is part of you," says Kheddar. "This is the feeling we want to reach."
To attempt this feat, researchers with the international Virtual Embodiment and Robotic Re-embodiment project used fMRI to scan the brain of university student Tirosh Shapira as he imagined moving different parts of his body.
Ori Cohen and Doron Friedman from the Advanced Virtuality Lab at the Interdisciplinary Center in Herzliya, Israel, and colleagues first took Shapira through several training stages in which he attempted to direct a virtual avatar by thinking of moving his left or right hand or his legs. The scanner works by measuring changes in blood flow to the brain's primary motor cortex, and using this the team was able to create an algorithm that could distinguish between each thought of movement (see diagram). The commands were then sent via an internet connection to a small robot at the B├ęziers Technology Institute in France.
The set-up allowed Shapira to control the robot in near real time with his thoughts, while a camera on the robot's head allowed him to see from the robot's perspective. When he thought of moving his left or right hand, the robot moved 30 degrees to the left or right. Imagining moving his legs made the robot walk forward.
It takes a little while for the robot to register the thought. "There is a small delay between the start of the neural activity and when we can optimally classify a volunteer's intentions," says Cohen. But he says that subjects can adjust for this by thinking of the intended movement ahead of time.
Shapira took part in three trials, including one in which he was able to move the robot around freely and another where he was instructed to follow a person around a room at the French lab (see video above). In the third trial he successfully piloted his avatar to locate a teapot placed somewhere in the room. To test the extent of his feelings of embodiment, the researchers also surprised him with a mirror (see "On the inside, looking out"). "I really felt like I was there," Shapira says. "At one point the connection failed. One of the researchers picked the robot up to see what the problem was and I was like, 'Oi, put me down!'"
The brain is very easily fooled into incorporating an external entity as its own. Over a decade ago, psychologists discovered that they could convince people that a rubber hand was their own just by putting it on a table in front of them and stroking it in the same way as their real hand. "We're looking at what kinds of sensory illusions we can incorporate at the next stage to increase this sense of embodiment," says Kheddar. One such illusion might involve stimulating muscles to create the sensation of movement (see "Feeling is believing").
The next step is to improve the surrogate. Replacing the current robot with the HRP-4, made by Kawada Industries in Japan, will increase the feeling of embodiment as it is roughly the height of an adult human and has a more stable and dynamic walk, says Kheddar.
The researchers are also fine-tuning their algorithm to look for patterns of brain activity, rather than simply areas that are active. This will allow each thought process to control a greater range of movements. "For example, you could think of moving your fingers at different speeds and we could correspond that with different speeds of walking or turning," says Cohen, who presented the results of the embodiment trials at BioRob 2012 in Rome, Italy, last week.
So far, only healthy people have embodied the surrogate. Next, the researchers, along with Rafael Malach's group at the Weizmann Institute of Science, in Rehovot, Israel, hope to collaborate with groups such as Adrian Owen's at the University of Western Ontario in Canada to test their surrogate on people who are paralysed or locked in.
"I think it is very impressive and in the broadest sense reflects where it is that we are trying to get to in enabling communication in patients who are deemed to be locked in or even vegetative," says Owen. He cautions, though, that there is a long way to go before the technology is able to provide long-term help for patients.
Electroencephalogram (EEG) technology, which uses electrodes attached to the scalp to record electrical activity in the brain, is likely to prove more practical than fMRI, he says, since it is cheaper and more comfortable to use for extended periods of time. Although EEG has been used to control robots, the readings are not yet as clear as those from fMRI. Nevertheless, this demonstration is "an interesting glimpse of what might be possible in the future", Owen says.
If such surrogates were to enter the world, there are some challenges to consider. If it is not immediately clear who is controlling the robot, it could be extremely frustrating for someone interacting with it, says Katherine Tsui of the University of Massachusetts Lowell, who has studied the impact that telepresence robots have in the workplace. "It's a bit like when you share your Skype video feed with someone, but they don't share theirs with you," she says.
Legal issues could also arise. If a surrogate were kicked in the street and broken, it's the same as damaging someone's property, says Noel Sharkey, a roboticist at the University of Sheffield in the UK. "However, if there was a way of the surrogate passing on the sense of touch to the person controlling it, that's a different story."
For now, the team are hoping to develop the surrogate's range of abilities. Shapira is keen to get back into the scanner to help: "Once they add in the ability to hear people talk and speak back to them, then it gets to a whole new level of interesting."
Feeling is believing
To create a real sense of embodiment in your robot avatar, it would help if it felt like your limbs were moving when the robot is walking. Massimo Bergamasco, Antonio Frisoli and their team at the Perceptual Robotics Laboratory in Pisa, Italy, are investigating how to do this by making use of a well- known sensory illusion in which vibrations applied to tendons evoke the sensation of movement.
The trick works by stimulating sensory receptors called muscle spindles in the muscle associated with the tendon, because the brain interprets signals from the receptors as movement in the associated joint. "If you keep your eyes closed, you get the impression that your hand is moving up or down depending on which muscles are being vibrated," says Daniele Leonardis, who presented the work at BioRob 2012 in Rome, Italy, last week.
Next year, the researchers hope to combine their work with Ori Cohen's robotic embodiment technology (see main story) so that when a surrogate moves a limb, the pilot feels like their own limb is moving too.
On the inside, looking out
Tirosh Shapira stepped into an fMRI in Israel and took on the guise of a little robot in France. He was one of the first people to embody a surrogate robot using this particular method of mind-reading (see main story). So what did it feel like?
"It's amazingly engaging," he says. "Even in the training phase where you get a kind of virtual avatar and you learn to move it around using your thoughts, you get loads of enthusiasm for the whole process."
Once you start controlling the robot, it gets much better. "It was mind-blowing. I really felt like I was there, moving around," Shapira says. It's not an easy job, though: "You need to concentrate, and you have to calculate a few steps in advance because there's a small delay between thinking of a movement and it actually happening. But once you get used to it you feel like a puppet master."
To create a left turn, right turn or leg movements, Shapira found it helpful to think about very specific actions. This enabled the computer to more easily recognise the activated areas of his brain. "I imagined turning the knob of a faucet with my right hand and a gear with my left hand. It worked best when I thought about everything in really vivid detail, like what the faucet felt like to touch."
While Shapira was controlling the robot, the French team surprised him with one last trick. "I turned around and they'd put a mirror in front of me," he says. He caught the first glance of his reflection. "I thought, 'oh I'm so cute, I have blue eyes', not 'that robot is cute'. It was amazing."
*******
Pollen-coated sticky bullets track a gunman's DNA
by Paul Marks
20 June 2012
Magazine issue 2869
Flowers and guns might conjure up images of flower power, but coating bullet casings in lily pollen could help forensic teams identify a gunman.
It is difficult to get useful DNA evidence from a spent bullet casing: copper and zinc ions from the brass alloy react with sweat to break down DNA, destroying evidence about who may have loaded a gun.
To combat this, Paul Sermon, a nanomaterials engineer at Brunel University in London, is leading a government-funded team to develop forensic coatings for brass bullet cartridges. The initial idea was to coat a bullet with a biochemical that stuck to the hands of those who touched it, allowing police to test the hands of suspects. Then they hit on a technique that could also stash away skin cells from that person. "We've combined these to increase the probability of obtaining useful associative evidence," says Sermon.
The challenge they faced was twofold: the coatings had to be compatible with the way bullets are made and cope with the heat generated when a bullet is fired. After years of experiments they say they have hit on a promising coating - and a way to apply it. "It's as simple as dunking a biscuit in a cup of tea," says Sermon.
Their first trick is to roughen the surface of the cartridge by dipping it in a solution of aluminium oxide and urea. When dry, this leaves an abrasive, ridged surface on which far more skin cells can be captured, increasing the chances that usable DNA can be recovered. Tested on bullets from a 9 millimetre Browning pistol, the team found that 53 per cent more viable DNA could be harvested from these bullets than from normal ones (Forensic Science International, DOI: 10.1016/j.forsciint.2012.04.021).
To label the hands of anyone who touches the bullet, they took the sticky pollen grains from the Easter lily, Lilium longiflorum, and coated them in titanium dioxide before dropping them in liquid plastic. This solution was used to coat the bottom of the bullet casing. While the pollen is not uncommon, and TiO2 is found in paints and sun lotions, together they form a unique tag, says Sermon.
"It's a fascinating development we'll watch with interest," says Mike Sweeney of BAE Systems, which makes ammunition for the British army.
*******
Mind control could be future of warfare
by Andy Coghlan
07 February 2012
Magazine issue 2851
Wars of the future might be decided through manipulation of people's minds, concludes a report this week from the UK's Royal Society. It warns that the potential military applications of neuroscience breakthroughs need to be regulated more closely.
"New imaging technology will allow new targets in the brain to be identified, and while some will be vital for medicine, others might be used to incapacitate people," says Rod Flower of Queen Mary, University of London, who chairs the panel that wrote the report.
The report describes how such technology is allowing organisations like the US Defense Advanced Research Projects Agency to test ways of improving soldiers' mental alertness and capabilities. It may also allow soldiers to operate weaponry remotely through mind-machine interfaces, the report says.
Other research could be used to design gases and electronics that temporarily disable enemy forces. This potentially violates human rights, through interference with thought processes, and opens up the threat of indiscriminate killing. The panel highlights the time that Russian security forces ended a hostage siege in a Moscow theatre in 2002 by filling the venue with fentanyl, an anaesthetic gas. Along with the perpetrators, 125 hostages died.
The Chemical Weapons Convention is vague about whether such incapacitants are legal. Ambiguities like this must be ironed out, say the panellists.
*******
Also See:
Science Fiction or Future Reality?
01 November 2008
and
A New Paradigm - Integration of Human-Animal-Machine!
11 August 2010
and
What do You Know about the Post American World?
11 September 2010
and
U.S. Lust For Technology Killing Globally
14 September 2010
and
What do you Know about New Technologies?
23 February 2011
and
Is This the Future?
(Part 1)
16 April 2011
and
(Part 2)
21 September 2011
*******