robotics
Living With Robots
On 22, Jan 2010 | No Comments | In robotics | By Dave
I’m very excited for this film. Asimo looks fantastic in it!
I think we’re on the verge of seeing a lot more examination of the relationship between humans and robots.
Fingerprint ridge width is coupled to Pacinian resonance
On 05, Jan 2010 | No Comments | In perception, physiology, robotics | By Dave
French scientist Georges Debregeas has published a finding that the width of the ridges of our fingerprints just happens to be optimized for maximally vibrating our nerve endings:
The latest evidence suggests that fingerprints process vibrations in the skin to make them easier for nerves to pick up. They may seem little more than digital decoration, but biomechanics have long known that fingerprints have at least one use: they increase friction, thereby improving grip…
In fact the role that fingerprints play in touch is far more important and subtle than anyone imagined.
…Biologists have known for some time that Pacinian corpuscles are most sensitive to vibrations at 250Hz. So how do fingers generate this kind vibration? Biologists have always assumed that humans can control the frequency of vibrations in the skin by changing the speed at which a finger moves across a surface. But there’s little evidence that people actually do this and the Paris team’s discovery should make this view obsolete.
…They say that fingerprints resonate at certain frequencies and so tend to filter mechanical vibrations. It turns out that their resonant frequency is around 250 Hz. What an astonishing coincidence!
That means that fingerprints act like signal processors, conditioning the mechanical vibrations so that the Pacinian corpuscles can best interpret them…
The article also notes that in robotics this is called morphological computation; that is, computation through interactions of physical form.
Hand prosthesis with sensitive fingertips
On 26, Oct 2009 | No Comments | In robotics, transhumanism | By Dave
The test patient underwent a complicated, experimental surgical procedure to wire the nerve endings in his stump to an electronic interface. His personal risk will advance science and potentially help millions of people. Thank you, Robin Af Ekenstam.
In the next version I hope they make the Smart Hand’s fingertips get a little bit more sensitive after you clip its fingernails.
(via Engadget)
Telepresence: a good excuse to stay on Earth?
On 05, Sep 2009 | 3 Comments | In robotics, tactility | By David Birnbaum
In the New York Times, an article that cites the beautiful dream of telepresence to squash the equally beautiful dream of space colonization. Lame.
Printed strain sensors = "sense of touch"
On 14, Aug 2009 | No Comments | In robotics, tactility | By David Birnbaum
“One component in this tactile capability is a strain gauge,” says Marcus Maiwald…“If the robot encounters an obstacle,” he explains, “the strain gauge is distorted and the electrical resistance changes. The special feature of our strain gauge is that it is not glued but printed on – which means we can apply the sensor to curved surfaces of the robot.”
The sensor system on this robot is not all that complex; strain gauges are literally a dime a dozen (or less). But the configuration of the sensors reminds us of an animal body, and that’s what intrigues us. Since the strain sensors are printed along the surface of the robot in a continuous way (rather than being attached at some specific point), we’re reminded of how touch receptors are embedded throughout the skin, bringing to mind the phrase “sense of touch.” The Roomba has a mechanical sensor that is technically similar to the ones in this new robot, but we don’t talk about the Roomba having a sense of touch because the sensor is in a discrete place. To have a sense of touch you need to be able to sense contact (almost) anywhere on the surface of the body.
Hand-observing robot understands human goals
On 10, Jun 2009 | No Comments | In cognition, robotics | By David Birnbaum
By observing how its human partner grasped a tool or model part, for example, the robot was able to predict how its partner intended to use it. Clues like these helped the robot to anticipate what its partner might need next. “Anticipation permits fluid interaction,” says Erlhagen. “The robot does not have to see the outcome of the action before it is able to select the next item.”
Maybe she was born with it; maybe it’s vibe power
On 17, Jun 2008 | No Comments | In robotics | By David Birnbaum
Estée Lauder and Lancôme will soon release vibrating mascaras that promise to coat eyelashes with black stuff in a way human hands could never do alone.
Robotic pre-touch
On 14, Jun 2008 | No Comments | In robotics | By David Birnbaum
Check out Intel’s prehensile manipulator with pre-touch sensing:
Whiskers as haptic sensor arrays
On 26, Feb 2008 | No Comments | In cognition, neuroscience, physiology, robotics | By David Birnbaum
Whiskers provide animals with complex perceptual content. In fact, all the things that whiskers actually do are fascinating.
The dimensionality of the data can be modeled according to how an animal moves them through space:
Rat whiskers move actively in one dimension, rotating at their base in a plane roughly parallel to the ground. When the whiskers hit an object, they can be deflected backwards, upwards or downwards by contact with the object. The mechanical bending of the whisker activates many thousands of sensory receptors located in the follicle at the whisker base. The receptors, in turn, send neural signals to the brain, where a three-dimensional image is presumably generated.Hartmann and Solomon showed that their robotic whiskers could extract information about object shape by “whisking” (sweeping) the whiskers across a small sculpted head, which was chosen specifically for its complex shape. As the whiskers move across the object, strain gauges sense the bending of the whiskers and thus determine the location of different points on the head. A computer program then “connects the dots†to create a three-dimensional representation of the object.
More on that “three-dimensional image” from the end of the first paragraph — whiskers indeed construct a high resolution spatial map:
Based on discoveries in primates and cats, scientists previously thought that highly refined maps representing the complexities of the external world were the exclusive domain of the visual cortex in mammals. This new map is a miniature schematic, representing the direction a whisker is moved when it brushes against an object.“This study is a great counter example to the prevailing view that only the visual cortex has beautiful, overlapping, multiplexed maps,” said Christopher Moore, a principal investigator at the McGovern Institute and an assistant professor in the Department of Brain and Cognitive Sciences, where he holds the Mitsui Career Development Chair.
Researchers are now working towards developing code for a whisker-like sensor array to be used for robotics. Could this software have human interface applications as well?
This reminds me of the impressive and thought-provoking Haptic Radar/Extended Skin Project. Although the sensing medium in that case was ultrasound rather than a deformable, physical substrate, and the resolution of the stimulators much lower, the researchers state that they intend to make the system more whisker-like as they develop it.
[via Science Daily]
Compact tactile sensors
On 18, Feb 2008 | No Comments | In physiology, robotics | By David Birnbaum
Man, would I ever love to get ahold of one of these tactile sensors developed for the Shadow Robot Company. Responding to pressure ranging between 0.1 N and 25 N using a quantum tunneling composite material, each sensor has up to 34 individual sensing units, on-board digital signal conditioning, and is the size of a human fingertip. I would bet that this sensor could be used to crudely model the non-pacinian 3 (np3) psychophysical sensory channel, which constructs an acute neural image of skin deformation.
A development kit is available for £1450, which includes one sensor and some interfacing materials. That’s a bit out of my range, but if someone from Shadow is reading this, I will give you free publicity if you send me a sample!
(via aiGuru)
From the Painfully Obvious Department of Future Studies
On 16, Feb 2008 | No Comments | In robotics | By David Birnbaum
It seems like gender studies professor David Levy has been reading some Kurzweil: Are robots the sex partners of the future?
The short answer is yes. The long answer is available in his book, Love and Sex with Robots: The Evolution of Human-Robot Relations.
On Fembots and Gynoids
On 22, Dec 2007 | No Comments | In robotics | By David Birnbaum
With the launch of Sputnik in 1957, the idea that in fifty years the Russians would be using gynoids to seduce men and compromise security might have seemed plausible. Fact:
A program that can mimic online flirtation and then extract personal information from its unsuspecting conversation partners is making the rounds in Russian chat forums, according to security software firm PC Tools.
The artificial intelligence of CyberLover’s automated chats is good enough that victims have a tough time distinguishing the “bot” from a real potential suitor, PC Tools said. The software can work quickly too, establishing up to 10 relationships in 30 minutes, PC Tools said. It compiles a report on every person it meets complete with name, contact information, and photos.
“As a tool that can be used by hackers to conduct identity fraud, CyberLover demonstrates an unprecedented level of social engineering,” PC Tools senior malware analyst Sergei Shevchenko said in a statement.
Among CyberLover’s creepy features is its ability to offer a range of different profiles from “romantic lover” to “sexual predator.” It can also lead victims to a “personal” Web site, which could be used to deliver malware, PC Tools said.
Combine software like this and a (hot) robot hardware platform and you would have the most powerful human intelligence tool in history. Sounds like a job for haptics!
The Pleo snuff tape
On 14, Dec 2007 | No Comments | In robotics | By David Birnbaum
At least one person feels that watching Pleo being tortured can be traumatizing, if not on the same level as watching real animal torture, at least on the same spectrum. I can see where she was coming from. My own emotional reaction was to first feel disturbed, and then amused at the fact that I was feeling honest disgust, then disturbed again, then more amusement, etc., until the arrival of the final frame, full of bitterness and death; I held my breath for a moment or two, and then burst out laughing.
Problems and prospects for Gibson’s self-tuning guitar
On 11, Dec 2007 | One Comment | In music, robotics | By David Birnbaum
Gibson has announced a guitar with a built-in self-tuning mechanism. Some have suggested that there is a problem with allowing people to skip learning how to tune a guitar before they play it, because tuning helps develop the ear. I think this is a valid concern, and readers of my papers would know I don’t think lowering the entry fee for musical instrumental interaction is, in itself, a “good thing.” At the same time, there are plenty of advantages offered by a self-tuning guitar that have nothing at all to do with ear training, such as avoiding the need to bring a capo to gigs, or to bring more than one guitar to a show for quickly playing two consecutive songs than require drastically different guitar tunings. (Besides, there are plenty of other excellent ways to train your ear.) Quick but accurate tuning changes will also surely be exploited in composition; tuning changes can be done in the middle of a piece, and the musical capabilities and quirkiness of the auto-tuner could even be used for some as-yet-unknown artistic end.
What I find especially interesting is how the words “world’s first robot guitar” are tossed around in this press release. First of all, it seems as if the word “robot” is being used vaguely to refer to the presence of a servo system. If this guitar is robotic, then so is my laptop for its ability to read and eject optical media. I think we’re going to see more of this, similar to the way “net” was overused in the nineties. We are entering a robo-sheik era where any product that can possibly justify doing so will be incorporating the word “robot” into its name.
As for the “world’s first” claim, someone should tell Gibson about TransPerformance, the company that has already been selling automatic tuner retrofits since 2005, as well as the dozens of other music technology projects that are based on guitar interaction and involve motors. It’s old, but anyone who hasn’t yet seen the League of Musical Urban Robots (LEMUR) video of the LEMUR Guitar Bot should check it out:
It’s easier for me to accept calling the LEMUR Guitar Bot a “robot” than the Gibson self-tuner. What do you think?
Robot sensitive to social touching
On 08, Nov 2007 | No Comments | In robotics | By David Birnbaum
A QRIO, modified to respond to social touching with giggles, attended a kindergarden class and became accepted by the human children:
In the study, QRIO was introduced into a classroom of toddlers aged 18 months to 24 months. Children of this age group were chosen because they have no preconceived notions of robots and they communicate using touch as much as speech.
“The children accepted the presence of QRIO very well,” Movellan told LiveScience. “There were a few children who were very interested but maintained distance. Over time, the relationship between children and QRIO evolved positively.”
In phase I of the experiment, which lasted 27 sessions, QRIO was instructed to interact with the children using its full behavioral repertoire, which included head-turning, dancing and giggling. At first, the children would touch the robot on its face, but as they warmed to him, the majority of their touches were to its hands and arms — a pattern the children also displayed toward each other.
During phase II, which lasted 15 sessions, QRIO ignored the children’s touches and danced throughout the session. “At that point, the [children] quickly lost interest,” Movellan said.
When QRIO’s ability to respond to touch and giggle were returned for three sessions in phase III, the children became friendly with the robot again.
When robot’s batteries died and it laid on the floor, some of the children cried.
Others put a blanket over him and said, “nigh-nigh.”
And further down in the same article:
The ability to respond to touch is relatively easy to program into robots, Movellan said. “We had things like computer vision in the robot, and touch was the easiest thing,” he said. “And it turned out to be the most important to get things going.”
A lot of energy is focused on developing computer vision, neural networks, and other AI-type technology, which is great. But touch interfaces are inexpensive, relatively easy to engineer, and very effective at engaging human emotion. I think this study implies that, for the purposes of social acceptance, touch may be the most important human-robot interaction mode.
Green groper
On 12, Oct 2007 | No Comments | In robotics | By David Birnbaum
U-Tsu-Shi-O-Mi combines a head mounted display and a robot to create a tangible avatar:
I like this idea. Since light, inexpensive, portable force-reflecting handheld interfaces are a long way off, why not couple physical objects with virtual vision technology to touch virtual objects? Moreover, using ocular displays, I could imagine a similar system for changing the appearance of our robotic companions on the fly, forgoing the need, for instance, to mechanize their facial expressions.
(via Gizmodo, Pink Tentacle, Robot Watch)
Beatbots: Socially rhythmic robots
On 23, Mar 2007 | No Comments | In robotics | By David Birnbaum
OMGHI2Beatbots!
(via Joe Malloch.)