perception
Brilliant Italian scientists successfully recombine work and pleasure
On 25, Aug 2011 | No Comments | In music, neuroscience, perception | By Dave
A study provides evidence that talking into a person’s right ear can affect behavior more effectively than talking into the left.
One of the best known asymmetries in humans is the right ear dominance for listening to verbal stimuli, which is believed to reflect the brain’s left hemisphere superiority for processing verbal information.
I heavily prefer my left ear for phone calls. So much so that I have trouble understanding people on the phone when I use my right ear. Should I be concerned that my brain seems to be inverted?
Read on and it becomes clear that going beyond perceptual psychology, the scientists are terrifically shrewd:
Tommasi and Marzoli’s three studies specifically observed ear preference during social interactions in noisy night club environments. In the first study, 286 clubbers were observed while they were talking, with loud music in the background. In total, 72 percent of interactions occurred on the right side of the listener. These results are consistent with the right ear preference found in both laboratory studies and questionnaires and they demonstrate that the side bias is spontaneously displayed outside the laboratory.
In the second study, the researchers approached 160 clubbers and mumbled an inaudible, meaningless utterance and waited for the subjects to turn their head and offer either their left of their right ear. They then asked them for a cigarette. Overall, 58 percent offered their right ear for listening and 42 percent their left. Only women showed a consistent right-ear preference. In this study, there was no link between the number of cigarettes obtained and the ear receiving the request.
In the third study, the researchers intentionally addressed 176 clubbers in either their right or their left ear when asking for a cigarette. They obtained significantly more cigarettes when they spoke to the clubbers’ right ear compared with their left.
I’m picturing the scientists using their grant money to pay cover at dance clubs and try to obtain as many cigarettes as possible – carefully collecting, then smoking, their data – with the added bonus that their experiment happens to require striking up conversation with clubbers of the opposite sex who are dancing alone. One assumes that, if the test subject happened to be attractive, once the cigarette was obtained (or not) the subject was invited out onto the terrace so the scientist could explain the experiment and his interesting line of work. Well played!
How touchscreen buttons “should” feel
On 09, Aug 2011 | No Comments | In perception | By Dave
Researchers at the University of Tampere in Finland found that,
Interfaces that vibrate soon after we click a virtual button (on the order of tens of milliseconds) and whose vibrations have short durations are preferred. This combination simulates a button with a “light touch” – one that depresses right after we touch it and offers little resistance.
Users also liked virtual buttons that vibrated after a longer delay and then for a longer subsequent duration. These buttons behaved like ones that require more force to depress.
This is very interesting. When we think of multimodal feedback needing to make cognitive sense, synchronization first comes to mind. But there are many more synesthesias in our experience that can only be uncovered through careful reflection. To make an interface feel real, we must first examine reality.
“We’re suggesting that the ear evolved out of the skin in order to do more finely tuned frequency analysis.”
On 27, May 2011 | No Comments | In perception, physiology | By Dave
Recent research reveals some fun facts about aural-tactile synesthesia:
Both hearing and touch, the scientists pointed out, rely on nerves set atwitter by vibration. A cell phone set to vibrate can be sensed by the skin of the hand, and the phone’s ring tone generates sound waves — vibrations of air — that move the eardrum…
A vibration that has a higher or lower frequency than a sound… tends to skew pitch perception up or down. Sounds can also bias whether a vibration is perceived.
The ability of skin and ears to confuse each other also extends to volume… A car radio may sound louder to a driver than his passengers because of the shaking of the steering wheel. “As you make a vibration more intense, what people hear seems louder,” says Yau. Sound, on the other hand, doesn’t seem to change how intense vibrations feel.
Touch affects cognition
On 25, Jun 2010 | No Comments | In cognition, perception, sociology | By Dave
Hand amputees have distorted vision
On 26, Jan 2010 | No Comments | In perception | By Dave
The space immediately surrounding the hands, where objects are grasped, touched, and manipulated, is called “action space” by psychologists. It is distinct from the wider spatial field because there is evidence that visual perception of an object is affected by the object’s proximity to the hands—i.e., its ability to be touched.
A new study has shown that hand amputees have distorted visual perception in the action space:
The space within reach of our hands—where actions such as grasping and touching occur—is known as the “action space.” Research has shown that visual information in this area is organized in hand-centered coordinates—in other words, the representation of objects in the human brain depends on their physical location with respect to the hand. According to new research in Psychological Science amputation of the hand results in distorted visuospatial perception (i.e., figuring out where in space objects are located) of the action space….
Volunteers were instructed to look at a central cross on a screen as two white squares were briefly shown to the left and right side of the cross. The volunteers had to indicate which of the squares was further away from the cross. The results reveal that hand amputations affect visuospatial perception. When the right square was slightly further away from the center, participants with right-hand amputations tended to perceive it as being at the same distance from the center as the left square; this suggests that these volunteers underestimated the distance of the right square relative to the left. Conversely, when the left square was further away, participants with left-hand amputations perceived both squares as being equally far away from the center—these participants underestimated the left side of near space. Interestingly, when the volunteers were seated farther away from the screen, they were more accurate in judging the distances, indicating that hand amputations may only affect perception of the space close to the body.
The findings suggest that losing a hand may shrink the action space on the amputated side, leading to permanent distortions in spatial perception. According to the researchers, “This shows that the possibility for action in near space shapes our perception—the space near our hands is really special, and our ability to move in that space affects how we perceive it.“
Perceptual chauvinism
On 11, Jan 2010 | No Comments | In medical, music, perception | By Dave
I read two articles in a row today that use unnecessary quotation marks, which expose that strange discomfort with writing about touch I have written about before. As humans we hold our feelings dear, so we don’t like to say that any other beings can feel. Especially plants, for chrissake:
Plants are incredibly temperature sensitive and can perceive changes of as little as one degree Celsius. Now, a report shows how they not only “feel” the temperature rise, but also coordinate an appropriate response—activating hundreds of genes and deactivating others; it turns out it’s all about the way that their DNA is packaged.
The author can’t simply say that plants can feel, so instead he writes “feel,” indicating a figurative sense of the word. Why? Because the word ‘feel’ implies some amount of consciousness. (In fact I have argued that ‘feeling’ signifies a baseline for the existence of a subject.) Only the animal kingdom gets feeling privileges.
And then, in another article posted on Science Daily, we have a similar example, but this one is even more baffling. The context is that research has shown that playing Mozart to premature infants can have measurable positive effects on development:
A new study… has found that pre-term infants exposed to thirty minutes of Mozart’s music in one session, once per day expend less energy—and therefore need fewer calories to grow rapidly—than when they are not “listening” to the music…
In the study, Dr. Mandel and Dr. Lubetzky and their team measured the physiological effects of music by Mozart played to pre-term newborns for 30 minutes. After the music was played, the researchers measured infants’ energy expenditure again, and compared it to the amount of energy expended when the baby was at rest. After “hearing” the music, the infant expended less energy, a process that can lead to faster weight gain.
Not allowing plants to feel is one thing. And I can even understand the discomfort with writing that newborns are listening to music, because that may imply they are attending to it, which is questionable. But why can’t human babies be said to hear music? This is the strangest case of perceptual chauvinism I have yet come across.
Fingerprint ridge width is coupled to Pacinian resonance
On 05, Jan 2010 | No Comments | In perception, physiology, robotics | By Dave
French scientist Georges Debregeas has published a finding that the width of the ridges of our fingerprints just happens to be optimized for maximally vibrating our nerve endings:
The latest evidence suggests that fingerprints process vibrations in the skin to make them easier for nerves to pick up. They may seem little more than digital decoration, but biomechanics have long known that fingerprints have at least one use: they increase friction, thereby improving grip…
In fact the role that fingerprints play in touch is far more important and subtle than anyone imagined.
…Biologists have known for some time that Pacinian corpuscles are most sensitive to vibrations at 250Hz. So how do fingers generate this kind vibration? Biologists have always assumed that humans can control the frequency of vibrations in the skin by changing the speed at which a finger moves across a surface. But there’s little evidence that people actually do this and the Paris team’s discovery should make this view obsolete.
…They say that fingerprints resonate at certain frequencies and so tend to filter mechanical vibrations. It turns out that their resonant frequency is around 250 Hz. What an astonishing coincidence!
That means that fingerprints act like signal processors, conditioning the mechanical vibrations so that the Pacinian corpuscles can best interpret them…
The article also notes that in robotics this is called morphological computation; that is, computation through interactions of physical form.
Skin receptors may contribute to emotion
On 02, Jan 2010 | No Comments | In language, neuroscience, perception, physiology | By Dave
Interoception, the perception of internal feelings, is a funny thing. From our point of view as feeling beings, it seems entirely distinct from exteroceptive channels (sight, hearing, and so on). Interoception is also thought to be how we feel emotions, in addition to bodily functions. When you feel either hungry or lovesick, you are perceiving the state of your internal body, organs, and metabolism. A few years ago it was discovered that there are neural pathways for interoception distinct from ones used to perceive the outside world.
Interesting new research suggests that mechanical skin disturbances caused by pulsating blood vessels may significantly contribute to your perception of your own heartbeat. This is important because it means that skin may play a larger role in emotion than has been previously thought.
The researchers found that, in addition to a pathway involving the insular cortex of the brain — the target of most recent research on interoception — an additional pathway contributing to feeling your own heartbeat exists. The second pathway goes from fibers in the skin to most likely the somatosensory cortex, a part of the brain involved in mapping the outside of the body and the sense of posture.
This sounds surprising at first, but it makes perfect sense. There have been other instances where the functionality of perceptual systems overlap. For example, it’s been found that skin receptors contribute to kinesthesia: as the joints bend, sensations of skin stretch are used to perceive of joint angles. This was also somewhat surprising at the time, because it was thought that perception of one’s joint angles arose out of the receptors in the joints themselves, exclusively. The same phenomenon, of skin movement being incidentally involved in some other primary action, is at work here. We might be able to say that any time the skin is moved perceptibly, cutaneous signals are bound up with the percept itself.
In fact, I think this may be a good object lesson in how words about feelings can be very confusing. A few years ago, before the recent considerable progress in mapping the neural signature of interoception, the word ‘interoception’ was used to describe a class of perceptions—ones whose object was the perceiver. Interoception meant the perception of bodily processes: heartbeat, metabolic functioning, and so on. When scientists discovered a neural pathway that serves only this purpose, the word suddenly began to refer not to the perceptual modality, but exclusively to that neural pathway. Now that multiple pathways have been identified, the word will go back to its original meaning: a class of percepts, rather than a particular neural conduit.
Right-handers: right arm is longer than left. Left-handers: arms are equal length.
On 05, Nov 2009 | No Comments | In perception | By Dave
Incredible speaking piano
On 09, Oct 2009 | No Comments | In art, music, perception | By David Birnbaum
A composer named Peter Ablinger has created a jaw dropping sound art piece. He recorded a speech read by a child, analyzed the recording to extract its frequency content, and then mapped it to pitches on an acoustic player piano. My reaction was identical to the one described in the interview: what at first sounds like nonsense comes into perfect focus when you begin reading the text along to the sound. The flip from unintelligibility to clarity is a thrilling experience. Beautiful, beautiful work!
Facial movement affects hearing
On 05, Feb 2009 | No Comments | In language, perception | By David Birnbaum
The movement of facial skin and muscles around the mouth plays an important role not only in the way the sounds of speech are made, but also in the way they are heard… “How your own face is moving makes a difference in how you ‘hear’ what you hear,” said first author Takayuki Ito, a senior scientist at Haskins.
Note that this sentence says that facial movement doesn’t affect what you hear, it only affects how you “hear” what you hear. More on this below.
When, Ito and his colleagues used a robotic device to stretch the facial skin of “listeners” in a way that would normally accompany speech production they found it affected the way the subjects heard the speech sounds.The subjects listened to words one at a time that were taken from a computer-produced continuum between the words “head” and “had.” When the robot stretched the listener’s facial skin upward, words sounded more like “head.” With downward stretch, words sounded more like “had.” A backward stretch had no perceptual effect.
And, timing of the skin stretch was critical—perceptual changes were only observed when the stretch was similar to what occurs during speech production.
These effects of facial skin stretch indicate the involvement of the somatosensory system in the neural processing of speech sounds. This finding contributes in an important way to our understanding of the relationship between speech perception and production. It shows that there is a broad, non-auditory basis for “hearing” and that speech perception has important neural links to the mechanisms of speech production.
“Listeners,” “hearing”… Why do I worry so much about these damn quotation marks? Because they point out an assumption we tend to make about perception: that there are objective sense data out there in the world, ready to be accessed through our senses. Within this model, secondary effects (caused by face pulling robots) are seen as tricks played on our minds. But this is backwards. The astounding implication of this research is that our minds are composed of these tricks; the tricks are what produce a stable reality that meets our expectations.
For example, when the researchers were listening to recordings of the words “had” and “head” in order to design their experiment, the shape of their faces must have affected their hearing. (At least, that’s what their research seems to imply.) So who can listen without “listening”? Who determines whether the word is really “had” or “head”—someone without any facial expression at all?
The paper itself, which I haven’t read, can be purchased here.
Mammalian skin may sense oxygen
On 28, May 2008 | No Comments | In perception, physiology | By David Birnbaum
Yet another sensory capability of skin has been discovered:
Biologists at the University of California, San Diego have discovered that the skin of mice can sense low levels of oxygen and regulate the production of erythropoietin, or EPO, the hormone that stimulates our bodies to produce red blood cells and allows us to adapt to high-altitude, low-oxygen environments.
“What we found in this study is really something quite unusual,” said Randall Johnson, a professor of biology at UC San Diego who headed the research study. “We discovered that mammalian skin, at least in mice, responds to how much oxygen is above it and, by virtue of that response, changes blood flow through the skin. This, in turn, changes one of the most basic responses to low oxygen that we have, which is the production of erythropoietin.”
Rolling ball tactile illusion
On 14, Feb 2008 | No Comments | In perception | By David Birnbaum
Friend and colleague Joe Malloch blogs about a very interesting project in which he uses a pipe with an accelerometer in both ends and a vibration actuator in the middle to create a haptic illusion of a rolling ball inside the pipe. From his technical report about the device:
Since the controller already contains five channels of acceleration sensing, it is simple to link this to virtual physical dynamics consistent with real-life gravity and user interaction. The acceleration signal is integrated to approximate velocity, and the resulting signal is used to control the frequency of a periodic signal mimicking the rolling of a ball of a set circumference. By varying the scaling of acceleration data and the scaling of velocity data, the mass and circumference of the virtual ball may be altered. Performing waveshaping of the final signal (or altering the stored waveform if look-up tables are used) alters the perception of the interaction between the virtual ball and the inside of the tube, creating the impression that the motion is smooth or bumpy, or that the inside of the pipe is ribbed. Integrating a second time approximates the position of the ball; this data is used to stop the virtual motion and set the velocity back to zero when the ball reaches the end of the modeled pipe…
Even lacking appropriate amplification and using somewhat un-physical coefficients, people trying the demonstration were convinced by the model – some would not believe that there was actually nothing rolling inside. Observation of users showed that their gaze would often follow the apparent position of the virtual ball, and perception of mass distribution would change depending on which end of the controller “contained†the virtual ball.
The altered perception of mass distribution shows that vibrotaction can give rise to illusions of force. I think there’s a lot of potential for this concept to be exploited for PxD.
Head-related haircut
On 16, Jul 2007 | One Comment | In perception | By David Birnbaum
Put on a pair of (good) headphones and enjoy this beautiful binaural recording. (Note: headphones are essential for the experience. Listening on speakers will not work!) It goes to show that well recorded sound can be very effective even after it’s terribly distorted by YouTube compression. Thanks to MissChievous for the heads up!