Image Image Image Image Image

Presented at CHI 2012, Touché is a capacitive system for pervasive, continuous sensing. Among other amazing capabilities, it can accurately sense gestures a user makes on his own body. “It is conceivable that one day mobile devices could have no screens or buttons, and rely exclusively on the body as the input surface.” Touché.

Noticing that many of the same sensors, silicon, and batteries used in smartphones are being used to create smarter artificial limbs, Fast Company draws the conclusion that the market for smartphones is driving technology development useful for bionics. While interesting enough, the article doesn’t continue to the next logical and far more interesting possibility: that phones themselves are becoming parts of our bodies. To what extent are smartphones already bionic organs, and how could we tell if they were? I’m actively researching design in this area – stay tuned for more about the body-incorporated phone.

A study provides evidence that talking into a person’s right ear can affect behavior more effectively than talking into the left.

One of the best known asymmetries in humans is the right ear dominance for listening to verbal stimuli, which is believed to reflect the brain’s left hemisphere superiority for processing verbal information.

I heavily prefer my left ear for phone calls. So much so that I have trouble understanding people on the phone when I use my right ear. Should I be concerned that my brain seems to be inverted?

Read on and it becomes clear that going beyond perceptual psychology, the scientists are terrifically shrewd:

Tommasi and Marzoli’s three studies specifically observed ear preference during social interactions in noisy night club environments. In the first study, 286 clubbers were observed while they were talking, with loud music in the background. In total, 72 percent of interactions occurred on the right side of the listener. These results are consistent with the right ear preference found in both laboratory studies and questionnaires and they demonstrate that the side bias is spontaneously displayed outside the laboratory.

In the second study, the researchers approached 160 clubbers and mumbled an inaudible, meaningless utterance and waited for the subjects to turn their head and offer either their left of their right ear. They then asked them for a cigarette. Overall, 58 percent offered their right ear for listening and 42 percent their left. Only women showed a consistent right-ear preference. In this study, there was no link between the number of cigarettes obtained and the ear receiving the request.

In the third study, the researchers intentionally addressed 176 clubbers in either their right or their left ear when asking for a cigarette. They obtained significantly more cigarettes when they spoke to the clubbers’ right ear compared with their left.

I’m picturing the scientists using their grant money to pay cover at dance clubs and try to obtain as many cigarettes as possible – carefully collecting, then smoking, their data – with the added bonus that their experiment happens to require striking up conversation with clubbers of the opposite sex who are dancing alone. One assumes that, if the test subject happened to be attractive, once the cigarette was obtained (or not) the subject was invited out onto the terrace so the scientist could explain the experiment and his interesting line of work. Well played!

Another MRI study, this time investigating how we learn parts of speech:

The test consisted of working out the meaning of a new term based on the context provided in two sentences. For example, in the phrase “The girl got a jat for Christmas” and “The best man was so nervous he forgot the jat,” the noun jat means “ring.” Similarly, with “The student is nising noodles for breakfast” and “The man nised a delicious meal for her” the hidden verb is “cook.”

“This task simulates, at an experimental level, how we acquire part of our vocabulary over the course of our lives, by discovering the meaning of new words in written contexts,” explains Rodríguez-Fornells. “This kind of vocabulary acquisition based on verbal contexts is one of the most important mechanisms for learning new words during childhood and later as adults, because we are constantly learning new terms.”

The participants had to learn 80 new nouns and 80 new verbs. By doing this, the brain imaging showed that new nouns primarily activate the left fusiform gyrus (the underside of the temporal lobe associated with visual and object processing), while the new verbs activated part of the left posterior medial temporal gyrus (associated with semantic and conceptual aspects) and the left inferior frontal gyrus (involved in processing grammar).

This last bit was unexpected, at first. I would have guessed that verbs would be learned in regions of the brain associated with motor action. But according to this study, verbs seem to be learned only as grammatical concepts. In other words, knowledge of what it means to run is quite different than knowing how to run. Which makes sense if verb meaning is accessed by representational memory rather than declarative memory.

Researchers at the University of Tampere in Finland found that,

Interfaces that vibrate soon after we click a virtual button (on the order of tens of milliseconds) and whose vibrations have short durations are preferred. This combination simulates a button with a “light touch” – one that depresses right after we touch it and offers little resistance.

Users also liked virtual buttons that vibrated after a longer delay and then for a longer subsequent duration. These buttons behaved like ones that require more force to depress.

This is very interesting. When we think of multimodal feedback needing to make cognitive sense, synchronization first comes to mind. But there are many more synesthesias in our experience that can only be uncovered through careful reflection. To make an interface feel real, we must first examine reality.

Researchers at the Army Research Office developed a vibrating belt with eight mini actuators — “tactors” — that signify all the cardinal directions. The belt is hooked up to a GPS navigation system, a digital compass and an accelerometer, so the system knows which way a soldier is headed even if he’s lying on his side or on his back.

The tactors vibrate at 250 hertz, which equates to a gentle nudge around the middle. Researchers developed a sort of tactile morse code to signify each direction, helping a soldier determine which way to go, New Scientist explains. A soldier moving in the right direction will feel the proper pattern across the front of his torso. A buzz from the front, side and back tactors means “halt,” a pulsating movement from back to front means “move out,” and so on.

A t-shirt design by Derek Eads.

Recent research reveals some fun facts about aural-tactile synesthesia:

Both hearing and touch, the scientists pointed out, rely on nerves set atwitter by vibration. A cell phone set to vibrate can be sensed by the skin of the hand, and the phone’s ring tone generates sound waves — vibrations of air — that move the eardrum…

A vibration that has a higher or lower frequency than a sound… tends to skew pitch perception up or down. Sounds can also bias whether a vibration is perceived.

The ability of skin and ears to confuse each other also extends to volume… A car radio may sound louder to a driver than his passengers because of the shaking of the steering wheel. “As you make a vibration more intense, what people hear seems louder,” says Yau. Sound, on the other hand, doesn’t seem to change how intense vibrations feel.

Max Mathews, electronic music pioneer, has died.

Though computer music is at the edge of the avant-garde today, its roots go back to 1957, when Mathews wrote the first version of “Music,” a program that allowed an IBM 704 mainframe computer to play a 17-second composition. He quickly realized, as he put it in a 1963 article in Science, “There are no theoretical limits to the performance of the computer as a source of musical sounds.”

Rest in peace, Max.

UPDATE: I haven’t updated this blog in a while, and I realized after posting this that my previous post was about the 2010 Modulations concert. Max Mathews played at Modulations too, and that was the last time I saw him.

I finally got around to recording and mastering the set I played at the CCRMA Modulations show a few months back. Though I’ve been a drum and bass fan for many years, this year’s Modulations was the first time I’d mixed it for others. Hope you like it!

Modulations 2010
Drum & Bass | 40:00 | May 2010

Audio clip: Adobe Flash Player (version 9 or above) is required to play this audio clip. Download the latest version here. You also need to have JavaScript enabled in your browser.

Download (mp3, 82.7 MB)


1. Excision — System Check
2. Randomer — Synth Geek
3. Noisia — Deception
4. Bassnectar — Teleport Massive (Bassnectar Remix)
5. Moving Fusion, Shimon, Ant Miles — Underbelly
6. Brookes Brothers — Crackdown
7. The Ian Carey Project — Get Shaky (Matrix & Futurebound’s Nip & Tuck Mix)
8. Netsky — Eyes Closed
9. Camo & Krooked — Time Is Ticking Away feat. Shaz Sparks

Over the last few days this video has been so much bombshell to many of my music-prone friends.

It’s called the Multi-Touch Light Table and it was created by East Bay-based artist/fidget-house DJ Gregory Kaufman. The video is beautifully put together, highlighting the importance of presentation when documenting new ideas.

I really like some of the interaction ideas presented in the video. Others, I’m not so sure about. But that’s all right: the significance of the MTLT is that it’s the first surface-based DJ tool that systematically accounts for the needs of an expert user.

Interestingly, even though it looks futuristic and expensive to us, interfaces like this will eventually be the most accessible artistic tools. Once multi-touch surface are ubiquitous, the easiest way to gain some capability will be to use inexpensive or open-source software. The physical interfaces created for DJing, such as Technics 1200s, are prosthetic objects (as are musical instruments), and will remain more expensive because mechanical contraptions will always be. Now, that isn’t to say that in the future our interfaces won’t evolve to become digital, networked, and multi-touch sensitive, or even that their physicality will be replaced with a digital haptic display. But one of the initial draws of the MTLT—the fact of its perfectly flat, clean interactive surface—seems exotic to us right now, and in the near future it will be default.

Check out this flexible interface called impress. Flexible displays just look so organic and, well impressive. One day these kinds of surface materials will become viable displays and they’ll mark a milestone in touch computing.

It’s natural to stop dancing between songs. The beat changes, the sub-rhythms reorient themselves, a new hook is presented and a new statement is made. But stopping dancing between songs is undesirable. We wish to lose ourselves in as many consecutive moments as possible. The art of mixing music is to fulfill our desire to dance along to continuous excellent music, uninterrupted for many minutes (or, in the best case, many hours) at a time. (Even if we don’t explicitly move our bodies to the music, when we listen our minds are dancing; the same rules apply.)

I don’t remember what prompted me to take that note, but it was probably not that the mixing was especially smooth.



A tomato hailing from Capay, California.

LHCSound is a site where you can listen to sonified data from the Large Hadron Collider. Some thoughts:

  • That’s one untidy heap of a website. Is this how it feels to be inside the mind of a brilliant physicist?
  • The name “LHCSound” refers to “Csound”, a programming language for audio synthesis and music composition. But how many of their readers will make the connection?
  • If they are expecting their readers to know what Csound is, then their explanation of the process they used for sonification falls way short. I want to know the details of how they mapped their data to synthesis parameters.
  • What great sampling material this will make. I wonder how long before we hear electronic music incorporating these sounds.

The Immersive Pinball demo I created for Fortune’s Brainstorm:Tech conference was featured in a BBC special on haptics.

I keep watching the HTC Sense unveiling video from Mobile World Congress 2010. The content is pretty cool, but I’m more fascinated by the presentation itself. Chief marketing officer John Wang gives a simply electrifying performance. It almost feels like an Apple keynote.

The iFeel_IM haptic interface has been making rounds on the internet lately. I tried it at CHI 2010 a few weeks ago and liked it a lot. Affective (emotional haptic) interfaces are full of potential. IFeel_IM mashes together three separate innovations:

  • Touch feedback in several different places on the body: spine, tummy, waist.
  • Touch effects that are generated from emotional language.
  • Synchronization to visuals from Second Life

All are very interesting. The spine haptics seemed a stretch to me, but the butterfly-in-the-tummy was surprisingly effective. The hug was good, but a bit sterile. Hug interfaces need nuance to bring them to the next level of realism.

The fact that the feedback is generated from the emotional language of another person seemed to be one of the major challenges—the software is built to extract emotionally-charged sentences using linguistic models. For example, if someone writes “I love you” to you, your the haptic device on your tummy will react by creating a butterflies-like sensation. As an enaction devotee I would rather actuate a hug with a hug sensor. Something about the translation of words to haptics is difficult for me to accept. But it could certainly be a lot of fun in some scenarios!

I’ve re-recorded my techno mix Awake with significantly higher sound quality. So if you downloaded a copy be sure to replace it with the new file!

Awake

Awake
Techno | 46:01 | October 2009

Audio clip: Adobe Flash Player (version 9 or above) is required to play this audio clip. Download the latest version here. You also need to have JavaScript enabled in your browser.

Download (mp3, 92 MB)


1. District One (a.k.a. Bart Skils & Anton Pieete) — Dubcrystal
2. Saeed Younan — Kumbalha (Sergio Fernandez Remix)
3. Pete Grove — I Don’t Buy It
4. DBN — Asteroidz featuring Madita (D-Unity Remix)
5. Wehbba & Ryo Peres — El Masnou
6. Broombeck — The Clapper
7. Luca & Paul — Dinamicro (Karotte by Gregor Tresher Remix)
8. Martin Worner — Full Tilt
9. Joris Voorn — The Deep

I recently started using Eclipse on OS X and it was so unresponsive, it was almost unusable. Switching tabs was slow, switching perspectives was hella slow. I searched around the web for a solid hour for how to make it faster and finally found the solution. Maybe someone can use it.

My machine is running OS X 10.5, and I have 2 GB of RAM. (This is important because the solution requires messing with how Eclipse handles memory. If you have a different amount of RAM, these numbers may not work and you’ll need to fiddle with them.)

  • Save your work and quit Eclipse.
  • Open the Eclipse application package by right-clicking (or Control-clicking) on Eclipse.app and select “Show Package Contents.”
  • Navigate to Contents→MacOS→, and open “eclipse.ini” in your favorite text editor.
  • Edit the line that starts with -”XX:MaxPermSize” to say “-XX:MaxPermSize=128m”.
  • Before that line, add a line that says “-XX:PermSize=64m”.
  • Edit the line that starts with “-Xms” to say “-Xms40m”.
  • Edit the line that starts ith “-Xmx” to say “-Xmx768m”.
  • Save & relaunch Eclipse.

Worked like a charm for me.

Scroll to Top

To Top

tactility

24

Jan
2010

One Comment

In sociology
tactility

By Dave

Among journalists, technology breeds fear of obsolescence, corporations

On 24, Jan 2010 | One Comment | In sociology, tactility | By Dave

Another NYT article about technology anxiety, this one by Brad Stone. Some excerpts:

I’ve begun to think that my daughter’s generation will also be utterly unlike those that preceded it.

Well, it’s better to begin to think than to never start. There’s plenty of room for more people to contemplate and write about the future of technology. We are a friendly bunch! Let me be the first to welcome you, Mr. Stone.

But the newest batch of Internet users and cellphone owners will find these geo-intelligent tools to be entirely second nature, and may even come to expect all software and hardware to operate in this way. Here is where corporations can start licking their chops. My daughter and her peers will never be “off the grid.” And they may come to expect that stores will emanate discounts as they walk by them, and that friends can be tracked down anywhere.

I see, so even though technology will lift people out of poverty and make life longer and more enriching, technology is really just a vehicle for capitalist oppression. And like mad, salivating dogs, corporations will lick their chops. Right.

But the children, teenagers and young adults who are passing through this cauldron of technological change will also have a lot in common. They’ll think nothing of sharing the minutiae of their lives online, staying connected to their friends at all times, buying virtual goods, and owning one über-device that does it all. They will believe the Kindle is the same as a book. And they will all think their parents are hopelessly out of touch.

Of all the mind blowing changes that technology will bring to our society, the real thought-provoker is that those crazy young’uns will think a Kindle is the same as a book!

Mr. Stone: elevate your perspective. If you need help, read my blog, and read what I link to. Anticipate the future. Integrate it. Do develop a grounded, holistic understanding of where we’re going as a technological society. Don’t develop sociological theories based on your marvel at incremental steps like the Kindle. It won’t help you see the big picture.

Tags | , , , , ,

22

Dec
2009

No Comments

In art
language
tactility

By Dave

“And reaching up my hand to try, I screamed to feel it touch the sky.”

On 22, Dec 2009 | No Comments | In art, language, tactility | By Dave

Check out this beautiful kinetic typography piece by Heebok Lee:

It’s based on an excerpt of the poem “Renascence” by Edna St. Vincent Millay.

renascence
noun
1. the revival of something that has been dormant.
2. another term for ‘renaissance.’
(Oxford English Dictionary)

Millay, who wrote the poem when she was only 20 years old, originally called it “Renaissance.” It’s interesting that the two words are so close in meaning and are pronounced almost the same way, but they’re not considered alternate spellings of the same word.


Edna St. Vincent Millay
Edna on a terrace.

Click below to read the poem in its entirety. I highly recommend reading the whole thing.
Read more…

Tags | , , , , ,

20

Dec
2009

One Comment

In sociology
tactility

By Dave

What can gardens teach us about digitality?

On 20, Dec 2009 | One Comment | In sociology, tactility | By Dave

The Washington Post has an intriguing piece about a book dealing with gardens (of all things) and digitality. The author, Robert Harrison, argues that gardens immerse us in place and time, and that digital devices do not. The article jumps all over the place, talking about mobile communication, cultural anthropology, and evolution, but it makes several important points.

To start, attending to digital devices is said to preclude being present:

“You know you have crossed the river into Cyberland when the guy coming your way has his head buried in the hand-held screen. He will knock into you unless you get out of his way, and don’t expect an apology. It’s as if you aren’t there. Maybe you’re not.”

I’m very interested in language like this, because it’s a metaphor in the process of becoming a literalism. Today, saying that you’re not there because you’re looking at a device is metaphorical, but I think that the meaning of ‘being there’ is going to change to mean where you are engaged, no matter where its geographical location is in relation to you. “I’ll be right there!” he said as he plugged his brain into the internet. Moments later he was standing in the garden…

The article quotes a study that claims that the average adult spends 8.5 hours a day visually engaged with a screen. 24-hour days, split up by 8.5 hours of screen and 8 hours of sleep—the Screen Age really does deserve its own delineation. It’s a significant and unique period in human history.

And just like sleep, perhaps disturbingly so, people looking at screens can resemble dead people (or, more accurately, un-dead people):

…We have become digital zombies.

But I think the resemblance is entirely superficial. Sure, if you only go by appearances, an army of screen-starers is a frightening sight to imagine. But scratch the surface and you realize that screen-staring is a far cry from zombism. The social spaces we are constructing while we stare, the vast data stores we are integrating—these activities remind me of life. Teeming life. Our bodies may be sedentary, our eyes fixed on a single glowing rectangle, but what is going on is indisputably amazing. On the microscopic level there are billions of electrical fluctuations per moment, both in our brains and our machines, and they are actively correlating and adapting to each other. Patterns of thought are encoded in a vast network of micro-actions and reactions that span the planet. And what is it like for you when you stare at a computer or phone screen? You juggle complex, abstract symbolic information at speeds never before achieved by human brains, and you’re also inputting—emitting—hundreds of symbols with the precise motor skills of your fingers. You are recognizing pictures and signs, searching for things, finding them, figuring stuff out, adjusting your self image, and nurturing your dreams. There is no loss of dignity or life in this. But I admit that we all look like zombies while we do it, and I suppose that is pretty weird.

The article goes on to quote author Katherine Hayles, who says she thinks humans are in a state of symbiosis with their computers:

“If every computer were to crash tomorrow, it would be catastrophic,” she says. “Millions or billions of people would die. That’s the condition of being a symbiont.”

Let that sink in. At any moment a catastrophic event could fry our entire digital infrastructure in one fell swoop. Our civilization teeters on a house of cards as high as Mount Everest! To me this is the only reason the Screen Age should be frightening, but it’s very frightening indeed.

Turning now to sensation, Hayles mentions that touch and smell are suppressed by bipedalism:

“You could say when humans started to walk upright, we lost touch with the natural world. We lost an olfactory sense of the world, but obviously bipedalism paid big dividends.”

Note that bipedalism is associated with a loss of tactility, but it has also been correlated with enabling more complex manual dexterity. Maybe there is a general principle here that ambient tactile awareness is inversely correlated to prehension.

After a brief ensuing discussion of dualism and the advent of location-based services, we’re back to the gardens:

The difficulty, Harrison argues, is that we are losing something profoundly human, the capacity to connect deeply to our environments… “For the gardens to become fully visible in space, they require a temporal horizon that the age makes less and less room for.”

I like the point about a gardens’ time horizon. But it’s used to complain about the discomfort of our rushed lifestyle, which I would argue is separable from communication technology. The heads-buried-in-screens thing doesn’t really affect whether we have time for gardens.

An interesting footnote offered by Harrison is that the Czech playwright Karel Capek, who invented the word ‘robot,’ was a gardener.

Finally, this is the photo that accompanies the article:

PH2009121403611

…captioned, “Fingers on the political pulse.” The article is about looking and being present, but the picture is about hands, heatbeat, and hapticity.

(via Althouse)

Tags | , , , , , , , , , , , ,

21

Nov
2009

No Comments

In tactility

By Dave

A touch saves a life

On 21, Nov 2009 | No Comments | In tactility | By Dave

The Moth is a stand-up storytelling podcast I’ve been enjoying lately. An especially moving story was told by Mike Destefano. When he had reached the rock bottom of his life, having lost his wife, his father, and his will to live, a mentor literally reaches out to him, touches him, and changes everything:

The flame that I had as a kid, all of it, gone. Because now everyone died. All at that one moment, you know? And I made arrangements to fly home the next day, and I got on the plane, and when I got on the plane I decided that I was going to end my life. I’m pretty much done. And I wasn’t telling anyone, it wasn’t a threat, it was a total fucking decision that I’ve pretty much had enough of this. There is no more, nothing else to live for, and I’m done. So I got on the plane, and I was so excited because I’m like, I’m really going to fucking die, this is so great! I was thrilled and at peace. And I couldn’t wait until the funeral was over, because that’s when I’m going to do it. I’m not going to jump off a building or in front of a car. You people ever heard of overdosing on drugs? …I get up and go to the back of the plane to go to the bathroom and… the monk that I had met was sitting in the back row… And he put his hands out like he did before, again… and it worked for me… it just, it worked. And I got home, and I quit my job and I said, you know what? I want to be a fucking comedian.

Listen to the whole thing.

Audio clip: Adobe Flash Player (version 9 or above) is required to play this audio clip. Download the latest version here. You also need to have JavaScript enabled in your browser.

Download (mp3, 17 MB)

Tags | , , , , , ,

07

Oct
2009

No Comments

In art
tactility

By David Birnbaum

Touch the paintings in the Met

On 07, Oct 2009 | No Comments | In art, tactility | By David Birnbaum

Hoping to boost attendance and broaden its base of supporters, the Metropolitan Museum of Art launched a new initiative this week that allows patrons, for the first time ever, to prod and scratch at the classic paintings in its revered collection.

“You can’t grasp the brilliance of a great painting just by looking at it… To truly appreciate fine art, you need to be able to run your fingers over its surface and explore its range of textures.”

The new policy has been so popular that on Monday the Met began extending tactile privileges beyond its paintings. Patrons are now invited to climb inside ancient Egyptian sarcophagi, whether to take a souvenir photo or just carve a message into a 2,500-year-old sacred coffin.

Some, however, remained unimpressed.

“I touched a crapload of Jasper Johns’ paintings,” said Mark Bennet, 67. “I just don’t get why they’re supposed to be so special. They feel like any regular old painting.”

Stop crying and cursing! It’s an Onion article.

Tags | ,

14

Aug
2009

No Comments

In robotics
tactility

By David Birnbaum

Printed strain sensors = "sense of touch"

On 14, Aug 2009 | No Comments | In robotics, tactility | By David Birnbaum


In future, the robot could find its own way. A sensor will endow it with a sense of touch and help it to detect its undersea environment autonomously.

“One component in this tactile capability is a strain gauge,” says Marcus Maiwald…“If the robot encounters an obstacle,” he explains, “the strain gauge is distorted and the electrical resistance changes. The special feature of our strain gauge is that it is not glued but printed on – which means we can apply the sensor to curved surfaces of the robot.”

The sensor system on this robot is not all that complex; strain gauges are literally a dime a dozen (or less). But the configuration of the sensors reminds us of an animal body, and that’s what intrigues us. Since the strain sensors are printed along the surface of the robot in a continuous way (rather than being attached at some specific point), we’re reminded of how touch receptors are embedded throughout the skin, bringing to mind the phrase “sense of touch.” The Roomba has a mechanical sensor that is technically similar to the ones in this new robot, but we don’t talk about the Roomba having a sense of touch because the sensor is in a discrete place. To have a sense of touch you need to be able to sense contact (almost) anywhere on the surface of the body.

Tags | ,

13

Jul
2009

No Comments

In tactility

By David Birnbaum

Hawking on evolution and technology

On 13, Jul 2009 | No Comments | In tactility | By David Birnbaum


“At first, evolution proceeded by natural selection, from random mutations. This Darwinian phase, lasted about three and a half billion years, and produced us, beings who developed language, to exchange information.”

But what distinguishes us from our cave man ancestors is the knowledge that we have accumulated over the last ten thousand years, and particularly, Hawking points out, over the last three hundred.

“I think it is legitimate to take a broader view, and include externally transmitted information, as well as DNA, in the evolution of the human race,” Hawking said.

In the last ten thousand years the human species has been in what Hawking calls, “an external transmission phase,” where the internal record of information, handed down to succeeding generations in DNA, has not changed significantly. “But the external record, in books, and other long lasting forms of storage,” Hawking says, “has grown enormously. Some people would use the term, evolution, only for the internally transmitted genetic material, and would object to it being applied to information handed down externally. But I think that is too narrow a view. We are more than just our genes.”

I found it very interesting that the accompanying image depicts digital touch, with no caption or explanation of how it relates to the article. It’s just assumed that readers will get it: a hand reaching out and creating ripples in a fluid, digital medium demonstrates that we are more than just our genes. Doesn’t it?

6a00d8341bf7f753ef011571a5828a970b

Tags | ,

The Hand

On 04, Jun 2009 | One Comment | In art, books, cognition, neuroscience, physiology, tactility | By David Birnbaum

0679740473The Hand by Frank Wilson is a rare treat. It runs the gamut from anthropology (both the cultural and evolutionary varieties), to psychology, to biography. Wilson interviews an auto mechanic, a pupeteer, a surgeon, a physical therapist, a rock climber, a magician, and others—all with the goal of understanding the extent to which the human hand defines humanness.

Wilson is a neurologist who works with musicians who have been afflicted with debilitating chronic hand pain. As he writes about his many interviews, a few themes emerge that are especially relevant to our interests here.

Incorporation
Incorporation is the phenomenon of internalizing external objects; it’s the feeling that we all get that a tool has become one with our body.

The idea of “becoming one” with a backhoe is no more exotic than the idea of a rider becoming one with a horse or a carpenter becoming one with a hammer, and this phenomenon itself may take its origin from countless monkeys who spent countless eons becoming one with tree branches. The mystical feel comes from the combination of a good mechanical marriage and something in the nervous system that can make an object external to the body feel as if it had sprouted from the hand, foot, or (rarely) some other place on the body where your skin makes contact with it…

The contexts in which this bonding occurs are so varied that there is no single word that adequately conveys either the process or the many variants of its final form. One term that might qualify is “incorporation”—bringing something into, or making it part of, the body. It is a commonplace experience, familiar to anyone who has ever played a musical instrument, eaten with a fork or chopsticks, ridden a bicycle, or driven a car. (p. 63)

Projection
Projection is the ability to use the hand as a bridge for projecting consciousness from one location to another. (Wilson did not use the word “projection” in the book.) In some ways, projection can be seen as the opposite of incorporation. Master puppeteer Anton Bachleitner:

It takes at least three years of work to say you are a puppeteer. The most difficult job technically is to be able to feel the foot contact the floor as it actually happens. The only way to make the puppet look as though it is actually walking is by feeling what is happening through your hands. The other thing which I think you cannot really train for, but only can discover with very long practice and experience, is a change in your own vision.

The best puppeteer after some years will actually see what is happening on the stage as if he himself was located in the head of the puppet, looking out through the puppet’s eyes—he must learn to be in the puppet. This is true not only in the traditional actor’s sense, but in an unusual perceptual sense. The puppeteer stands two meters above the puppet and must be able to see what is on the stage and to move from the puppet’s perspective. Moving is a special problem because of this distance, because the puppet does not move at the same time your hand does. Also, there can be several puppets on the stage at the same time, and to appear realistic they must react to each other as they would in real life. So again the puppeteer must himself be mentally on the stage and able to react as a stage actor would react. This is something I cannot explain, but it is very imprortant for a puppeteer to be able to do this. (pp. 92–93)

Serge Percelly, professional juggler:

[An act is successful] not because you put something in the act that’s really difficult, but because you put something in the act in exactly the right way—in a way that makes it more interesting, not only for me but for the audience as well. I’m just trying somehow to do the act that I would have loved to see. (p. 111)

Skill
Wilson is a musician and a doctor to musicians, so he has special insight into the neurology of musical skill—which he recognizes as special case of manual skill that involves gesture, communication, and emotion.

Musical skill provides the clearest example and the cleanest proof of the existence of a whole class of self-defined, personally distinctive motor skills with an extended training and experience base, strong ties to the individual’s emotional and cognitive development, strong communicative intent, and very high performance standards. Musical skill, in other words, is more than simply praxis, ordinary manual dexterity, or expertness in pantomime. (p. 207)

The upper-limb (or “output”) requirements for an instrumentalist are not unique either; they depend upon the possession of arms, fingers, and thumbs, specific but idiosyncratic limits on the rage of motion at the shoulder, elbow, wrist, hand, and finger joints, variable abilities to achieve repetition rates and forces with specific digital configurations in sequence at multiple contact points on a sound-making device, and so on. Peculiarities in the physical configuration and movement capabisities of the musician’s limbs can be an advantage or disadvantage but are reflected in (and in adverse cases can be overcome through) instrument design: How wide can you make the neck of a guitar? How far apart should the keys be on a piano? Where should the keys be placed on a flute—in general? and for Susan and Peter? (p. 225)

Awareness
Touch experience can be a gateway to awareness, which can in turn heal both the mind and the body. Moshe Feldenkrais invented a form of physical therapy that focuses on stimulating an awareness of touch and movement sensations in order to relieve pain.

Most people slouch, tilt, shuffle, twist, stumble, and hobble along. Why should that be? Was there something wrong with their brains? After considering what dancers and musicians go through to improve control of their movements, [Feldenkrais] guessed that people must either be ignorant of the possibilities or refuse to act on them. So they just heave themselves around, lurching from parking place to office to parking place, utterly oblivious to what they are doing, to their appearance, and even to the sensations that arise from bodily movement. He suspected that people just lose contact with their own bodies. If and when they do notice, it is because they are so stiff that they can’t get out of bed or are in so much pain that they can barely get out of a chair. Then they start noticing…

What [Feldenkrais] was doing did not seem complicated. The goal of the guided movements was not to learn how to move, in the sense of learning to do a new dance step. The goal was not to stretch ligaments or muscles. It was not to increase strength. The goal, as he saw it, was to get the messages moving again and to encourage the brain to pay attention to them. (p. 244)

And his student, Anat Baniel, on the deep psychological roots of movement disorders:

I think working with children has given me this idea, which isn’t often discussed in medicine: a lot of disease—medical disease and emotional “dis-ease”—is an outcome of a lack of full development. It’s not something we can get to just by removing a psychological block…

Of course there are problems due to traumatic events in childhood, or disease—you name it. Feldenkrais said that ideal development would happen if the child was not opposed by a force too big for its strength. When you say to a small child, “Don’t touch that, it’s dangerous!” you create such a forceful inhibition that you actually distort the child’s movement, and growth, in a certain way.

Feldenkrais taught us to look for what isn’t there. Why doesn’t movement happen in the way that it should, given gravity, given the structure of the body, given the brain? For all of us there is a sort of sphere, or range, of movement that should be possible. Some people get only five or ten percent of that sphere, and you have to ask, “What explains the difference between those who get very little and those who get a lot?” Feldenkrais said that the difference is that in the process of development, the body encountered forces that were disproportionate to what the nervous system could absorb without becoming overinhibited—or overly excited, which is a manifestation of the same thing. (p. 252)

Feldenkrais’s approach is fascinating, but there is scant discussion in Wilson’s book about the role of the therapist’s hand in this process. After all, this kind of therapy is wholly reliant on an accidental discovery: that the patient can be made aware of her own body through an external, expert hand radiating pressure and heat. How is this possible? The topic isn’t explored.

There are many, many wonderful things to learn from this book for anyone with an interest in biology, art, music, history, or sports. You can find Frank Wilson on the web at Handoc.com

Tags | , , ,

15

Dec
2008

No Comments

In tactility

By David Birnbaum

3-D images play tricks on your hands

On 15, Dec 2008 | No Comments | In tactility | By David Birnbaum

New, high resolution 3-D images of Mars seem to afford haptic interaction, but don’t:

“You’d swear you could touch the terrain,” HiRISE operations manager Eric Eliason said.

Alas.

Tags | ,

10

Dec
2008

One Comment

In neuroscience
physiology
tactility

By David Birnbaum

The Times on tactility

On 10, Dec 2008 | One Comment | In neuroscience, physiology, tactility | By David Birnbaum

The New York Times has published a piece on tactility and haptics. It’s pretty good, and it may inspire some readers to think more about the role touch plays in their lives. Many of the points made in the article are central to my research and my life, so a proper fisking in order. Let us begin.

Imagine you’re in a dark room, running your fingers over a smooth surface in search of a single dot the size of this period. How high do you think the dot must be for your finger pads to feel it? A hundredth of an inch above background? A thousandth?

I’m hooked. I have no idea, but I’m prepared to be surprised.

Well, take a tip from the economy and keep downsizing.

We dodge the silly reference to arrive at our answer:

Scientists have determined that the human finger is so sensitive it can detect a surface bump just one micron high. All our punctuation point need do, then, is poke above its glassy backdrop by 1/25,000th of an inch—the diameter of a bacterial cell—and our fastidious fingers can find it.

Wow! Seriously, that’s incredible. I wonder whether we can feel bacteria with our fingers in certain situations.

The human eye, by contrast, can’t resolve anything much smaller than 100 microns. No wonder we rely on touch rather than vision when confronted by a new roll of toilet paper and its Abominable Invisible Seam.

This is great—not the whimsical use of capital letters to catalog the phenomenology of toileting, but the comparison of tactile resolution to visual resolution. It’s a common misconception that touch is less precise than vision. After all, the visual processing we perform to communicate (i.e., reading and writing) can seem much more complex than touch. However vision isn’t necessarily more precise, it’s just more symbol-based, which makes it seem higher resolution because it is associated with complex concepts.

Biologically, chronologically, allegorically and delusionally, touch is the mother of all sensory systems. It is an ancient sense in evolution: even the simplest single-celled organisms can feel when something brushes up against them and will respond by nudging closer or pulling away.

I don’t understand “delusionally,” and really wonder what she means. However, it is interesting that simple animals can be said to feel by virtue of their ability to react to their environment, because it says a lot about the concept of feeling: that to feel is to be alive. Nicholas Humphrey wrote in Seeing Red, “The external world is the external world, and it is certainly useful to know what is going on out there. But let the animal never forget that the bottom line is its own bodily well-being, that I am nobody if I’m not me“.

It is the first sense aroused during a baby’s gestation and the last sense to fade at life’s culmination. Patients in a deep vegetative coma who seem otherwise lost to the world will show skin responsiveness when touched by a nurse.

The chronology of touch is crucial. As I’ve written before, touch frames experience. It sets the boundaries of your self, not only in space but also in time.

Like a mother, touch is always hovering somewhere in the perceptual background, often ignored, but indispensable to our sense of safety and sanity. “Touch is so central to what we are, to the feeling of being ourselves, that we almost cannot imagine ourselves without it,” said Chris Dijkerman, a neuropsychologist at the Helmholtz Institute of Utrecht University in the Netherlands. It’s not like vision, where you close your eyes and you don’t see anything. You can’t do that with touch. It’s always there.

Which, again, may say something about touch as a concept: that to live is to touch. Try to imagine that you would like to have the experience of “not touching anything.” You jump out of a plane, but realize that your body is being touched all over by air. You float in space (with no clothes on), and splay your limbs out so that you are sure to not even touch yourself. But sensory receptors throughout your body are still reflecting your body state—the angles of your joints, the stretch of your skin. Is it possible to conceive of a unfeeling animal?

Long neglected in favor of the sensory heavyweights of vision and hearing, the study of touch lately has been gaining new cachet among neuroscientists, who sometimes refer to it by the amiably jargony term of haptics, Greek for touch.

Haptics doesn’t mean touch. It relates to the perception and manipulation of one’s environment. And why is it amiable, because you said so?

They’re exploring the implications of recently reported tactile illusions, of people being made to feel as though they had three arms, for example, or were levitating out of their bodies, with the hope of gaining insight into how the mind works. Others are turning to haptics for more practical purposes, to build better touch screen devices and robot hands, a more well-rounded virtual life.

It’s interesting that the author assumes studying tactile illusions is impractical. In fact, we haptic interface designers use tactile illusions all the time to achieve design goals.

There’s a fair amount of research into new ways of offloading information onto our tactile sense, said Lynette Jones of the Massachusetts Institute of Technology. To have your cellphone buzzing as opposed to ringing turned out to have a lot of advantages in some situations, and the question is, where else can vibrotactile cues be applied?â

For all its antiquity and constancy, touch is not passive or primitive or stuck in its ways. It is our most active sense, our means of seizing the world and experiencing it, quite literally, first hand. Susan J. Lederman, a professor of psychology at Queen’s University in Canada, pointed out that while we can perceive something visually or acoustically from a distance and without really trying, if we want to learn about something tactilely, we must make a move. We must rub the fabric, pet the cat, squeeze the Charmin.

I doubt that Lederman actually said those words, because they’re inaccurate and Lederman is one of the leading scientists in the field. Tactile sensations do not require movement, haptic perception does. Touch is incredibly hard to talk about. To do so successfully we must be particular about terminology.

And with every touchy foray, Heisenberg’s Uncertainty Principle looms large. Contact is a two-way street, and that’s not true for vision or audition, Dr. Lederman said. If you have a soft object and you squeeze it, you change its shape. The physical world reacts back.

This is also important—the act of touching necessarily produces change in the world (and thus, the context of experience), making it, at least figuratively, a singularity, or an infinitesimal feedback loop. However, I’m quite sure the Heisenberg uncertainty principle is not involved. Maybe the author knows that. Did she mean to use a difficult concept in particle physics as a metaphor for another difficult concept in perceptual science? If so, does this make the point clearer or more complicated?

Another trait that distinguishes touch is its widespread distribution. Whereas the sensory receptors for sight, vision, smell and taste are clustered together in the head, conveniently close to the brain that interprets the fruits of their vigils, touch receptors are scattered throughout the skin and muscle tissue and must convey their signals by way of the spinal cord. There are also many distinct classes of touch-related receptors: mechanoreceptors that respond to pressure and vibrations, thermal receptors primed to sense warmth or cold, kinesthetic receptors that keep track of where our limbs are, and the dread nociceptors, or pain receptors—nerve bundles with bare endings that fire when surrounding tissue is damaged.

The signals from the various touch receptors converge on the brain and sketch out a so-called somatosensory homunculus, a highly plastic internal representation of the body. Like any map, the homunculus exaggerates some features and downplays others. Looming largest are cortical sketches of those body parts that are especially blessed with touch receptors, which means our hidden homunculus has a clownishly large face and mouth and a pair of Paul Bunyan hands.

There’s another part of the sensory homunculus that’s clownishly large. Go there!

“Our hands and fingers are the tactile equivalent of the fovea in vision,” said Dr. Dijkerman, referring to the part of the retina where cone cell density is greatest and visual acuity highest. “If you want to explore the tactile world, your hands are the tool to use.”

Our hands are brilliant and can do many tasks automatically—button a shirt, fit a key in a lock, touch type for some of us, play piano for others. Dr. Lederman and her colleagues have shown that blindfolded subjects can easily recognize a wide range of common objects placed in their hands. But on some tactile tasks, touch is all thumbs. When people are given a raised line drawing of a common object, a bas-relief outline of, say, a screwdriver, they’re stumped. ‘If all we’ve got is contour information,’ Dr. Lederman said, ‘no weight, no texture, no thermal information, well, we’re very, very bad with that.’

This makes sense. If you think about it, the visual qualities of an object have little in common with sensory qualities of the same stimulus perceived through a different sensory channel. Imagine that you took a two dimensional contour of a screwdriver and mapped the X and Y values to pitch and loudness, and then listened to the contour of the screwdriver. It wouldn’t sound much like a screwdriver. In fact, I would bet that if you had someone listen to that, and then listen to silence, she would say that the silence sounded more like a screwdriver than the contour.

Touch also turns out to be easy to fool. Among the sensory tricks now being investigated is something called the Pinocchio illusion. Researchers have found that if they vibrate the tendon of the biceps, many people report feeling that their forearm is getting longer, their hand drifting ever further from their elbow. And if they are told to touch the forefinger of the vibrated arm to the tip of their nose, they feel as though their nose was lengthening, too.

I want to try this!

Some tactile illusions require the collusion of other senses. People who watch a rubber hand being stroked while the same treatment is applied to one of their own hands kept out of view quickly come to believe that the rubber prosthesis is the real thing, and will wince with pain at the sight of a hammer slamming into it. Other researchers have reported what they call the parchment-skin illusion. Subjects who rubbed their hands together while listening to high-frequency sounds described their palms as feeling exceptionally dry and papery, as though their hands must be responsible for the rasping noise they heard. Look up, little Pinocchio! Somebody’s pulling your strings.

It’s great to see that ordinary journalists are curious about tactility. The article includes many of my own talking points about touch, which is cool. Of course, the way to make someone really appreciate touch is by hugging or hitting them. Alas, writing about touch will have to do for now, until haptic technology makes its glorious arrival.

Tags | ,

12

Nov
2008

One Comment

In tactility

By David Birnbaum

My hometown's newspaper puts those annoying quotes around the word "feel"

On 12, Nov 2008 | One Comment | In tactility | By David Birnbaum

The San Diego Union-Tribune recently published an article about haptics:

On one computer, users could “feel” the contours of a virtual rabbit.

Do users “feel” the contours of a virtual rabbit, or do they just feel them? Do we “read” text on the internet, or just read it? When we watch a movie do we “see” the actors? Harumph.

The article is about Butterfly Haptics, which is a haptic interface based on magnetic levitation. I “felt” it at SIGGRAPH ’08, and it was extraordinarily crisp and strong. The only problem is that the workspace (range of motion) is tiny compared to other haptic interfaces, and there doesn’t seem to be a clear development path for expanding the workspace using magnetic technology. Nevertheless, it’s great to be able to add magnetic field actuation to the relatively limited number of technologies that can be used for haptic display.

Tags | , , , ,

07

Sep
2008

No Comments

In books
language
tactility

By David Birnbaum

Philosopher deathmatch, and how words are like tools

On 07, Sep 2008 | No Comments | In books, language, tactility | By David Birnbaum

9780060936648I just finished reading Wittgenstein’s Poker. From the jacket:

In October 1946, philosopher Karl Popper arrived at Cambridge to lecture at a seminar hosted by his legendary colleague Ludwig Wittgenstein. It did not go well: the men began arguing, and eventually, Wittgenstein began waving a fire poker toward Popper. It lasted scarcely 10 minutes, yet the debate has turned into perhaps modern philosophy’s most contentious encounter, largely because none of the eyewitnesses could agree on what happened. Did Wittgenstein physically threaten Popper with the poker? Did Popper lie about it afterward?

The authors provide a comprehensive biographical and historical context for the incident, and use it as a springboard into the two men’s respective philosophies. It’s an enjoyable look at two self-important, short-tempered intellectuals and their rivalry.

As I mentioned in this post, I find Wittgenstein’s philosophy of language often invokes touch themes. In the following excerpt from Poker (originating from one of his lectures), Wittgenstein makes a point about a colleague’s statement, “Good is what is right to admire,” utilizing a haptic metaphor:

The definition throws no light. There are three concepts, all of them vague. Imagine three solid pieces of stone. You pick them up, fit them together and you now get a ball. What you’ve now got tells you something about the three shapes. Now consider you have three balls of soft mud or putty. Now you put the three together and mold out of them a ball. Ewing makes a soft ball out of three pieces of mud. (68)

Another example stems from Wittgenstein’s midlife change in philosophical outlook. In his first publication, the Tractatus Logicio-Philosophicus, he was preoccupied with the “picture theory of language”—the idea that sentences describe “states of affairs” that can be likened to the contents of a picture. Later, he developed a theory of language based on words as tools for conveying meaning. In my reading, he shifted from a vision-based to a haptic-based (in fact, a distinctly physical-interaction-based) understanding of how language works.

The metaphor of language as a picture is replaced by the metaphor of language as a tool. If we want to know the meaning of a term, we should not ask what it stands for: we should instead examine how it is actually used. If we do so, we will soon recognize that there is no underlying single structure. Some words, which at first glance look as if they perform similar functions, actually operate to distinct sets of rules. (229)

Here’s the relevant passage directly from Philisophical Investigations:

It is like looking into the cabin of a locomotive. We see handles all looking more or less alike. (Naturally, since they are all supposed to be handled.) But one is the handle of a crank which can be moved continuously (it regulates the opening of a valve); another is the handle of a switch, which has only two effective positions, it is either off or on; a third is the handle of a brake-lever, the harder one pulls on it, the harder it brakes; a fourth, the handle of the pump: it has an effect only so long as it is moved to and fro. (PI, I, par. 12)

Words as the physical interface to meaning. Love it!

Tags | , , ,

28

May
2008

One Comment

In books
tactility

By David Birnbaum

Wittgenstein

On 28, May 2008 | One Comment | In books, tactility | By David Birnbaum

The philosophy of Ludwig Wittgenstein seems to come up often, so I decided to read up on what all the hoopla is about. This semi-biographical introduction to his major works is fascinating and easy to read. It seemed pretty comprehensive as well, though I’m no expert.

Wittgenstein was primarily concerned with logic and language, but I found his emphasis on know-how as opposed to know-that, and his view that skill supersedes knowledge of rules, to have a certain ‘embodiment’ quality to it. Excerpts:

So, language can indeed be said to be governed by rules; but those rules are for the most part only implicit in native speakers’ common usage. They can be derived from common usage by anyone who pays attention to it, but they are rarely operative in it: we do not normally use rules to work out what is correct. Rather, we have a fairly reliable ‘feeling’ for what sounds right in a given case. Rules can be formulated to codify our usage, but our usage is not ultimately based on such rules. (191)

Our traditional concept of pain, however, is not in competition with concepts that classify the same phenomena in terms of their underlying conditions. No physiologist could convince me that what in my own case I call ‘pain’ may not in fact be pain, or that my pain was in truth not where I felt it but in the brain. Why is this so?

Concepts are an expression of our interests. We group things together and call them by a common name according to those resemblances we find striking or important. And in different contexts we may be interested in different aspects of things. To classify phenomena scientifically, by their underlying structures or causes, is not always what we want. For instance, when taking an aesthetic attitude towards things, we are concerned entirely with their appearances. Invisible micro-structures become wholly irrelevant. And another area in which the scientific urge to leave behind the surface for underlying causes is often out of place is the realm of feeling: where our primary interest is in people’s conscious experience. Their suffering and well-being is important to us in its own right, and not merely as an indication of some underlying physiological conditions. Therefore physiological concepts like ‘lesion of tissue’ — whatever their importance for diagnosis and therapy — can never be in competition with, or act as substitutes for, our traditional concepts of feelings and emotions that are taught and understood through their links with natural expressive behaviour and characterized by the special authority we have in their first-person use. (254)

Tags |

28

Mar
2008

No Comments

In neuroscience
tactility

By David Birnbaum

How many neurons make a feeling?

On 28, Mar 2008 | No Comments | In neuroscience, tactility | By David Birnbaum

One:

The Dutch and German study, published in Nature, found that stimulating just one rat neuron could deliver the sensation of touch.

“The generally accepted model was that networks or arrays make decisions and that the influence of a single neuron is smaller, but this work and other recent studies support a more important role for the individual neuron.

“These studies drive down the level at which relevant computation is happening in the brain.”

I think it also supports the idea (discussed in detail in my thesis) that the word “touch” serves as a baseline indicator for subjective experience.