Hearing with the skin

  • October 21, 2015

We both hear and feel bass vibrations.  This is familiar to anyone who has felt bass thumping in their chest, or has simply rested their hand on a piano.  Hearing involves the skin, not just the ears.

This “non-classical auditory pathway” has some interesting features:

  1. It works well — All by itself it’s enough for a professional musician.
  2. We’re born with it — The system is strongest when we’re kids, but some may gradually lose it.
  3. We use it for clarity — Tactile cues affect the words we hear.

 

It works well – Although the skin only participates in the hearing of most people, in some cases it can do the whole the job by itself.  The Scottish percussionist Evelyn Glennie, for example became profoundly deaf by age 12, but remains a world-class performer.  She once explained what it’s like to hear primarily with her skin, and how she trained her non-classical pathway for perfect pitch.  Sometimes she performs barefoot to hear better.  A recent performance of hers appears in a TED talk.    

EvelynGlenniePercuss

Figure 1: “Deaf” percussionist Evelyn Glennie at Moers Festival 2004. Licensed under CC BY-SA 2.5 via Commons https://commons.wikimedia.org/wiki/File:Evelyn-glennie.jpg#/media/File:Evelyn-glennie.jpg

 

Excerpts from Evelyn Glennie’s Hearing Essay

“Hearing is basically a specialized form of touch. Sound is simply vibrating air which the ear picks up and converts to electrical signals, which are then interpreted by the brain. The sense of hearing is not the only sense that can do this, touch can do this too. If you are standing by the road and a large truck goes by, do you hear or feel the vibration? The answer is both….

With very low frequency vibration the ear starts becoming inefficient and the rest of the body’s sense of touch starts to take over. For some reason we tend to make a distinction between hearing a sound and feeling a vibration, in reality they are the same thing. It is interesting to note that in the Italian language this distinction does not exist. The verb ‘sentire’ means to hear and the same verb in the reflexive form ‘sentirsi’ means to feel…

Deafness does not mean that you can’t hear, only that there is something wrong with the ears. Even someone who is totally deaf can still hear/feel sounds….

…The low sounds I feel mainly in my legs and feet and high sounds might be particular places on my face, neck and chest.”

 

Evelyn Glennie’s career, and her 2015 Polar Music Prize, show how rich and precise the sense of tactile sound can be.

 

We’re born with it – Although tactile sound can be an especially rich sense for an adult musician like Evelyn Glennie, we’re all born with it.  Recently Prof. Aage Moller showed that younger people are particularly sensitive to tactile sound.  In Neuroscience Letters, he reported the change in perceived loudness of “click” sounds when he gave people of different ages a tingling sensation in the hand.  Subjects matched the loudness of “clicks” with and without electrical stimulation of the median nerve (Figure 1, below).  The younger the individual, the greater the average loudness increase.  The increases could be substantial since 3 decibels is the same as roughly doubling the sound power.  

TacLoudnessVAge

Figure 1: Average change in perceived loudness during median nerve stimulation as a function of age.  Reprinted from Moller et al. 2002.

 

Although the loudness effect diminished for adults in Moller’s study, other studies show tactile sound increases loudness in adults as well.  The difference may have been the use of electricity at the hand, versus vibration, which doesn’t seem to have this problem.  For example, studies by Sean Olive, Director of Acoustic Research at Harman, show that tactile bass from seat shakers in cars can offset about 6 dB of acoustic volume [1].

 

[1] Martens, William L., et al. “Whole-Body Vibration Associated with Low-Frequency Audio Reproduction Influences Preferred Equalization.” Audio Engineering Society Conference: 36th International Conference: Automotive Audio. Audio Engineering Society, 2009.

 

It adds clarity — A recent paper in Nature shows that people regularly use tactile cues to clarify their perception of speech.  Some speech sounds produce a puff of air.  We perceive these puffs and use them to help differentiate similar sounds, like “p” and “b.”  In an experiment, artificial puffs of air swayed listeners’ perception toward one sound or another (Figure 3 below).

TacPahBahPuff

Figure 3: air puffs steering sound perception toward “p” or “b.”  Reprinted from Bryan Gick and Donald Derrick.  Aero-tactile integration in speech perception. NATURE| Vol 462| 26 pg 502-504. November 2009.

 

This is important because languages have many  confusing word pairs  differentiated by puffs of air.  These differences can set the meaning of a phrase.  For example, at the start of a word, a tactile “puff” cue can help distinguish:  

“I hate peas”

from

“I hate bees.”

 

Or, at the end of a word a listener might use them to distinguish:

“The woman in his lab”  

from

“The woman in his lap”

 

The point is that everyone naturally brings these tactile cues to their perception of sound.  Tactile sound is not just about bass.  It is also about clarity.

In conclusion – The studies above were all published in the last 15 years, so it’s clear that we’re just starting to appreciate the importance of tactile sound.  Taction Technology is hard at work creating hardware, software, and products that engage this vital sense.  It’s rewarding and worthwhile project because…

the nonclassical auditory pathway:

  • is musical
  • is something we all are born with
  • is something we use for clarity every day.

 

Taction FAQ For Gaming

  • October 14, 2015

How much will Taction headphones speed up my reaction time?

About 60 milliseconds. A person will have a range or reaction times for a given task, .  For example, to click a left or right mouse button when hearing a sound on the left or right side, it commonly takes 150-400 milliseconds for the person to hear it, pick the side, and flex the finger.  Adding Taction takes that normal range, and shifts it to shorter times, typically by about 60 milliseconds.  Typical results for three players doing this task are shown below.

Does 60 milliseconds matter?

Yes, gamers care about 60 milliseconds.  It’s the difference between a playable latency (e.g. 50 milliseconds, perfectly fine), and a latency that makes the game unplayable (e.g. 110 milliseconds, disaster). On CNET experienced gamers report they can tell the difference between 20 milliseconds and 50 milliseconds of lag.  Taction won’t make a bad network connection good, but it does remove a source of latency in your body.  It reduces the time you need to locate sound sources and take action.

Why does Taction make it easier to tell sounds on the left from sounds on the right?

Because a touch on the right side of your body is always “from the right.”  But sound from the right reaches both ears, almost equally.  It’s true that we have a built-in auditory processing to help locate sounds, but adding the sense of touch certainly makes it easier. It adds, at minimum, a “Tactile Interaural Level Difference” to the acoustic cues we use to locate sound. With audio-tactile processing and our uniquely expressive hardware, we can actually provide much more than a level difference, but a tactile level difference is a big step.

What makes the tactile route fast?

We don’t know.  We do know that it’s often found in user studies that tactile signal let people react faster than visual or auditory signals (see, for example [1]).  We also know that the part of the brain that processes touch (somatosensory cortex) is right next to the part that moves your body (motor cortex).  The part that processes sound is farther away, and the part that processes vision is even farther (see Figure below).

We also know that the spatial clarity and fast processing time of the tactile channel works well for military pilots [2] , and that it is an ongoing area of military research and development [3].

[1] D. Hecht, M.Reiner, G. Halevy, “Multi-Modal Stimulation, Response Time, and Presence”. Presence 2005.

[2] McGrath, B. J., et al. Tactile Situation Awareness System Flight Demonstration. No. USAARL-2004-10. ARMY AEROMEDICAL RESEARCH LAB FORT RUCKER AL, 2004. Available online 10/132014 at http://www.dtic.mil/dtic/tr/fulltext/u2/a422198.pdf

[3] John Chiasson, B.Sc., Braden J. McGrath, Ph.D., Angus H. Rupert, M.D., Ph.D. Enhanced Situation Awareness in Sea, Air and Land.Environments. DTIC Report; Naval Medical Research Laboratory FL. Available Online 10/13/2015 at http://www.faa.gov/about/office_org/headquarters_offices/avs/offices/aam/cami/library/online_libraries/aerospace_medicine/sd/media/mp-086-32.pdf

Does it feel natural?

Yes, absolutley it feels natural.  In fact, it feels far more natural than headphones without it  We are used to feeling music and other sounds.  Think of feeling music in your chest when you stand in front of a big speaker.  We are used to feeling bass with the skin.  Once people try it, they feel like something is missing if they use headphones without Taction.