Our ability to perceive and discriminate textures is based on the processing of high-frequency vibrations generated on the fingertip as it is scanned across a surface. But how do we process tactile information when we simultaneously experience different cues at separate locations on the body? In this podcast, Editor-in-Chief Bill Yates talks with Jeff Yau about his recent study which found that vibrations experienced on one hand always systematically modulated the perception of vibrations on the other hand. Listen to learn about somatosensory interactions, the role of hand position in tactile perception, and more!
Somatosensory interactions reveal feature-dependent computations
Md. Shoaibur Rahman and Jeffrey M Yau
Journal of Neurophysiology, Published online April 10, 2019.