Stop Children, What's That Sound?
Ingress:
Several years ago, we (Kostas Stylidis, Casper Wickman, and Rikard Söderberg), a group of researchers at Chalmers University of Technology, had an idea that by showing annotated images of car parts to people, we could evoke their memories and experiences linked to those memories. By doing so, we wanted to capture what's most important for the people when they make a decision about the quality of the cars. We framed Perceived Quality attributes around Sensory Modalities. We were right, and multiple research today supports that idea.
For example, we knew then that we were showing an image of a half-open Mercedes Benz door; people could refer to the very distinct "vault-like" closing sound of this door.
Of course, to be able to recall such an experience, you must be familiar with the subject - either by closing the door of a car or a vault. Preferably both.
Today, the Intended Future's technology is based on that finding.
We are capturing not peoples' opinions about present design but their dreams about their perfect car, home, architectural marvel, or perfume bottle.
And guess what? We are crafting Intelligent Agents that help designers create their perfect designs.
Dive now into this research and wonder once again how complex and beautiful our human brains are.
Executive Summary:
This fascinating study sheds light on the brain's extraordinary ability to process sounds related to hand-object interactions, such as the act of bouncing a ball, within the primary somatosensory cortex (SI)—a brain area traditionally thought to be dedicated to processing touch. The researchers utilized advanced functional MRI techniques and analytical methods to reveal that SI can discern these interaction sounds from unrelated control sounds, such as pure tones or animal vocalizations. This discovery suggests that our sensory processing system is far more interconnected than previously understood, capable of integrating information across different sensory modalities. It underscores the importance of prior experiences in shaping our sensory perceptions, demonstrating the brain's adaptability in making sense of the world around us through a complex web of sensory inputs. This research opens new pathways for understanding how the brain interprets and integrates multisensory information, with potential implications for enhancing sensory processing in clinical settings and developing technologies that mimic human sensory integration.
Key Takeaways:
Multisensory Integration: The primary somatosensory cortex (SI) is involved in processing sounds of hand-object interactions, indicating a broader role in sensory integration than previously understood.
SI's Decoding Ability: SI can distinguish between sounds of hand-object interactions and unrelated sounds, showcasing its capacity to integrate and interpret complex sensory information.
Importance of Prior Experience: The findings suggest that prior experiences play a crucial role in how the brain processes and perceives sensory information, highlighting the adaptability of sensory perception.
What we thought before is not exactly right
Historically, neuroscientific research has primarily focused on how the primary sensory areas of the brain—such as the visual, somatosensory, and auditory cortices—process sensory inputs within their specific modalities. Despite this, it's widely recognized that these primary sensory areas are significantly influenced by inputs from other sensory modalities. Recent advancements have employed multivoxel pattern analysis (MVPA) to reveal that primary sensory cortices can encode detailed information about stimuli that are not inherently related to their primary sensory modality. For instance, research has demonstrated that observing a silent video that implies sound can activate the primary auditory cortex, and similarly, viewing images of objects that can be grasped activates the primary somatosensory cortex, despite the absence of tactile stimulation.
These findings suggest that the brain's early sensory areas can process fine-grained information about stimuli presented through other modalities, likely due to prior multisensory experiences. Anatomical studies support this cross-modal processing, showing potential pathways through which sensory information from one modality might reach primary sensory areas dedicated to another. Despite this, research has not yet fully explored how the primary somatosensory cortex (SI) processes detailed information about familiar sound categories, a gap this study addresses, considering the anatomical evidence for direct connections between auditory and somatosensory areas in both animals and humans.
Materials and Methods
The study utilized ten self-reported right-handed, healthy participants aged 18–25 for a functional MRI experiment. The participants were selected based on previous studies that investigated cross-sensory effects in early sensory areas using multivoxel pattern analysis (MVPA), which typically used a similar sample size. All participants reported normal or corrected-to-normal vision and hearing and were compensated for their participation.
Stimuli and Experimental Design
The experiment employed three categories of auditory stimuli: sounds depicting hand–object interactions (e.g., bouncing a basketball, typing on a keyboard), animal vocalizations (e.g., a dog barking, a frog croaking), and pure tones of varying frequencies. Each category included five subcategories with two exemplars each, totaling 30 stimuli. The stimuli were chosen to ensure familiarity and direct or observed experience with the sounds, with control categories selected to compare against the hand–object interaction sounds.
MRI Data Acquisition and Analysis
MRI data were collected using a 3-Tesla MR scanner, with structural and functional images acquired through specific protocols to optimize the quality and reliability of the blood oxygen level-dependent signals. Multivoxel pattern analysis (MVPA) involved training a linear support vector machine (SVM) to discriminate between the spatial patterns of brain activation elicited by each sound category, using a leave-one-run-out cross-validation approach. Additional analyses included erosion analysis to address potential spatial artifacts and univariate deconvolution to explore signal differences between sound categories across various brain regions of interest (ROIs).
Procedure
Participants underwent a training session to familiarize themselves with the experimental procedure before the scanning sessions. During the fMRI experiment, participants listened to the sounds while performing a one-back repetition counting task, aimed at ensuring attentive listening without requiring explicit motor actions that could confound somatosensory cortex activity. The total experiment included multiple runs, each beginning and ending with silent fixation blocks, and incorporated a mix of stimulus presentations and null trials.
This comprehensive approach, combining detailed stimulus design, sophisticated MRI data collection and analysis techniques, and carefully structured experimental procedures, aimed to investigate the somatosensory cortex's response to auditory stimuli depicting hand–object interactions in the absence of tactile stimulation.
Things to consider
The study presents compelling evidence that the primary somatosensory cortex (SI) can decode sounds related to familiar hand-object interactions without any tactile stimulation. This finding contrasts with control sounds of animal vocalizations and pure tones, which did not show similar decoding in SI. The research underlines the specificity of SI in processing hand-object interaction sounds, highlighting a sophisticated level of cross-modal sensory integration.
Key insights include:
Cross-Modal Information Processing: The study supports the idea that SI is capable of processing detailed information across different sensory modalities, specifically between auditory and tactile senses. This suggests a broader functional role for SI, extending beyond its traditional sensory domain.
Functional Specialization in SI: Decoding performances were significantly higher for hand-object interaction sounds compared to pure tones, particularly when analyses were focused on hand-sensitive voxels of left SI or when considering the entire SI region. This indicates a degree of functional specialization within SI for processing sounds associated with hand-object interactions.
Anatomical Pathways for Cross-Modal Integration: The study hypothesizes about possible anatomical pathways that could facilitate the observed cross-modal sensory processing in SI, including high-level multisensory convergence zones and direct projections between sensory areas.
Representation of Sound Categories in Somatosensory Cortices: Both primary (SI) and secondary (SII) somatosensory cortices were found to contain fine-grained information about specific sound categories, suggesting a widespread capacity for sound category representation across somatosensory areas.
The Role of Prior Experience: The study highlights the influence of prior audio-haptic experience in shaping the ability of SI to decode hand-object interaction sounds, suggesting that familiar sound categories are processed distinctively based on past interactions.
Hemispheric Differences in Decoding: The study observed that the brain processes sounds of hand-object interactions, like the sound of typing or bouncing a ball, differently in its two halves. It found that the left side of the brain, especially in people who are right-handed, is more actively involved in understanding these types of sounds. This suggests that the brain's system for managing actions involving tools or objects is primarily located in the left hemisphere for right-handed individuals. Essentially, if you tend to use your right hand more, your left brain is busier analyzing sounds related to handling or manipulating objects.
Conclusions
The findings highlight the brain's remarkable ability for cross-modal sensory integration, showing that auditory information can be processed in regions traditionally associated with touch. We found out that SI doesn't just help us feel things through touch but also helps us understand sounds that happen when we use our hands to interact with objects, like when bouncing a ball or typing. This shows how our brain can mix information from different senses to help us understand the world better.
We also learned that the left side of the brain, especially in people who use their right hand more, is really good at figuring out these kinds of sounds. This is interesting because it shows that our brain has special areas that are really good at specific tasks, like understanding sounds related to using objects.
The study tells us that what we've experienced before plays a big role in how our brain processes these sounds. This means our brain is really good at learning from our past interactions with the world around us.
In simple terms, this research teaches us more about how smart our brain is at putting together information from what we see, hear, and touch.
Disclaimer: This Future Insight is the adaptation of the original research article entitled: “Decoding sounds depicting hand–object interactions in primary somatosensory cortex. Written by Kerri M Bailey, Bruno L Giordano, Amanda L Kaas, Fraser W Smith. Originally published by Oxford University Press in “Cerebral Cortex”
About this paper:
Kerri M Bailey, Bruno L Giordano, Amanda L Kaas, Fraser W Smith, Decoding sounds depicting hand–object interactions in primary somatosensory cortex, Cerebral Cortex, Volume 33, Issue 7, 1 April 2023, Pages 3621–3635, https://doi.org/10.1093/cercor/bhac296