Journal Logo

Integrative Systems

Perception of surface stickiness in different sensory modalities

an functional MRI study

So, Yosupa; Kim, Sung-Philb; Kim, Junsuka,,c

Author Information
doi: 10.1097/WNR.0000000000001419
  • Open

Abstract

Introduction

In our daily life, we perceive surface texture information of surrounding objects through various sensory modalities [1]. For example, we can perceive surface properties of an uneven rock by touching it with our hand. Moreover, the coarse surface of the rock can also be perceived using sensory information obtained from different sensory modalities, for example, auditory (harsh sound generated by interacting with it) and visual (bumpy shape by looking at it) channels. To date, a number of human psychophysical studies have explored surface texture perception using various sensory modalities [2,3]. Several studies have shown that participants are more accurate and faster when exploring surfaces with their hands than with their eyes and ears [4,5]. It has also been reported that humans are surprisingly sensitive toward identifying surface texture with visual stimuli [3] or touch-generated sounds [6] alone. However, most of these studies have investigated texture perception using rough and stiff surfaces as the fundamental dimensions of tactile perception [7], with only a few studies having explored the stickiness dimension [8].

In our previous psychophysical study [9], we investigated the sensitivity of humans participants toward perceiving different extents of surface stickiness using tactile, auditory, and visual cues. Interestingly, our results demonstrated that the perceptual mapping for visual and tactile stickiness were statistically similar, however, auditory stickiness perception was different. In the present study, using functional MRI (fMRI), we explored brain activation due to surface stickiness perception evoked by different sensory modalities, that is, tactile, auditory, and visual cues. There are several previous neuroimaging studies that have identified brain regions involved in the perception of tactile stickiness [10,11], but none of these studies have highlighted brain activation elicited by other sensory modalities. In this backdrop, we aimed to compare brain activation in response to the sticky stimuli conveyed via tactile, auditory, and visual channels. In particular, we attempted to explore common activation profiles across sensory modalities, and to identify brain regions involved in stickiness information processing for a certain modality.

Methods

Participants and ethics approval

Twenty-one right-handed volunteers (13 female, average age: 22.9 ± 1.6 years, age range: 20–25 years) participated in the experiment. Participants had no history of neurological disorders or deficits. However, one participant exhibited palmar hyperhidrosis, and was excluded from the analysis. Experimental procedures were approved by the ethical committee of Sungkyunkwan University (IRB# 2018-05-001) and the study was conducted in accordance with the Declaration of Helsinki. All participants were informed about the experimental procedure, and provided written informed consent before their participation. Since the experimental task included active texture explorations with the right index finger, participants’ handedness might have an effect on perception of texture. Thus, we asked participants to report their handedness in writing when they filled out the informed consent form. Participants were randomly recruited students. Hence, they did not know about this study beforehand and they were not likely to have special skills for tactile, visual perception, and auditory imagery.

Stimuli

As described in our previous behavioral experiments [9], a repositionable tape (3M Center, St. Paul, Minnesota, USA) was used for the sticky stimuli. In particular, we selected the ‘9425’ tape with a physical stickiness intensity of 131.2 gf (gram-force). This physical adhesiveness was estimated by the ‘probe tack test’ that measures the peak value of adhesive force, which is indicative of the instantaneous adhesion property. Using this tape, we created (1) tactile, (2) auditory, and (3) visual stimuli. (1) A tape measuring 5 × 1.9 cm was attached on an acrylic plate sized 5 × 9 cm. A plastic plate was used to enable the experimenter to present the stimuli easily without direct contact with the participants. (2) The sound generated while touching and detaching with the right index fingertip was recorded using a condenser microphone. Each audio clip was 3.5 s long and consisted of two parts, that is, touching period for the first 2 s and detaching period for the last 1.5 s. (3) A video clip was recorded with a resolution of 1920 × 1080 at 30 frames per second. Each video clip recorded the right index fingertip touching and detaching from the sticky surface. The video camera was positioned at a distance of 10 cm from the surface used to apply the stimulus, and 5 cm above the tabletop. Video clip was 3.5 s long, and consisted of two parts similar to the audio clip. Note that audio and video clips were prepared for each participant separately to minimize the potential effect of the differences in appearance of the hands of the participants. Thus, participants watched or heard their own tactile explorations during the fMRI experiment (see Experimental Design for more details).

Experimental design

Before the main experiment, participants performed a training session to familiarize themselves with the stimuli and the process of recording their own audio and video clips. In this training session, participants repeatedly touched the sticky surface using their right index finger tip. At the same time, they were watching and hearing their tactile exploration of the sticky surface through the screen and the headset.

During the acquisition of fMRI data, participants were comfortably laid with their right arm placed along the magnet bore and a response box held in their left hand. They wore an MRI-compatible headphone to listen to auditory stimuli and watched the computer screen via an angled surface-mirror. Participants carried out six fMRI runs, that is, twice for each sensory modality. Each run consisted of 12 trials and each trial started with a fixation cross on the screen for 6–8 s (jittered) followed by touching (2 s) and detaching (1.5 s) periods. In the tactile runs, participants slowly attached their right index fingertip to the given stimulus when ‘Touch’ was displayed on the screen and maintained for 2 s. As soon as ‘Detach’ was displayed, they lifted their finger. Tactile stimuli were exchanged manually by the experimenter standing next to the magnet bore. In the auditory runs, participants heard their audio clips of touching (2 s) and detaching (1.5 s) recorded during the training session. Similar to the tactile runs, stimulus onsets were informed by ‘Touch’and ‘Detach’ instructions on the screen. In the visual runs, participants watched video clips displaying their right index finger exploring the sticky surface for 3.5 s, that is, touching (2 s) and detaching (1.5 s). To confirm that participants consistently attended to the task, they were asked to press a button after the presentation of the stimulus.

Data acquisition and preprocessing

fMRI experiments were performed using a 3T MRI scanner (Magnetom TrioTim; Siemens Medical Systems, Erlangen, Germany) with a standard 24-channel head coil. Functional images were acquired using Blood-oxygen-level-dependent (BOLD) sensitive gradient-echo-based echo planar image (Repetition time (TR) = 2000 ms, Echo time (TE) = 35 ms, flip angle = 90°, Field of view (FOV) = 200 mm, slice thickness = 2 mm, and in-plain resolution = 2 × 2 mm) with 72 slices that cover the whole cerebrum. To obtain T1-weighted anatomical images from each participant, a 3D magnetization-prepared gradient-echo sequence was used (TR = 2300 ms, TE = 2.28 ms, flip angle = 8°, FOV = 256 mm, slice thickness = 1 mm, and in-plain resolution = 1 × 1 mm). The preprocessing and statistical analysis of fMRI data were performed using SPM12 (Wellcome Department of Imaging Neuroscience, UCL, London, UK) and a high-pass filter of 128 s was used to eliminate low frequency noise. The Echo-planar imaging data were realigned for motion correction, coregistered to the individual T1-weighted images, normalized into the Montreal Neurological Institute space, and spatially smoothed by a 4 mm full-width-half-maximum Gaussian Kernel.

Data analysis

Data analyses were performed using a general linear model in SPM12 with a canonical hemodynamic response function to estimate BOLD responses to each stimulus. We determined the moment of ‘Detach’ the fingertip from the sticky surface as a stimulus onset because the stickiness perception occurs when the skin is physically stretched by the adhesive surface. Moreover, for the contrasting analysis, we determined the middle of the period when a fixation cross was displayed as a resting onset. Regressors for stimulation and resting were defined accordingly. Six nuisance regressors for motion (movement and rotation along three orthogonal axes) were also defined for the purpose of head movement correction. We applied the whole-brain analysis rather than a specific region of interest analysis, since brain regions being engaged in stickiness perception were unknown, especially for the auditory and visual domains. We performed two different analyses. First, using an event-related design analysis, we evaluated the BOLD activation level differences by comparing stimulus onset and resting period for each modality separately. Contrast images from the individual analyses were sent to the second level analysis with subjects as a random factor. Second, a conjunction analysis was performed. We employed a one-way analysis of variance design and evaluated the conjunction of ‘tactile vs. resting state’, ‘auditory vs. resting state’, and ‘visual vs. resting state’. This conjunction analysis could identify common brain regions for which neural activation was greater during stimulation periods than those of resting periods across all modalities. Statistical threshold for all analyses was determined as P < 0.05 family-wise error-corrected and cluster-extents >40 voxels.

Results

A contrasting analysis for the tactile condition elicited significant activation in the somatosensory cortices including the postcentral gyrus (poCG), anterior intraparietal sulcus (aIPS), supramarginal gyrus, and rolandic operaculum (Table 1 and Fig. 1). Moreover, the anterior insula (aINS), medial occipital gyrus (MOG), and precentral gyrus (preCG) regions were activated in the contralateral hemisphere. In the ipsilateral hemisphere, we observed activated brain regions in the preCG and aINS. Furthermore, the supplementary motor area (SMA) was activated bilaterally. The auditory stickiness information process also activated contralateral somatosensory cortices including the poCG and aIPS. We observed ipsilateral activations in the medial temporal gyrus and inferior occipital gyrus. Additionally, there was significant bilateral activation in preCG, aINS, and SMA. For the visual condition, contrasting analysis identified significant activation in the contralateral somatosensory cortices (poCG and aIPS), aINS, and IOG. Moreover, the preCG, MOG, and SMA were activated bilaterally.

Table 1
Table 1:
Brain activation from general linear modal analyses
Fig. 1
Fig. 1:
Brain activations of contrasting analyses. Statistical threshold was set at P < 0.05, family-wise error (FWE)-corrected for multiple comparisons, cluster size 40 contiguous voxels.

In addition to the contrasting analyses for each modality, we conducted a conjunction analysis across tactile, auditory, and visual conditions. Results showed a substantial overlap in the conjunction map including clusters in the contralateral poCG, aINS and preCG and in the bilateral SMA (Table 1 and Fig. 2).

Fig. 2
Fig. 2:
Brain activations of conjunction analyses. Statistical threshold was set at P < 0.05, family-wise error (FWE)-corrected for multiple comparisons, cluster size 40 contiguous voxels.

Discussion

The current study aimed to identify brain regions related to surface stickiness perception for each modality and searched for common brain activations across modalities. In particular, we attempted to compare brain regions in response to information on surface stickiness conveyed via tactile, auditory, and visual cues. All the contrasting analyses for each modality identified significant activation in the somatosensory cortices together with the primary sensory brain regions of each sensory system. Moreover, the results of conjunction analysis suggest that the poCG, aINS, preCG, and SMA are closely associated with the perception of surface stickiness across sensory modalities.

One of the main findings of this study is that the poCG and aINS regions were consistently activated in both contrasting and conjunction analyses. These commonly activated regions are traditionally known as main areas of the human somatosensory system, and these regions have been reported to be involved in tactile information processing in a number of fMRI studies [12–14]. Moreover, activation in these regions exhibits considerable overlap with regions identified in previous studies on tactile stickiness perception [10,11]. The most interesting aspect of our results is that these somatosensory regions were consistently activated by non-corresponding sensory cues, that is, auditory or visual cues. Several previous neuroimaging studies have observed that the primary sensory cortices of each modality could be activated by non-corresponding sensory input [15,16]. For example, Stilla and Sathian [17] reported significant neural activations in the human somatosensory cortices when the texture of the surface was presented visually. Similarly, Merabet et al. [18] showed that tactile stimulus alone could recruit visual cortical activation. Moreover, a recent fMRI study investigated the convergence of information from different sensory streams and successfully demonstrated that common objects presented with auditory, visual, and tactile modalities are not only reflected within corresponding sensory cortices, but also represented in the sensory area of different modalities [19]. In line with these previous findings, our results suggest that auditory and visual stimuli conveying information on surface stickiness can elicit neural activation in the somatosensory regions.

The activation of IPS in contrasting analyses of each modality is a noteworthy observation. This region is known to be implicated in multisensory information integration in both primate and human brains. A number of non-human primates [20] and human neuroimaging studies [17,21] have consistently demonstrated that the IPS performs integration of multisensory information. For example, Jäncke et al. [22] have reported the activation of IPS play a role for supra-modal integration between visual representation and complex manipulation of objects in the human brain. In our study, we clearly observed the simultaneous activation in the corresponding primary sensory cortices and the IPS, for example, simultaneous activation in both primary auditory cortex and aIPS under the auditory condition. Hence, we speculate that the identification of aIPS is mainly due to multisensory convergence and cross-modal generalization.

Although there was no finger movement in the auditory and visual conditions, we observed the activation of SMA and preCG. These regions are well known to be associated with the preparation and execution of voluntary movements [23]. However, significant activation in these regions was reported during passive touch or motor imagery as well [24,25], as observed in our present results. Therefore, a possible explanation for the activation of SMA and preCG could be related to motor imagery. In our experimental procedure, participants underwent a training session prior to MR scanning. While repeatedly touching the sticky surface with their fingertip, they were watching and hearing their surface explorations. We therefore believe that a strong association exists between the three different sensory stimuli, and that participants imagine finger movements without actually performing the movement during scanning sessions. Consequently, this motor imagery is likely to have an influence on the activation in the SMA and preCG.

There are several potential limitations of this study. First, the generalization of the results to population should be made with caution because the sample size was relatively small (N = 21) and our sample consisted solely of college students. Second, we could not completely eliminate the effects of scanner noise during the auditory session. In future works, we will use an equipment such as noise canceling headsets for better perception of the auditory cues.

In this study, we successfully identified brain regions exhibiting information on surface stickiness perceived by tactile, auditory, and visual cues. Intriguingly, our results using contrasting and conjunction analyses suggest that neural activities in somatosensory cortices, aINS, and aIPS regions play an important role in cross-modal generalization. To the best our knowledge, this is the first attempt to explore brain activities related to auditory and visual stickiness perception. We envision that future work will uncover the detailed neural mechanisms underlying surface texture perception, such as stickiness intensity perception.

Acknowledgements

This work was supported by the Institute for Basic Science (IBS-R015-Y2).

Y.S. and J.K. conducted experiments, analyzed results, and wrote the original draft of the manuscript. S-P.K. revised the manuscript.

Conflicts of interest

There are no conflicts of interest.

References

1. Lederman SJ, Klatzky RL. Calvert E, Spence C, Stein B. Multisensory texture perception. Handbook of Multisensory Processes. 2004, Cambridge, MA: MIT Press107–122
2. Bergmann Tiest WM, Kappers AM. Haptic and visual perception of roughness. Acta Psychol (Amst). 2007; 124:177–189
3. Lederman SJ, Abbott SG. Texture perception: studies of intersensory organization using a discrepancy paradigm, and visual versus tactual psychophysics. J Exp Psychol Hum Percept Perform. 1981; 7:902–915
4. Heller MA. Texture perception in sighted and blind observers. Percept Psychophys. 1989; 45:49–54
5. Lederman SJ, Klatzky RL. Hand movements: a window into haptic object recognition. Cogn Psychol. 1987; 19:342–368
6. Katz D. The world of touch. 1989, Hillsdale, NJ: Lawrence Erlbaum Associates
7. Hollins M, Faldowski R, Rao S, Young F. Perceptual dimensions of tactile surface texture: a multidimensional scaling analysis. Percept Psychophys. 1993; 54:697–705
8. Bensmaia S. Prescott T, Ahissar E, Izhikevich E. Texture from touch. Scholarpedia of Touch. 2016, Paris: Atlantis Press207–215
9. Lee H, Lee E, Jung J, Kim J. Surface stickiness perception by auditory, tactile, and visual cues. Front Psychol. 2019; 10:2135
10. Kim J, Yeon J, Ryu J, Park JY, Chung SC, Kim SP. Neural activity patterns in the human brain reflect tactile stickiness perception. Front Hum Neurosci. 2017; 11:445
11. Yeon J, Kim J, Ryu J, Park JY, Chung SC, Kim SP. Human brain activity related to the tactile perception of stickiness. Front Hum Neurosci. 2017; 11:8
12. Kitada R, Hashimoto T, Kochiyama T, Kito T, Okada T, Matsumura M, et al. Tactile estimation of the roughness of gratings yields a graded response in the human brain: an fmri study. Neuroimage. 2005; 25:90–100
13. Reed CL, Shoham S, Halgren E. Neural substrates of tactile object recognition: an fmri study. Hum Brain Mapp. 2004; 21:236–246
14. Lederman S, Gati J, Servos P, Wilson D. Fmri-derived cortical maps for haptic shape, texture, and hardness. Cogn Brain Res. 2001; 12:307–313
15. Liang M, Mouraux A, Hu L, Iannetti GD. Primary sensory cortices contain distinguishable spatial patterns of activity for each sense. Nat Commun. 2013; 4:1979
16. Wallace MT, Ramachandran R, Stein BE. A revised view of sensory cortical parcellation. Proc Natl Acad Sci U S A. 2004; 101:2167–2172
17. Stilla R, Sathian K. Selective visuo-haptic processing of shape and texture. Hum Brain Mapp. 2008; 29:1123–1138
18. Merabet LB, Swisher JD, McMains SA, Halko MA, Amedi A, Pascual-Leone A, Somers DC. Combined activation and deactivation of visual cortex during tactile sensory processing. J Neurophysiol. 2007; 97:1633–1641
19. Man K, Damasio A, Meyer K, Kaplan JT. Convergent and invariant object representations for sight, sound, and touch. Hum Brain Mapp. 2015; 36:3629–3640
20. Avillac M, Ben Hamed S, Duhamel JR. Multisensory integration in the ventral intraparietal area of the macaque monkey. J Neurosci. 2007; 27:1922–1932
21. Makin TR, Holmes NP, Zohary E. Is that near my hand? Multisensory representation of peripersonal space in human intraparietal sulcus. J Neurosci. 2007; 27:731–740
22. Jäncke L, Kleinschmidt A, Mirzazade S, Shah NJ, Freund HJ. The role of the inferior parietal cortex in linking the tactile perception and manual construction of object shapes. Cereb Cortex. 2001; 11:114–121
23. Gerloff C, Corwell B, Chen R, Hallett M, Cohen LG. Stimulation over the human supplementary motor area interferes with the organization of future elements in complex motor sequences. Brain. 1997; 120Pt 91587–1602
24. Blakemore SJ, Bristow D, Bird G, Frith C, Ward J. Somatosensory activations during the observation of touch and a case of vision-touch synaesthesia. Brain. 2005; 128:1571–1583
25. Meyer K, Kaplan JT, Essex R, Damasio H, Damasio A. Seeing touch is correlated with content-specific activity in primary somatosensory cortex. Cereb Cortex. 2011; 21:2113–2121
Keywords:

functional MRI; primary sensory cortex; somatosensory cortex; tactile stickiness; texture perception

© 2020 Wolters Kluwer Health | Lippincott Williams & Wilkins