Javascript must be enabled to continue!
Adapting the Crossmodal Congruency Task for Measuring the Limits of Visual–Tactile Interactions Within and Between Groups
View through CrossRef
The crossmodal congruency task (CCT) is a commonly used paradigm for measuring visual–tactile interactions and how these may be influenced by discrepancies in space and time between the tactile target and visual distractors. The majority of studies which have used this paradigm have neither measured, nor attempted to control, individual variability in unisensory (tactile) performance. We have developed a version of the CCT in which unisensory baseline performance is constrained to enable comparisons within and between participant groups. Participants were instructed to discriminate between single and double tactile pulses presented to their dominant hand, at their own approximate threshold level. In Experiment 1, visual distractors were presented at −30 ms, 100 ms, 200 ms and 400 ms stimulus onset asynchronies. In Experiment 2, ipsilateral visual distractors were presented 0 cm, 21 cm, and 42 cm vertically from the target hand, and 42 cm in a symmetrical, contralateral position. Distractors presented −30 ms and 0 cm from the target produced a significantly larger congruency effect than at other time points and spatial locations. Thus, the typical limits of visual–tactile interactions were replicated using a version of the task in which baseline performance can be constrained. The usefulness of this approach is supported by the observation that tactile thresholds correlated with self-reported autistic traits in this non-clinical sample. We discuss the suitability of this adapted version of the CCT for measuring visual–tactile interactions in populations where unisensory tactile ability may differ within and between groups.
Title: Adapting the Crossmodal Congruency Task for Measuring the Limits of Visual–Tactile Interactions Within and Between Groups
Description:
The crossmodal congruency task (CCT) is a commonly used paradigm for measuring visual–tactile interactions and how these may be influenced by discrepancies in space and time between the tactile target and visual distractors.
The majority of studies which have used this paradigm have neither measured, nor attempted to control, individual variability in unisensory (tactile) performance.
We have developed a version of the CCT in which unisensory baseline performance is constrained to enable comparisons within and between participant groups.
Participants were instructed to discriminate between single and double tactile pulses presented to their dominant hand, at their own approximate threshold level.
In Experiment 1, visual distractors were presented at −30 ms, 100 ms, 200 ms and 400 ms stimulus onset asynchronies.
In Experiment 2, ipsilateral visual distractors were presented 0 cm, 21 cm, and 42 cm vertically from the target hand, and 42 cm in a symmetrical, contralateral position.
Distractors presented −30 ms and 0 cm from the target produced a significantly larger congruency effect than at other time points and spatial locations.
Thus, the typical limits of visual–tactile interactions were replicated using a version of the task in which baseline performance can be constrained.
The usefulness of this approach is supported by the observation that tactile thresholds correlated with self-reported autistic traits in this non-clinical sample.
We discuss the suitability of this adapted version of the CCT for measuring visual–tactile interactions in populations where unisensory tactile ability may differ within and between groups.
Related Results
Neuromodulation of crossmodal influences on visual cortex excitability
Neuromodulation of crossmodal influences on visual cortex excitability
Crossmodal interactions occur not only within brain regions deemed to be heteromodal, but also within primary sensory areas, traditionally considered as modality-specific. So far, ...
Crossmodal Hierarchical Predictive Coding for Audiovisual Sequences in Human Brain
Crossmodal Hierarchical Predictive Coding for Audiovisual Sequences in Human Brain
AbstractPredictive-coding theory proposes that the brain actively predicts sensory inputs based on prior knowledge. While this theory has been extensively researched within individ...
Crossmodal correspondences
Crossmodal correspondences
For more than a century now, researchers have acknowledged the existence of crossmodal congruency effects between dimensions of sensory stimuli in the general (i.e., non-synestheti...
Multiple expectancies underlie the congruency sequence effect in confound-minimized tasks
Multiple expectancies underlie the congruency sequence effect in confound-minimized tasks
The congruency sequence effect (CSE) occurs when the congruency effect observed in tasks such as the Eriksen flanker task is smaller on trials preceded by an incongruent trial rela...
Dissociable crossmodal recruitment of visual and auditory cortex for tactile perception
Dissociable crossmodal recruitment of visual and auditory cortex for tactile perception
Primary sensory areas previously thought to be devoted to a single modality can exhibit multisensory responses. Some have interpreted these responses as evidence for crossmodal rec...
Radiographic Evaluation of Congruency of the First Metatarsophalangeal Joint in Hallux Valgus
Radiographic Evaluation of Congruency of the First Metatarsophalangeal Joint in Hallux Valgus
Abstract
Background Congruency of the first metatarsophalangeal (MTP) joint is extremely important for the selection of surgical methods and prognosis, while radiographic e...
Investigating visuo-tactile mirror properties in Borderline Personality Disorder: a TMS-EEG study
Investigating visuo-tactile mirror properties in Borderline Personality Disorder: a TMS-EEG study
AbstractPatients with Borderline Personality Disorder (pw-BPD) are characterized by lower levels of cognitive empathy compared to healthy controls (HCs), indicating difficulties in...
Artificial SA-I, RA-I and RA-II/vibrotactile afferents for tactile sensing of texture
Artificial SA-I, RA-I and RA-II/vibrotactile afferents for tactile sensing of texture
Robot touch can benefit from how humans perceive tactile textural information, from the stimulation mode to which tactile channels respond, then the tactile cues and encoding. Usin...

