Loading Events

« All Events

  • This event has passed.

Spatial interactions across vision, audition and touch: searching for a taxonomy

May 30, 2014 @ 9:00 am - 5:00 pm

Cross-modal interactions between vision and hearing involving spatial properties occur at a number of different levels and in a number of different ways. Some interactions are best explained in terms of enhancement effects involving neurones with cross-modal receptive fields; some involve mechanisms of selective attention, with the stimulus in one sense cuing attention to a location in another sense; in some auditory cases, spatial interactions are modulated by binding: when acoustic elements are bound together, each contributes spatial information, and, in some cases, vision determines which elements get bound together. The ventriloquism effect involves the integration of spatial information across vision and hearing (and other senses) and can be modulated by both bottom-up and perhaps by top-down mechanisms.

– Can the ways in which spatial information is integrated in these different cases be explained in terms of the same mechanisms? What different or other kinds of spatial interaction are there?

– There’s an existence question here: why do these different levels of interaction and correspondingly different mechanisms exist? What is their function and why couldn’t their function be sub-served by a single kind of interaction/mechanism?

– Is there an explanation that should lead us to expect functionally similar mechanisms linking different senses: can we find the same kinds of mechanisms linking vision and touch, or touch and audition; if not why not?

A further question concerns the extent to which we should consider the mechanisms involved in these interactions as spatial. For example, we might expect that the way spatial properties are encoded in vision is different from the way in which they are encoded in audition and touch.

– What is the significance of these differences? Are auditory processes intrinsically spatial in the way visual process are; if not, why think of them as spatial at all? What about tactile processes – are they intrinsically spatial in the same way and for the same reason as visual processes; if not, how and why are they different?

– Should we think of some auditory processes as only derivatively spatial – as inheriting or depending for their spatial interpretation on their relation to another sense modality? Are their any aspects of the spatial properties in touch that depend on a connection between touch and vision? Are there similar differences in the way space is encoded across all the senses?

Details

Date:
May 30, 2014
Time:
9:00 am - 5:00 pm
Event Category:

Organizer

Rethinking the Senses
View Organizer Website

Venue

University of London
Senate House, Malet St
London, WC1E 7HU United Kingdom
+ Google Map
View Venue Website