Considerations in Audio-Visual Interaction Models: An ERP Study of Music Perception by Musicians and Non-musicians

Frontiers in Psychology 11 (2021)
  Copy   BIBTEX

Abstract

Previous research with speech and non-speech stimuli suggested that in audiovisual perception, visual information starting prior to the onset of corresponding sound can provide visual cues, and form a prediction about the upcoming auditory sound. This prediction leads to audiovisual interaction. Auditory and visual perception interact and induce suppression and speeding up of the early auditory event-related potentials such as N1 and P2. To investigate AV interaction, previous research examined N1 and P2 amplitudes and latencies in response to audio only, video only, audiovisual, and control stimuli, and compared AV with auditory perception based on four AV interaction models. The current study addresses how different models of AV interaction express N1 and P2 suppression in music perception. Furthermore, the current study took one step further and examined whether previous musical experience, which can potentially lead to higher N1 and P2 amplitudes in auditory perception, influenced AV interaction in different models. Musicians and non-musicians were presented the recordings of a keyboard /C4/ key being played, as well as CO stimuli. Results showed that AV interaction models differ in their expression of N1 and P2 amplitude and latency suppression. The calculation of model and has consequences for the resulting N1 and P2 difference waves. Furthermore, while musicians, compared to non-musicians, showed higher N1 amplitude in auditory perception, suppression of amplitudes and latencies for N1 and P2 was similar for the two groups across the AV models. Collectively, these results suggest that when visual cues from finger and hand movements predict the upcoming sound in AV music perception, suppression of early ERPs is similar for musicians and non-musicians. Notably, the calculation differences across models do not lead to the same pattern of results for N1 and P2, demonstrating that the four models are not interchangeable and are not directly comparable.

Links

PhilArchive



    Upload a copy of this work     Papers currently archived: 92,227

External links

Setup an account with your affiliations in order to access resources via your University's proxy server

Through your library

Similar books and articles

Tunes and Tones: Music, Language, and Inhibitory Control.Robert E. Graham & Usha Lakshmanan - 2018 - Journal of Cognition and Culture 18 (1-2):104-123.
What is Music?Zina Schiff, Wendy Carlos, Bruce Buntschuh, Christopher Hogwood & Richard Boulanger - 1989 - Wgbh Education Foundation Coronet Film & Video [Distributor].
Hearing and Painting: Neuroaesthetic Theoretical Insights.Alain Londero, Capucine Payre, Zoï Kapoula & Jacqueline Lichtenstein - 2018 - In Zoï Kapoula, Emmanuelle Volle, Julien Renoult & Moreno Andreatta (eds.), Exploring Transdisciplinarity in Art and Sciences. Springer Verlag. pp. 149-163.
Musical perceptions.Rita Aiello & John A. Sloboda (eds.) - 1994 - New York: Oxford University Press.
Object Perception: Vision and Audition.Casey O’Callaghan - 2008 - Philosophy Compass 3 (4):803-829.

Analytics

Added to PP
2021-01-20

Downloads
33 (#487,172)

6 months
22 (#124,404)

Historical graph of downloads
How can I increase my downloads?