• norsk
    • English
  • norsk 
    • norsk
    • English
  • Logg inn
Vis innførsel 
  •   Hjem
  • Fakultet for samfunns- og utdanningsvitenskap (SU)
  • Institutt for psykologi
  • Vis innførsel
  •   Hjem
  • Fakultet for samfunns- og utdanningsvitenskap (SU)
  • Institutt for psykologi
  • Vis innførsel
JavaScript is disabled for your browser. Some features of this site may not work without it.

Musical Experience Modulates Audiovisual Perception – Evidence from event-related potentials and inter-trial phase coherence

Sorati, Marzie
Doctoral thesis
Åpne
Fulltext is not available (Låst)
Permanent lenke
https://hdl.handle.net/11250/2739961
Utgivelsesdato
2021
Metadata
Vis full innførsel
Samlinger
  • Institutt for psykologi [1965]
Sammendrag
This dissertation investigated audiovisual (AV) perception of speech and music, when visual information starts before the auditory onset and provides a prediction about an upcoming corresponding sound. Previous electroencephalography (EEG) research has shown that in the brain this prediction due to the visual information can modulate early processing of the auditory signals and leads to a more suppressed and speeded up early event-related potentials (ERPs) such as N1 and P2 in AV compared to the auditory perception. However, the influence of previous experience on this prediction in AV perception has received little attention. To explore the influence of previous experience, the current project examined musical experience by examining N1 and P2 amplitudes and latencies for musicians and non-musicians. In addition, this project extends previous research by investigating the predictive effect of visual cues in AV perception using a time-frequency based approach, inter-trial phase coherence (ITPC) in delta, theta, alpha, and beta oscillation. ERP suppression and reduced latency resulting from predictive visual cues in AV perception were evaluated for four previously developed AV models

Musical experience influences AV speech and music perception. In AV speech perception, seeing facial articulation precedes and predicts the audio speech being produced, compared to the auditory speech, leads to reduced ERPs and ITPCs for musicians and non-musicians. However only musicians showed reduced N1 and suppression of alpha oscillation in AV speech. In AV music perception, seeing finger and hand movements precedes and predicts the audio music being produced, compared to the auditory music, leads to reduced ERPs and ITPCs for both groups. However only musicians showed reduced beta oscillation in AV music perception. These results indicate that early sensory processing in AV perception can be modified by musical experience. Furthermore, calculated differences in the four AV models lead to different patterns of results for N1 and P2, indicating that these models are not comparable.

Collectively, these results indicate that previous AV experience, such as that attained through musical training, influences the predictive mechanisms in AV speech and music perception. Moreover, regardless of previous musical experience, AV interaction models applied by previous research are not interchangeable.
Utgiver
NTNU
Serie
Doctoral theses at NTNU, 2021:69

Kontakt oss | Gi tilbakemelding

Personvernerklæring
DSpace software copyright © 2002-2019  DuraSpace

Levert av  Unit
 

 

Bla i

Hele arkivetDelarkiv og samlingerUtgivelsesdatoForfattereTitlerEmneordDokumenttyperTidsskrifterDenne samlingenUtgivelsesdatoForfattereTitlerEmneordDokumenttyperTidsskrifter

Min side

Logg inn

Statistikk

Besøksstatistikk

Kontakt oss | Gi tilbakemelding

Personvernerklæring
DSpace software copyright © 2002-2019  DuraSpace

Levert av  Unit