• norsk
    • English
  • English 
    • norsk
    • English
  • Login
View Item 
  •   Home
  • Fakultet for samfunns- og utdanningsvitenskap (SU)
  • Institutt for psykologi
  • View Item
  •   Home
  • Fakultet for samfunns- og utdanningsvitenskap (SU)
  • Institutt for psykologi
  • View Item
JavaScript is disabled for your browser. Some features of this site may not work without it.

Musical Experience Modulates Audiovisual Perception – Evidence from event-related potentials and inter-trial phase coherence

Sorati, Marzie
Doctoral thesis
View/Open
Fulltext is not available (Locked)
URI
https://hdl.handle.net/11250/2739961
Date
2021
Metadata
Show full item record
Collections
  • Institutt for psykologi [2002]
Abstract
This dissertation investigated audiovisual (AV) perception of speech and music, when visual information starts before the auditory onset and provides a prediction about an upcoming corresponding sound. Previous electroencephalography (EEG) research has shown that in the brain this prediction due to the visual information can modulate early processing of the auditory signals and leads to a more suppressed and speeded up early event-related potentials (ERPs) such as N1 and P2 in AV compared to the auditory perception. However, the influence of previous experience on this prediction in AV perception has received little attention. To explore the influence of previous experience, the current project examined musical experience by examining N1 and P2 amplitudes and latencies for musicians and non-musicians. In addition, this project extends previous research by investigating the predictive effect of visual cues in AV perception using a time-frequency based approach, inter-trial phase coherence (ITPC) in delta, theta, alpha, and beta oscillation. ERP suppression and reduced latency resulting from predictive visual cues in AV perception were evaluated for four previously developed AV models

Musical experience influences AV speech and music perception. In AV speech perception, seeing facial articulation precedes and predicts the audio speech being produced, compared to the auditory speech, leads to reduced ERPs and ITPCs for musicians and non-musicians. However only musicians showed reduced N1 and suppression of alpha oscillation in AV speech. In AV music perception, seeing finger and hand movements precedes and predicts the audio music being produced, compared to the auditory music, leads to reduced ERPs and ITPCs for both groups. However only musicians showed reduced beta oscillation in AV music perception. These results indicate that early sensory processing in AV perception can be modified by musical experience. Furthermore, calculated differences in the four AV models lead to different patterns of results for N1 and P2, indicating that these models are not comparable.

Collectively, these results indicate that previous AV experience, such as that attained through musical training, influences the predictive mechanisms in AV speech and music perception. Moreover, regardless of previous musical experience, AV interaction models applied by previous research are not interchangeable.
Publisher
NTNU
Series
Doctoral theses at NTNU, 2021:69

Contact Us | Send Feedback

Privacy policy
DSpace software copyright © 2002-2019  DuraSpace

Service from  Unit
 

 

Browse

ArchiveCommunities & CollectionsBy Issue DateAuthorsTitlesSubjectsDocument TypesJournalsThis CollectionBy Issue DateAuthorsTitlesSubjectsDocument TypesJournals

My Account

Login

Statistics

View Usage Statistics

Contact Us | Send Feedback

Privacy policy
DSpace software copyright © 2002-2019  DuraSpace

Service from  Unit