Show simple item record

dc.contributor.advisorBehne, Dawn
dc.contributor.advisorWaadeland, Carl Haakon
dc.contributor.authorSorati, Marzie
dc.date.accessioned2021-04-27T13:18:19Z
dc.date.available2021-04-27T13:18:19Z
dc.date.issued2021
dc.identifier.isbn978-82-326-6644-7
dc.identifier.issn2703-8084
dc.identifier.urihttps://hdl.handle.net/11250/2739961
dc.description.abstractThis dissertation investigated audiovisual (AV) perception of speech and music, when visual information starts before the auditory onset and provides a prediction about an upcoming corresponding sound. Previous electroencephalography (EEG) research has shown that in the brain this prediction due to the visual information can modulate early processing of the auditory signals and leads to a more suppressed and speeded up early event-related potentials (ERPs) such as N1 and P2 in AV compared to the auditory perception. However, the influence of previous experience on this prediction in AV perception has received little attention. To explore the influence of previous experience, the current project examined musical experience by examining N1 and P2 amplitudes and latencies for musicians and non-musicians. In addition, this project extends previous research by investigating the predictive effect of visual cues in AV perception using a time-frequency based approach, inter-trial phase coherence (ITPC) in delta, theta, alpha, and beta oscillation. ERP suppression and reduced latency resulting from predictive visual cues in AV perception were evaluated for four previously developed AV models Musical experience influences AV speech and music perception. In AV speech perception, seeing facial articulation precedes and predicts the audio speech being produced, compared to the auditory speech, leads to reduced ERPs and ITPCs for musicians and non-musicians. However only musicians showed reduced N1 and suppression of alpha oscillation in AV speech. In AV music perception, seeing finger and hand movements precedes and predicts the audio music being produced, compared to the auditory music, leads to reduced ERPs and ITPCs for both groups. However only musicians showed reduced beta oscillation in AV music perception. These results indicate that early sensory processing in AV perception can be modified by musical experience. Furthermore, calculated differences in the four AV models lead to different patterns of results for N1 and P2, indicating that these models are not comparable. Collectively, these results indicate that previous AV experience, such as that attained through musical training, influences the predictive mechanisms in AV speech and music perception. Moreover, regardless of previous musical experience, AV interaction models applied by previous research are not interchangeable.en_US
dc.language.isoengen_US
dc.publisherNTNUen_US
dc.relation.ispartofseriesDoctoral theses at NTNU, 2021:69
dc.titleMusical Experience Modulates Audiovisual Perception – Evidence from event-related potentials and inter-trial phase coherenceen_US
dc.typeDoctoral thesisen_US
dc.subject.nsiVDP::Social science: 200::Psychology: 260en_US
dc.description.localcodeFulltext is not availableen_US


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record