Show simple item record

dc.contributor.authorBrandtsegg, Øyvind
dc.contributor.authorTidemann, Axel
dc.date.accessioned2021-02-09T08:44:46Z
dc.date.available2021-02-09T08:44:46Z
dc.date.created2020-12-08T14:46:38Z
dc.date.issued2020
dc.identifier.issn2663-9041
dc.identifier.urihttps://hdl.handle.net/11250/2726763
dc.description.abstractThe development of musical interfaces has moved from static to malleable, where the interaction mode can be designed by the user. However, the user still has to specify which input parameters to adjust, and inherently how it affects the sound generated. We propose a novel way to learn mappings from movements to sound generation parameters, based on inherent features in the control inputs. An assumption is that any correlation between input features and output characteristics is an indication of a meaningful mapping. The goal is to make the user interface evolve with the user, creating a unique, tailor made interaction mode with the instrument.en_US
dc.language.isoengen_US
dc.publisherZenodoen_US
dc.relation.uri10.5281/zenodo.3932892
dc.rightsNavngivelse 4.0 Internasjonal*
dc.rights.urihttp://creativecommons.org/licenses/by/4.0/deed.no*
dc.title{Shape: an adaptive musical interface that optimizes the correlation between gesture and sounden_US
dc.typePeer revieweden_US
dc.typeJournal articleen_US
dc.description.versionpublishedVersionen_US
dc.source.journalProceedings of the International Conference on Live Interfaces (Proceedings of ICLI)en_US
dc.identifier.doi10.5281/zenodo.3928017
dc.identifier.cristin1857541
dc.description.localcodeLicensed under a Creative Commons Attribution 4.0 International License (CC BY 4.0). © 2019 Copyright held by the owner/author(s).en_US
cristin.ispublishedtrue
cristin.fulltextoriginal
cristin.qualitycode1


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record

Navngivelse 4.0 Internasjonal
Except where otherwise noted, this item's license is described as Navngivelse 4.0 Internasjonal