TY - GEN
T1 - Full-Body Interaction in a Remote Context
T2 - 10th International Conference on Digital and Interactive Arts
AU - Masu, Raul
AU - Pajala-Assefa, Hanna
AU - Correia, Nuno N.
AU - Romão, Teresa
N1 - Funding Information:
info:eu-repo/grantAgreement/FCT/6817 - DCRRNI ID/UIDB%2F50009%2F2020/PT#
info:eu-repo/grantAgreement/FCT/6817 - DCRRNI ID/UID%2FCEC%2F04516%2F2013/PT#
We would like to acknowledge Stephan Jürgens for recording the video material. The first author acknowledges ARDITI-Agencia Regional para o Desenvolvimento e Tecnologia under the scope of the Project M1420-09-5369-FSE-000002 - PhD Studentship. This work is co-funded by 597398-CREA-1-2018-1-PT-CULT-COOP1 -Moving Digits: Augmented Dance for Engaged Audience.
Publisher Copyright:
© 2021 Copyright held by the owner/author(s). Publication rights licensed to ACM.
PY - 2021/10/13
Y1 - 2021/10/13
N2 - This paper describes the process of adapting a dance piece to an interactive installation. To cope with COVID-19 restrictions, the installation runs on a browser, allowing for remote access, and is based on camera tracking of movement. We describe the adaptation of the dramaturgy, with a primary focus on translating the interaction design aspects from the performance to the browser-based installation. We invited five users to test our working prototype, and we report on their feedback. This paper offers insights on how to adapt a dance piece to an interactive browser-based installation. We also highlight the benefits of using machine learning for motion capture in this context. Finally, we identify the potential of using interaction design with sound for embodied perception.
AB - This paper describes the process of adapting a dance piece to an interactive installation. To cope with COVID-19 restrictions, the installation runs on a browser, allowing for remote access, and is based on camera tracking of movement. We describe the adaptation of the dramaturgy, with a primary focus on translating the interaction design aspects from the performance to the browser-based installation. We invited five users to test our working prototype, and we report on their feedback. This paper offers insights on how to adapt a dance piece to an interactive browser-based installation. We also highlight the benefits of using machine learning for motion capture in this context. Finally, we identify the potential of using interaction design with sound for embodied perception.
KW - AudioVisuals
KW - Full-body Interaction Design
KW - Web Installation
UR - http://www.scopus.com/inward/record.url?scp=85125645597&partnerID=8YFLogxK
U2 - 10.1145/3483529.3483747
DO - 10.1145/3483529.3483747
M3 - Conference contribution
AN - SCOPUS:85125645597
T3 - ACM International Conference Proceeding Series
BT - ARTECH 2021 - Proceedings of the 10th International Conference on Digital and Interactive Arts: Hybrid Praxis - Art, Sustainability and Technology
A2 - Lopes, Maria Manuela
A2 - Bastos, Paulo Bernardino
A2 - Araújo, António
A2 - Olivero, Lucas Fabian
A2 - Fernandes-Marcos, Adérito
A2 - Fernandes-Marcos, Adérito
PB - ACM - Association for Computing Machinery
CY - New York
Y2 - 13 October 2021 through 15 October 2021
ER -