Evaluation of a multimodal video annotator for contemporary dance

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

24 Citations (Scopus)

Abstract

This paper discusses the evaluation of a video annotator that supports multimodal annotation and is applied to contemporary dance as a creation tool. The Creation-Tool was conceived and designed to assist the creative processes of choreographers and dance performers, functioning as a digital notebook for personal annotations. The prototype, developed for Tablet PCs, allows video annotation in real-time, using a live video stream, or postevent, using a pre-recorded video stream. The tool also allows different video annotation modalities, such as annotation marks, text, audio, ink strokes and hyperlinks. In addition, the system enables different modes of annotation and video visualization. The development followed an iterative design process involving two choreographers, and a usability study was carried out, involving international dance performers participating in a contemporary dance "residence - workshop".

Original languageEnglish
Title of host publicationProceedings of the Working Conference on Advanced Visual Interfaces, AVI 2012
EditorsG. Tortora, S. Levialdi, M. Tucci
Place of PublicationNew York
PublisherACM - Association for Computing Machinery
Pages572-579
Number of pages8
ISBN (Print)9781450312875
DOIs
Publication statusPublished - 12 Jul 2012
Event2012 International Working Conference on Advanced Visual Interfaces, AVI 2012 - Capri Island, Italy
Duration: 21 May 201225 May 2012

Conference

Conference2012 International Working Conference on Advanced Visual Interfaces, AVI 2012
Country/TerritoryItaly
CityCapri Island
Period21/05/1225/05/12

Keywords

  • contemporary dance
  • multimodal video annotations
  • performing arts
  • real-time video annotations

Fingerprint

Dive into the research topics of 'Evaluation of a multimodal video annotator for contemporary dance'. Together they form a unique fingerprint.

Cite this