S.A.R.A.: synesthetic augmented reality application
dc.contributor.author | Benitez, Margarita | en_NZ |
dc.contributor.author | Vogl, Markus | en_NZ |
dc.date.accessioned | 2015-04-13T04:13:18Z | |
dc.date.available | 2015-04-13T04:13:18Z | |
dc.date.copyright | 2014 | en_NZ |
dc.date.issued | 2014 | en_NZ |
dc.description.abstract | S.A.R.A. (synesthetic augmented reality application) is an App exploring the potential of using a mobile device as a unique and wearable musical interface. S.A.R.A. was originally developed as a standalone App to translate the surrounding environment into sounds on mobile devices (iPhone and Android), creating a digitally augmented synesthetic experience. The imagery captured via the mobile device’s onboard camera is translated into synesthetic-inspired sounds. Our interests in developing this project stemmed from the desire to explore the following research questions. Can technology be used to create a synesthetic augmented reality? What sonochromatic sound mapping should be used? Do we allow for a variety of mapping choices? Should a visual element be used as well? While investigating these research veins it led us to the realization that the S.A.R.A. App and interface would be best explored in a performance setting, therefore we arranged to collaborate with a local dance troupe that agreed to utilize S.A.R.A. as part of their repertoire. The performance version of the S.A.R.A. App is a fully interactive App that generates both its own sounds and visuals based on the camera video input and the movement of the device. The mobile device is complemented by a pico laser and mounted in a sleeve worn by each of the four dancers. S.A.R.A. becomes an extension of the dancer’s arm and allows for natural movement to occur. The role of the performers is also augmented as they are now gatekeepers of what sounds are made, as well as what images are projected, by deciding what live imagery and angles look most appealing to rebroadcast. Performers can choose to project images on themselves, their coperformers, or on to the architectural structures of the venue. This format allows for a completely new interaction with wearable technology; augmenting and mediating their performance via several technological input and output mechanisms while still maintaining choreography, as well as allowing for subjective choices during the performance. The performance setting brought up additional questions. How wearable can these devices be made in their current configuration? What is the best placement on the body for these devices that does not impede movement but allows for maximum control of the App? What does it mean when one performer wears a device like this? Multiple performers? Does wearing this device change the role or mechanism of the performer? Does the lighting need to be thought out differently for the stage and the performers? Should additional light be placed on the dancers if they can’t be lit in traditional methods? Can other dance troupes benefit from the technology? During various beta performances it became obvious that the lighting source needed to be on the performers’ bodies rather than from an external source. In response we are creating custom LED to provide a light source for the camera to pick up imagery more effectively. The LEDs were integrated into a neck cowl and the rest of the costume is designed in white to easily provide a surface to project on. Although within a set choreography, the performer’s role changes as their body’s interactions directly produce sounds. The Human Computer Interaction between the dancers and the technology as an extension of their bodies creates an altered/mediated/mitigated performance environment that is always unique to the specific performance venue. S.A.R.A. is not only an interface and an interactive software application for consumption, play, discovery and joy, but is also a jump off point for a larger discussion on transformational strategies in regards to both S.A.R.A. as a wearable musical/performance interface but additionally in the Open Source distribution of S.A.R.A. as a tool. The technology will be released open source and it is potentially possible to custom craft new versions for every performance or for other dance troupes to adapt the technology with their artistic vision. Creating the App for an existing platform device such as an iPod touch and utilizing a relatively inexpensive laser pico projector (less than $500), S.A.R.A. can be added relatively simply and cheaply as a versatile tool to their technology performance toolkit. Therefore the artwork we created provides a new tool set for other artists. | en_NZ |
dc.identifier.citation | Shapeshifting: A Conference on Transformative Paradigms of Fashion and Textile Design, 14-16 April 2014, Auckland, New Zealand | en_NZ |
dc.identifier.isbn | 978-1-927184-27-1 | en_NZ |
dc.identifier.uri | https://hdl.handle.net/10292/8576 | |
dc.publisher | Textile and Design Lab and Colab at Auckland University of Technology | en_NZ |
dc.subject | Media interfaces | en_NZ |
dc.subject | Digital experiences | en_NZ |
dc.subject | Wearables | en_NZ |
dc.subject | Transformational strategies | en_NZ |
dc.subject | Synesthetic mobile app | en_NZ |
dc.subject | Sonification app | en_NZ |
dc.subject | Transformative interfaces | en_NZ |
dc.subject | Open source artistic tools | en_NZ |
dc.subject | Interactive dance and technology | en_NZ |
dc.subject | wearable music interfaces | en_NZ |
dc.title | S.A.R.A.: synesthetic augmented reality application | en_NZ |
dc.type | Conference Paper | en_NZ |