S.A.R.A.: synesthetic augmented reality application

dc.contributor.authorBenitez, Margaritaen_NZ
dc.contributor.authorVogl, Markusen_NZ
dc.date.accessioned2015-04-13T04:13:18Z
dc.date.available2015-04-13T04:13:18Z
dc.date.copyright2014en_NZ
dc.date.issued2014en_NZ
dc.description.abstractS.A.R.A. (synesthetic augmented reality application) is an App exploring the potential of using a mobile device as a unique and wearable musical interface. S.A.R.A. was originally developed as a standalone App to translate the surrounding environment into sounds on mobile devices (iPhone and Android), creating a digitally augmented synesthetic experience. The imagery captured via the mobile device’s onboard camera is translated into synesthetic-inspired sounds. Our interests in developing this project stemmed from the desire to explore the following research questions. Can technology be used to create a synesthetic augmented reality? What sonochromatic sound mapping should be used? Do we allow for a variety of mapping choices? Should a visual element be used as well? While investigating these research veins it led us to the realization that the S.A.R.A. App and interface would be best explored in a performance setting, therefore we arranged to collaborate with a local dance troupe that agreed to utilize S.A.R.A. as part of their repertoire. The performance version of the S.A.R.A. App is a fully interactive App that generates both its own sounds and visuals based on the camera video input and the movement of the device. The mobile device is complemented by a pico laser and mounted in a sleeve worn by each of the four dancers. S.A.R.A. becomes an extension of the dancer’s arm and allows for natural movement to occur. The role of the performers is also augmented as they are now gatekeepers of what sounds are made, as well as what images are projected, by deciding what live imagery and angles look most appealing to rebroadcast. Performers can choose to project images on themselves, their coperformers, or on to the architectural structures of the venue. This format allows for a completely new interaction with wearable technology; augmenting and mediating their performance via several technological input and output mechanisms while still maintaining choreography, as well as allowing for subjective choices during the performance. The performance setting brought up additional questions. How wearable can these devices be made in their current configuration? What is the best placement on the body for these devices that does not impede movement but allows for maximum control of the App? What does it mean when one performer wears a device like this? Multiple performers? Does wearing this device change the role or mechanism of the performer? Does the lighting need to be thought out differently for the stage and the performers? Should additional light be placed on the dancers if they can’t be lit in traditional methods? Can other dance troupes benefit from the technology? During various beta performances it became obvious that the lighting source needed to be on the performers’ bodies rather than from an external source. In response we are creating custom LED to provide a light source for the camera to pick up imagery more effectively. The LEDs were integrated into a neck cowl and the rest of the costume is designed in white to easily provide a surface to project on. Although within a set choreography, the performer’s role changes as their body’s interactions directly produce sounds. The Human Computer Interaction between the dancers and the technology as an extension of their bodies creates an altered/mediated/mitigated performance environment that is always unique to the specific performance venue. S.A.R.A. is not only an interface and an interactive software application for consumption, play, discovery and joy, but is also a jump off point for a larger discussion on transformational strategies in regards to both S.A.R.A. as a wearable musical/performance interface but additionally in the Open Source distribution of S.A.R.A. as a tool. The technology will be released open source and it is potentially possible to custom craft new versions for every performance or for other dance troupes to adapt the technology with their artistic vision. Creating the App for an existing platform device such as an iPod touch and utilizing a relatively inexpensive laser pico projector (less than $500), S.A.R.A. can be added relatively simply and cheaply as a versatile tool to their technology performance toolkit. Therefore the artwork we created provides a new tool set for other artists.en_NZ
dc.identifier.citationShapeshifting: A Conference on Transformative Paradigms of Fashion and Textile Design, 14-16 April 2014, Auckland, New Zealanden_NZ
dc.identifier.isbn978-1-927184-27-1en_NZ
dc.identifier.urihttps://hdl.handle.net/10292/8576
dc.publisherTextile and Design Lab and Colab at Auckland University of Technologyen_NZ
dc.subjectMedia interfacesen_NZ
dc.subjectDigital experiencesen_NZ
dc.subjectWearablesen_NZ
dc.subjectTransformational strategiesen_NZ
dc.subjectSynesthetic mobile appen_NZ
dc.subjectSonification appen_NZ
dc.subjectTransformative interfacesen_NZ
dc.subjectOpen source artistic toolsen_NZ
dc.subjectInteractive dance and technologyen_NZ
dc.subjectwearable music interfacesen_NZ
dc.titleS.A.R.A.: synesthetic augmented reality applicationen_NZ
dc.typeConference Paperen_NZ
Files
Original bundle
Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
SS20140Submission_03.pdf
Size:
2.52 MB
Format:
Adobe Portable Document Format
Description:
License bundle
Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
license.txt
Size:
1.71 KB
Format:
Item-specific license agreed upon to submission
Description: