[dance-tech] Interaktionslabor: 2nd week, sensor interface research report

  • From: "Johannes Birringer" <Johannes.Birringer@xxxxxxxxxxxx>
  • To: <dance-tech@xxxxxxxxxxxxx>
  • Date: Tue, 25 Jul 2006 16:52:31 +0100

interim report:

I n t e r a k t i o n s l a b o r 2006
Coal Mine Göttelborn [Germany]


Sensordance, interactive game, webcam dramaturgy.
__________________________________________

Extending previous studies of physical camera, sensor choreography, and 
interactive design for real time networked performance, Interaktionslabor 
Göttelborn is now in the second week of its fourth annual workshop (July 17 ? 
31, 2006) in the former coal mine in southwest Germany.

We are offering this brief report as a communiqué, and naturally, we would be 
interested in a discussion on this list, which traditionally tends to be more 
active in the summer time when people have some free time or are inclined to 
discuss workshops and new productions. 

The discussion we'd like to propose is on the experimentations with "sensor 
dance", "sensor choreography," real time interaction design for performance 
with digital 3D worlds.  We invte feedback. (e.g. testbed 4:  
http://interaktionslabor.de/lab06/gase.htm )

The term "sensor choreography" is not the right term.
But I would be glad to define more closely, in debate, what I see happening in 
our workshop ---  as we moved from the general daily rhythm of warm ups, body 
mind centering/yoga based exercises, and somatic techniques to movement 
choreography rehearsal with three performers (dancers/actors), to the current 
performance with sensors.  The three performers each rehearsed individually 
with a dramaturg, which is perhaps unusual as a method for many of you, but I 
think the connection in this production, and in this research, builds on the 
following areas, and dramaturgical decisions are necessary in all:  

1.  Theatre / performance  --  devising with digital media   (a dramaturgical 
script has been written by the team directors:  this script is a complex piece 
of sketching actions on many levels (screen projection of the triptych/video 
actions, webcam actions in the remote sites, 4 characters [Player 1,  Player 2, 
 Real Avatar,  Screen Avatar] , direction of local and distributed action;  
sound direction, text/voice-over direction,  scenographic direction, 
interactivity direction, precise lighting design). 

 2.  There is scenography, as there is a local venue where the distributed 
performance will be first performed  (in Athens). Stage design, screen design, 
lighting design.

3.  Interaction design:     

----level A:   digital video or 3D animation on three separated screens 
creating immersive world (the screen architecture is 12m x 3 meter; playing 
area for Real Avatar / live dancer is 2 meter wide and 12 long.   Hundreds if 
not thousands of film scenes have been  shot on location, in a special triple 
camera set up, many tracking shots,  now edited  in small clusters. Amount of 
digital film material roughly 1 terabyte. 

     The interaction patch is written by programmers on open source software 
(PD)

---- sound design/music   (not interactive, it is created by composer and 
performed in real time synthesis)

3.  Sensor design

---- level A:    physical sensor design for various types of sensors  (orient, 
flex, tilt, and rotation sensors; switch/clicker; turn/resistor-switch sensor; 
heat sensor, photosensor)  --- 
which are all from the same family of accelerometers and sensors, looking 
slightly different....... providing different information depending on where 
they are attached to the body or rub on the flesh in the garment or are moved 
and pressed as you would when you touch garment fabric or skin. The sensors we 
use in rehearsal come from I-CubeX (Wi-miniSystem) and La Kitchen (Kroonde 
Gamma,  a wireless, high-speed, high-precision data-acquisition system 
dedicated to real-time applications/performance using embedded sensors). 
The transmitter sends the data from the motion to the computer laptops, via 
wireless Bluetooth transmitter (I-Cube X) or OSC/UDP (Kroonde), and on the 
computing side the information/data is received into PD patches (we are also 
using Max/Msp patches). 

--- level B:   programming of the interfaces (motion data/sensorial data 
control what we could call, citing Manovich, the "image instrument", i.e. the 
digital images and data and thus the game world.


4.  Sensor Choreography;  Movement Behaviors

To come back to my initial point,  the movement vocabulary, involving our Real 
Avatar (dancer) and the remote Players (streamed into our triptychon video 
projection scenography) was rehearsed and developed for the characters in the 
game world  (which is, largely movement and behaviors), and in the first week 
our engineers and programmers were still working on the interaction design, and 
12 sensors were put together (an additional set of click switches was built and 
sewed into hand made "gloves" ) to be eventually worn by the real avatar dancer 
Ermira Goro.

Since yesterday, Ermira Goro has been dancing the scenes of the game world 
choreography ("Walhalla") wearing 12 sensors. That is to say, she is 
re-choreographing her movement choreography incorporating the sensors on her 
body which allow her an immediate, direct relationship to the virtual worlds 
(digital film and animation) in front of which, into which, she literally moves 
and navigates. 

As with an avatar one would navigate in a game, the dancer in our production is 
navigating her character into the digital world, and she is also responding to 
the programmable interactions she does not control, such as the appearances of 
the other players streamed in from the webcams in Amsterdam and Sofia.  

For the time being, we are using the Interaktionslabor as a "testbed",  during 
the following days we are putting together the various elements of the digital 
performance, and simulate the remote Players by having them perform here with 
us, but outside, in the Coal Mine, in public squares we have lit and to which 
our Wifi network connection for webcasting reaches. Interestingly,  here is the 
telematic component of "organization":  since the dramaturgy obviously has a 
sequence of scenes and cues, these cues have to be transmitted via chat lines 
and chat operators to the remote Players/Performers,  they need to hear the 
cues, and enact their actions in precise range of the webcams.  So for the 
remote sites, each performance requires a network director, a chat operator, 
and a stagemanager or mediator. If the remote Player is out in public squares, 
the cues will be signaled to them via mobile phones. 

 More elements of the design, choreography, film, editing, sound, etc can be 
brought into the discussion if this is desired, I stop here with a very brief 
summary of the new work

The 2006 lab is dedicated to ?i-MAP?, a one-year collaborative project 
implemented through a trans-European network of four participating media art 
organizations (amorphy.org./Athens; InterSpace/Sofia; De Waag/Amsterdam; 
Interaktionslabor/Göttelborn) . We  seek to test the expressive and narrative 
possibilities of digital dispositives based on sensor design, live webcams, and 
augmented reality for game-based performance.

Examining the relationship of the human body and its real time representations 
in digital environments, and building its materializations in the 
laboratory-space of Interaktionslabor Göttelborn through visual, gestural, 
voice and sound narrative,  the distributed media casting creates a world of 
adventure and surprise. A streaming, telekinetic performance (?See you in 
Walhalla,? ) is created which follows the logic of a computer game but 
encounters real people, streets and occurrences in parallel urban realities. 
Through collaborative inter-media creation process, developed simultaneously by 
teams of artists in three different locations, a dramaturgical structure for a 
.........[sensordance game]  [live 3D computer game]  [interactive art work] [  
  ] [   ]     is created. This telekinetic performance event will be 
simultaneously presented in three European venues (Athens, Amsterdam and Sofia) 
on September 14. These venues will have fully interactive capabilities, 
allowing for live control of media in all three locations, creating a shared 
virtual environment, guided and utilized by the participating artists. The 
teams have come here for a ?testbed? phase in Göttelborn,  rehearsing the new 
work and completing the collaborative programming.

The research and development process in-action at the Interaktionslabor 
Göttelborn is open to the public and can be monitored via our webcams and the 
bulletins published on the website. Theoretical articles on the research are 
posted as well.



http://interaktionslabor.de
director, Johannes Birringer

Blvd der Industriekultur, 66287 Quierschied-Göttelborn (Saarland)
Tel. +49 (0) 6825-94277-19
A list with information of the participating artists is on the website.

i-MAP is supported by the Culture 2000 Framework Program, the Ministry of 
Culture of the Saarland, Goethe Institut, Red House For Culture and Debate, IKS 
IndustrieKultur Saar Gmbh, IME Research Facility, Ipsilon Production Company, 
Brunel University, i-DAT University of Plymouth. Locally, Interaktionslabor is 
supported by DeepWeb gmbh, Willi Meiser Computertechnik Göttelborn, Future Tech 
Quierschied, intraNET Systemhaus GmbH Saarbrücken, and Knecht 
VeranstaltungsTechnik.



Other related posts:

  • » [dance-tech] Interaktionslabor: 2nd week, sensor interface research report