Nach gut sechs Wochen audiovisueller Arbeit für die Parsifal-Inszenierung der Fura dels Baus bin ich sehr glücklich am vergangenen Freitag eine erfolgreiche Premiere gefeiert zu haben. Hier eine Übersicht der verschiedenen Pressestimmen:
Ein “Parsifal” mit lebendigem Bühnenbild, Tagesthemen, 30.3.2013
Traumhafte Phantasiewelt, WDR Lokalzeit, 30.3.2013
Parsifal, 3Sat Kulturzeit, 2.4.2013
Padrissa macht aus Menschen Deko, Kölner Stadt-Anzeiger, 2.4.2013
Magische Bilderwelt beim Parsifal Kölnische Rundschau, 2.4.2013
Vernetzter Erlöser, klassik.com, 4.4.2013
Der reine Tor im Bühnenfeuerwerk, WAZ, 3.4.2013
Die Kölner Mammut-Oper, Express, 26.3.2013
Zwischen frommer Naivität und dekorativer Beliebigkeit, Kultur heute, Deutschlandfunk, 30.3.2013
Kölner „Parsifal“-Inszenierung bietet viele Effekte und wenig Erhellendes, Westfälische Nachrichten, 2.4.2013
Wenn Statisten zu Paparazzi werden, Bergische Landeszeitung, 2.4.2013
Parsifal und die Jedi-Ritter koeln.de, 30.3.2013
Wir alle sind Parsifal, General Anzeiger Bonn, 26.3.2013
filed under: köln/cologne
la fura dels baus
opera
projection
videomapping
welovecode
Hi everybody, I just wanted to let you know that I am very happy to announce that I am currently working with my friends of Welovecode for an Opera production of La Fura dels Baus‘s Carlus Padrissa here in Cologne!
We are doing the audiovisual and interactive projections for Richard Wagner’s Parsifal that will premiere on March 29th 2013 at Oper Köln.
So if you are in Cologne and won’t make it to the premiere: grab some of the few remaining tickets for one of the five performances during April and enjoy the spectacle!
filed under: 3d
barcelona
köln/cologne
la fura dels baus
motion design
opera
videomapping
welovecode
For several weeks now I am exploring the possibilities of the kinect and I am quite excited about the broad range of new possibilties for the future!
In this post I want to share the first experiences that I have made, I hope to help somebody with these!
First of all, I want to express a big thank you to all of the openni / openkinect / primesense developer that have made it relatively easy for a non-developer as me to get started with the kinect - Thank you! Also to my friends Roman and Pelayo of welovecode to get me into the right direction to start things going.
I am an After Effects focused Motion Designer and I enjoy working with trapcode particular, Plexus and Form. That means I think and work mainly in 3D with cameras and artefacts moving around in space. On the other hand I also like cinematography and recording moving images to combine them with motion graphics. Now what exites me most about the kinect is the ability to record 3d-data and to be able to process this data further in after effects.
Since I have learned some Processing before and I also do a lot of expressions in After Effects it was not too dificult to understand how openni and openkinect work. Daniel Shiffman has a great introductory article to get started with the kinect.
So I wanted to record the depth data that comes out of the kinect and i hacked together this little script that works with the simple-openni wrapper
import SimpleOpenNI.*;
SimpleOpenNI kinect;
boolean record = false;
void setup() {
size(640, 480);
kinect = new SimpleOpenNI(this);
kinect.enableDepth(); }
void draw() {
kinect.update();
image(kinect.depthImage(),0,0);
if (record == true ) {
saveFrame("frames/depthmap-####.jpg");
text("Recording frame" + frameCount,10,15); }
}
void keyPressed() {
if (key == 'r') {
record = true;
frameCount = 0; }
else if (key == 's') {
record = false; }
}
With 'r' you start the recording and with 's' you stop it. The image sequence will be recorded into a folder called 'frames' inside processing's data folder. Very basic but it worked for me in the first place.
Then I used the resulting image sequence in after effects as a luma-map to drive trapcode forms z-extrusion, and to be able to use custom particles etc. That turned out to work quite well. What was annoying me was the fact that there was a lot of glitch data due to the suboptimal recording situation in my office.
I found an example sketch by Elie Zananiri in the openkinect library to set a dynamic threshold to record only a part of the depth information and to filter out the rest.
I did not manage to output the depth image with the resulting threshold matte, but I found out that it would be easier for a non-programmer just to export two different image sequences, one with the depth data, and the other one with the "rough alpha" channel.
import org.openkinect.*;
import org.openkinect.processing.*;
Kinect kinect;
int kWidth = 640;
int kHeight = 480;
int kAngle = 15;
PImage depthImg;
int minDepth = 60;
int maxDepth = 860;
boolean record = false;
void setup() {
size(kWidth, kHeight);
kinect = new Kinect(this);
kinect.start();
kinect.enableDepth(true);
kinect.tilt(kAngle);
depthImg = new PImage(kWidth, kHeight);
}
void draw() {
// draw the raw image
image(kinect.getDepthImage(), 0, 0);
if (record == true ) {
saveFrame("frames/depthmap-####.jpg");
println("Recording" + frameCount); }
// threshold the depth image
int[] rawDepth = kinect.getRawDepth();
for (int i=0; i < kWidth*kHeight; i++) {
if (rawDepth[i] >= minDepth && rawDepth[i] <= maxDepth) {
depthImg.pixels[i] = 0xFFFFFFFF;
} else {
depthImg.pixels[i] = 0;
}
}
// draw the thresholded image
depthImg.updatePixels();
image(depthImg, 0, 0);
if (record == true ) {
saveFrame("frames/alpha-####.jpg");
println("Recording" + frameCount);
}
fill(0);
println("TILT: " + kAngle);
println("THRESHOLD: [" + minDepth + ", " + maxDepth + "]");
}
void keyPressed() {
if (key == CODED) {
if (keyCode == UP) {
kAngle++;
} else if (keyCode == DOWN) {
kAngle--;
}
kAngle = constrain(kAngle, 0, 30);
kinect.tilt(kAngle);
}
else if (key == 'q') {
minDepth = constrain(minDepth+10, 0, maxDepth);
} else if (key == 'w') {
minDepth = constrain(minDepth-10, 0, maxDepth);
}
else if (key == 'z') {
maxDepth = constrain(maxDepth+10, minDepth, 2047);
} else if (key =='x') {
maxDepth = constrain(maxDepth-10, minDepth, 2047);
}
else if (key == 'r') {
record = true;
frameCount = 0;
} else if (key == 's') {
record = false;
}
}
void stop() {
kinect.quit();
super.stop();
}
Then I played a little bit around with camera perspectives and particles and this is the resulting animation:
The limitation I see here is that this approach only outputs 255 values of depth while the kinect has a total depth dynamic of 2047 steps. In part two I will show you how I managed to access the full point cloud that comes out of the kinect to use it further in the 3d program cinema 4.
filed under: after effects
exploration
kinect
lab
software
I am on my way to Moers where the Shiny Toys *Space is taking place. The Festival is a gathering of a variety of international vjs and visual artists that present their experimental audiovisual approaches and give workshops and lectures.
On Saturday I will present a 2 min videomapping onto the main building of the train station Moers, looking forward to this show! Probably I will also present some of my kinect and videomapping explorations.
I found some time to update my website and I am happy to present you six new clips, animations, and kinect explorations that I have created this summer.
So check out:
#40 The Sky (lab)
Techmeck (Blinkenlichten/ZDF)
My Brother Recorded On Kinect (also Kinect)
My Kinected Hand (Kinect)
GMO Freshness (lab)
We Are (lab)
filed under: 3d
after effects
animation
kinect
köln/cologne
lab
me
motion design
video
Last week I was invited to give a presentation at Cologne based agency Denkwerk. At their format kreatifrühstück they ask internal and external creatives to give insights into their processes.
I have talked about my recent explorations to use the kinect for recording 3d depth data and also about general insights of my creative workflow.
You can read the blog post about the talk (Kreatives Neuland betreten – und immer weiter gehen (german), it summarizes my presentation very well!
This week I participate with a audiovisual projection loop at the Minifestival Der Kalk vor lauter Bäumen. Janina Warnk will present a curious urban forrest and Katharina Huber some of her illustrations.
The festival will take place from August 16th to 19th daily from 17h-22h at Baustelle Kalk in Cologne.
Find more info on the Facebook-Event Page.
There will be concerts on thursday, a future shorts screening on friday and lectures and performances on Saturday.
filed under: festival
installation
köln/cologne
screening
video art
I am very excited to tell you that the videomapping for the tv3 show Com va la vida on which I did a lot of design and animation work for my friends at Tigrelab last year has won a Bronce Laus for its audiovisual design! Yessss!
filed under: award
barcelona
motion design
tv