LIPPtv: Glowing With Fire with OCTONOMY x nic.cab

Here is the full video documentation of Glowing With Fire, an audio-visual projection mapped performance in the times of COVID-19.

logo2.PNG

This performance was created for LIPP.tv, the final performance project of the Live Image Processing and Performance (LIPP) class at NYU’s ITP. LIPPtv is a creative response to how code, video, networks, and art can be used to create a new experience in live performance. Each student has created their own short TV show influenced by video art, experimental animation, public access TV and more. This entire event was created remotely and is performed remotely, with students creating the website, commercials, music, and animations.

The original stream took place on Twitch on May 11th, 2020.
Please watch the full recorded program here: https://lipp.tv/


Development

For LIPPtv, I am collaborating with my roommate Heidi Lorenz, also known by her project name OCTONOMY. She will be providing the audio, and I will be using a combination of Max and madMapper to projection map our backyard. This pre-recorded performance will also be shown with Cultivated Sound, a hybrid label and collective.

Here is a skeletal version of the track, which she is still working on. We discussed a water element where she will pour water into a bowl with contact mics and distort it.

The first samples were very different from this.

Here is my are.na link for some inspiration / references.

Before hearing the track go towards its new direction, I thought I could incorporate some experiments I was working on for Light & Interactivity - my gesture controlled DMX moving head lights, that also has a particle system visualization within TouchDesigner.

I knew that securing an expensive short-throw projector that doesn’t belong to me (thanks ER) and cantilevering it off the edge of a fire escape would be a huge source of anxiety…

IMG_9821.JPG
IMG_9822.JPG

After all this, the angle was still not at an ideal location. Back to fabricating a new attachment and remapping the layout…

References

Gesture Controlled DMX Moving Head Lights

When I registered for the Light and Interactivity course, one of my goals was to learn more about DMX lighting and how to control it. The final project was the perfect opportunity to create a DMX controller that would be responsive to hand gestures. For this, I used TouchDesigner to parse body tracking data from the Microsoft Kinect and madMapper to receive OSC messages that would change channel values to control the moving head lights.

The video shown below is the first prototype: the X coordinates controlled the Panning channel, and the Y controlled the Tilt.

Bill of Materials

  • Pair of UKING 80 Watt Double Beam Moving Head Lights, $128.78 on eBay

  • Kinect Azure, borrowed from ITP Equipment Room

  • ShowJockey USB -> DMX Adapter, gifted from CHiKA, can be purchased from garageCube

  • ZOTAC ZBox QK7P3000, my NUC PC

  • Rockville RDX3M25 25 Foot 3 Pin DMX Lighting Cable, 2 pack for $10.95 on eBay

Software

  • TouchDesigner 2020.22080

  • madMapper 3.7.4

  • OSC Data Monitor (Processing Sketch), for troubleshooting

System Flow

PC -> Kinect Azure -> TouchDesigner -> madMapper -> DMX moving heads

TouchDesigner Setup

madMapper Setup

Screen Shot 2020-04-28 at 4.06.12 AM.png
Screen+Shot+2020-04-28+at+1.45.26+AM.jpg



Roadblocks

ShowJockey USB DMX
TouchDesigner is a powerful tool and includes its own DMX out CHOP, but Derivative built the TD environment with ENTTEC hardware in mind. Tom put together a workaround for DMXKing’s eDMX1 Pro using DMXout via sACN which would send messages to QLC+ for controlling the lights. The eDMX1 Pro uses an FTDI driver which can be recognized with the QLC+ software.

I experienced difficulty finding the specification sheet for the ShowJockey SJ-DMX-U1 device, and could not see which driver it would need. I blindly downloaded the FTDI driver to see if the ShowJockey would then show up, but that did not work. As per Tom’s advice, I checked to see what serial devices my Mac recognized. To do this, I used the Terminal command “ls /dev/cu.*” The ShowJockey did not show up.

Screen Shot 2020-05-03 at 12.54.14 PM.png
Screen Shot 2020-05-03 at 12.51.35 PM.png

When CHiKA gifted me the ShowJockey, we were using it only with madMapper, so I knew that the device was functional in that environment. I assumed that this product on the GarageCube site is what I must have, and its description says "This "NO DRIVER!" DMX controller is plug & play and ready to work with madMapper and Modul8 (only)" For this reason, I decided to use TouchDesigner simply to send out OSC data to madMapper for channel value changes.

OSC Connection
When trying to establish the link between TouchDesigner and madMapper, I knew that OSC would be very straightforward. It’s a matter of matching network ports, setting up the correct network / local addresses, using the appropriate protocol, and making sure the OSC message that is being sent is in the correct format that the receiving software could recognize. When I did not see any changes to the channel values within MM, I used the OSC Data Monitor to make sure that I was indeed sending out an OSC message with TD. Sure enough, I was sending an appropriately formatted OSC message.

TD-OSC_troubleshoot1.PNG

I followed a few tutorials (see references) but they all did not mention a very important thing; Tom pointed out "You'll need to use the address 127.0.0.1 if you're trying to communicate between two programs on the same machine.” Duh. Thanks Tom!

Notes

I picked the UKING 80W Double Beam moving heads as Louise had mentioned in class that UKING had decent reviews. For this project, I favored these lights for their basic functionality and value, however, I was not pleased with the color blending quality. Once I received my order, I used my AKAI APC40 MIDI controller to change channel values within madMapper just to test that the moving head lights were received in working condition.