Vortex

For my final Design for Digital Fabrication project, I created a singular prototype for what I envision for Vortex.

In this prototype, audio-reactive LEDs driven by madMapper are embedded within an aluminum channel. This entire channel spins continuously via a NEMA23 motor mounted to a modular base, controlled by an Arduino.

Using Fusion360, I created a simulation of what 5 of these bars positioned in a pentagonal form would look like.

It was difficult purchasing perfectly mating parts for the mechanical connection between the LEDs to the motor. I purchased a 12mm shaft hub, a 1/2” ID slip ring, and a 1/4” coupler for the NEMA23. This called for a custom shaft, which Ben Light helped me make on the metal lathe. We started off with a 1” aluminum round stock, and very meticulously shaved it down to the appropriate widths for each component.

IMG_3212.jpg

Here is a quick time lapse of some manual fabrication necessary for the mechanical connections.

Here is the Fusion360 rendering of the project.

vortex.PNG

This project was inspired by collectifscale’s project Flux.

flux1.gif

Morphing My 3D Scan in Wrap3

Thanks Matt for another great tutorial!

I used Blender to resize and adjust my model before bringing it into Wrap3. When it was time to compute the wrapping from my model to the base mesh model, I was startled by the result: the inner parts such as the mouthbag, ear sockets, eyeballs, and nostrils should not have been morphed. I went ahead and added those polygroups to the selection, and recomputed the wrap.

mouthbagew.gif
mouthbag.PNG

That sure fixed it. But then I realized that the female base mesh model did not include any eyes; instead, they showed up as gaping holes.

femalewrap.gif

As per Matt’s suggestion, I redid the tutorial using the male base mesh model, and got my eyes back. My fingers came out looking super creepy, but I know that can be fixed at a later stage.

malewrap.gif
finalwrap.PNG
malewrap.png

3D Scanning with itSeez3D

Although the 3D scanning process was pretty straightforward, I came across a few difficulties when using the itSeez3D app with the Structure Sensor + iPad.

Christina volunteered to help me with my very first scan, which ended up being the best of many attempts. I wasn’t quite ready, so the face I was making was a bit caught off guard. I initially thought that the lighting was uneven, that the left side of my face was in more of a shadow than the other side. I’m pretty happy that Christina was able to scan pretty much every detail - there’s not too many holes in my texture!

 
1_good.gif
 

A little later that same day, I asked Dina if she could scan me so I could have an avatar with a neutral face. Even though we were in the same location with the same amount of lighting as the first scan, the scans ended up being very inconsistent. Dina came across alignment issues, and the portions that were already scanned would shift along with the body. We continued to finish and process the scan anyway, and now I see that the shifted scan resulted in multiples of my body parts. There were many more vestiges and incomplete portions in these scans. We restarted the iPad, made sure everything was charged, adjusted all the lighting in the room, and even tried the bust mode to see if that behaved differently - all to no avail.

2_badlegs.gif
3_badarmsreduce.gif
4_mehreduce.gif
5_bust.gif
 

I decided to import the first scan into Mixamo, and finally was able to live vicariously through my avatar - I’ve been missing nightlife so badly, I really just want to dance! I took advantage of all the dancing animations available.

 
7_ijustwannadance.gif
6_float.gif

Gesture Controlled DMX Moving Head Lights

When I registered for the Light and Interactivity course, one of my goals was to learn more about DMX lighting and how to control it. The final project was the perfect opportunity to create a DMX controller that would be responsive to hand gestures. For this, I used TouchDesigner to parse body tracking data from the Microsoft Kinect and madMapper to receive OSC messages that would change channel values to control the moving head lights.

The video shown below is the first prototype: the X coordinates controlled the Panning channel, and the Y controlled the Tilt.

Bill of Materials

  • Pair of UKING 80 Watt Double Beam Moving Head Lights, $128.78 on eBay

  • Kinect Azure, borrowed from ITP Equipment Room

  • ShowJockey USB -> DMX Adapter, gifted from CHiKA, can be purchased from garageCube

  • ZOTAC ZBox QK7P3000, my NUC PC

  • Rockville RDX3M25 25 Foot 3 Pin DMX Lighting Cable, 2 pack for $10.95 on eBay

Software

  • TouchDesigner 2020.22080

  • madMapper 3.7.4

  • OSC Data Monitor (Processing Sketch), for troubleshooting

System Flow

PC -> Kinect Azure -> TouchDesigner -> madMapper -> DMX moving heads

TouchDesigner Setup

madMapper Setup

Screen Shot 2020-04-28 at 4.06.12 AM.png
Screen+Shot+2020-04-28+at+1.45.26+AM.jpg



Roadblocks

ShowJockey USB DMX
TouchDesigner is a powerful tool and includes its own DMX out CHOP, but Derivative built the TD environment with ENTTEC hardware in mind. Tom put together a workaround for DMXKing’s eDMX1 Pro using DMXout via sACN which would send messages to QLC+ for controlling the lights. The eDMX1 Pro uses an FTDI driver which can be recognized with the QLC+ software.

I experienced difficulty finding the specification sheet for the ShowJockey SJ-DMX-U1 device, and could not see which driver it would need. I blindly downloaded the FTDI driver to see if the ShowJockey would then show up, but that did not work. As per Tom’s advice, I checked to see what serial devices my Mac recognized. To do this, I used the Terminal command “ls /dev/cu.*” The ShowJockey did not show up.

Screen Shot 2020-05-03 at 12.54.14 PM.png
Screen Shot 2020-05-03 at 12.51.35 PM.png

When CHiKA gifted me the ShowJockey, we were using it only with madMapper, so I knew that the device was functional in that environment. I assumed that this product on the GarageCube site is what I must have, and its description says "This "NO DRIVER!" DMX controller is plug & play and ready to work with madMapper and Modul8 (only)" For this reason, I decided to use TouchDesigner simply to send out OSC data to madMapper for channel value changes.

OSC Connection
When trying to establish the link between TouchDesigner and madMapper, I knew that OSC would be very straightforward. It’s a matter of matching network ports, setting up the correct network / local addresses, using the appropriate protocol, and making sure the OSC message that is being sent is in the correct format that the receiving software could recognize. When I did not see any changes to the channel values within MM, I used the OSC Data Monitor to make sure that I was indeed sending out an OSC message with TD. Sure enough, I was sending an appropriately formatted OSC message.

TD-OSC_troubleshoot1.PNG

I followed a few tutorials (see references) but they all did not mention a very important thing; Tom pointed out "You'll need to use the address 127.0.0.1 if you're trying to communicate between two programs on the same machine.” Duh. Thanks Tom!

Notes

I picked the UKING 80W Double Beam moving heads as Louise had mentioned in class that UKING had decent reviews. For this project, I favored these lights for their basic functionality and value, however, I was not pleased with the color blending quality. Once I received my order, I used my AKAI APC40 MIDI controller to change channel values within madMapper just to test that the moving head lights were received in working condition.