Client: Class Work
Project: Devising Experiential Media Systems Studio +
Designing with New Technology Studio
This course was built around understanding what makes an experience and designing media systems that engage participants in unique and memorable environments. For my final project, I was inspired by rumors of the Spider-Man Web Slingers attraction set to open June 2021 at Disney California Adventure. The rumors indicated that the system would allow guests to "sling webs" using their hands over an IR sensor. Armed with a Leap Motion, I was determined to create a similar gaming environment. The front-end of the course was based around learning and using Isadora, a tool built with theatrical applications specially. Unfortunately the software's 3D capabilities were very limited. Knowing that, I moved into TouchDesigner.
The system utilized a kinect-driven parallax to further immerse a participant in the experience. TouchDesigner for Mac doesn't support Kinect out of the box due to the lack of Microsoft SDK support. To address this issue I activated the Kinect in Isadora and used OSC to stream skeleton positioning data of the head bone into TouchDesigner.
To launch a projectile, a user has to flick their hand upward. These gesture is read into a series of logic gates that figure out which hand has performed the launch gesture and then pass that hands position and rotation values into the launch mechanism.
The entire game environment is built within a Bullet Solver to allow physics to impact everything. The launch hand data is based into a force operator and that propels the projectile forward. A series of timers tell said projectile when to listen to the launch force, when to be solely moved by gravity, and when to start over.
The target that is randomly placed every round detects collisions and uses those to increase the in-game score.
The purpose of this studio was to research technologies and create potential applications with them in a collaborative group setting. One of my group members was researching virtual assistants and another was researching wearables. The prototype I created acts as a potential wearable system with voice control. The joystick is the user input, which in a real prototype would likely be MultiTouch or something similar. The LED grid visualization would be replaced with an LCD or flexible LED display.
A program called Firmata is installed on to an Arduino. This program allows the Arduino to communicate with a corresponding TouchDesigner operator over serial.
The Siri interaction with the TouchDesigner setup is a Siri-Shortcut that uses a WebHook to trigger an IFTTT applet that adds a value to an Adafruit IO data feed. Changes to that feed cause things to happen in TouchDesigner via an MQTT client.
Toggling the joystick increases values that create an X and Y value for the illuminated LED. That math feeds into a series of Python conditional statements. The MQTT client can also modify what LEDs are illuminated via Python conditionals.