This post is a sample of the large scale and commercial work I have done as the lead creative technologist/visualist at Fake Love in NYC.
To be clear, these are not at all my personal works or designs, I often work with a very talented team that helps realize each one of these projects from creative direction, design and production, but I don’t have every single person credited in the projects below – I am only listing projects that I made a significant contribution to in terms of technology or development (and occasionally design).
I am also in debt to the creative code community, in particular to openFrameworks, for helping me with the software tools to make most of these pieces.
I was the lead tech for this event we developed in collaboration with 7Up. We were tasked with making special interactive elements for an EDM concert for a deaf audience. The performer was DJ superstar Martin Garrix. We had haptic floor elements that vibrated with the music, cymatics pieces that vibrated water to visualize music, fog machines, and live visuals. I oversaw a lot of the development of the different elements. I also developed a fair amount of the live visuals, and I constructed and performed with the system live during the concert.
For this one, I contributed heavily to technical organization and physical execution. The system to get all of these up, cloned, running, talking over network, and automated was surprisingly complex, but went off without a hitch.
I was the team lead for this Nike project that went live for nearly 2 weeks during the Olympic track and field trials in Eugene, Oregon. We had some generative visuals that were in one of the specially designed domes – that was my primary programming contribution.
We also had an interactive experience in the following dome that allowed 16 participants to race each other on special manually powered treadmills for 60 seconds. The treadmill data was sent back to our visualizing software that allowed us to project race progress for all the participants in real time. Matt Felsen and Mike Romeo were responsible for getting the treadmill experience to be a flawless piece of software.
My role: I was one of the lead technical developers for this piece, along with Dan Moore and Ekene Ijeoma. Programmed in openFrameworks on OSX and iOS. I made the tracking system and iOS app. I also designed the projector/technology system that ran it, and ran the on-site live run.
Fake Love designed this 48 screen interactive video wall for SXSW2015. Caitlin Morris did the multi-channel video playback software development and architectural wall design, and I was primarily in charge of the video hardware (choosing the miniature screens, wiring ideation). Matt Conlen and Naoise Boyle developed the backend/frontend for the control tablet.
Fake Love designed and installed a pop up gallery that explored the history of wearable technology with various interactive exhibits that were based on the 5 senses. My primary contributions were some of the sound design and programming on an interactive electronic harp. I also modified the LCD monitors that were only visible through special polarized glasses hanging in the space (the LCD’s polarization layer had been hand-removed by me).
For this piece designed by Fake Love, Dan Moore and Jason Levine were the primary visual/audio reactive sound developers. Mike Romeo did the sound design and circuit development for the haptic sensing element. I lead the on site installation, projection design and hardware integration between all the sculptures.
For this piece, Aramique and Fake Love collaborated with Sonos to make 4 different art directed rooms that were awash with an audio reactive projection piece. We used openFrameworks – Dan Moore did the primary visual development, Jonathan Dahan did the Node backend integration with Sonos and Echonest, and I developed the audio analysis engine/logic gates for how the piece would react to incoming audio based off FFT data, and Echonest analysis returned from the song title (loudness, danciness, energy, etc etc)
Fake Love worked with an extremely talented team to realize this piece that took older devices like typewriters and old cameras and granted them the ability to post to modern social networks like Twitter and Instagram. My smaller part of this piece was the development of the audio recording software/effects chain for the 1950’s Gibson that could post to sound cloud – basically in charge of making sure it sounded good and recorded correctly. I also did some the initial installation of the devices onto the train itself.
Worked with Dan Moore and Charlie Whitney to create these interactive touch screen prisms. I did some of the graphics and display programming, as well as some of the hardware design and implementation. I was also on site for installation and support.
Dave Lublin of Vidvox had made the original visuals rig/toured with Girl Talk in 2010/2011. On a second leg of the same tour, I went out on tour with them for a week to train their visuals creator Andrew Strasser how to run the whole rig by himself. I also worked with their LED wall designer John Frattalone to tighten up the visuals to be as pixel accurate as we could be on a very unusual hexagonal shaped wall.
My role: I was the programmer and tech lead on this project. Devised the tracking system, custom baton, software and design. Made with openFrameworks and Max/MSP/Jitter
My role: I was the lead tech for the installation/physical side of this project (another company did the banners and web server portion). I did the vending machine hacking, setup and programming in New York, Cape Town, Mountain View and Buenos Aires. This project went on to win the first Cannes Lions mobile award. Other programming and hardware hacking by Caitlin Morris, Chris Piuggi, and Brett Burton. Made with openFrameworks.
My role: Lead projection designer, programmer, and live show visualist. I designed the entire 12 projector system for this Shen Wei premiere at the Park Avenue Armory. I also programmed and maintained the playback system for the 5 night run of the show. Made with Max/MSP/Jitter and VDMX
My role: Lead projection designer, programmer and live show visualist. I designed the playback and technology system for this new piece by choreographer Shen Wei. I also contributed heavily to some of the visual effect programming seen in some of the pre-rendered clips. Made with Max/MSP and VDMX.
My role: I was the technical designer of the hardware and projections for this audio reactive immersive piece. Red Paper Heart was designed the visuals and developed them with Cinder. Aramique was the creative director.
Expanding on the software that I made for the NY Pops 2012 Gala, this was an interactive piece that utilized a 4K Sony camera and a 4K LG TV for their pop up shop at the 2014 US Open. Users would swing a special tennis racquet outfitted with IR LED’s and create ribbon graphics in real time. With the help of openFrameworks, I developed the visuals, software and hardware pipeline that was used in the piece.