Category: Portfolio

  • Commercial Work – Page 1

    This post is a sample of the large scale and commercial work I have done as the lead creative technologist at Fake Love in NYC

     

    Lexus – Trace Your Road – Life Sized Video Game – Rome, Italy – 2013

    My role: I was one of the lead technical developers and designers for this piece, along with Dan Moore and Ekene Ijeoma. Programmed in openFrameworks on OSX and iOS.

    Lexus | TRACE YOUR ROAD | Director’s Cut from Fake Love on Vimeo.

    ——————————–

    AmEx Instagram Towers – Fashion Week – Lincoln Center, NYC –  2012

    My role: Lead technical architect on the hardware and interaction, also programmed by Caitlin Morris Made with openFrameworks.

    Amex Fashion Week Instagram Towers from Fake Love on Vimeo.

    ———————————

    NY Pops Gala 2012 – Interactive Conductors Baton – Carnegie Hall, NYC – 2012

    My role: I was the programmer and tech lead on this project. Devised the tracking system, custom baton, software and design. Made with openFrameworks and Max/MSP/Jitter

    NY Pops | Gala 2012 from Fake Love on Vimeo.

    ———————————-

    Google Project Re:Brief Coke – Interactive Vending Machine – Worldwide – 2011

    My role: I was the lead tech for the installation/physical side of this project (another company did the banners and web server portion). I did the vending machine hacking, setup and programming in New York, Cape Town, Mountain View and Buenos Aires. This project went on to win the first Cannes Lions mobile award. Other programming and hardware hacking by Caitlin Morris, Chris Piuggi, and Brett Burton. Made with openFrameworks.

    Project Re:Brief | Coke from Fake Love on Vimeo.

    —————————-

    Shen Wei Dance Arts – Undivided Divided – Park Avenue Armory, NYC – 2011

    My role: Lead projection designer, programmer, and live show visualist. I designed the entire 12 projector system for this Shen Wei premiere at the Park Avenue Armory. I also programmed and maintained the playback system for the 5 night run of the show. Made with Max/MSP/Jitter and VDMX

    Shen Wei | Park Avenue Armory from Fake Love on Vimeo.

    ——————————-

    Shen Wei Dance Arts – Limited States – Premiere – 2011

    My role: Lead projection designer, programmer and live show visualist. I designed the playback and technology system for this new piece by choreographer Shen Wei. I also contributed heavily to some of the visual effect programming seen in some of the pre-rendered clips. Made with Max/MSP and VDMX.

    Shen Wei – Limited States from Fake Love on Vimeo.

    ——————————–

    Sonos – Playground and Playground Deconstructed – SXSW and MOMI NYC – 2013

    My role: I was the technical designer of the hardware and projections for this audio reactive immersive piece. Red Paper Heart was the lead designer and developer on this project which they made with Cinder.

    PLAYGROUND DECONSTRUCTED from Fake Love on Vimeo.

  • Sonic Prints

    Sonic Prints

     

    THIS PAGE IS OUTDATED!! Check the new post here and download the software to try for yourself!

     

    Using openFrameworks and ofxFFT to generate 3D Meshes of sound data for use in 3D printing.

    This is very much a work in progress, but just wanted to share some initial results while I work on the final visual output. I have done some initial prints (see below) but have found that I’d like to make some much larger prints to be able to really get the detail out of the generated models. This project is interesting for me because it allows me to look at the structure of a song in a different way than a musical score or just a volume intensity graph like we’re used to.  I can also play with the idea of making physical objects out of songs in a different way than burning a CD, pressing a vinyl or etching a wax cylinder.

    We will be releasing the source after I get rid of a few bugs and clean a few things up, just need time to make those adjustments. Then you’ll be able to tweak it and make your own 3D meshes for your own real-time music and sound input.

    The meshes are set up like this: left is generally the bass/low end, right is high end. Red or orange marks are the loudest frequency band at that particular time. White to black shows the relative volume intensity of the particular time. I can adjust the number of frequency bands it is looking at and make it more coarse or fine.

    If you would like one of the 3D mesh files to look at for yourself, I can send you a sample. The individual files are about 20mb so I don’t want to host them here yet.

    You can see some images of sample prints below, they are about the size of a bangle or bracelet.

    Sample prints now available on Thingiverse

    I have been doing this as a personal project and a side project where I work: www.fakelove.tv

    3DFFT Sonic Prints – Work in Progress from blair neal on Vimeo.

    Screen shot 2013-05-17 at 1.57.57 PM Screen shot 2013-05-18 at 7.14.48 PM annuals-3d-Print debugviewart kanye west - power Screen shot 2013-05-24 at 12.27.51 PM Son lux - all the right things 3d_print_test 3dFFT_print






    3DFFT _Sonic prints example – Son Lux – “Easy” from blair neal on Vimeo.

  • Demo Reel 2011

    I finally put together a demo reel of a bunch of my previous work, most of which you can find in the rest of my portfolio. It includes some of my live visuals work, some of my interactive installation work, and a few of my music videos (both official and unofficial). I’ll be adding onto it eventually. All of the material on the reel was shot, edited, programmed, and tweaked by me.

  • The Wobbulator

    I was finally able to cobble together a video for Nam June Paik’s Wobbulator. It was one of my favorite pieces of equipment during my residency at the Experimental Television Center, and I was confused about why there wasn’t a lot of information out there about it on the web. There are a few grainy youtube videos but they don’t show a lot of the exterior of the device or any of the real time manipulations, so I wanted to make a little educational video. Most of the Wobbulator’s source images in this video were either from a camera pointed out a window, or just from straight video feedback.

    For a lot more information, check out the Experimental Television Center’s website in their Video History Project area. There are tons of great articles on early analog video tools and techniques, but in particular there is a very detailed article on the wobbulator. Just to give you some more info, here is the first paragraph of the article on the device:

    A raster manipulation unit or ‘wobbulator’ is a prepared television which permits a wide variety of treatments to be performed on video images; this is accomplished by the addition of extra yokes to a conventional black and white receiver and by the application of signals derived from audio or function generators on the yokes. The unit is a receiver modified for monitor capability; all of the distortions can thus be performed either on broadcast signals or, when the unit is used as a monitor, on images from a live or prerecorded source. Although the image manipulations cannot be recorded directly, they can be recorded by using an optical interface. The patterns displayed on the unit are rescanned; a camera is pointed directly at the picture tube surface and scans the display. The video signal from this rescan camera is then input to a videotape recorder for immediate recording or to a processing system for further image treatment. The notion of prepared television has been investigated by a number of video artists and engineers; this particular set of modifications was popularized by Nam June Paik.

    I also made a quick music video with the wobbulator as a key component…check it out here

    For more on my experience at the experimental television center check out a few of these links
    [1] [2] [3] [4]

    This video is now featured on Rhizome, Create Digital Motion, Hack a day, Makezine, Wired and Notcot among others

  • Trip[tych]

    Fall 2009

    In preparation for my thesis show, I began developing a series of pieces involving different uses of live visuals in relation to music. I was particularly interested in the fact that in many cases the visuals were being led by the music, that is, they were representing the music without having the music react to them. This process seems broken to me, so I tried to think of an idea that was more of a feedback loop.

    Trip[tych] treats the visualist as the conductor of the overall musical action. Three musicians sit behind see through scrims that are projected on. The musicians are only allowed to play while their particular screen is lit up in front of them. While the musicians are improvising off eachother, they are also supposed to be working off of visual cues being projected in front of them. The visualist is also reacting live to what they are playing, so dynamic changes are seen and heard very fluidly. The use of scrims also solved a problem I was having with projecting on a rectangular screen and how that disconnects the projection from the performers. By setting up the projection in this way, I was able to create a multilayered space with the musicians in between. The content of the projections was based on a late night cab ride I had back from a show in NYC.

    The first performance of Trip[tych] was at the West Hall Auditorium at the Fall 2009 MFA show. It featured Kyle McDonald on drums, Will Rogers on saxophone, and David Rhoderick on guitar effects pedal. It was also performed at my thesis show, Overflow.

    You can read more about Trip[tych] in my thesis.

  • Lightpainting

    Fall 2008
    Max/MSP/Jitter

    I’ve always been interested in using extended exposure photography, but I was always frustrated with the lack of real time feedback and the fact that it took so long to make a single frame. I wanted to develop a way to achieve something like the lightpainting effect seen in stop motion commercials and things like that, but make it more live and interactive. It also isn’t limited to being used in a totally dark room, and could be used in more normal lighting situations.

    The code for the example and more detail on the process is available here.

  • Thesis – A Visualist’s Practice

    A-Visualists-Practice.pdf (1700 downloads ) (Right Click and select “Save As”)

    A Visualist’s Practice the the culmination of my two years in the MFA program at Rensselaer Polytechnic Institute. My thesis advisor was Curtis Bahn. I focused on the history and practice of combining images and sound in live performance. I had been developing my own performance interface for several years already, but the MFA program allowed me to solidify some of my thoughts on the use of a visual performance instrument.

    In the thesis I cover a brief history of light instruments, describe my visual instrument, and give a detailed outline of my thesis show Overflow. I also conducted interviews with some of the well known names in live visuals, including Tony Martin, Chris Allen (of the Light surgeons) and D-Fuse. Below is the abstract, please let me know if you’ve got some feedback on the paper.

    Abstract:

    This thesis investigates the use of live video performance within a musical context. Live video performance involves projected video imagery that is digitally manipulated through the use of software in real-time. A performer of live video may use hardware interfaces that permit a level of expressiveness on par with that of a musical instrument. Artists have been creating and refining different interfaces for interacting with light and images in a musical context for centuries. Early interfaces relied on an idea of directly connecting musical elements, such at pitch, with abstract visual elements, such as color. Modern methods of interacting with video do not require these direct connections to musical elements, and instead allow the video to be an independent contributor to the experience of a new kind of multimedia performance. I give an example of an interface for live video performance by detailing my own expressive visual instrument. I also outline the use of live video in the culmination of my thesis work, a performance of five multimedia pieces that explore the various relationships between projected video and music.

  • Lonelygirl512

    Spring 2007
    Max/MSP/Jitter

    This project was a collaboration between myself, Zane Van Dusen and Kyle McDonald. Using Max/MSP and Jitter we created a network video loop using 3 laptops and a wireless router. The first laptop would send each individual channel of video to one laptop and then to the next and then back to the first. This feedback loop would eventually degenerate into a mess of noisy pixels. This degraded transfer was achieved by sending 512-pixel packets over UDP (which is a lossier protocol than TCP).

    Our source footage came from youtube star ” Lonelygirl15. ” The ambient audio and regular talking audio were added in post production. We’re not sure what causes the stripes at the end but our guess is that they result from some sort of video “harmonic.”