beepHR

Video

Hired by M2 Digital Media Group to create this short explainer video for an exciting new software product created by beepNow. Assisted with script writing, and was the visual artist responsible for all animation, video, sound, and final deliverable.

Video Playback software for LED Advertising Truck

Video

I teamed with Boom Digital Ads to create the custom audio and visual playback software to run the multi-screen display system that is rigged to their truck.

The software was created for Boom to have the most dynamic control over its three-panel LED video system.

This custom software (created with the nodal programming application ‘Touchdesigner’) made it easy for the operator to choose what content was to be displayed on which of the three screens, or to display one single video over the entire three screens of the truck. The software contained an integrated system for displaying Karaoke software which they use to throw outdoor special events.

Wormhole Stage Visuals – Untz Festival 2019

Video

BTS into my project file for The Untz Festival 2019

Projection Mapping and Live Visuals

The Untz Festival 2019 was held in Mariposa California. I was doing content creation and live visuals on the Wormhole Stage brought to you by Wormhole Entertainment. The stage was designed and built by Simple Machinists.

For this project I used Resolume and Touchdesigner for live visuals.
I mixed a combination of custom animations that were prerendered to be mapped to the stage, as well as realtime 3D plugins to control texturing and lighting in realtime.

Animations for Phutureprimitive Fall 2015 Tour

Video

Here are some of the animations I made to go on tour with Phuture Primitive fall 2015. The animations were created for the Carey Thompson stage, and Even Glantz from Third Eye Visuals VJed the clips along with work from other animators.

You can see more work from Carey Thompson at

Galactivation
http://www.phutureprimitive.com/tour-dates Cj

TEDxBerkley 2015

http://tedxberkeley.org/event/wisdom-compassion-connection

I had the opportunity to work with Oaktown Productions and Viz Wrap Society for the second year in a row at TedxBerkley. I assisted with the setup of all video systems and programmed the multimedia presentations for the speakers.

Hand To Mouth – World Premier Original

Hand to Mouth – A seed’s journey from hand to plate.
Filmed at Main Stage West May 2015
Featuring John Duykers, Kira Dills-Surra, and James Pelican
Director and Librettist: Missy Weaver
Composer/Performer: Miguel Frasconi
Motion Graphics, Video, Projection Mapping, and Interactive Programming by Matthew Eben Jones, Logan Jhonston and Matthew Childers
Costumes by Maria Christoff

This was a really great project and we got to work with a lot of different mediums for this one. We created all custom content for the 3D projection mapped stage that evolved cue by cue with the performance. The custom animations, projection mapping, LED strips, and stage lighting were controlled with a custom video and lights application programmed with Touchdesigner. There were a few scenes that also incorporated real-time animation using motion-tracking cameras. The tech for this show was in development for two years and is still an ongoing process expanding with the performances of First Look Sonoma.

Here is a quick edit of some of the animations and motion graphics I designed.

Hole In Space – Connecting two Oakland neighborhoods in real-time

Maya Gurantz and Ellen Sebastian Chang came up with this incredible idea and with financial backing from The Great Wall of Oakland they assembled the audio and visual technical team of Matthew Eben Jones, Logan Jhonston, and myself, Matthew Curtis Childers.

The goal of the project was to connect two contrasting neighborhoods of Oakland through a live video portal. Basically it was a live Skype call that would remain on for a few hours every night. Anytime it was running Oakland residents in the separate neighborhoods could stop to see and communicate with each other in real time. One of the really cool things about this project is that we barely spent any money on the software side.

We used Skype to create the video connection, Syphoner ($30.00) to get the video feed from Skype to Quartz Composer, and then we used Quartz Composer (free if registered apple app developer) to projection map the video feed onto the store front windows.

We used Hamachi and VNC viewer in order to be able to log on to the two computers remotely and initiate the calls every night. We were also able to turn on the projectors remotely as well by connecting them to the computers through a serial cable and then executing action script.

  holeinspace1

Come back soon for a more in depth description of our technical set up. Until then please check out these articles written about the project.

http://www.eastbayexpress.com/oakland/artists-create-two-way-video-portal-for-oaklanders-to-meet-their-neighbors/Content?oid=4178751 http://www.citylab.com/design/2015/02/two-oakland-neighborhoods-connect-through-a-hole-in-space/385014/

http://eatdrinkfilms.com/2015/02/05/site-and-sight-a-hole-in-space-oakland-redux-opens-a-window-between-two-neighborhoods/

Fort Mason Center PIER 2 REOPENING CELEBRATION

Video

Fort Mason Center San Francisco

PIER 2 REOPENING CELEBRATION

This was a really unique opportunity and I got to work on with the famous Matthew Eben Jones as the lead. The mapping was really simple for this gig, but we still had to calibrate two of the Christie 7.5k’s so that they would overlap seamlessly.

The most exciting thing about this gig was the content we were able to work with. We were provided maps of Fort Mason all the way back to the late 1700’s, and we were asked to create an animation of them in chronological order. We overlapped every decipherable image to form an hour long animation of the geography and architecture slowly changing over time.  We also created an animated timeline so that the date would be displayed with each image.


10245464_10152637325220255_5946377567037848535_n 2

flier

First Look Sonoma’s Visual Wizardry : H2M Preview Performance

The first look into First Look Sonoma’s Visual Wizardry

I worked with Matthew Eben Jones and Logan Johnston on this project for First Look Sonoma. Matt Jones was our creative and technical lead, and due to his expertise I also learned a lot about theater production and rigging. This was a really fun project, and we got to do a lot of experimenting for the purpose of developing the final concept for the official debut.

Matthew Eben Jones filmed most of the video content on an organic farm in Sebastopol, California. Logan Johnston and I created the animations with Cinema 4D, After Effects, Mandelbulb 3D, and TouchDesigner.

For this project we used one laptop running QLab, one laptop running ResolumeMadMapper, and Processing, and a desktop PC running TouchDesigner.

We used QLab so the stage director could just push “go” to trigger the next cue in each scene. QLab would then send an OSC message to both of the other computers simultaneously.

First Look Sonoma Presents:
MIDI

The laptop running Resolume and MadMapper was using all three outputs on the Matrox Triplehead2Go Display Port edition. This laptop was also running Processing to send a video Syphon feed from Resolume.

We were really eager to use TouchDesigner for many reasons. We experimented with a lot of different techniques such as using the X-box Kinect for motion tracking, and we are still working on some ideas for the official debut.

KinectExp by Di

We ended up using TouchDesigner with CamSchnappr for 3D mapping parts of our stage. We had two fake rock faces on either side of the stage made out of scraps of spray painted card board and then staple gunned to the barn doors. We used the Kinect with the Skanect software to scan in 3D models of our rock faces. We cleaned up the models in Cinema 4D, imported them into TouchDesigner, used CamSchnappr to snap our geometry into perspective, textured the geometry with rendered animations, and also created realtime animations with the geometry.

Here is a video of our Augmented Reality Testing

Occidental Arts and Ecology Center in Occidental California.

Occidental Arts and Ecology Center in Occidental California.

One of the really great things about building a cueing system in TouchDesigner is having uncompromisable control over every parameter with each cue. Once we programmed all of the cues we didn’t have to touch anything during the performance. The only clicking was the stage directors single mouse click for the beginning of each scene. The computer running Qlab would send Touchdesigner a single cue which would point to a specific row in a table. Each column in the table contained the values for the effects and video cues for that scene. Thank you Matthew Reagan for your help with setting up the cue list.

Python in Touchdesigner.

Python in TouchDesigner.

Simple Python scripts go a long way.

Simple Python scripts go a long way.

Live Visuals Demo Reel 2015

Digital Introspect and Yael Braha for the Minnesota Orchestra at the Minneapolis Orchestra Hall. More
Minnesota Orchestra

The Human Experience – Santa Cruz, Ca

NYE: New Bohemia at the Armory in San Francisco, Ca.  More.

Digital Media Art by Digital Introspect

Digital Media Art by Digital Introspect

Guitar Fish 2013 – Cisco Grove, Ca

“White Ocean” Burningman 2013  – Black Rock City NV.

Burning Man 2013 on the White Ocean Stage.

Burning Man 2013 on the White Ocean Stage.

Wormhole Wednesdays in Oakland, Ca.

BURT: Emissions concert in San Francisco, Ca.

 

Minnesota Orchestra – Live Audio Reactive Visuals

Multimedia collaboration with The Eyeo Festival and Northern Spark

The Pitch

On May 22nd, 2014 I received an email from Yael Braha (Creative Director) with the words “2D/3D Artists & Projection Mapping Artist Needed (URGENT!)” in the subject field. The rest of the email revealed that she was being commissioned to create an audio-reactive live visualization for the Minnesota Orchestra during The Northern Spark and Eyeo Festivals starting on June 14th. Once I took a peek at the attached images of the stage, I was immediately sold.

This is the stage in the Orchestra Hall in Minneapolis Minnesota.
This is the stage in the Orchestra Hall in Minneapolis Minnesota.

We would be creating one visual set designed specifically for the Minnesota Orchestra’s performance on Friday and Saturday, and one visual set for the electronic musician and performer Dosh after the orchestra on Saturday. On both Friday and Saturday after the Minnesota Orchestra’s performance there would be a kiosk set up with eight different instruments so that any member of the audience could play the instruments and watch the projection mapped visual animations react in real time.

The event would be held at the Orchestra Hall in Minneapolis Minnesota , and we would be working with two 1920×1080 20K lumen Projectors and eight different audio inputs into a Behringer Firepower FCA1616 8 channel audio mixer for audio reactivity. Resolume was the choice for our media server, and we needed to use that in conjunction with Mad Mapper in order to map the complex geometry of the stage. We wanted to have the absolute clearest and brightest image possible, so we were to use the two projectors directly on top of each other. Both projectors had to be from the same distance, so we decided that we would use one projector zoomed in as far as it would go while still encompassing the entire center of the stage, and the other would stretch as wide is it could go without bleeding into the audience. The projector that is to be focused tightly on the center will have it’s lumens and pixels more concentrated.

Kiosks

Developing the template

We only had two and a half weeks to come up with a full production, so we began working immediately. Luckily we had help creating enough content, but first we needed to create a digital 3D model of the stage in Cinema 4D that would become the 3D template with locked camera positions. This way we were able to share the 3D template with each animator and ensure that each animation would have exactly the same perspective.

renderingMad MapperTesting

Half of our animators were more comfortable working in 2D and Quartz Composer and the other half in 3D. We knew any 2D content would either appear completely flat when mapped on the stage or it would become badly distorted if we tried to fake the perspective by warping the  2D images on the actual stage. Each cube had three visible faces, so if we attempted to create a third plane from a 2D image the animations would not line up right on the edges of each cube face. We wanted to make use of this opportunity to projection map a stage with this geometry so using only straight 2D content was unacceptable. To fix this we took the animations rendered by our 2D animators and used them as flat camera mapped textures in our Cinema 4D template. This way we could still give the illusion of 3D perspective in our 2D animations by positioning lights in the 3D space and playing with shadows.

Compensating for perspective

We only had a photo taken from a cell phone camera, but at least it was from audience’s perspective. We had no idea of the actual dimensions of the stage. If we could have afforded to fly out to Minneapolis early we would have taken a better photograph and calculated the warping of perspective from the camera lens to the projector lens, and from the projector’s perspective to the audience’s perspective. At this point I was not familiar with the software TouchDesigner or I would have certainly used it.  Since we wouldn’t have time on site in Minneapolis and we needed to start creating animations immediately, we decided we would have to use that one cell phone photo. This presented us with a couple of minor hurdles. We could try to match the model to the audience’s perspective to the best of our ability, but there were too many factors involved with our limitations. There was no way we were going to be able to use Resolume’s mesh warp feature to warp our unavoidable errors of perspective into shape. This method only works with very simple geometry and much more simple geometry than we were working with. Thankfully there is a wonderful piece of mapping software called MadMapper.

Benefits of MadMapper

With Mad Mapper we would be able to cut out each face of the cubes with no overlap and then map them individually on the physical stage. We originally wanted to use one output of 1920 x 1080 from Resolume, but after discovering that we absolutely needed to use MadMapper it was necessary to separate the background from the cubes in Resolume making them two entirely different animations that would need to be cued from different layers at the same time. We had to increase the resolution of the output in Resolume to do this, and then the Macbook Pro couldn’t handle it. We had to downsize each animation from 1920 x 1080 to 1280 x 720. At this point using the HDMI output and the Thunderbolt output simultaneously was still too much work on the onboard graphics card, but when we used the Matrox Triple Head2Go every thing began to run smoothly. Our final solution allowed us to use five layers in Resolume. We used two layers positioned on top of each other for the cubes and two layers on top of each other for the background, so we would have the ability to mix between them. Then we made one layer just for the animations that would be mapped onto the balconies on the sides of the stage.

Back to the Template

We passed off a rough Cinema 4D model with the cell phone photo as a camera mapped background image to Tim Shetz for him to work his 3D magic and create a template that would make it easier for our animators to render from two different cameras without messing up the perspective. Tim wrote a script for the Cinema 4D template that made this incredibly easy for our animators. This was also necessary so that our animations could have the same perspective and so that objects could move from the faces of the cubes and onto the background seamlessly, regardless that one of the projectors would have a much wider image. These two separate animations would need to be triggered at the same time from two different layers in Resolume, but we would take care of that in the MIDI Mapping.

Numbers3d_grey_wide_2D_grid

Open Sound Control

Now that we had all of the mapping and animating figured out, we had to start thinking about how we were going to get eight different audio feeds into Resolume. For that we went to Chris O’Dowd to help us out. He wrote a Max MSP  patch that would take any audio from a mixer and convert it to OSC (Open Sound Control). We used a separate MacBook Pro(2) to take in data from the eight channel mixer and to run Chris’s patch. I ran OSCulator to send the OSC messages over Wifi from that computer(2) to the other one that was running Resolume. I also used OSCulator on the Macbook running Resolume to then convert the OSC signals into MIDI signals to get a more dramatic audio reaction from Resolume. A special thanks to Matthew Jones for all of the advice.

 TEAM: Creative Director: Yael Braha, San Francisco,CA // Technical Director and 3D Artist: Matthew Childers, San Francisco,CA // Cinematography: Michael Braha, Rome,Italy // AV Programming: Jim Warrier, Berlin,Germany // Audio Programming: Chris O’Dowd, San Francisco,CA // 2D/3D artists: Tim Shetz, Matthew Childers, Yael Braha, Jennifer McNeal, San Francisco,CA

TEDx Berkeley 2014

Contracted as a live camera operator for TEDx Berkeley. Assisted the Technical Director.
Programmed a timer so each speaker knew how much time they had left.

Published on Feb 22, 2014

“Tim Shields at TEDxBerkeley 2014: “Rethink. Redefine. Recreate.” His talk is titled “Playing for Keeps.”

Seeking out the desert tortoises which have charmed him for three decades, desert biologist Tim Shields has walked a number of miles equivalent to circumnavigating the Earth. Over thousands of days’ careful observation he has helped unlock secrets of tortoise biology which will lead directly to the preservation of this bellwether species. Shields is an integral member of the scientific team that documented the tortoise’s decline and whose work led to its listing as a threatened species.

Frustrated by the mental roadblocks humans set for themselves, and looking for efficacious new means to help desert species thrive, Tim has stepped outside ordinary thought patterns about conservation. He is now engaging the brightest minds available in engineering, education, business, Earth systems and games. His company, Hardshell Labs, is developing novel combinations of emergent technologies to provide solutions to a simple but critical problem: how to make conservation not only meaningful but fun, thus attracting many more active participants.

In the spirit of ideas worth spreading, TEDx is a program of local, self-organized events that bring people together to share a TED-like experience. At a TEDx event, TEDTalks video and live speakers combine to spark deep discussion and connection in a small group. These local, self-organized events are branded TEDx, where x = independently organized TED event. The TED Conference provides general guidance for the TEDx program, but individual TEDx events are self-organized.* (*Subject to certain rules and regulations)”

Pretty Lights, The Grouch, and Eligh: “All These Lights”

Watch Pretty Lights, The Grouch, And Eligh: All These Lights

Contracted by Hubtuit Productions to create 3D animations and projection map them onto set designs for this music video. The animations were played in sync with the lyric lapsing effects. In order to do this the animations were projected by a single frame for each consecutive shot, and then edited together in post to achieve the effect of witnessing the full animation. .