Archive for the ‘Effects’ Category.

Promo in Stop Motion

I recently worked with Alarming Pictures on this video promoting Bill Gates’ Feb 10 2014 appearance on Reddit’s Ask Me Anything.

The paper craft artwork was shot using a Canon 5D Mark III recording raw files to a tethered Mac laptop. The style was meant to be analog and papercrafty, so no specialized stop motion animation software was used, just the Canon EOS Utility.

GatesRedditAMAProduction1 GatesRedditAMAProduction2

Production was fairly straightforward with the team shooting the planned scenes generally in the order they were going to be included in the final piece. Shoot, nudge, shoot, nudge, etc. No attempt was made at attempting lip sync with the audio recording, the mouths were just cycled through and would be augmented in post.

When I received the CR2 files from production my first step was to import them into Lightroom 5. This gave me a chance to find obviously-bogus frames that wouldn’t ever be used, then I exported everything out as full-size 5760×3840 JPEG files.

The JPGs were imported as a sequence into After Effects CC and interpreted as 10fps. Then the image sequence was placed into a 1920×1080 comp running at 23.976, the same size and frame rate we would finish in. These HD comps, as well as nearly-full-size 5760×3240 23.976 comps were rendered out to ProRes 422.

With the prep work of turning the raw frames into QuickTime movies completed, I proceeded to edit just like I had any other footage. Because After Effects would be integral later on and because it was flexible in dealing with multiple frame sizes, Premiere Pro CC was selected as the editing tool. I created subclips for different scenes and starting building the structure of my sequence. Some clips were sped up for action, some slightly slowed down. On set, shots were not meticulously timed out, so I had a lot of freedom to just make it work.

I also had the ability to choose when we would cut to a closer shot, and I had plenty of resolution as well. At first I just scaled up the 1920×1080 “wide shot” then later the full-frame 5760×3240 ProRes clips were edited in and scaled down. Overkill? Yes, and it caused problems later.

As the edit became more refined I also turned my attention to refining the mouth movements– at least somewhat. The mouth opening and closing randomly while the character spoke was fine, but during pauses of voice the moving mouth was distracting. In Premiere Pro I would composite a freeze frame of the “resting mouth”, which was his smile. I used the crop effect to make a rough box around the mouth so that any animation in the rest of the frame could still be seen. It was rough, indeed, but it helped me quickly place the mouth rests in the realtime NLE environment.

This is about the time the crashing started.

Screen Shot 2014-02-03 at 3.48.38 PM

Over the course of the afternoon I crashed Premiere several times. With some help from Twitter I discovered my use of the very large 5760×3240 frames in Premiere was causing it to crash, and my cropping of one 5760 frame over another probably wasn’t helping.

I didn’t really need that many pixels, so I replaced the 5760×3240 renders with 4K versions and then I had no crashing problems at all.

After the timing was finalized in Premiere Pro I moved into After Effects to do for-real the adding of the mouth pauses. The freeze frame clips from Premiere helped me find where the pauses were to go, but I also ran the audio file containing the spoken answers through Prelude and along with a text transcript performed Speech to Text analysis which placed markers for each word that I was able to see in After Effects.

Screen Shot 2014-02-09 at 9.45.28 PM

When the character’s mouth pauses, it is actually his whole head that I replaced. It was easier to mask the head and place it on top than to try to replace just the mouth. In a few shots there was some flicker in the moving main shot and so the frozen head would not completely match, so I used the GenArts Sapphire FlickerMatch plug-in to make the head have the same flicker as the rest of the shot.

When I was done I rendered a QuickTime movie and edited it into my sequence on top of the dynamically linked comp/clip. If a change was needed to the After Effects comp I could easily choose Edit Original to reopen the comp. Then when done in AE I would rerender the comp and Premiere would immediately see the new render file.

The last thing to do was deal with the flicker that I mentioned before. I recently sent my friend Pete Litwinowicz from RE:Vision Effects a timelapse shot I took at Machu Picchu in Peru, he wanted to test their DE:Flicker plug-in with it. Because of that interaction I had it in my head that maybe DE:Flicker could Deflicker my stop motion movie. With the default settings I wasn’t getting the result I was after so based on a recommendation from the excellent AE-List listserve I tried Digital Anarchy’s Flicker Free plug-in. With the defaults I was able to have the flickering gone immediately. However, I ended up using the DE:Noise plug-in from RE:Vision with some settings Pete provided, because it was a little smoother. It is worth noting, however, that DA’s Flicker Free got about 90% of the way there with just the defaults.

Here’s a screenshot of the final timeline in Premiere Pro.

Screen Shot 2014-02-09 at 9.49.20 PM

It was a fun project and everyone involved was happy. I learned a lot too!

Semi-automatic title and menu creation

Introduction

These days I don’t get much time for real video production and editing, except for a couple of times a year I put my skills to use and produce recital DVDs for the parents of my daughter’s dance studio.  I’m just finishing up one of these projects and I’d like to share a few ideas that might help you in your future projects.

A cornucopia of software

When I was editing full time in the late 90s I would often impress my clients with my effortless orchestration of many pieces of software.  My mission wasn’t to be showy but rather to combine the powers of the different tools to better complete my project.  These days this way of working is a matter of course, today’s editors are required to juggle a lot of applications and skills.

For this project the key applications I used were

Don’t redo work already done

When I produce these dance recital DVDs I need to be super efficient.  I need to get the project done very quickly while still providing the kind of quality I can be proud of.  For this project the “deliverable” is a set of two DVDs, each disc approximately 90 minutes in duration with 35 performances in one, 34 in the other.  Each performance has a lower third identifying the song the kids are dancing to, the class they’re in, who their teacher is and who the dancers are.  Each DVD also has a menu system so viewers can choose individual performances to watch.  That’s a lot to do, and if I had to type all the data into the computer it would have added significant time and annoyance to the process.

So my first rule was to get the recital program as either a PDF or a Word doc.  Someone at the dance studio already went to the trouble to type it all into a computer for the program, so I wanted to leverage that effort.  This seemingly simple request made possible a lot of workflow improvements.

Specialist tools

Once I had the Word doc I was on my way.  I copy and pasted the text into a super-powerful text editor called BBEdit (if you’re looking for a good free text editor check out TextWrangler from the same developer).  In BBEdit I was able to change line endings into tabs allowing me to quickly create a tab-delineated text file.  The tabs separated the individual data fields of each performance (performance number, performance name, class name, instructor name, dancer names) and each performance was separated by a new line.

Next I spent some quality time in Motion 4– the motion graphics and compositing component of Final Cut Studio– creating the look of my lower-third graphics.  I’m not a designer, but I can create graphics that are appropriate enough to impress dance moms.  But doing this in Motion and not Adobe After Effects is key, because Motion has a secret.  Motion’s project files, which can be played back in any QuickTime savvy application (on a machine with Motion installed), are actually XML files which makes possible some interesting workflows.

One app that fantastically leverages Motion’s openness is Automotion from Digital Heaven.  Automotion combines a Motion template with a tab-delineated text file to quickly (very quickly) create many variations of a title with the text file’s data.  It is really fantastic, and afterward you can update your template with a new idea then within seconds reproduce all of the graphics with the new look.

A few tips for using Automotion:

  • to include quotes ( ” ) in your titles replace them with double quotes in your text file ( “” )
  • you can create a box of paragraph text in Motion so lines of varying length will wrap
  • include a forward slash character ( \ ) in your text to force a new line

Motion is a great companion to MC too?

The output of Automotion is many Motion project files (.motn extension) which you can open up in QuickTime Player or edit directly into your Final Cut Pro timeline. What may surprised you is that Media Composer can also import .motn files just like any other QuickTime movie!

It isn’t entirely smiles and rainbows, though.  If you have ever seen Motion demoed you know that it does a lot of coolness in real time, harnessing the power of your video card.  But if other applications want to incorporate the Motion project the .motn must be rendered in software.  What is nice is that the rendering happens automatically, but what isn’t nice is it can be rather slow.  In other words if you import the .motn file into Media Composer part of the import time is consumed by rendering each frame of the .motn file.

In my case the titles were rather simple, just an animating blur as the characters faded up and each took approx 1 minute 15 seconds to import into Media Composer.  I had 69 titles to import, so that time added up quick.

I admit that I didn’t do it exactly this way because I don’t entirely trust Media Composer with importing .motn files directly.  Instead I ran my Motion projects through Compressor to make QuickTime movies with an alpha channel, then these movies I imported into MC.  It was slower to do it this way, but I have more faith in Media Composer doing a straightforward .mov import.  I have no reason or experience to justify mistrust, mind you, so try it both ways and decide what works best in your situation.

Either way, this is a good time to give yourself a break.  Your computer is going to be tied up for a while, so go for a walk or a drink or lunch or run some errands.  You’ve earned it.

Title revisions

Revising graphics is inevitable, but Automotion makes it a breeze.  Change your data or tweak your template then re-run Automotion and in no time at all your .motn files will be brought up to date.

Though not as immediate as if you were using the .motns in a Final Cut Pro timeline, it is still very easy to make the adjustments and then update your sequence with the new Automotion output.  You can delete the media for the imported clips then use Media Composer’s Batch Import command to reimport all of the Motion projects or QuickTime movies.  Even if your titles are edited and cut up throughout your timeline the sequence will relink to the new versions of the media files.

It is easy, but will be as time consuming as it was to bring the files in the first time, so be ready for a snooze or a bubble bath or whatever.

Bottom line: Automotion is fantastically useful, and while it is primarily marketed to Final Cut Pro users it should also be considered by more Avid editors.  Definitely check out the trial version, tell them I sent you.

Just a note to those of you wondering why I haven’t brought up AMA.  I couldn’t bring in my movies via AMA because they had an alpha channel.  If they had been full-screen and lacking transparency I could have saved a lot of time by relinking via AMA.

Smart DVD menu creation from text file

Eventually I had my edit finalized and I was ready for DVD authoring. As the time came to create the DVD menu systems, it occurred to me to again leverage the already-typed-in state of the text.  I would have a set of chapter selection screens, and the name of each performance would be onscreen along with an animated thumbnail of the dance. I didn’t want to start typing all those names in now!

I used After Effects to create the animations that would become my menus, and my hope was that I could put together an expression to read from a text file each of the song names.  As I was working on it I decided I would go a step further and tie together all of the layers of the menu comp with expressions, so that every element that needed to change for each menu screen could all change together automagically via one control.

If you’re scared of Expressions I completely sympathize.  I don’t know much about coding myself, but thankfully there are loads of expressions and threads about them on the interwebs.  I managed to find a few threads that talked about reading a text file and putting the data into a text layer, and the trickiest part was that I needed to format my text file as javascript.  Dan Ebberts is the main genius behind this magical code, and you’ll quickly find as you search the web for After Effects expressions that Dan is one of the foremost authorities.  I again used BBEdit to do some text replacement and I had a javascript-flavored text file that After Effects could read and the text layers sprang to life.

The following expression was put into my text layers’ Source Text control:

x = comp("130 Chapter LOOPing").layer("CONTROL LAYER").effect("NUMBER OF FIRST SONG")("Slider"); //number of first song on page (1, 7, 13, 20 etc)
y = x + 0; // increment the number for songs 2, 3, 4, 5, 6 etc. per page
if (y < 10) {"" + "0" + y}else{y}; //add leading zero to single digit
myPath = "/Volumes/G-SPEED Q_1/PACWEST JAN 2011/DVD MENUS/WINTER SHOWCASE list of songs.txt";
try{
$.evalFile (myPath);
song = y;
txt[Math.min(song,txt.length-1)]
}catch (err){
"file not found"
}

I used similar expressions to change the chapter number for each performance. The thumbnail was controlled by layers in a nested comp setting their opacity based on the song number.  It was pretty neat to change one slider in the control layer and have all of the elements update accordingly.

There was one significant downside: Because of my use of precomps and the control layer being in the main parent comp I couldn’t set up six different menu arrangements and render them all together (I had six pages of chapter selection menus, each page pointing to six chapters).  I had to render the comp once, change the slider value, render again, change the slider value, render again, etc.  Not the end of the world, but it made me wish I’d planned better.

Avid locators to chapters

In the past I have decided to start and complete an edit in Final Cut Pro because the output was DVD and it is so nice to have FCP sequence markers exported as chapter markers for DVD Studio Pro.  When I made the decision to cut this project in Media Composer I knew that I was setting myself up for hassle when DVD chapter time came, but I decided to deal with that problem when the time came.

Apparently I’m not the only one who has considered this challenge and I’m thrilled to see that the solution I arrived at was also innovated by the fantastic editor Steve Cohen.

From the Avid’s Tools menu choose Locators, and this opens up a window displaying a list of the sequence’s locators. You’ll see in my screenshot I have my locators in the M1 track, I put them there for my own mental convenience to differentiate from any locators I used for editorial purposes on any other track. With the Locators window open choose Export Locators from the File menu and the result is a nice tab delineated text file. Open in OpenOffice or Excel or such thing and remove the columns other than timecode and comment. Now you can import the text file into DVD Studio Pro.

The end

That’s it this time. I am really pleased with how well these techniques worked for me on this project, I was really able to save myself a lot of time.

Playing VJ

Last weekend I had a special experience, one of my favorite musical artists came to my house and performed a living room concert.  Helios is the name of the artist, the music can generally be described as electronic and ambient.  If you’ve ever seen electronic music performed you will know that visuals are commonly performed alongside the music, and since I had been curious about the VJ experience I decided I would provide visuals for this show.

Overall the show went very well. We had 50 guests attend the show, and we tied in a Haitian charity donation opportunity for our guests and we raised $415.

In this post I’ll talk about some of the logistics I had to tackle in order to bring the visual aspect to the show.

Several friends at the show asked what software I was using for playback of the visuals.  I tried a few different VJ applications and the one I settled on was Resolume Avenue.  Applications such as this are intended for live performance, with the ability to mix different source movies, even different sources such as live cameras, plus application of effects to the video streams in real time.  These apps typically accept MIDI input for the triggering and adjustment of clips and effects, so I got an inexpensive Korg nanoKONTROL device to give me some knobs to twirl and sliders to push along with buttons to press.

With the software and the hardware interface I set off to learn how I could be a “VJ”.  I’ve been making video for more than twenty years so it was fun to learn a way to deliver the results.  But it became clear to me pretty quickly that if I was going to combine multiple visual elements in real time it was going to a) be a lot of work, b) require a lot of concentration and as a result probably c) reduce my enjoyment of the show.  So I decided that I would instead largely pre-build my performance and use the VJ software to play back my videos in sequence.

I shot most of the elements I ended up using and I also sourced some stock footage elements, plus I had a friend send me some video he shot on his sailboat, I thought it might come in handy.

Production of elements was fun.  I used Final Cut Pro to arranged clips in time then treated everything in After Effects.  I loaded on and experimented with the effects and kept myself entertained late into the night.  Overall I spent about a month working on my clips, and before long the 2TB of drive space I set aside for this project became almost nothing and I had to add another disk.

The next major consideration was how I was going to project my show.  I knew where the “stage” was going to be in our house, but there wasn’t going to be room to put a screen behind the performer and have the projector somewhere in the room.

Since we were going to have Helios play in front of the large windows in our living room I thought it might be cool to project the video from the rear, meaning the projector would be outside shining into the house.  This way the audience would see the imagery behind the musician and it would all be very natural.

I built some screens using PVC pipe as a frame and spandex material from a local fabric shop stretched across, affixed with Velcro.  They were very lightweight and it was easy to hang them on the outside of the house with hooks and eye bolts.

I felt very makey-makey as I was building the screens.  I used two different kinds of velcro, one with adhesive backing and the other kind meant for fabric that ironed on.  So one night I cut PVC sections, attached velcro to the assembled frames and ironed the other side of the velco to the spandex fabric.  Only once did I leave the iron in one place for too long and I slightly burned/melted the fabric, but it was minor and didn’t have any real affect.

My next problem was where to put the projector outside so that it could cast its light onto, and through, my fancy screens.  The windows are on the second story as viewed from the back yard so I decided to build a platform for the projector to sit on.

I’m not much of a carpenter, and maybe because of that I managed to build something that was later described as a lifeguard tower, a trebuchet, even a guillotine. I was proud of it, though, especially the top platform that was able to tilt so that I would be able to adjust the up and down throw of the projector.

Finally I had to consider protecting the projector from the elements.  I knew that the chances of rain or other wetness on a February evening in the Pacific Northwest was pretty high so I needed a weatherproof box, and that box needed to be ventilated because projectors make a lot of heat.

I considered building a box from scratch but recognizing my limitations knew I was unlikely to build something waterproof and useful.  So I chose instead to use a Rubbermaid storage container as the basis for my projector housing and I modified it to become a projector protector.  I cut a window on one side, covered it with clear plexiglass then duct taped the heck out of it, adding some caulking for extra good measure.

I cut a hole in the bottom for cool air to enter the box (through a matching hole in the stand’s platform) and cut holes in the sides for air to be exhausted.  I mounted two 120v fans from Radio Shack against these holes and then for extra credit I attached vent covers on the outside of the box to protect the fans and projector from overly enthusiastic rain drops.

The entire thing was way over-thought but in the end it all worked very well.  The visuals shined through the windows behind Helios as he played and it felt incredibly natural.  And thankfully I was able to enjoy the show even as I had to pay attention just enough to the software to trigger a clip and to fade out at the end of a song.

It was a very fun experience and I look forward to hosting another show in the future.

I have posted the individual movies of my visuals over at wesplate.com.

Since the show I’ve found this writeup on the evening from The Stranger. The writer was so complimentary about the event I won’t make a big deal of his 20-person underestimation of the number of guests in attendance. 🙂

UPDATE: This performance is now available on DVD through the Unseen Music shop… http://www.unseen-music.com/live_snohomish.html

Goodbye Jump Cut

For a long time my Dad misunderstood when people talked about jump cuts, he thought they were saying “junk cuts”. This anecdote doesn’t mean anything, I was just reminded of it as I started typing this post.

I will first credit the one who inspired me to even think it possible, what I’m about to describe. This man has actually influenced me a lot over the years, and my last blog post about Avid’s ScriptSync wouldn’t have happened had I not seen this guy give a presentation about ScriptSync some years back. Anyway, I first met Steve Audette in 1998 at one of Avid’s Master Editor Workshops. I was blown away by the work Steve showed at this presentation eleven years ago, and I’m very happy to say we became and are still good friends.

One of the things that Steve demoed back then was using Elastic Reality to morph between the clips in a jump cut, making what was two clips into one, hiding the cut.  This blew my mind. But I never used this particular idea until last week.

Original clip with jump cut

Original clip with jump cut

The project I was cutting I ended up with a jump cut that I couldn’t cover with b-roll. I thought about modifying the script so I could do away with one of the shots in the bad edit, but the words as they were were just too good. Since the interview subject didn’t move too much across the jump cut I was reminded of Steve Audette’s morphing maneuver.

I turned to RE:Vision Effects and their powerful RE:Flex plugin for After Effects to see if I could accomplish what I had in mind.  RE:Vision’s effects are powerful but do require some learning, so after I brought my clips into After Effects via Pro Import AE I sat myself down and read through the user guide and reviewed how the plug-in works.

Essentially you set up a series of mask pairs, a mask shape that follows the “from” geometry then directly beneath that a mask shape that follows the corresponding “to” geometry.  Clearly the more masks you create to control the morph the better results you will get, and I tried to walk that fine line of doing enough to make the effect work but do few enough that I could quickly move on to other editing tasks.

Series of alternating From and To masks

Series of alternating From and To masks

Thankfully it didn’t take all that long, and it wasn’t really all that hard.  In fact when I started previewing the morph I had to double-check that my eyes weren’t deceiving me because the morph was actually working.

Someone with real morphing and warping experience could give you better advice, but I found parts of the face and body that had easily defined areas and made my mask outlines there.  Like the eyes, the top of the blouse, the sides of the face, and of course the lips.  I previewed the effect to see where warping artifacts appeared or where the effect wasn’t seamless then added mask outlines in those areas.  If there were warping problems that I couldn’t see, well I didn’t consider those to be problems I needed to solve.

Another consideration for me was that this final video was going to be viewed primarily on the web, so a little bit of imperfection could be tolerated since it might end up masked my a compression artifact or a viewer might simply blame the defect on their internet connection.

Now with 100% more morphing

Now with 100% more morphing

The end result was definitely worth the time I spent, which was probably only about an hour, I was able to keep my interviewee saying the phrase I needed, and on camera too.

A big thank you to RE:Vision Effects for hooking me up with their powerful plug-in and another big thank you to Steve Audette for providing the inspiration.