I first met Sean a couple of years ago and I remember that I enjoyed listening to him telling stories from the VFX world on the other side of the ocean. He’s a good storyteller too, so pretty quickly he had a captive audience around the table.
The story he is telling today is about his career so far, and I’m really glad he agreed to do it on our blog. If you are working in the VFX industry or are considering a job in this field, read on, I believe you’ll find it interesting.
I started professionally in visual effects in 2002. A friend of mine worked at a small visual effects studio whenever they needed help, and they were hiring for Austin Powers in Goldmember. I had dabbled in Adobe After Effects at home a little bit, and he asked me if I wanted to come work at this studio with their small team. They used After Effects for all of their film work. This was way out of my league, however, and I told him I was not really that good at it. I barely knew how to set keyframes, and I think I tried to rotoscope something once (badly). They hired me anyway.
At this point, obviously, I didn’t really know much about visual effects. I had taken a Kinetix 3dsMax (v1.0) class at school in ’96, and had dabbled with Adobe Photoshop (v4.0) and somehow got into a Alias PowerAnimator class that wasn’t a part of my course studies. PowerAnimator was the program that would eventually become Maya. After graduation, I packed up and moved to California, where I worked in miniature models, animatronic creatures, and special make-up effects. That’s what I really went to school for – practical effects – and the 3dsMax class had been tacked on at the end so we’d at least have an introduction to this new digital effects stuff.
The whole time we worked on Goldmember, I was convinced that I was going to get fired if someone actually watched me work for a few minutes. As it turned out, I learned quickly and got pretty good at getting shots done as fast as possible. In fact, because I knew a bit of 3dsMax, I even ended up doing a bit of 3D work on that show. For a year or so, I drifted between physical effects jobs and doing visual effects for that small studio, until there was enough work for me to just stick with the digital stuff. They taught me almost all the techniques I still use today.
During this time, when I would talk to people about work, they would always ask what program we used. Many professionals would look confused when I mentioned After Effects. “Isn’t that just for motion graphics?” My standard response would be to ask if they had noticed the thirty-odd greenscreen shots in Meet the Fockers. “No,” they would always say. “Then I guess After Effects can do more than motion graphics”. I would try to say it without sounding like a smart-ass.
Then, in 2005, another friend invited me to come work at Cafe FX on Sin City. They used Fusion for compositing, which, at that time, was being developed by eyeon (and today is being developed by Blackmagic). At my interview, when they asked if I knew Fusion, of course I had to say no, but I said I knew how to composite, and it was just a matter of learning where the tools were and what they were called in this particular program. Once I was hired, I sat down with the lead compositor so he could show me the software. The first two things I asked him were: one, where are the tracking tools, and two, where are the rotoscoping tools. With those skills, along with color correction, I knew I could pretty much fumble my way through any kind of shot.
The first two weeks, once again, I was paranoid that I would be fired at any moment. As soon as anyone watched me work for a minute or two, they’d realize what a mistake they had made and get rid of me. It was so frustrating trying to understand working with nodes that I almost came in to work and quit once or twice. I just wasn’t getting it. After years of using After Effects, nodes just seemed like a very foreign concept. Then one day, I can’t even explain how or why, it just clicked. It made sense. From that moment on, I was fine. At the end of Sin City, I was asked to stay at the company and work on the next project. I chose not to, however, because I didn’t live in the town where the studio was and had been living out of a hotel. Back to the small studio for me.
The next year, 2006, that same friend who had brought me into Cafe FX was now working at Rhythm & Hues, a studio I had sent my reel and resume to before I had even moved to California in 1997. Back then, I had received in return their standard postcard reply, “Thanks for your resume, we’re not hiring, we’ll keep it on file, etc.” Now, nine years later, with a friend working there and plenty of experience under my belt, I was hired for Garfield 2: A Tale of Two Kitties. What software would I have to learn now?
At each of these studios I had been to, not only did I use different compositing software, but I also had to use many different pipeline programs. At the small studio where I began, the staff was tiny, sometimes only three of us, so I got to learn and use matchmoving software (Andersson Tech Syntheyes), warping tools (Avid Elastic Reality), different tools for keying (Discreet Combustion), painting (Pinnacle Commotion), and had to learn their custom way of creating movies for reviews, approvals, and final delivery. At Cafe FX, I had to learn a different way of doing all those same tasks. Also, since Cafe FX was a much bigger company, I had learned to use render management software (Thinkbox Deadline).
Now, having finally made it to a major visual effects studio, I was very excited to see what kind of tools they used. I couldn’t wait to get started! Once again, I had to learn compositing software all over again, except this time, it was their own proprietary tool, Icy. Just as before, it took me a little while to get comfortable with the program. And it wasn’t just Icy. R&H had many custom tools. Playback software, publishing assets, subscribing to those assets, render farm tools, job tracking programs, even their own 3D program they had written (I didn’t really have to learn that one, being hired only as a compositor). Also, they were a Linux studio, so I had to learn enough Linux commands to get by using the shell (no more drag and drop copying of files?!). On top of all that, once again I had to learn how they preferred to review and approve things. I fact, when I started there, dailies were still done on film. They were actually printing film of our shots every single day.
I struggled to keep up for a while (again, constantly worrying about losing my job), but by the end of that first show, it all felt natural. I was asked to stick around for the next show, and with R&H actually being in Los Angeles, I accepted. The learning never stopped. New tools would be added to our existing software (such as stereoscopic 3D), and completely new programs would be introduced (Unibrowser, for asset management).
During the eight years I was at R&H, Nothing Real’s Shake (which I had never learned) was discontinued and The Foundry’s Nuke (which I had never learned) began to take over the compositing industry. Programs like Imagineer System’s Mocha appeared, while Pixelogic Zbrush, Autodesk Mudbox, and Maxon Cinema4D became more and more popular. I stayed on top of the latest stuff by downloading trial versions. I also worked on freelance jobs constantly. Sometimes I would agree to do a job that I didn’t even know how to do, which would force me to learn new things. I also made my own short films on the side, and got some experience on the film festival circuit.
At R&H, I was becoming slightly frustrated with the departmentalization of a big studio, especially now that I was a lead compositor and wanted to make things easier for the teams I was on. For much of my career, I had been used to “owning” shots, meaning I would do everything a shot needed. At a big studio, different departments did the different aspects of a single shot. One shot could have many, many people working it. And even when I did want to take it upon myself to create small 3D elements for myself or my team, R&H didn’t have 3dsMax or After Effects available. Those two programs were still my favorite to work in. So I would have to go home, make whatever elements I needed, and email the renders to myself at work so I could use them or distribute them to the team.
Then I found Blender. R&H had v2.48 installed on our machines, which was an older interface. I had tried to learn it a few times, but the UI was just too different. I had learned many UI’s in my career, but for some reason this one just wasn’t clicking with me. Then 2.5 came out, and the Blender Institute had overhauled the UI. Much better! This I could work with! I downloaded it on my machine at work and started learning.
Flash forward to 2013, when R&H declares bankruptcy and eventually restructures itself as a much smaller company. When I left R&H in April of that year, I had secured a full-time freelance gig doing visual effects for an independent feature film. I’d be working from home for the next two years. It also meant that after years of working in other people’s pipelines, I was finally going to have to create my own at home. By this time, I hadn’t opened 3dsMax in about three years. Everything I would normally use 3dsMax for, I was able to do better and faster in Blender. It’s important to note that I am not saying Blender is better than 3dsMax. Blender’s way of working just works much better for how I like to do things.
Blender and After Effects became the foundation of my home studio, and I discovered many other free and open source tools to incorporate into my little pipeline. And I’m still learning! Natron is a newer open source program that is to compositors what Blender is to 3D artists. And Krita – wow! The feature set in that program is just amazing. I’m so looking forward to being much more comfortable with it. And naturally, since I’ve been back at a visual effects studio, I finally got around to learning Nuke.
In fact, I’ve discovered so many great free and open source tools that I am starting a blog to share what I’ve learned. Some of these little helpful programs are very obscure, and I’d love to be able to introduce them to a much wider audience. Of course I’ll also be focusing on the heavy hitters of the open source world. I’ve got plenty of great ideas for courses and tutorials, and I’d also love to help other professional artists start to see these free tools as something they can jump right into and use at the high level to which they’re accustomed. If you’re interested, it will be coming soon at www.openvisualfx.com
What is the moral of this story? I guess it’s to keep in mind that the software that you work so hard master is just a tool. It’s a hammer. No one ever looks at a house and says “I wonder what brand of hammer they used to build that house? That’s how I’ll know if it’s a good house or not.” For people to look at a movie and think that the visual effects in it are solely the result of a piece of software is just as absurd as judging that house on the brand of hammer used to pound in the nails. The truth is, every program is going to have things you like and don’t like. They are all missing some feature you will need someday. There is definitely some other piece of software out there that can do that one thing or other better than your software.
The house quality is based on who built it. The designer, the contractors. It’s the same in visual effects. That software that you love so much and worked hard to learn so you could get a job in the industry? Stop obsessing over it. Stop thinking it’s the only piece of software that can do the job. Stop thinking projects done in other software aren’t as good. All the software does the same thing. Using compositing as an example, all compositing programs can put footage A over footage B. All software can make that look good or bad. It’s the person pushing the mouse that makes the difference. Don’t become a software snob. Work on learning techniques, understanding what makes a realistic image, matching lighting and color, etc. These are the things that make the art of visual effects valuable. And these are the things that make you, the artist, valuable. Be willing to learn whatever software they can throw at you. Don’t ever stop being fascinated by what this software can do in the right hands. Work hard to make those hands yours.
Sean Kennedy is a visual effects artist living and working in Los Angeles, California. He composited on two Academy Award winning films – The Golden Compass and Life of Pi. See all his work at his IMDB profile or look him up on Facebook. Stay tuned for Sean’s visual effects blog coming soon, filled with tutorials, tips and tools.