Home » Rendered with RenderStreet » The making of the 360 Wales Green Party broadcast

The making of the 360 Wales Green Party broadcast

Last updated on by .

Technology is opening up ways of communication that we couldn’t imagine before. The 360° view gives us control to look over our shoulder and immerse ourselves in the story, and virtual reality offers the next level in delivering the message and making the experience more powerful.

Chris McFall is one of the pioneers who uses Blender to create virtual environments. He now reveals the detailed making-of of the first 360° message for a political party in Wales.

The Green Party approached me to create a video that reflected the parties values—different, creative, supporting new technologies, were some of the underlined policies. I suggested a 360 video, being the first ever 360 political party broadcast, and part of a new technological movement, I felt it would reflect how different and forward thinking the party is.

I scripted the piece to reflect the power of a single vote, as many people in Wales like the Greens but fear wasting a vote on them when the majority parties are so strong. I thought a Rube Goldberg device would reflect the message nicely showing a voting ballot outweighing a bowling ball, that has a knock-on effect in different sections, each of which reflect the parties policies. There were a number of set backs and last minute changes to the script which meant not all the original ideas could be fully realised in the short time and small budget, but overall I feel the core of the video is still strong.

I filmed the Party leader (Alice Hooker Stroud) in a hired room in Chapter Arts Centre where my offices are based using six “GoPro session” cameras on a 3D printed rig. The sessions were more reliable during testing than the Hero4’s (which occasionally failed to trigger) and the printed rig meant we could better align the stitching to less critical points in the scene. The cameras and a great deal of help were provided to me by the wonderful 4Pi Productions who are based near by but work internationally on 360 projects. They even stitched the footage for me, outputting ultra high-res image sequences. A note about 360 stitching. We deliberately planned on a very loose stitch to save time instead covering these areas with CGI elements like scafolding so that things look nice and seamless. Patching is WAY quicker than stitching.

Prior to filming I built a precisely scaled mock up of the room in Blender and set about laying out the machine. This resulted in floor plans that allowed us to layout where we needed Alice to walk and at what points in her dialogue she should be at any given point.

I also used the Virtual Reality viewport add-on with the Oculus Rift DK2 (or the Occy Wokky Woo as I like to call it) and ran multiple tests out to the GearVR in order to get a better feel of the final product.

From there I built the models from scratch in Blender, largely using procedural textures to save on GPU RAM and heavy unwrapping tasks. This was the most time consuming part of the project working many late nights to get things finished on time. After that I rigged each of the seven sections of the Rube Goldberg machine. Certain parts of the animation couldn’t be reliably simulated, so they were hand animated. The final spinner on the rails that flips the light switch was animated manually and is one of my favourite bits, having the spinner jump between multiple rails was tricky but satisfying.

I used Scorpion81’s Fracture Modifier build of Blender to destroy the piggy bank and simulate the old ball and chain. Simulations were baked to keyframes and MDD files to keep them locked down and to allow for external rendering.

One very disappointing factor in this project was my original intention to use the 360 video as the HDRI environment allowing more realistic lighting of the CGI elements. However Blender really struggled with this and eventually I resigned to using a still image. This also made timing and aligning the video to the CGI a HUGE chore. Something I hope will be fixed in later releases of Blender.

Whilst I’m moaning another real problem with working with 360 in Blender is that you can’t viewport render equirectangular footage (I think it’s not a true view, but a post effect) making previewing the thing damned difficult. Again a solution to this would be most welcome. Because rendering things out at expense to find the are poorly timed is Galling!

Each section was uploaded to RenderStreet for rendering, and Marius and the guys there were AMAZING! The project which started a month later than scheduled because of complications with the party, had now come down to its last week and I need the renders FAST! With all the caching and procedural textures and timing dependencies this was never going to be a cakewalk. But the RenderStreet gang, guided me through any technical hitches and we were rendering before you know it. As CGI is so much sharper than video originally I planned to blur the renders slightly to soften them but a more cost effective way of working was to render at a smaller size and then scale them back up. This saved a bit of time and money and got us the result we wanted without having to expend extra processing power blurring things in After Effects.

A really difficult part of this project was the HUGE file sizes and how unwieldy it is to work in 360. I had to produce the video at double the necessary resolution, because we had to produce a version of sufficient quality for broadcast. Thank God I upgraded my machine before I started! Actually we had to make two for broadcast—one in English, one in Welsh, and of course one in 360.

The video footage was put into Mettle’s amazing SkyBox suite (a plugin suite for After Effects) this allowed me to “keyout” two sections of green screen and replace them with CGI Backdrops (the solarium light and the wall with the text on it). Sadly because of rapidly changing light conditions the keys were tricky and required heavy rotascoping which was again made possible because of SkyBox.

After getting the renders back, each section of image sequences was tested for corruption before being comped and noise reduced and resized in After Effects and SkyBox before re-rendering to more efficient file sizes.

Each section was extended out as a still image to fill the sections of the timeline where things weren’t playing, so they didn’t just disappear when their part was done, lots of complex masking was necessary to make it all believable.

Converting the equirectangular footage to a cubemap (SkyBox) allowed me to add video copilots optical flares to really make the lights come alive. As well as comp out lights and rigging that wasn’t supposed to be in shot. Sadly due to time I wasn’t able to fix all the shadow issues on the floor but SkyBox certainly would have made that possible (as would have a little more time planning during the filming, but hey nobody’s perfect and I missed it). After a 20-something-hour render the final composite was put into Premiere Pro where (due to being let down badly by the freelancer composer) I had to throw together a foley track (primarily using stuff from freesound.org) some stock music (courtesy of Music Vine) and then a rough mix. The broadcast versions got a sprinkle of light leaks on top and I quickly added captions at the beginning and end. Render again. Drive over to the lovely folks at Gorilla Post production, transfer to tape, drop at the BBC, ITV and S4C. Go home with intention of sleeping for a week, instead wind up being used as a climbing frame by my two year old daughter.

So it’s far from a perfect piece, but I’m happy with it, for a low budget piece of work with sever limitations, I think everyone shone and really came through. The Wonderful Green Party for their patience and belief, the Blender Foundation for providing awesome software that is amazing for working in 360 despite some quirks.

Mettle.com for Skybox, which is such a great piece of kit particularly when you need to fix stuff fast, if you have AE and want to do 360 work you kinda have to get it! Also the Denoiser II effect from Red Giant is super powerful, allowing us to fix occasional render noise, and smooth things out nicely. 4 Pi Productions were utter legends helping with the kit and really helping me craft the process. And last but not least RenderStreet were true heroes! Pulling my spotty arse from the flames by making the renders possible and actually allowing me to get the film out on time!

A version of this post originally appeared at http://unitedfilmdom.com

Chris McFall
For the past six years, Chris McFall has been running United Filmdom, a creative studio based in Cardiff (UK) producing video, motion graphics, animations, films and video marketing. See his last year's work!

Chris is also a Blender Foundation Certified Trainer and the author of Animation Fundamentals series on CG Cookie.