Peter Rabbit 2: The Runaway
I was invited back to Animal Logic to work on a second Peter Rabbit film for the latter half of 2019 - early 2020. Headed up by the enigmatic Lighting Supervisor Francesco Sansoni, the team assembled had a lot of familiar faces from the first Peter Rabbit. On the creative side I was tasked with a handful of sets to keylight, with a few being super moody such as the underground lair of Barnabas and his crew, and a night time London stone alley/square complete with steam pouring out of a restaurant kitchen back door.
To step up my game I decided to look under the hood of the automated latlong stitching process set up by the lighting TDs (an ingenious nuke script) as I was starting to notice some re-projection inconsistencies with alignment on buildings and walls when the source material was shot in smaller confined areas. Understanding how to override/manipulate settings inside the script, re-process and publish allowed me to really tighten the lighting setup. Although it took a bit longer on the setup, once shots started rolling the character lighting really fell into place easily with minimal adjustments, a relief since some animals were literally all over the set.
The other helpful contribution I made during this project was devising an alternative method for refining eye pings. Peter Rabbit was a VFX show, but the characters were a bit of a feature animation hybrid so the eyes contained realistic reflections from the environment, but also needed a clear primary eye ping relevant to the lighting world. When I arrived a few people had been trying to create a solution in 3D, however I found it to be a bit unwieldy, and to really see the eye pings a lot of sampling was needed for a single frame. I had recalled a Nuke environment relighting setup from Deke Kincaid (foundry tech guy) based on normals and a pref pass, so I took this and repurposed it as tool that would just take the eyes of the character, the main HDR latlong map being used in lighting, and with a rotopaint node one could simply paint out or move around existing light sources and see realtime the spatial result on the eyes across the whole shot. Once happy with the placement the rotational data and new map would be exported back to the lighting setup for the eye passes and then rendered out properly with Glimpse.
The other big mention is that this show was the first production at Animal to use their proprietary node based GUI workflow (Filament) that would tie in with their in house renderer Glimpse and embrace a full USD pipeline. As it was actively in early development it was rife with issues and bugs, but even in its raw state was functional and better yet, a clean canvas to make something amazing . More so than normal, artists were mandated to give feedback, submit ideas and offer suggestions for new tools/gui design or improvements/variations on existing ones, and it was great to see it evolving day by day. For me the project was so interesting because it combined VFX feature film production with something different - an overlay of software development, and an active role in defining its creation.
All projects