Ms. Understanding VFX
An overview of the visual effects development and work done on the short film, Ms. Understanding.
I began work on Damien Krzesinski short film Ms. Understanding around 2 weeks before production was scheduled. In order to help meet the asthetic of the time period in which the film was set, year 2100, I suggested having the characters interact with futuristic holographic phones. Additionally, we had some vary ambitious goals in mind which as we learned later, were perhaps a bit overamibitious for the time in post production we were alloted. This article details the developmental process of the VFX shots that were able to make it into the premiere.
Virtual production testing
During preproduction, I tested a few different ideas for visual effects. First, with the help of The Mix at George Mason University, I created a virtual production demo using only the hardware we had on hand. The goal of this test was to see if using a virtual environment could produce photorealistic results. Unfortunately, due to the time constraints, I was unable to put a viable demo together. However, I do plan on exploring virtual production much more heavily with better hardware.
Working with the director, we instead settled on the concept of the actors using futuristic looking transparent phones. This was my first attempt at such a concept but during my preproduction research, I estimated that with the help of new modern tools including AI, we could safely bet that this effect could be implemented into the film prior to the festival.
I knew that the best chance for this effect to work was to use a prop that would allow the actors something to physically interact with. We settled on screen protectors since they are transparent and are obviously relatively the same size as a phone.
Additional pre-production planning was done by myself in order to find the best way to track the screen so that an image could be overlayed on it. I stumbled upon the newly released (out of beta) GeoTracker for After Effects by KeenTools. This software looked incredibly promising and had worked well in a brief test albeit not with the footage of the transparent screen protector.
As both cinematographer and VFX supervisor, I unfortunately stretched myself thin and ended up making decisions that did not help me down the road. For one, we drew Xs in marker on laminated paper. This was a "we'll fix it in post" moment but due to time constraints, we weren't able to fix it before the festival. Additionally, I now realize these marks would not benefit myself later down the road since the shots were static and they ended up serving only as a odd visual quirk on screen.
Another problem that I only noticed in post was that Marcus, the actor playing Boris, had used both hands when interacting with the screen protector. If I had my focus solely on workload during post, I would have asked he only use one hand to eliminate more rotoscoping work that had to be done later down the road.
There was no pre-viz for what the screen would look like or the animation so the actors were left guessing what to do. In the future, I would work with the pre-production team to create a plan for the motions the actor would perform.
IV. Starting VFX
No one on the post team had worked with RED footage before, including myself, bringing a whole new set of challenges for us to overcome. The footage was shot on the RED Komodo 6K in anamorphic 2x mode on the MQ quality setting. This netted ~1.3 TB of footage that had to be distributed among the various members of the post team. Proxies were created but I am now unsure what my thought behind not compressing them further was. Now reduced down to ~700 GB, the footage was put on a few hard drives and provided to the team.
If I were to start over, I would have worked with the editor immediately after the shoot to create more slimmed down proxies. Additionally, SSDs, though more costly, are such an improvement to the workflow thanks to their snappy speeds that I would stress their use, especially with high bitrate RED RAW and similar codecs.
As we had less than 2 months for post, I started working on the VFX I could based on the shots I knew the editor had basically solidfied. Prior the Ms. Understanding, I had only worked in Adobe After Effects and thus I had not taken any other options into account. As it turns out, this was not the right call and I ended up wasting so much time trying to coax After Effects into working.
The major issue that I had with After Effects was with its performance. As many AE users know, it will use all the memory you throw at it. Since many of the shots I was attempting to work on where a few seconds and I was working with the raw video, timeline performance was atrocious. Hardware did not matter either, I tried first with my M1 Macbook Pro then the GMU Favs Mac Pro (12 Xeon W, Radeon PRo W5700X 16gb, and 48GB DDR4). Neither computer was able to efficently handle the footage, performance was still holding me back.
The second issue was that the AE GeoTracker plugin, although an amazing piece of software, has to reanalyze the footage everytime you perform an action on the footage. This led to hours of waiting for analysis, a process required before you can adjust the track of an object, in this case, the transparent screen.
Believe it or not, I ended up spending almost an entire month or so wrestling with After Effects. I had made very little progress and the deadline to submit to the GMU film festival was fast approaching. After another weekend of frustration with AE, I decided to give the industry standard compositing software Nuke a try.
Nuke is a popular VFX compositing software which I had previously known very little about. After watching Hugo's Desk, a YouTube channel run by director and VFX supervisor Hugo Guerra, I was able to get a basic grasp on how I could use the software to accomplish what I needed.
Nuke was an immediate improvement in both workflow and performance. Additionally, I now had access to the same toolset that Hollywood uses. Even better, GeoTracker by KeenTools was orginally built for Nuke. Unlike the AE version, you can save the analysis file which meant I only had to analyze the footage once unlike before.
There were still a few hurdles I had to solve:
Remote Workstation: With limited time remaining, it made more sense to use the Cloud VM I had setup previously for After Effects with Nuke since it could be upgraded instantly. Using Parsec, a remote desktop software, the performance was amazing, I had no troubles with latency or delays. As for the cloud provider, Paperspace was the most affordable option I could find where you pay only for storage and an hourly rate when you have the machine running.
Footage in Cloud: I debated how I could get the footage onto the cloud machine. The easiest way I found was to upload to a CloudFlare R2 bucket from George Mason's Gigabit wired connection. It still took a few hours to upload. CloudFlare offers no egrees fees on incoming/outgoing data which makes it especially useful when you know the footage will be downloaded many times.
Licensing: Software licenses can be expensive and Nuke is no exception, a yearly license typically runs around $3,299. I started by using the free 30 day trial and then applied for an educational license which was granted but only lasts for a year. I additionally licensed GeoTracker.
Now, it was finally time to begin working with Nuke.
VI. Screen Replacement Breakdown
Screengrab of the in-progress Nuke comp
3D model of screen protector made in AutoCAD
Here's how I comped the holographic display in this shot:
After analyzing the movement the actor made with the phone, I realized it would look better if it was reduced.
I froze two frames where the actress held the phone before a jumpy movement then extended them out.
Since she was moving her thumb during these frame holds, I rotoscoped her thumb out and merged it in so that she would still be moving it while the rest of the hand was still.
With the stabilized movement now in the pipeline, I used the GeoTracker node to automatically track the screen protector.
To improve the track, I created a 3D model of the screen protector to scale. After importing the geometry, GeoTracker did an amazing job and required almost no keyframe adjustments.
Using my iPhone, I created a screen recording matching the action. I set my background to green and keyed it out in Nuke.
I added the footage into a 3DCard node and then applied the transform data from GeoTracker.
I rotoscoped each finger that passed over the screen, this was by far the most time consuming part.
Finally, I merged the fingers over the iPhone footage, added some basic motion blur to the screen, and violia, the finished shot.
VII. Cost Breakdown
Paperspace Remote Workstation: $41.78 for around 62 hours of usage
Paperspace Shared Drive (250 gb): $5.84
CloudFlare R2 Storage Bucket: $15.45
GeoTracker by KeenTools Freelancer License: $18.00
In total, $81.07 to rent a powerful machine and with the time savings, not bad. Keep in mind this figure is without the Nuke license so after the 1 year is up, I will most likely need to invest in a Nuke Indie license at $500 a year. More stomach-able than $3,299 but still expensive, however, the versatility and things you can do with Nuke might make it worth it for some.
Working on Ms. Understanding was a great learning experience for me and I hope everyone else involved. I'm looking forward to finishing shots that weren't able to be done in time for the festival.
I will be working towards improving my visual effects skills with Nuke and hopefully getting into more advanced things like set extension using projection.
Special thanks to:
Damien Krzesinski (director) and Emilee Hayward (producer) for the opportunity to work on this short.
The instructors and staff at the George Mason Film department.
Andrew Jorgensen and Evan Bowen at GMU film who provided help and guidance throughout all the stages of this film during the semester. They allowed me to work outside the box and gave valuable feedback and advice.
Professor Russell Santos, who introduced me to node based VFX compositing and provided guidance during his Spring 2023 session of FAVS 399, Visual Effects at George Mason.