Introduction

“AfterVice is a gripping crime drama setting 1980sMiamithat explores the haunting cycle of generational trauma. Alejandro, once a frightened boy abandoned by his drug-addicted mother, has grown into a powerful yet emotionally scarred crime boss. His Mother's broken promises of being better have left deep scars, and now, as an adult, Alejandro unknowingly repeats the same cycle with his own son.

Instead of using drugs, Alejandro deals them,  but the damage remains.When betrayal from a close associate puts Alejandro on the run, he makes one last promise to his family that things will change. In a final showdown with the police at an airstrip,Alejandro's life comes to a violent end, leaving his son to inherit a legacy of broken promises. The film explores how the failure to break free from the past affects not just individuals, but entire generations.”

- Andres Hermida and Alessandra Hernandez Dolan

[Still from Unit 1: Colombia]

Pre-production

2.1 Concept and Requirements

When Andres approached me to work as a virtual production artist on the film I was made aware that most of the shots that were to be worked on the XR Stage [Unit 3: XR] were car chase scenes and the requirement was moving cities and ocean drive from Miami. The timeline on the project was two weeks to create the environment and two load tests before the final shoot day. 

[Load Test Day 1]

production

3.2 Challenges and their solutions

Scale

The biggest issue was the scale of the city. During the shoot we realised that one of the scenes was a three minute monologue and the environment ran for no more than 30 seconds which was 720 frames. So we had to find a fast solution to make the environment longer without drop in performance, so I went ahead and asked our technician Conner Write to pause the stage for us to extend the environment. Upon quick thinking I decided to make the entire current environment a level instance and duplicate the same 5 times. 

700 frames in 24 frames per second played for 30 seconds

Three minutes is 180 seconds 

180/30 = 6 

But upon noticing frame drops we reduced the speed of the root camera and went ahead with 5 making the final sitting frame rate 26 frames per second which was pushing the limitations of the stage. 

Bokeh

As mentioned earlier Sten Olsen our DP was able to achieve great anamorphic bokeh´s for most shots but there was one that caused the depth of field to feel fake due to the size of the bokeh´s in proportion to the distance of the objects in the virtual space. On the stage the best way to mitigate this issue was to add depth of field and increase the iris in the post process volume. It made the bokeh´s look slightly better but did not completely remove the issue at the moment due to the tight schedule we had to leave that issue open to interpretation. 

2.2 Planning Solution

Me and my team came up with multiple creative solutions and approaches for the environment, and we chose the one that required the least amount of time. We started our project by inspecting the real life environment on google earth trying to understand the infrastructure of downtown Miami. Me and most of the team already had been to downtown Miami so we had some trained eyes on the job. 

Eliezer Garcia Gazaui, our project lead, came up with several Miami Asset Packs. I went through these asset packs reviews and documentation to figure which ones matched our requirements. One of the biggest issues we faced was that the LED Volume stage had recently updated to Unreal Engine 5.4 which caused migration issues with almost all the asset packs on fab. We found two environments that matched our use case, one for an average downtown environment and one for ocean drive Miami. 

Eliezer spent his time setting up blueprints of these city blocks for both environments, roads and existing local lighting, while Donnie Abrahamson our VFX artist and level designer spent their time working on lighting for performance on the LED volume found that emissive light used in extreme values works great in not only lighting the scene it also helps bring real time performance down. On the other hand Ben Jones, another level designer was testing camera movements and how motion blur was perceived in-camera.

2.3 Initial Reflection

Upon initial renders of the environment I realised performance can be immediately saved by using google earth 3d geometry as part of the set extension, due its low polygon count and not as heavy textures allowed for the streets to feel more immersive on the stage. We used a storyboard provided by the production team to set up certain cameras for our load test.

Our first Load test we tested all aspects. We were performing well at 80 frames per second which gave us hope of adding more assets to push the realism of the environment, adding crowds and and more cars were on the table. In the following load test we found that crowds were tough on the system and noticed multiple frame drops.

3.1 On-Set Setup 

We were able to migrate the project onto a new level and reduce the file size and after placing disguise plugins we were ready to load on the LED Volume stage. The Director of Photography Sten Olsen and Director Andres Hermida decided to remove the camera tracker and to use a handheld camera. The camera used was a Arri Alexa Mini shooting with a 50 mm anamorphic lens, even in the initial meeting Sten had mentioned the need for bokeh. 

Performance

During the shoot we shot a top down of the hood of the car which would show the reflections of the environment where the Sten our DP noticed that we could sell realism if the car had headlights. Upon discussion with the team, Donnie and Ben went ahead and placed both spot lights in place following sten´s directions. When we turned on the stage the spotlight took all our performance and we were down to 12 frames per second i.e barely moving. We decided to remove one of the spotlights and art directing the one to resemble two lights. 

Quality 

For the same top down shot we realised that since the car and camera were so close to the stage the quality of the display was taking a hit initially we believed it was the limitation of texture but upon close analysis we found that the the pixels from the individual panels of the XR stage were starting to be visible there was not particular solution for this but to place a green screen and composite it in the post-production.