top of page

VIRTUAL PRODUCTION

Bringing the impossible to life

Early on in the development of Fallen Star, it became apparent that shooting this film practically was going to be beyond our budget. The most sure way to tell this story and visually capture the set pieces we wanted it create was to use a virtual production system. While these setups generally cost millions of dollars, in recent years the ability to do it at home has become a game changer for independent films, so it was time for us to take on this challenge. 

The first step in this process was to build a studio. Today major studios use what's called the volume. Basically it is a stage with massive LED walls that show what the background of a scene will be. Unfortunately that option is still well outside our price range, so we need to go back to basics. Greenscreens. The process we would use is very similar to what was used for the filming for the Star Wars Prequel trilogy and the first Avatar movie. So a two bay garage was converted into a green screen studio room, with overhead lighting and green foam flooring.  

20231124_111703.jpg

Next up is the camera system. What makes virtual production different from just your standard green screen filming is the ability to see your results in real time and live camera tracking. This means as the physical camera is moved in the real world, the virtual camera in the computer will make the exact same movements. This allows you to not only make your shots look more lifelike, but also do more complicated shots that you normally couldn't. For example, we were able to live track the focus of our real world camera and send that data to the virtual camera so that as you adjusted focus in the real world, the focus in the virtual world changed as well. This is a process that would be extremely difficult to make look real if you tried to do it without a virtual setup. 

​

So the system basically consists of 4 HTC Vive sensors mounted in each corner of the room. Then on top of the camera, an HTC Vive tracker is placed. Attached to the lens is a special tracker on our focus wheel. All this information is sent to our computer which loads the data in real time into Unreal Engine

20240409_163109.jpg
20240409_163057.jpg

From there we enter Unreal Engine. This free software lets you build just about any environment you can think of. To help judge the scale of your sets, I liked to place mannequins to represent the average human height. Once you have built your set, you can add a camera to the virtual world. It's very important to make sure your camera settings match your real world camera. This includes the sensor size of your camera, the lens you are using and the F stop of that lens.  

Picture5.jpg

The next step is to setup the process of being able to see your footage your are filming in Unreal Engine in live time, then send it back out to a monitor on set so the cinematographer can see it. To do this, we need to create a composure that includes our live footage, a chrome key, and the Unreal Camera. By creating a composure, you are allowing the elements to be stacked together just like you would in post production in After Effects. This is basically just a visual aid however. The Chrome Key in Unreal Engine is not nearly as good as the Key you can get from After Effects. Also in our case, we are shooting in RAW format to give us the most flexibility in color grading and this process will not work with that level of footage.  

Picture4.jpg
057A9276.jpg

Once we've got all those steps completed, it's time to do some test shots to work out any bugs in the system. Here we use the take recorder system in Unreal Engine. Be sure to match up your scene and take numbers with what you are shooting on your physical camera to help make organization that much easier. This is also that point where things can get a little tricky. If your camera supports a time code generator, you are able to sync that time code with Unreal Engine. This will make syncing your starts and cuts very easy between the two sets of footages. However if you don't, a simple way to help yourself out is the move the physical camera side to side before you say action. This will give you a movement to match up to in post.

Picture2.jpg

Once you've taken your shot, you can go into the cinematics section in Unreal Engine and watch your take play back and then render it out. This is the basic way to get a nice shot. You would then match the two footages up in After Effects and color grade. However, sometimes we want to take things an extra step. In this film we have a forest fire our characters are in. In order to make things look as real as possible, we need the embers to interact with our actors. Here, you would take your keyed and color graded footage from After Effects and export it as an EXR image sequence with Alpha Channel. You then load that footage into Unreal Engine as a media plane. Place the plane in front of your camera you used for the take at a distance so it matches up perfectly with your cameras field of view. You then attach the media plane to the camera so the two stay locked together. Unreal Engine now understands there is a object in the world that just happens to be the shape of and move like your actors. Now when elements like embers are simulated, they with interact with the actors, moving past them. This also allows for things like heat distortions and real time shadows to interact with your actors. Adding an extra layer of reality to your virtual shot.

Virtual Production 1.JPG
Fallen Star Trailer Pictures 19.jpg

So that's the basics of virtual production! Without it this film would never have been possible. Some set pieces such as the forest fire would need an enormous film budget to do practically. Now we have the ability to do it in the comfort of our homes! At the moment, you need a very powerful computer to pull this off. It's not a cheap system, but its way cheaper then the alternative. And in the grand scale of film making, it's really not that bad. HTC Vive trackers are around $150, with Base tracking stations around $200. Unreal Engine is free, and depending on how big of a green screen setup you need, those prices will vary. As for computer specs, you need at least a 3080 graphics card, with 64GB of ram minimum and the most powerful processor you can get your hands on. To have a smooth process and if money isn't a concern, a Threadripper with an Nvidia A6000 graphics card is recommended, but not required.  The possibilities this system allows for is endless and we are already looking ahead on ways to refine and improve the process for our next project! 

Fallen Star Trailer Pictures 44.jpg
bottom of page