Learn Camera Tracking For Free Using Blender 5.0

Learn Camera Tracking For Free Using Blender 5.0

In this tutorial, we're going to be learning a super easy and basic camera tracking process using our free camera tracking footage here on VFX Oasis. Check out the product here.

Blender 5.0 is currently in beta right now, however, there are some massive changes out right now that greatly improve the VFX experience in the program. In this video tutorial we'll be tracking the camera as well as transcoding our footage to work with ACES in blender. ACES is a brand new color workflow added to blender and so we have to set it up properly for camera tracking to work correctly. I'll be creating a separate more in-depth video tutorial soon about how to use ACES in blender so stay tuned for that.

Below you can find the tutorial and don't forget to follow along using our free asset!

Now you should be able to camera track any footage that comes your way! Of course make sure you are applying the proper lens distortion to any CGI that you might add to your scene in order for it to look like it sticks to the floor of your 3D scene. Many tutorials skip over that step and so make sure to pay special attention to it so you don't fall into the same problem.

For my professional VFX workflow for clients actually uses another method of camera tracking. I use syntheyes for my camera tracking then I export the camera data as a python script to open in Blender. That way I can get better tracking results as syntheyes is a more industry standard VFX program. 

For the lens distortion workflow, Blender needs the lens distortion data to be zeroed out in syntheyes. In order to get around this, I render out a nuke script with the len distortion data baked into it. Then inside of Nuke, I write out a proxy version of the footage that has the distortion taken out. Then I use that proxy as the background image of the camera inside of blender. This way when I model and render in blender, I know that the CGI will match the distortion. Finally in compositing anything that I need to add distortion to, I use the provided distortion node in the nuke script to add the lens distortion back into just the CGI and use the original distorted version of the footage for compositing. That way when we're in compositing, both the CGI and the footage is the distorted version and so that camera movement matches again!

I'll be creating a specialized tutorial going over exactly how I do it at a later date and so let me know what you want me to cover in that!

Anyways, that's enough talk about camera tracking for one day. Even though these are my methods, just know that there are countless methods for camera tracking so experiment on your end and find what works for you. Thanks for watching and I'll see you in the next tutorial!

Back to blog

Leave a comment