The process, as I understand it so far, is tracking in Mocha Pro > export FBX to Blender (2.79) > render out sequence from Blender > composite original footage with rendered footage in AE (2018). If that’s so, how do I know if the sync is working? Can I import simple 3D objects into Mocha to see if it’s syncing correctly or do I need to render out a quick draft of the sequence in Blender and composite it with the original footage in AE? But when it’s in Blender it will be moving the camera while the 3D models will be stationary. The body moves a little, not too much, but the face does rotate from 3/4 left to 3/4 right and back a few times.Īs I understand it, Mocha’s camera solver will export FBX data to Blender. and another couple ones similar, but whenever I paste the eye shapes into a solid white tint block it makes the eyes shape but seems there is no data for the tracking as all is flat and stable at same position, sure if I press U I get the load of frames but are all the same and there is no change, so no actual movement tracking, how do I do. The footage is a locked camera shot where the face and body are moving and it hasn’t been filmed with tracking in mind, so there are no markers. I want to track footage of someone’s face and body and sync it up with 3D objects in Blender. ![]() Hey all, I’m considering buying Mocha Pro but I have a few questions and want to make sure I understand the process.
0 Comments
Leave a Reply. |