Hi, I'm developing a show that is premised on mixing reality with fiction that would feature real interviews with real people. I'm relying almost exclusively on pre-recorded audio, and the lip syncing feature with ACA is incredible. I was wondering if it would possible to do the same thing with the physical movements of my subjects where I could import the video of their interview and ACA would sync their basic movements to the puppet. I could then go back and fine tune their facial expressions etc. I also realize this is a really lazy request (lol), but it would be like magic for my workflow. Just a thought. Thanks guys! What you're doing here is revolutionary and very much appreciated!