In other face animation type apps, they have a tendency to use selectable and dragable tracking dots. These are used to define the face feature ie, click on the side of the mouth and side of eyes. These points are then used for future animation.
Ch's Face Tracking is very good and a big plus against other programs, but think it could be a whole lot better if the tracking dots, could be selected, moveable and then their position confirmed against the Puppet and your own face in the Camera view.
The Tracking Dots seem to be grouped... eye.. nose.. mouth... head etc.. If the Tracking Dots could be selective and confirmed in position as described above, then the potential to improve the quality of the tracking and the use of this improved tracking to trigger animations could be significantly improved.
I have two suggestion:
1. Use the Auto Face Tracking to detect the facial feature and then give the user the ability to adjust the Tracking Dots as described above.
2. Remove the Auto Face Tracking to detect the facial feature and get the users to define the facial feature first, like many other programs. Then allow Ch to animate using the set Tracking Dots.