Hi, I have a feature request if it doesn't already exist. When doing animation previously, I'd carefully draw out multiple frames for each sound so that if my character is doing an "ahh" sound for example, there would be three to five slightly different frames of the mouth and jaw moving just right for a very smooth look. I was hoping I could create an animated PNG file in Premiere, plunk it down on the character's body in photoshop, and then run it through Character Animator to give those subtle body movements of a live person while the animated face runs through it's already lip synced speech to really enhance the overall effect. Unfortunately, when I tried that Character Animator just takes whatever expression is currently in the Photoshop timeline and discards the actual animation. Is it possible that any animated components in Photoshop could retain their animation when brought into Character Animator? Or do I need to have my dozens upon dozens of different facial expressions brought in to Character Animator and sync it up manually there? I suppose that would have the same effect, I'm just much more proficient doing my syncing in Premiere, as well as I already have multiple speeches animated for a project I am working on that I'd like to use in Character Animator rather than have to animate them again.