Home Website Youtube GitHub

Data Centric Face help

Hey! I was wondering if anyone has any insight on this topic out here in 2023. I’ve been following the facial rigging workshop and other mgear content to create rigs for my animation school thesis. I used what I learned in the workshop to make my own custom steps for the eyes, brows, and etc, and I’m very happy with my face rig. But like some others, I have had a lot of trouble connecting everything to the body. :broken_heart: I’ve been very invested in learning this wonderful framework for the last several weeks, but I’ve really hit a wall trying to connect the head/face to the body. Any advice to get me unstuck would be deeply appreciated, even if just a general direction to keep looking in. Thank you all for what you do. I have had a blast learning this so far and hope to continue. :smile: :green_heart:

2 Likes

Hi there, I moved your question to a new topic, so it’s easier to follow.

Can you describe what trouble you’re having, or what you’ve tried so far? I’ll edit your account so you can paste screenshots here too.

Hi, thanks so much for the reply! I have mostly run into trouble when attempting to build after combining my body guides with the guides from my face rig into one hierarchy. I tried to parent things where they would make sense. I moved the face root under the neck, but I am having a hard time conceptualizing where the mouth slide guides should go go. In the facial rigging workshop, they were just under the main guide group, so I did the same thing here too. Here is my guide hierarchy (selected are the guides from the face rig)


When I build, I get this:

It looks like something happened with the guides, but I’m not sure how to identify the exact issue. My python knowledge is still quite surface level (learning as we speak :sweat_smile:).
I suppose I’m actually not so sure how else to approach this and what the process of connecting the head and body normally entails. Embarrassing enough, I haven’t tried much else, as I’m honestly unsure of what to try/how to go about this. I haven’t worked much with local rigs prior to this workshop, so I’m realizing this is probably a bit more complicated than originally expected, haha.
Happy to provide more info- I’m sure you are very busy, so even a general direction to think toward would help- any expertise is deeply appreciated (but of course the more the merrier)!!

So, I don’t exactly know. I don’t use that neck component, and I use a different method for my faces.

That error doesn’t seem to specifically indicate which guide was failing. There might be multiple problems though. This might not be the only one.

Did you delete any parts of the guide? That neck component should have some other components that don’t seem to be there. Deleting parts of a guide would definitely cause errors similar to that.

You might have to unparent any custom face stuff. Delete the neck. Re-make the neck guide. And then re-parent the other stuff again. The face should likely be parented under “neck_C0_head” (or if not, maybe “neck_C0_eff”).

(You can also rename the neck to neckOLD. Make a new neck. Use neckOLD to snap your new guide to it. And then delete neckOLD. Then you’ll know you got your placement matching the old version.)

1 Like

Silly me- you’re right, I had somehow deleted some guides :grimacing:. I went back to a backup version of my body guide setup and parented the facial root to the “neck_C0_head” that I had deleted in my newer file, and things are building now. It seems to be almost all working- I just have to make ghost controls for the other parts of the face I added myself (eyes, brows, etc). Thank you so much for the help so far!!

2 Likes

Hey @GingerBeeLuna, have you been able to find a way to connect to the main body rig ? I’m also following the Data Centric Facial workshop, and I’m kind of stuck on this. Miguel mentions a matrix multiplication in the main topic, but I’m unsure where this would fit, as all joints are constrained to their controller, you can’t just offset the root joint of the face rig, as the others won’t follow.
I could make ghost controls for every controller in the the facial volume parts but in the other topic people mention that this breaks the mouth rig.

Hi welcome AlexV,

There is no one “way” to connect things together. It’s a general question, but it really depends on your rig and how you set it up. What kind of experience do you have with rigging before starting the data-centric tutorials?

From the way you worded the question, it sounds like ghost controls would be a complicated solution. You shouldn’t need to do that. You should just be able to find which nodes you need to constrain or which ones need to not follow. (I know that’s not always easy. But it always depends on your rig.)

“breaks” also depends on your rig. What specifically is wrong? For this you need to try to show with screenshots. Or show your hierarchy.

Otherwise the most general possible way to answer is: If it doesn’t follow, constrain it somehow. If it double-transforms, make it not follow somehow. (Change the hierarchy, turn off inherit transform, or constrain it to something static, etc.)

1 Like

Hey @chrislesage , thanks for the detailed reply !

I have a pretty solid technical experience but rarely rig, so I was wondering what the ideal solution was in this specific case.
But I have found a solution that works great for my needs. The situation with the Data centric face workshop is the following :

  • The facial volumes controllers and joints are build outside of the biped rig
  • The facial volumes are then blendshaped onto the main geo
  • The controllers should follow the head/neck, so they need to be parented under the head.
  • This introduces double transforms as the joints are matrix constrained to the controllers, hence using their world space transform, not just the local controller transforms.
  • The matrix constraint is fast and elegant, it ensures that the bones follow the mGear mouth component properly, as this component has special logic in it’s controllers.

My solution is the following, if anybody would need it :

  • Detect the controller the joint is constrained to
  • Given the parent head joint, compute the necessary inverse matrix to remove the worldspace head transform from the worldspace controller transform
  • Inject this matrix before the matrix constraint.

Here’s my snippet in my custom step :

def compensate_joint_offset(self, joint, offset_joint):
    neutral_mtx = offset_joint.attr("worldMatrix[0]").get()

    inputs = pm.listConnections(joint.attr("translate"))
    if len(inputs) == 1 and inputs[0].nodeType() == "mgear_matrixConstraint":
        drivers = pm.listConnections(inputs[0].attr("driverMatrix"), plugs=False)
        if len(drivers) == 1 and drivers[0].nodeType() == "transform":
            driver = drivers[0]
            pm.delete(inputs[0])
            
            # Create offset to driver matrix
            offset_to_driver = pm.createNode("mgear_mulMatrix")
            pm.connectAttr(driver.attr("worldMatrix[0]"), offset_to_driver.attr("matrixA"))
            pm.connectAttr(offset_joint.attr("worldInverseMatrix[0]"), offset_to_driver.attr("matrixB"))
            
            # Create world to offset
            world_to_offset = pm.createNode("mgear_mulMatrix")
            pm.connectAttr(offset_to_driver.attr("output"), world_to_offset.attr("matrixA"))
            world_to_offset.attr("matrixB").set(neutral_mtx)
            
            # Connect matrix to transforms
            decomp = pm.createNode("mgear_matrixConstraint")
            pm.connectAttr(world_to_offset.attr("output"), decomp.attr("driverMatrix"))
            pm.connectAttr(joint.attr("parentInverseMatrix[0]"), decomp.attr("drivenParentInverseMatrix"))
            pm.connectAttr(decomp.attr("translate"), joint.attr("translate"))
            pm.connectAttr(decomp.attr("rotate"), joint.attr("rotate"))
            pm.connectAttr(decomp.attr("scale"), joint.attr("scale"))
            pm.connectAttr(decomp.attr("shear"), joint.attr("shear"))
            
            # Set joint orients to 0
            joint.attr("jointOrientX").set(0)
            joint.attr("jointOrientY").set(0)
            joint.attr("jointOrientZ").set(0)

In the context of the facial workshop, this method can be called like this in the facial proportions custom step :

    neck_jnt = pm.PyNode("neck_C0_head_jnt")
    headBend_C0_0_jnt = pm.PyNode("headBend_C0_0_jnt")
    headBend_C2_0_jnt = pm.PyNode("headBend_C2_0_jnt")

    self.compensate_joint_offset(headBend_C0_0_jnt, neck_jnt)
    for child in pm.listRelatives(headBend_C0_0_jnt, ad=True, type="joint"):
        self.compensate_joint_offset(child, neck_jnt)

    self.compensate_joint_offset(headBend_C2_0_jnt, neck_jnt)
    for child in pm.listRelatives(headBend_C2_0_jnt, ad=True, type="joint"):
        self.compensate_joint_offset(child, neck_jnt)

Hope this helps, and thanks again :slight_smile:

4 Likes