Home Website Youtube GitHub

Eye Rigger problem

Hi @Krzym,

This is a super late reply, but this week I am looking into the eye code a bit more. What did you mean by “sorted”? I’m guessing sorting by edge or vert number will not work, since you can’t necessarily rely on those being in a nice order. (Or can you?)

I am testing an algorithm that will either orient itself to the axis of the corner vertices, or will march along the edge-loop, choosing the two middle points for the upper and lower vertices, so even 45 degree rotated eyes should work. This works for finding the vertices, when the user chooses manual corner vertices.

But I’m still trying to figure out why the right side sometimes loops all the way around the eye. I haven’t figured out where that is happening yet.

Edit: Actually, I have a promising result that seems to have fixed it when I override the upPos and lowPos verts in the eyes, but I am double-checking. This was a mesh I had that was failing on the right side, and now it is working, because I find the upPos and lowPos by edge-marching to the middle of the upper and lower eyelid edges. I’ll run this by the devs later this week and then make a pull request if it seems solid.

eyes45degrees

1 Like

Hi @chrislesage. I’m not sure what exactly is wrong, but sometimes it helped me to create temporary geo - then even 45 degree rotated eyes worked - only minus of that was that the controls weren’t rotated correctly as on your preview - do you think you could quickly fix that too? :wink:

What I was testing was sorting edges/verts by world position or from one corner to another. Sorry I do not have any examples right now, just some code I quickly pasted:

#sort vertices from X+ to X-
def sortVerts(points):
    length = len(points)
    for i in range(length):
        for j in range((i+1),length):
            pos_i = points[i].getPosition(space='world')[0]
            pos_j = points[j].getPosition(space='world')[0]
            
            if (pos_i < pos_j):
                temp1 = points[i]
                points[i] = points[j]
                points[j] = temp1
    return points


c_Loop = [pm.PyNode(e) for e in pm.polySelect(geoTransform, edgeLoopPath=(l_inner.indices()[0], r_inner.indices()[0]), ass=True, ns=True)
                  if pm.PyNode(e) not in l_Loop and pm.PyNode(e) not in r_Loop]

I’m sure this is not a clever solution but worked that time :wink:

2 Likes

Thanks! Yeah that makes sense.

And yeah I’ll see if I can fix the orientation of the controls too. I’ll see if I can calculate a sane orientation space between the vertices.

1 Like

This is a bit old, but a thread closest to the issue I’m having. What ended up being the thing that defines the rig orientation? I’m working on an eye rig that exhibits extreme protrusion and while this rig works, the orientation is pretty wonky. Any recommendations for what I could do to get a more traditional XY plane orientation?

@Saveremreve Is your eyeball geometry separate? Is your eyeball a perfect sphere, or is it a partial sphere? Is the eye skinned or constrained?

There was another thread where someone had a parentConstraint node under the eye, and the transform of that constraint was outside of the eye, and it was causing the bounding box of the eyeball to be huge and the center was detected as halfway down the character. And in that case, the orientation of the controls might be caused by an offset center position of the eyeball.

If that doesn’t seem to be related to your problem, if you share a file, I’d be willing to take a quick look.

The thread I’m referring to: Eye Rig orientation...again

I’m a VFX rigger, so this is a small prototype that’s part of a larger system and it’s intentionally stripped down and very basic. I left my failed mGear eye setup in there, but I basically only want the eye rig and nothing else.

So no intentional use of constraints, and I watched the original walkthrough video and made a temporary sphere for a separate, symmetrical, pivote centered eye I could retarget trivially later.

https://www.dropbox.com/s/sx78zyb864cqyz2/EyeFace_External.ma?dl=0

Here’s another angle that shows the unusual protruding look. My use case is placing copies of the eye on another surface so they must protrude. I intend to use this in a real-time application so this is a lower resolution proxy.

I’ve figured out the problem. The problem is in the way mGear tries to discover the upper and lower eyelid.

Because of the shape of your tear duct, the lowest vertex is inside your tear duct. The eye rigger was written with the assumption of simple football shaped cartoon eyes.

In this image, the two outer positions, and the upper position are correct. But the tear duct is causing the lowest position to be on the inside corner. mGear then averages those 4 positions, and results in the red locator. That location is what the controllers AIM at, from the center of your eyeball. That is where the offset comes from.

If your design permits it, I suggest raising the tear duct slightly so it is above the lower eyelid. Or open the eyelid wider a bit in your model.

I’d also suggest evening out your edge-loops so you have a center edge loop on the lower eyelid. (And again make sure that is the low point in translateY):

2 Likes

Thanks so much Chris. As this is an effects model it’s no trouble to rebalance what the neutral state is. Understanding now how the algorithm works this totally makes sense. Are there any other key items for how it behaves that you think it would be illuminating? Also is there any documentation besides the 2 videos that discuss eye riggers on the main youtube channel I should be educating myself with? I hunted around and that was the best I could find. If there’s anything I could do to contribute I’d like to.

Those are the key items. The way that they are calculated has room for improvement.

I actually wrote a new algorithm for marching along the edge loop, rather than just looking for the lowest point. But I haven’t had the time to test it as much as I would like before I submit it as a PR.

If you are comfortable with Python, feel free to replace scripts/mgear/rigbits/facial_rigger/eye_rigger.py and test this out.

The new function is march_vertex() and it overrides upPos and lowPos. My algorithm isn’t perfect either. It gets the middle point by the middle of the list between the corner points. I’m going to change it to measure which is closest to the middle in space.

And ultimately the orientation is caused by the way t_arrow is calculated using transform.getTransformLookingAt. A better algorithm could likely be found that would make the eye orient to a better plane, or by using the centroid of the 4 points instead of the average. Also, if you use an aim controller in the eye_rigger GUI tool, it doesn’t use that as an orientation. And it could.

In your specific example, t_arrow could be calculated simply by using a point exactly ahead of the center of the eyeball.

1 Like