This makes the authored gesture pose completely override the tracking data from the controller, rather than adding on top. This can be used to override the hand animation for any gesture! So what you need to do is open your gesture controller (which handles the hand animations) and, for every hand gesture, add an appropriate VRC Animator Tracking Control behaviour script, setting “left fingers” or “right fingers” to Animation as appropriate: This one is called VRCAnimatorTrackingControl. For that, we’ll need to use another State Behavior. You don’t want to track your hands when you’re AFK. In order for the AFK animation to “take over” the character, you have to disable input from things like IK. Somewhat buried in that article, when talking about AFK animations, is this little gem: This is especially problematic when your avatar’s hands don’t map cleanly onto a standard human hand, such as with a critter’s mitts.įortunately, if you don’t need individual finger tracking (as is available on some controllers such as the Valve Index), there’s a cheat to get past this, which I learned from the official Avatar 3 Walkthrough. The hand armature is especially fiddly when it comes to the finger and thumb positioning, due to the hidden/unspecified measurements mentioned above. (It’s also what identified that my FBX scale was set incorrectly.) Unfortunately it doesn’t have any options to automatically mirror poses between the left and right hands, but it isn’t too hard to copy-paste values manually. PumkinAvatarTools has a rather nice pose editor, which makes it much easier to edit hand poses than using Unity’s built-in animation editor. Update: So, it turns out that this was actually due to the armature scale being wildly off, and fixing the FBX export settings seems to have gotten the game to actually honor my authored eye positions. However, it seems that correcting my FBX scale has also corrected the eye positioning. One of the more obnoxious side-effects of this system is that it also wants to move your eyes to be where it thinks they physically are, which means that if you have your avatar’s eyes mapped, they might be adjusted, incorrectly. This is why VRChat provides “height” and “shoulder width” measurement options, but both of them are still pretty much secret sauce. There is no way to override this directly in general, and there’s no way to determine where it thinks those limbs are either. The VRChat avatar system wants to adjust the sizes and positions of things to match some reference skeleton which it builds based on where it thinks your physical limbs are. I’m probably making things way more difficult than they need to be, really. Cats does have some very nice options for exporting, however, including baking normal maps as part of the decimation process and generally making a better export workflow when you’re doing as much as possible (especially texturing) within Blender itself. Note that for most people it probably makes more sense to use Cats, especially if you’re using Blender to set up the materials however, I’m doing some overly-fiddly/complicated texturing and normal mapping stuff externally and I’m not making use of normal maps on Quest (and my base mesh is low-poly enough that it doesn’t require decimation on desktop). The “apply scalings” setting is pretty important otherwise, PhysBones' scale will be off substantially, among other things.Īnyway, after doing the export, I can then undo the ApplyModifierForObjectWithShapeKeys and my mesh is back to a lovely subdivision surface, perfect for continued tweaking and fiddling. Add leaf bones (this seems to be necessary for PhysBones to work right).Apply scalings: FBX scale Update: This turns out to be incredibly important, see below!.Limit to: (whatever you feel is appropriate, I just export my whole scene personally).Instead, I use ApplyModifierForObjectWithShapeKeys to “bake” my subdivision mesh into triangles, and then when I export the FBX I use the following options: FBX exportingĪ lot of folks swear by the Cats Blender Plugin, but I’ve found it to cause more problems than it helps with. The downside is that it complicates the FBX export slightly, although it isn’t too bad. This does mean there’s somewhat of a precision loss for rigging, UV mapping, and weight painting, but it isn’t so bad that it makes me wish I had a higher-resolution mesh to reference, and the advantages of staying in subdivision mode are well worth it. I maintain the subdivision modifier forever and ever, because editing subdivision surfaces is way easier than editing meshes. When it was time to weld the body to the hair, I applied the mirror modifier, but kept the subdivision modifier, and used the same subdivision modifier on the hair. For my initial modeling, I set my body mesh up with two modifiers: mirror, and subdivision.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |