Avatars 3.0


Under Construction

This documentation is preliminary and will change over time.


You might want to read about What is Avatars 3.0? first!

What is Avatars 3.0?

Avatars 3.0 is a huge collection of features for avatars in VRChat. You can think of it as a new version of our avatar feature framework. Previously, you've been using Avatars 2.0 (in SDK2). Avatars 3.0 will be usable with VRCSDK3. Both can exist side-by-side in VRChat at the same time, but AV3 avatars will be far more powerful.

AV3's features are focused on improving expression, performance, and the abilities of avatars in VRChat. Moreover, we've learned a lot from years of watching users find ways to do cool stuff with Avatars 2.0. A lot of those methods are considered "hacky", and it is hard for us to support those. We want to formalize the process so you can do the things you want, access them more easily, and use them in a system that is officially supported.

Avatars 3.0 is heavily integrated with the Action Menu for controlling and interacting with the avatar you're wearing. It's probably best if you hop in and try out the Action Menu before building an AV3 avatar!

So, all that out of the way, let's get to some basic concepts.

Understanding the Concepts

In order to understand and use Avatars 3.0, you need to know a few concepts. These concepts will help you understand the construction of avatars, how best to assemble them, and the intended use of various systems.

Unity Systems

This document is written with the assumption that you know a bit about Unity Animators. In particular, you should ensure you have basic working knowledge of:

  • Animators and animations
  • Animator layers
  • Animator layer weights and blending
  • Animator states
  • Animator transitions
  • Animator parameters
  • State behaviors
  • Avatar masks

It also helps to know about things like:

  • Animation/State Exit Time
  • Loop Time for animations
  • Time Sync between layers
  • Blend trees (advanced)

Those are all good to start with.


With Avatars 3.0, you can create a basic avatar with simulated eye movement and SDK2-like visemes very quickly.

  1. Import your avatar, rig as humanoid. Set up your materials, etc.
  2. Add the Avatar Descriptor component.
  3. Define the eye bones, if you want simulated eye movement.
  4. Define the viseme type, if you want visemes. Assign the jaw-flap bone in the Rigging Configuration Screen, or define your visemes by blendshapes. Same as Avatar 2.0.
  5. Set your viewpoint.
  6. Build and upload!

You're done! This will create a basic avatar with default gestures and actions. There's some built-in things you can take advantage of, so even if someone slaps in an avatar with blendshapes/bones named a certain way, you'll get some basic Avatar 3.0 features.

However, even with these basic upgraded systems, there are some new features.

Local Avatar Testing

Ever wanted to iterate and test an avatar without uploading it? Well, with Avatars 3.0, now you can!

In the "Builder" tab of VRChat SDK control panel, you can now select "Build & Test" at "Offline Testing" section. When you click this, your avatar will be built, and then copied into a folder.

When you launch VRChat, you'll be able to access this avatar locally by looking in the "Other" section of the Avatar menu! Only you will be able to see it, but you can make changes to your avatar, click "Build & Test" again, and after a short build, your avatar will be updated. Simply re-select the avatar in your menu and click "Change" again, and you'll swap into the new testing avatar.

This avatar is only visible to you! To everyone else, you'll look like you're wearing the last avatar you were wearing before swapping into the local test avatar. For our AV3 testers, this made iteration a TON faster. We hope you like it!

To delete the copied local test avatar, go to "Content Manager" tab of the VRChat SDK control panel. You will see your avatar in "Test Avatars" section at the bottom. Click "Delete" and it will disappear from "Other" section of the Avatar menu when you reopen it.

Simulated Eye Movement

Simulated eye movement is where your eyes will move around, looking at things around you. This isn't eye tracking-- as in, we don't have a way for you to input data from eye tracking devices-- but it is a pretty good way of making your avatar look more "alive".

This was present in Avatars 2.0, but in AV3 we've improved it overall. The movement is more natural, syncs with blinking if you have it set up, and other tweaks. These changes also roll back to avatar 2.0 avatars that use eye tracking!

However, in AV3, you get more control. The eyelook setup is a lot easier, and you can set limits to which your eyes can rotate. In addition, you can preview these limits in-editor! No more tabbing back and forth between Blender and Unity to tweak your eye bones.

As an aside, the eyebone setup is very similar to AV2, and you won't have to change anything in your rig if you've already got it set up for AV2 eye tracking.

You will have to change the way you do blinking, though. In addition to blendshapes, you can now use bones to move your eyelids! This may work better for a variety of avatars, so we included it.

Blinking blendshapes are no longer defined by left/right blink and lowerlid, it is now defined by three blendshapes, described below:

  • Blink - Both eyes blinking
  • LookUp - Blendshape used when looking up-- use this to tweak eye/iris/lid/eyebrow positioning
  • LookDown - Blendshape used when looking down, use this similarly to LookUp

You can set LookUp and LookDown to -none- if you don't want to use them.

In addition, you'll notice two sliders-- one goes from Calm to Excited, and the other goes from Shy to Confident. Calm / Excited affects how often you blink. Shy / Confident affects how often you look at other players, and how long your gaze remains on other player's faces until you look away.

You'll learn more about this when we talk about state behaviors, but you can set states in your animator to disable eye animations when you reach that state. That's right, you don't have to worry about your blendshapes being overdriven because your "happy" mood closes your eyes, and your blinking is still firing off. 🙌

Blendshape / Bone-based Visemes

If you just want to stick with the standard jaw-flap bone or blendshape-based visemes, we've got you covered. Both are still present and work just fine.

In addition, you can now configure the angle of the jaw-flap bone viseme for some additional customization!

However, in Avatars 3.0, you can also access an Animator Parameter which indicates which viseme should be currently playing! This means if you can animate it, you can use it in a viseme. No more trickery for 2D mouths, robots, whatever-- you can just animate whatever you like for your visemes.

The Viseme animator parameter is updated in all viseme modes.

Proxy Animations

You'll probably notice that the SDK includes a bunch of animations named proxy_animationName. These animations are "placeholders" for a variety of default VRChat animations. If you use an animation that starts with proxy_, VRChat will attempt to replace it with a built-in animation. This can be done in any playable layer.

Although we will not replace an animation with a proxy_ prefix if the suffix does not match one of our built-in animations, it is probably best practice to avoid naming any of your animations with the prefix proxy_.

Use Auto Footstep

This is an option in the AV3 Avatar Descriptor. It is on by default.

"Use Auto Footstep" only applies to 3-point or 4-point tracking. Turning it off means you're disabling the procedural lower body animation for room-scale movement. This procedural animation is what plays when you move around in room-space while in 3 or 4-point tracking.

Leaving Auto Footsteps on (which is the default state) will still allow you to enable/disable tracking via the Tracking Control state behavior.

If Auto Footsteps is off, enabling/disabling tracking on your legs and hips won't do anything, and you're relying on your animations to drive your lower body at all times.

Force Locomotion Animations

This is an option in the AV3 Avatar Descriptor. It is on by default.

"Force Locomotion Animations" is on by default. It only applies to 6-point tracking (Full-Body Tracking). When "Locomotion Animations" is on, locomoting in FBT (as in, moving your joysticks) will play a walking/running animation as determined by your Base playable layer.

When "Locomotion Animations" is off, locomoting in FBT will NOT play the walking/running animation. This is useful if you wish to "mime" your walking with your full-body tracking movement. If you are turning off "Locomotion Animations", do not use the default Base and Additive layers. You're expected to make your own!

Write Defaults on States

Write Defaults is an option available on states in Animators in Unity. The documentation Unity provides is a bit sparse, and the actual function of this option is a bit cryptic! Write Defaults will write the initial state of all animation properties at run-time. This can cause some very strange interactions if you don't plan for it.

This causes some significant issues. Normally, when you're working with other game developers on a project, you agree on a standard. In VRChat, we're all game developers, so let's set the standard here.

VRChat does not use "Write Defaults" in our built-in and example animators. This means that only the actual properties that are in animations get played by any one animation node. We recommend that creators also follow this workflow, as it is easier to keep track of what properties will be animated through any specific layer.

The Write Defaults value defaults to on when you create a new node, so creators must be aware they will have to uncheck this value. If you want to use Write Defaults, you will have to keep track of all the possible properties that may be written by a node with this enabled.

We recommend keeping Write Defaults off and explicitly animating any parameter that needs to be set by the animation. Note that this may require adding "reset" animations or adding properties to the animation to "initialize" transforms in a specific orientation.

Did this page help you?