Unity Knowledge Required
This document is written with the assumption that you know a bit about Unity Animators.
This is a list of Parameters (case-sensitive) that can be added to any Playable Layer (animation controller) and change across all Playable Layers that include this parameter. User created parameters that are not in this list will exist only locally within that animation controller and are not currently changeable by the avatar.
You'll need to add these to your Playable Layer animators to use them. They are case-sensitive!
You should assume that parameter values may change. If you "dead-end" your animators-- as in, you don't have an "exit" in any particular branch-- you may end up having a broken avatar.
|IsLocal||True if the avatar is being worn locally, false otherwise||Bool||None|
|Viseme||Oculus viseme index (||Int||Speech|
|Voice||Microphone volume (||Float||Speech|
|GestureLeft||Gesture from L hand control (0-7)||Int||IK|
|GestureRight||Gesture from R hand control (0-7)||Int||IK|
|GestureLeftWeight||Analog trigger L (0.0-1.0)†||Float||Playable|
|GestureRightWeight||Analog trigger R (0.0-1.0)†||Float||Playable|
|AngularY||Angular velocity on the Y axis||Float||IK|
|VelocityX||Lateral move speed in m/s||Float||IK|
|VelocityY||Vertical move speed in m/s||Float||IK|
|VelocityZ||Forward move speed in m/s||Float||IK|
|Upright||How "upright" you are. 0 is prone, 1 is standing straight up||Float||IK|
|Grounded||True if player touching ground||Bool||IK|
|Seated||True if player in station||Bool||IK|
|AFK||Is player unavailable (HMD proximity sensor / End key)||Bool||IK|
|Expression1 - Expression16||User defined param, Int (||Int / Float||IK or Playable|
|TrackingType||See description below||Int||Playable|
"Supine" and "GroundProximity" are visible in the Debug display, but are not implemented yet. They currently do nothing and never change values.
† GestureLeftWeight and GestureRightWeight go from 0.0 to 1.0 in various gestures depending on the trigger pull. For example, if you make a fist but don't pull the trigger on the left hand, GestureLeft will be 1, but GestureLeftWeight will be 0.0. When you start pulling the trigger, it will climb from 0.0 towards 1.0. This can be used to create "analog" gestures or conditionally detect various things.
You have access to three types of variable when defining your parameters in your Parameters object.
You can use up to a total of 256 bits of "memory". This isn't strictly memory in the sense of memory usage of the avatar, but has to do with the bandwidth you use when syncing parameters.
|Parameter Type||Range||Memory Usage||Notes|
|8 bits||Unsigned 8-bit int.|
|8 bits||Signed 8-bit fixed-point decimal†.|
† Remotely synced
float values have 255 possible values, giving a precision of
1/127 over the network, and can store
1.0 precisely. When updated locally, such as with OSC, float values are stored as native (32-bit) floating-point values in animators.
GestureLeft and GestureRight use these as their values:
We use the Oculus viseme index, top to bottom, where
sil is 0. For reference:
- Speech - Only used for visemes, is driven by the Oculus Lipsync output parameters depending on your speech. Updated locally, not directly synced (because its driven by audio)
- Playable - A slower sync mode meant to synchronize longer-running animation states. Updates every 0.1 to 1 seconds as needed based on parameter changes (1 to 10 updates per second), but you shouldn't rely on it for fast sync.
- IK - A faster sync mode meant to synchronize frequently-changing values. Updates continuously every 0.1 seconds (10 updates per second), and interpolates
floatvalues locally for remote users. Depending on the parameter, this may also just be calculated based on the avatar's locally rendered IK state.
When an Expression Parameter is in-use in a Puppet menu, it automatically swaps from Playable to IK sync so you get the continuous update rate and smooth interpolation. When the menu is closed, it returns to Playable sync.
In addition, Expression parameters can be "driven" to a value via State Behaviors. They can be set using the
Avatar Parameter Driver State Behavior on a state in an animator.
The AFK state is triggered by:
- The user removing the headset and the HMD proximity sensor returning that the headset is not being worn
- A system menu is open. This depends on how the platform you're using delivers data when system menus are up-- for example, the Oculus Dash doesn't register as AFK, but SteamVR's menu does register as AFK. This is kind of a knock-on, and not a designed behavior.
- The user has pressed the End key, toggling the AFK state.
TrackingType indicates a few pieces of information.
If the value is 3, 4, or 6 while
VRMode is 1, the value is indicating how many tracked points the wearer of the avatar has enabled and currently tracked. This value can change! If a user in 6-point tracking removes their extra three points of tracking, they will go from a value of 6 to a value of 3. Take this into account when you design your animator.
If the value is 0, 1, or 2 while
VRMode is 1, the value indicates that the avatar is still initializing. You should not design animators to branch based off this combination of values, and it should instead wait for a "valid" value of 3, 4, or 6.
Account for Changes
During avatar initialization, this value may change! Ensure that your animator accounts for possible changes, and that it doesn't "dead-end" into any branch.
|0||Uninitialized. Usually only occurs when the user is switching avatars and their IK isn't sending yet.|
|1||Generic rig. The user might have tracking of any kind on, but the avatar is rigged as Generic, so tracking is ignored. Might be a desktop user if |
|2||Only occurs with AV2, and therefore isn't a state you should expect to be in for very long for AV3 controllers on avatars. May still occur with SDK3 stations.|
Hands-only tracking with no fingers. This will only occur in states that are transitions-- as in, you should expect
|3||Head and hands tracking. If |
|4||4-point VR user. Head, hands, and hip.|
|6||Full Body Tracking VR user. Head, hands, hip, and feet tracked.|
You must create names (or "aliases") for Expression parameters. You cannot (and shouldn't!) use the default ExpressionN name for the parameters.
Once you have created names for any Expression parameter you want to use, you can use that name directly in your Controller. This means can come up with your own standard naming for your parameters. This also means that Menu definitions and Controllers can be mixed and matched as long as they use the same names. You can get prefab controllers from others and create your own menu styles based on your preferences, without worrying about Expression parameter conflicts.
When naming your own parameters, using forward slashes (
/) will cause parameters to automatically organize in various selection dropdowns. For example, naming a parameter
Toggles/Hat will make the menu selection show up as Toggles -> Hat when selecting parameters for things like Animator transitions and Expression Menus, while keeping the underlying parameter the same name. This doesn't change how parameters behave, it just makes it easier to work with large parameter lists.
There's a few "defaults" in use by the template AV3 VRChat controllers that you can use if you don't want to build out your own controllers. These won't collide with your own use (as long as you don't name them the same thing) thanks to aliasing.
In particular, the Default Action and FX layers use aliasing. You don't need to worry about using a Expression that is in these layers.
Actions use aliased parameters named
VRCEmote , which is an Int with a range of 1 to 16.
FX uses aliased Float parameters called
VRCFaceBlendH (-1,1) and
VRCFaceBlendV (-1,1), if you want to try out your own menus to use them. The default FX layer requires that you have a skinned mesh named
mood_suprised , and
To restate, if you have an avatar that you upload as an Avatar3 avatar without any custom Playable layers, you'll be able to use some built-in emotes with them as long as you've got the above-named blendshapes.
If you also have an
eyes_closed blendshape, it'll close them when you use the default Die emote or go AFK.
When using an avatar that has both Quest and PC versions uploaded, parameters are synced by their position in the parameters list and their parameter type, not by the names of the parameters. For a given parameter to sync between PC and Quest, it has to be in the same position in the parameter list, and have the same parameter type.
Given this, it can be a good idea to use the same Expression Parameters asset for both the PC and Quest versions of an avatar, even if one version doesn't make use of all the parameters.
Updated 4 days ago