Unity Knowledge Required
This document is written with the assumption that you know a bit about Unity Animators.
This is a list of Parameters (case-sensitive) that can be added to any Playable Layer (animation controller) and change across all Playable Layers that include this parameter. User created parameters that are not in this list will exist only locally within that animation controller and are not currently changeable by the avatar.
You'll need to add these to your Playable Layer animators to use them. They are case-sensitive!
You should assume that parameter values may change. If you "dead-end" your animators-- as in, you don't have an "exit" in any particular branch-- you may end up having a broken avatar.
|IsLocal||True if the avatar is being worn locally, false otherwise||Bool||None|
|Viseme||Oculus viseme index (||Int||Speech|
|GestureLeft||Gesture from L hand control (0-7)||Int||IK|
|GestureRight||Gesture from R hand control (0-7)||Int||IK|
|GestureLeftWeight||Analog trigger L (0.0-1.0)†||Float||IK|
|GestureRightWeight||Analog trigger R (0.0-1.0)†||Float||IK|
|AngularY||Angular velocity on the Y axis||Float||IK|
|VelocityX||Lateral move speed in m/s||Float||IK|
|VelocityY||Vertical move speed in m/s||Float||IK|
|VelocityZ||Forward move speed in m/s||Float||IK|
|Upright||How "upright" you are. 0 is prone, 1 is standing straight up||Float||IK|
|Grounded||True if player touching ground||Bool||IK|
|Seated||True if player in station||Bool||IK|
|AFK||Is player unavailable (HMD proximity sensor / End key)||Bool||IK|
|Expression1 - Expression16||User defined param, Int (||Int / Float||IK or Playable|
|TrackingType||See description below||Int||Playable|
"Supine" and "GroundProximity" are visible in the Debug display, but are not implemented yet. They currently do nothing and never change values.
† GestureLeftWeight and GestureRightWeight go from 0.0 to 1.0 in various gestures depending on the trigger pull. For example, if you make a fist but don't pull the trigger on the left hand, GestureLeft will be 1, but GestureLeftWeight will be 0.0. When you start pulling the trigger, it will climb from 0.0 towards 1.0. This can be used to create "analog" gestures or conditionally detect various things.
You have access to three types of variable when defining your parameters in your Parameters object.
You can use up to a total of 128 bits of "memory". This isn't strictly memory in the sense of memory usage of the avatar, but has to do with the bandwidth you use when syncing parameters.
Unsigned 8-bit int.
Signed 8-bit minifloat
GestureLeft and GestureRight use these as their values:
We use the Oculus viseme index, top to bottom, where
sil is 0. For reference:
- Speech - Only used for visemes, is driven by the Oculus Lipsync output parameters depending on your speech. Updated locally, not directly synced (because its driven by audio)
- Playable - A slower sync meant to synchronize longer-running animation states. Updates every 0.2 to 1 seconds, but you shouldn't rely on it for fast sync.
- IK - Syncs much faster, with the exact same frequency as Networked IK. Depending on the parameter, this may also just be calculated based on the avatar's locally rendered IK state.
When a Expression parameter is in-use in a Puppet menu, it automatically swaps from Playable to IK sync so you get the faster sync.
In addition, Expression parameters can be "driven" to a value via State Behaviors. They can be set using the
Avatar Parameter Driver State Behavior on a state in an animator.
The AFK state is triggered by:
- The user removing the headset and the HMD proximity sensor returning that the headset is not being worn
- A system menu is open. This depends on how the platform you're using delivers data when system menus are up-- for example, the Oculus Dash doesn't register as AFK, but SteamVR's menu does register as AFK. This is kind of a knock-on, and not a designed behavior.
- The user has pressed the End key, toggling the AFK state.
TrackingType indicates a few pieces of information.
If the value is 3, 4, or 6 while
VRMode is 1, the value is indicating how many tracked points the wearer of the avatar has enabled and currently tracked. This value can change! If a user in 6-point tracking removes their extra three points of tracking, they will go from a value of 6 to a value of 3. Take this into account when you design your animator.
If the value is 0, 1, or 2 while
VRMode is 1, the value indicates that the avatar is still initializing. You should not design animators to branch based off this combination of values, and it should instead wait for a "valid" value of 3, 4, or 6.
Account for Changes
During avatar initialization, this value may change! Ensure that your animator accounts for possible changes, and that it doesn't "dead-end" into any branch.
Uninitialized. Usually only occurs when the user is switching avatars and their IK isn't sending yet.
Generic rig. The user might have tracking of any kind on, but the avatar is rigged as Generic, so tracking is ignored. Might be a desktop user if
Only occurs with AV2, and therefore isn't a state you should expect to be in for very long for AV3 controllers on avatars. May still occur with SDK3 stations.
Hands-only tracking with no fingers. This will only occur in states that are transitions-- as in, you should expect
Head and hands tracking. If
4-point VR user. Head, hands, and hip.
Full Body Tracking VR user. Head, hands, hip, and feet tracked.
You must create names (or "aliases") for Expression parameters. You cannot (and shouldn't!) use the default ExpressionN name for the parameters.
Once you have created names for any Expression parameter you want to use, you can use that name directly in your Controller. This means can come up with your own standard naming for your parameters. This also means that Menu definitions and Controllers can be mixed and matched as long as they use the same names. You can get prefab controllers from others and create your own menu styles based on your preferences, without worrying about Expression parameter conflicts.
There's a few "defaults" in use by the template AV3 VRChat controllers that you can use if you don't want to build out your own controllers. These won't collide with your own use (as long as you don't name them the same thing) thanks to aliasing.
In particular, the Default Action and FX layers use aliasing. You don't need to worry about using a Expression that is in these layers.
Actions use aliased parameters named
VRCEmote , which is an Int with a range of 1 to 16.
FX uses aliased Float parameters called
VRCFaceBlendH (-1,1) and
VRCFaceBlendV (-1,1), if you want to try out your own menus to use them. The default FX layer requires that you have a skinned mesh named
mood_suprised , and
To restate, if you have an avatar that you upload as an Avatar3 avatar without any custom Playable layers, you'll be able to use some built-in emotes with them as long as you've got the above-named blendshapes.
If you also have an
eyes_closed blendshape, it'll close them when you use the default Die emote or go AFK.
Updated 8 days ago