VRChat now offers support for receiving eye tracking data (eyelook and eyelid) via OSC.
Please note: This is an advanced feature! It is NOT plug-and-play. You must create your own program to transmit this data to VRChat using OSC.
Hardware manufacturers may provide an application that will send this data for you. You may also use a community-created program.
/tracking/eye/EyesClosedAmount /tracking/eye/CenterPitchYaw /tracking/eye/CenterPitchYawDist /tracking/eye/CenterVec /tracking/eye/CenterVecFull /tracking/eye/LeftRightPitchYaw /tracking/eye/LeftRightVec
Addresses are case sensitive
After 10 seconds without receiving data, eye tracking behavior will revert to default (auto look / auto blink). This timeout is separate for eyelids and eye-look.
0~1 value for how closed the eyes are. Currently we only support a single value simultaneously controlling the blink of both eyes. In the future we'll add support for separate per-eye winking.
Only one eye-look address at a time
In addition to the EyesClosedAmount, you can send data to one of the addresses below depending on the format you'd like to send. You probably only need to implement one (or just a few) of the possible types of eye-look data in your sending app.
For demonstration purposes and to help you debug coordinate / axis issues, we'll provide some example data for each address for the following configuration: An eye target roughly 15 degrees up, 20 degrees to the user's right, and 50cm away relative to the user's "eye root" or HMD. The user's IPD is set at 64mm. See the image below:
In general, positive numbers for pitch and yaw will rotate down and to the right respectively. For vectors, it's the standard Unity format of +x right, +y up, and +z forward
Pitch value and yaw value in degrees for a single "center" eye look direction. Because no distance is defined here, this mode will always use a raycast in-world to find the convergence distance. Example data: -15.252, 20.128
Same as above but with an added distance value in meters to define the convergence distance. The format is pitch, yaw, distance. Example data: -15.252, 20.128, 0.503
"Center" eye x,y,z directional normalized vector local to the HMD. The vector is normalized so this mode will always use raycast to find the convergence distance. Example data: 0.332, 0.263, 0.905
"Center" eye x,y,z directional vector local to the HMD. The length of this vector matters and (in meters) will determine the convergence distance. Example data: 0.167, 0.132, 0.456
(In degrees) left pitch, left yaw, right pitch, right yaw. Example data: -14.903, 23.592, -15.560, 16.503
HMD local normalized directional vectors for each eye (left x,y,z right x,y,z). Example data: 0.387, 0.257, 0.886, 0.274, 0.268, 0.923
You can see an example script here that could be used to send eye tracking data from a Unity project. This example script assumes that you have an "eye root" and "eye target" transform in your project to be slotted in. You must also set the user's interpupillary distance. This script shouldn't just be used as-is however, and is only intended to help get a prototype project up and running. Please read through the code and use it for educational purposes. This is just one example of a way to send OSC eye tracking data into to VRChat.
Updated about 2 months ago