This is an Avatars 3.0 feature request.
As an avatar creator, I would like to be able to measure one or several distances between 2 transforms of my choosing.
Those distances would be calculated each frame and would be exposed to the playable layer animators.
An optional choice would be provided so that the measured distance in Unity units could be inverse lerped into a normalized distance between 0.0 and 1.0 based on a minimum and maximum distance that I would set in advance for that measurement.
Additionally, if at least one of the two transforms is disabled, then the measured distance is always 0.0.
By exposing a method to measure one or several distances on the avatar, I could make:
A: Pocket items. Measure the distance between the hand and a pocket. When the hand starts doing a fist gesture near that pocket by comparing the distance, the animator drives a synced boolean to put an item on that hand, making it seem like an object was just picked up from that pocket.
B: Clap hands particle without collider. Measure the distance between the two hands. In the FX layer, add a condition so that whenever the measured distance ends up below 0.3 units, a hands clap particle can be seen.
C: Distance-based emissive material. Read the distance between the hand and another object on the avatar, with a minimum distance of 0.2 and a maximum distance of 2. An animation with Normalized Time reads that normalized distance, and animates the color of a material, making it emissive as the hand becomes close to that object.
D: Velocity measurement. Every tenth of a second, two transforms are moved around alternatively based on the hand position. By measuring the distance between the two transforms, we can deduce the velocity of the hand, and play an effect based on that hand velocity.
E: Scale an object based on distance. In the Gesture layer, an animation with Normalized Time or a Blend Tree reads the normalized distance between the two hands in order to scale that object linearly.