Synced Float Parameters on Avatars don't sync correctly (off by 0.01)
tracked
Eremite
Replication:
Create an animator state that requires precision values in a synced float. In my case this was for making a card deck where each frame is a different face of the card. The animator time is set to 0 with my float parameter as the driver for animation Motion Time.
Try randomizing this and syncing it. This works perfectly most of the time, but in some cases, the remote client sees the value as 0.01 higher than the value that I see assigned locally. It is consistent across all remote clients - they all see 0.01 higher while I see the lower value.
I see that the float is randomized with 2 decimals of precision (0.00, 0.01, ... 0.99, 1.00, etc). I'm not sure why this is consistently failing but my theory is that there is a rounding error that happens like this:
The Parameter Driver picks a random value (0.0-1.0) with the animator's float precision. (eg: 0.777777). For networking purposes, that is rounded to two decimals of precision (eg: 0.78). My local parameter then animates based on the true value while theirs animates based on the synced value that has been rounded.
If that is the case, can we ensure that the float value assigned to the local animator is also the rounded value so that we get consistently synced values?
If that's not the case, then I dunno. :P
Log In
StormRel
marked this post as
tracked
Hackebein
Eremite it's not rounded to 2 decimals of precision. Remotely synced float (8 bits) values have 255 possible values, giving a precision of 1/127 (step size ~0.007874) over the network, and can store -1.0, 0.0, and 1.0 precisely.
Eremite
At any rate, the value that it's set to locally via a parameter driver should animate locally the same way that it is synced over the network. The only thing affecting its value is the VRC Parameter Driver script in my animator - if it can only store 255 steps, then the driver should set it in 255 steps.
Eremite
Of note, I tried adding a new Int to the animator and converting the float to an int, normalizing range 0.0 to 1.0 to 0 to 100. Then converting that int back to a float similarly.
When doing this, an int of 30 with range 0 to 100 converted to a float value of 0.31 in a range of 0.0 to 1.0. Remote people now see 0.30 while I see 0.31 - effectively reversing the problem.