Avatar Bugs & Feature Requests

Post about current Avatar bugs and Avatar Feature Requests. One item per post!
Non-constructive and off-topic posts will be moved or deleted.
Graphics.Blit Scripts for avatars
(Edit: Added "for avatars" to the post title. VRCGraphics.Blit is available for worlds, but not avatars.) The Shader Community's Request for Graphics.Blit What is Graphics.Blit? Graphics.Blit allows you to copy a source texture into a destination render texture using a shader. Being able to do this allows for shaders to write to render textures in a manner that does not require cameras ---- Why do we want Graphics.Blit? Currently shader creators are using cameras and render textures to emulate what Graphics.Blit does. Cameras were not made for this purpose and have a relatively high amount of overhead. On top of this, in order to effectively mimic Graphics.Blit we need two cameras to achieve the same effect. This doesn’t mean Graphics.Blit replaces cameras though, it allows for most use cases to run much quicker. ---- What uses this camera workaround currently? GPU particles . Anyone who's experimented with GPU particles knows how much faster these particles run compared to Unity's built in particle system. GPU particles in their current form can run a million particles in 1.43ms on the GPU. That's amazing already, but because we have to use cameras, ~0.7ms of CPU time(camera0 and camera1) is spent on updating the cameras. Graphics.Blit can bring the CPU time from ~0.7ms to ~0.07ms. GPU particles are not the only shader which will perform better using Graphics.Blit. Any shader which has uses a 2 camera and render texture setup will be faster using Graphics.Blit. Other use potential cases for Graphics.Blit * GPU dynamic bones * Cheap Gaussian blur * Fluid Simulation * Grass/Hair Simulation * Procedurally generating Data * Virtual Keyboard for mutes * AI for Shader pets following you around * Simple games and puzzles. Pong, Flappy bird, and the beginnings of Doom have been done with cameras already * Worlds with snow dynamic, rain puddles, and water ripples ---- Why should time be spent on Graphics.Blit when only shader creators will be using it? Even though this is going to mostly only effect shader creators, everyone benefits from this change. GPU particles have been released to the public for a while and has been the only way for VRChat users to get millions of particle without completely destroying FPS. Graphics.Blit would increase the room for innovation. ---- What would a Graphics.Blit component from the VRC SDK look like? We would like to see two scripts added to the SDK, a Blit Controller, and a Blit Component. The Blit Controller should be a component which can be added to game objects on avatars and worlds. The Blit Controller should also have one input, an array of Blit Components. The purpose of this controller is to handle the order in which each blit is executed. In order for a shader to store stateful information, there need to be two render textures and two blits. The order in which these two blits are run is important, and this controller handle that. The Blit Controller component should run Graphics.Blit for every Blit Component in the array, in order. The Blit Component should hold two inputs, a source texture and a destination texture. Where is the material at? Well in order to allow shaders to be animated in VRC we think that using the shared material from the mesh renderer on the game object would be ideal. The blit component should have a RunBlit function which takes the mesh renderer’s shared material and runs Graphics.Blit with the source,destination, and shared material. Also the RunBlit function should pass a couple extra bits of information to the shader. Namely the game object’s localToWorldMatrix, worldToLocalMatrix, and position. We think this is important because with Graphics.Blit you get no outside information. Some of the existing camera effect require position information. We have included a GitHub link which contains a mock implementation of what we think would be the best way to implement this for the use cases we have. It is under the MIT licence so feel free to use anything from our implementation. We want to make this as easy as possible on development time. Our mock implementation: https://github.com/Merlin-san/Blit-Component We aren't sure if components update on both PlayerLocal and Player for the local player, but updating more than once per frame should be avoided if that's the case. ---- Safety System Currently the render texture/camera work around that we use requires that you are friends with another user in order for that user to see the render texture. This was implemented before the safety system. If Graphics.Blit is implemented, render textures should be either apart of the safety system under "shaders", or its own category under "render textures" The shader community, Toocanzs, Merlin, Elixir, Xiexe, Blake447, Okano, snail, Tykesha, .Captain., Claw, Quantum, Nulliel, Naelstrof, Wakamu, 1001, Nave, Neitri, SCRN, Silent
52
Components that take Transform values and output them to Custom Shader properties/Animator parameters?
This is something that feels like has been lacking for a good while, but having components that could take vectors of a transform property and output them to either a custom string for either a material property or an animator parameter to use. The implementation would be similar to how Physbones allows creators to use custom strings to name parameters that the animator will use and output specific animation reactions depending on the value. Example for the animator: Rotation Offset: There's one feature that Blender has that Unity lacks, and that is using corrective shape keys. Corrective ShapeKeys is a thing in Blender that lets users use bone transform angles to correct deforming mesh based on the angle. So you could bend your elbow in a specific way without needing to redo your weight paints or mesh topology. You could limit the angle of influence of how far the rotation can go for the blendshape amount within the driver itself in Blender. Positional Offset: Similar case to the above, you could also use a child of an empty game object to control how blendshapes look dependent on the child object's transform positional offsets relative to the parent object. If I say dragged a ball that was in world space to the left, I could make the face look angry, or make a character on the screen of a prop on my model move to the left. Examples for shader properties: Vectors, Floats, and Colors: Genshin Impact, love or hate it, is a decent game with an interesting shading style, especially with how they do their facial shadows. HoyoVerse uses SDFs to achieve this effect by getting the forward and right vectors of the head bone to determine the lighting angle along a gradient, and flip the gradient depending on specific conditions. The problem is the only method of doing this within Unity is to change the root bone of a skinned mesh renderer to the head bone rather than the hip bone, which is bad for those that want to have a consistent bounding box across all meshes and for shaders that manipulate the vertices of the skinned mesh. A way to solve for this would be to have a component that could output the forward, right, and even upward vectors to a custom string for a material property to read from and inherit the transform values of. Alternatively you could also output the transform positions for proper dissolve effects or for motion capture based systems that use colors to output, or to even give a good wobble effect/get the volume of a glass bottle for a liquid shader, or have it magically fill up. If possible, this could also be an extension to the Avatar Dynamics, and it could also be implemented into PhysBones for the transforms, but having this be extended to other use cases like the ones mentioned would be extremely helpful and possibly more optimized than using lights + depth tricks, CRTs + Cameras on our avatars, or whatever jank and unoptimized methods we use for our creations.
3
Load More