World/Udon Bugs & Feature Requests

Post about current World or Udon bugs feature requests. One item per post!
Non-constructive and off-topic posts will be moved or deleted.
Allow worlds/Udon to officially control the local listening position and orientation (consistent on Desktop and VR)
Current problem: Some worlds use an AudioListener-based workaround to simulate moving the user's ears to another point in the world, for example for dummy-head ASMR style effects. However, on Desktop, when the listening position is moved this way, distance and volume appear to follow the moved listening point, while directional localization (left/right directionality) still remains tied to the player's original viewpoint. As a result, the gimmick does not work correctly on Desktop. The same gimmick works as intended in VR. I have also confirmed a difference in PPC° between VR and Desktop in the Audio Sources Debug View. I can provide screenshots and other supporting material if needed. Use cases ASMR / dummy-head style "ear-level" audio presentation using a listening point placed in the world Observer camera / special viewpoint gimmicks where the visual viewpoint and listening point should be separated Audio-driven experiences where distance and direction should be designed around a specific in-world ear position Requested change Instead of relying on unsupported AudioListener-based workarounds, I would like an official and safe way for worlds/Udon to control the local user's listening position and orientation (listener transform). Requirements: It should work consistently on both Desktop and VR Directional localization should update correctly, not just distance attenuation / volume It only needs to affect the local user Network synchronization is not required Why the current workaround is insufficient AudioListener is not part of the allowlisted world components, so it is not an officially supported component for worlds Because of that, platform differences (such as Desktop vs VR) and future breakage are difficult to avoid VRChat already has official functionality such as Audio from Camera, which changes where audio is heard from For that reason, it would be very helpful to have a similarly official and safe way for worlds to specify a local listening point Scope / non-goals This is not a request to bypass normal audible range, occlusion, or existing voice distance rules This is not a request to force other players' listening positions to change A local-only solution is sufficient This is not intended for eavesdropping or expanding what a player is allowed to hear beyond normal rules Possible safety constraints would be fine, for example: Explicit per-world opt-in Local-only behavior Existing distance attenuation and audibility rules remain unchanged Minimal repro Minimal repro world: https://vrchat.com/home/world/wrld_c84d1e3b-d5bb-410d-b407-064b49fb56db Repro steps: Join in VR → the gimmick works as intended, with both distance and direction matching the moved ear/listening position Join on Desktop → distance / volume follows the moved listening position, but directional localization still remains based on the original player viewpoint If needed, I can also provide the following on request: A VR vs Desktop comparison video Debug View screenshots (Audio Sources: PPC°, Dist, lG/rG dB, d) VRChat version / build number
0
Reboot development of Udon UI
Recently, the number of sophisticated world gimmicks has increased, along with the increasing number of controls and settings. This has forced world creators to design their own UI menus and implement unique methods for accessing them. Especially in large worlds, users need to be able to access the UI menu from anywhere. For example, in VR, they can double-click the trigger, hold down the right stick, or use a gesture. On desktop, they can press the E or Esc keys. These methods can even interfere with the controls of the VRChat client, which has seen an increase in functionality in recent years. For example, pressing up on the right stick causes Vive Wand players to jump. Pressing the E key interferes with camera control. (One reason for this is that Udon cannot receive menu button inputs.) Meanwhile, users often become confused by or even fail to notice the different ways to access UI menus in each world. World creators must always provide guidance on how to access the UI menu. I know that in the past, development of a Udon UI accessible from QM was underway, but that work was suspended. https://ask.vrchat.com/t/developer-update-16-february-2023/16474#p-33374-udon-ui-in-the-quick-menu-14 https://ask.vrchat.com/t/developer-update-16-march-2023/16950#p-34625-update-on-udon-ui-9 https://ask.vrchat.com/t/vrchat-sdk-roadmap-may-2024/24243 From my perspective, a significant factor in this was the amount of time spent supporting smartphone platforms. Smartphone platforms have already been released for both Android and iOS. Now is the time to reboot development of Udon UI! --------------------原文-------------------- 最近、更に高機能なワールドギミックが増えて、その操作や設定も増える傾向にあります。そのため、ワールド作者はUIメニューを独自に設計し、独特な手段で呼び出す実装を強いられています。特に広域なワールドでは、ユーザーはどこからでもUIメニューを召喚できる必要があります。例えば、VRならトリガーをダブルクリックであったり、右スティックを下に長押ししたり、あるいはジェスチャーをしたり…デスクトップならEキーを押したりEscキーを押したり… これらは、近年機能の増えたVRChatクライアントの操作と干渉することさえあります。例えば、右スティックの上入力ではVive Wandのプレイヤーはジャンプしてしまいます。Eキーではカメラを操作しようとする時に干渉します。(理由の一端に、メニューボタン入力を Udon から取得できないのもあります) 一方でユーザーは、ワールドごとに呼び出し方の違うUIメニューに戸惑い、あるいは気付かない事もしばしば起こっています。ワールド作者は、常にUIメニューの呼び出し方を案内する必要があります。 私は、過去にQMからアクセスできるUdon UIを開発していて、しかし中断している事を知っています。 私の認識では、スマートフォンプラットフォームの対応に工数が取られていた事も重要な一因です。既にスマートフォンプラットフォームはAndroid, iOS共にリリースされています。今こそUdon UIの開発をリブートする時です!
0
Stage Instance Crash Recovery and Scheduled Access for Large VRChat Events
Summary Large VRChat productions such as festivals, concerts, award shows, and panels increasingly operate like live broadcast environments. In these events a single world instance functions as the stage, while moderators, camera operators, DJs, presenters, and performers coordinate the show inside that instance. If a staff member crashes during a show, quickly returning to the stage instance is critical. Additionally, many performers and presenters join the stage hours after the event begins for their scheduled segment. Currently, reconnecting requires the normal world join flow, which depends on UI navigation and multiple API calls. During degraded platform conditions (API latency, matchmaking issues, networking outages), this process can become slow or unreliable. This proposal suggests a Stage Instance Recovery and Scheduled Access system allowing authorized users to securely rejoin or join a stage instance using tightly scoped tokens while preserving existing permissions and API validation. Problem Large VRChat events have operational requirements similar to live broadcast productions. A single stage instance may run for many hours while operators and performers must maintain continuous access. If a crash occurs, recovery currently requires: Restarting the client Resolving the world and instance through APIs Navigating UI or launch links Rejoining through normal matchmaking If services are degraded, this process may fail or take too long for live production needs. In addition, scheduled participants (performers, presenters, panelists, etc.) may need to join the stage instance hours after the event begins. Proposed Solution Introduce a Stage Instance Recovery and Scheduled Access system for designated event stage instances using two token types. Stage Recovery Tokens Issued when an authorized user joins the stage instance. The token is tied to: user ID world ID instance ID group ID group role session timestamp If the client crashes and restarts within a short recovery window, the client can present "Resume Stage Session" and attempt reconnection using the token before falling back to the normal join flow. Stage Access Tokens Issued to scheduled participants before their segment begins. These tokens are tied to: user ID world ID instance ID group ID participant role time window This allows performers and presenters to join the stage instance when their segment begins even if the event has been running for hours. API Behavior Under normal conditions, reconnect attempts still perform standard API validation. During a short recovery window, recovery tokens could allow the client to skip certain non-critical lookups and attempt direct reconnection to the existing stage session. If validation fails, the system falls back to the normal join flow. Role-Based Access Operator Recovery group admins moderators stage technicians camera operators DJs / event hosts Participant Access performers presenters panelists speakers award recipients Audience Audience members would not receive tokens. Security Recovery and access tokens remain tightly scoped: short expiration windows token system only available for event type main stage instance bound to one user and one instance role validation on reconnect/join server-side revocation rate limited reconnect attempts Tokens do not grant new permissions; they only allow users to resume or join sessions they are already authorized for. Benefits faster crash recovery for event operators reliable entry for scheduled participants improved resilience during partial service outages better support for large festivals, concerts, and live shows As VRChat continues hosting larger productions, stage-oriented recovery and scheduled access would help the platform support professional-scale events more reliably.
0
·
Feature Requests
Load More