Age Verification Feedback

Share feedback about Age Verification. DO NOT POST YOUR ID OR INFORMATION HERE.
Allow AV (age verification) to be accessed via Udon
Hey Team! Lately we are seeing more worlds add in "age checking" at the entrances of worlds (even with the option for AV only instances available). It would be very useful if world creators had access to call on AV through Udon to help groups run their lobbies a bit more smoothly and help provide them that extra level of security. Being able to access AV through Udon, lets world creators do some of the following as example: The option for groups to auto allow AV users into the world, while stopping non AV users at the door to manually check. You could provide AV users with an icon above their head. (With a toggle hopefully). Let AV users toggle on game modes that may be more targeted towards 18+ users. Restrict certain area's of a world. Restrict users from entering a world which main theme is adult/"NSFW"/ or not suitable for below -18. ( 'Age Gating' should cover this already, but at the moment you don't need to prove ID when making your VRChat account, so this would solidify it) Automated content adjustments for different age groups -18 & +18 instead of blocking younger users fully or non-verified, worlds could dynamically adjust gameplay, visuals & mechanics. Helpful in overall moderation tools for admins/groups. Enhanced trust for community worlds and create a more trustworthy environment. Improves the reputation of VRChat as a 'safer' social platform. I am sure there are many other examples/features that world creators can add that allows groups to run their lobbies more smoothly and safely, these were just a handful of ideas submitted by the community. Thanks!
8
Title: VRChat needs a real NSFW / age-gate system instead of ban waves
I want to give some constructive feedback regarding the recent ban waves related to “NSFW geometry / textures” on avatars. Right now, the NSFW tag technically already exists, but in practice it doesn’t really protect creators. We are still getting banned when the avatar is uploaded as public, even though there is no clear or usable NSFW option for public avatars. Here’s what I think would actually fix the issue: 1) The NSFW tag should be clearly defined and enforced by the system At the moment, NSFW tagging seems to function mainly for private use, but creators are still punished for public uploads. In my opinion, the NSFW option should either: Be explicitly limited to private avatars only , OR Come with automatic restrictions when used on public content 2) Public avatars should not even be uploadable as NSFW without restrictions If an avatar is uploaded as public: The creator should NOT be able to accidentally upload NSFW content Either the NSFW option should be disabled for public avatars Or the system should automatically force: - age verification - avatar fallback / hidden mode for non-verified users - or block access entirely 3) Add automatic NSFW detection BEFORE upload There should be a detection step during upload that scans: mesh geometry textures obvious NSFW markers If NSFW is detected, the uploader should get a clear warning like: “NSFW content detected. Please set this avatar to private or apply proper age restrictions.” This would protect both: minors creators who did not intentionally add NSFW content (third-party assets, old models, etc.) Right now, the current system creates unnecessary ban waves that could be avoided with one technical feature. Creators are being punished after the fact instead of being warned or guided during upload. A proper NSFW + age-gate + upload detection system would: reduce moderation load prevent false bans make the rules clear and improve trust between VRChat and its creator community I really hope VRChat considers solving this at a system level instead of continuing reactive moderation.
0
Proposal for a Systematic NSFW Age-Gating System to replace Ban Waves To the VRChat Team,
I am writing to provide constructive feedback regarding the recent moderation actions concerning "NSFW geometry/textures" on avatars. Currently, the NSFW tag exists but fails to adequately protect creators. Users are frequently banned for public uploads even when the distinction between allowed content and bannable offenses is ambiguous in the UI. I believe the following system-level changes would solve this issue more effectively than reactive ban waves: Explicit Definition & Enforcement of the NSFW Tag The NSFW tag currently appears to be for private use only, yet creators are punished for public uploads. The system should be binary: The NSFW option must be strictly limited to private avatars, OR If allowed on public avatars, it must trigger automatic viewing restrictions (age-gating). Prevention of Public NSFW Uploads It should not be technically possible to upload an avatar as "Public" with NSFW characteristics without safeguards. If an avatar is public, the NSFW toggle should either be disabled, or; Checking the NSFW toggle should automatically enforce age verification and force a "fallback" avatar for non-verified users. Pre-Upload Detection System Instead of banning users retroactively, please consider implementing a detection step during the upload process (scanning mesh geometry or specific texture patterns). If potential NSFW content is detected, the uploader should receive a warning: "NSFW content detected. Please set to Private or apply restrictions." This protects minors from seeing content and protects creators from accidental bans due to third-party assets. Moving from reactive moderation to a proactive system (Age-Gate + Upload Detection) would reduce the moderation workload, prevent false bans, and rebuild trust with the creator community. Thank you for considering this feedback.
0
Load More