The most practical solution, without requiring VRChat to overhaul large parts of the platform, is to introduce an “NSFW” tag for avatar uploads. This tag would only be selectable by age-verified users. By default, all users would have “View NSFW Avatars” turned off. Those who wish to see NSFW-tagged avatars could manually enable this option in their settings. In 18+ instances, users who choose to enable NSFW viewing would knowingly opt in to seeing lewd content. If someone enables this setting and then reports an avatar for being lewd, VRChat could clearly see that the report is invalid, as the content was properly tagged and intentionally enabled by the reporting user. This system would protect users who don’t want to see NSFW content, give adults consensual access to lewd avatars, reduce false or bad-faith reports, and require minimal changes compared to full moderation overhauls. Overall, it creates clear consent, accountability, and better moderation outcomes without placing excessive burden on VRChat’s development team.