Definition:
Content moderation is the practice of reviewing, filtering, or removing user-generated content according to platform rules, legal requirements, or community standards.
Usage Context:
Seen on social media platforms, forums, marketplaces, comment sections, and digital communities.
Critical Note:
Content moderation is often presented as neutral safety enforcement. In practice, it reflects platform power, cultural norms, and risk management priorities, with uneven application and limited accountability or appeal.
Related Terms:
Platform Power, Authority Framing, Behavioural Governance, Brand Safety Doctrine
