Reporting tools give users the ability to flag content or behavior that is problematic or hateful for review by the content moderation teams at the company. When a content item is reported, it should immediately be removed from the user's view, even if it isn't fully removed from the platform. If a person is reported for behavior that is problematic, that person's content and activity should be blocked for the person reporting the issue even if they are not yet warned or removed from the service.