
AI Image Moderation
ultimate

AI Image Moderation
ultimate
This plugin will automatically block low-quality or inappropriate photos before they ever appear in your app.
When a user uploads an image — during registration or later — the system instantly sends it to AWS Rekognition, Amazon’s industry-leading image analysis service, and rejects anything that violates your content rules.
The plugin detects and blocks:
- Pornographic or explicit content
- Suggestive or sexualized images
- Violence, weapons, gore, or disturbing scenes
- Celebrity photos and impersonation attempts
- Any other categories you configure
Customizable Sensitivity (Threshold Settings)
You will be able to adjust sensitivity thresholds for each moderation category.
This determines how strict the system should be:
- Lower threshold → only clear violations are blocked
- Higher threshold → even borderline or subtle violations are rejected
This flexibility allows you to fine-tune moderation to fit your community: from very strict (e.g., marriage-oriented apps) to more relaxed (e.g., lifestyle or casual dating apps).
All images are analyzed in real time, and unsafe photos are automatically rejected at upload.
This keeps your platform clean, reduces moderator workload, and ensures new users only see authentic, high-quality profile photos.